Image Title

Search Results for es:

Breaking Analysis: Cutting Through the Noise of Full Stack Observability


 

from the cube studios in palo alto in boston bringing you data driven insights from the cube and etr this is breaking analysis with dave vellante full stack observability is the new buzz phrase as businesses go digital customer experience becomes ever more important why because fickle consumers can switch brands in the blink of an eye or the click of a mouse every vendor wants a piece of the action in this market including companies that have provided traditional monitoring log analytics application performance management etc and they're joined by a slew of new entrants claiming end invisibility across the so-called modern tech stack recent survey research from etr however confirms our thesis that no one company has it all new entrants they've got a vision and and they're not encumbered with legacy technical debt however their offerings are immature on the other hand established players with deep feature sets in one segment are pivoting through m a and some organic development to fill gaps meanwhile the cloud players are well positioned and participating through a combination of their own native tooling combined with strong ecosystems in their respective marketplaces to address this opportunity hello everyone and welcome to this week's wikibon cube insights powered by etr in this breaking analysis we dive into a recent etr drill down study on full stack observability and to do so we once again welcome in our colleague eric bradley chief engagement strategist and director of research at etr eric good to see you my friend thanks for coming on uh always good to be here dave thank you so much for having us we appreciate it all right before we get into the survey eric i i want to talk a little bit about full stack observability define what it is and so let me start and then you can chime in so when people talk about full stack observability they're referring to the need to understand the behavior of all the technology components that support an application i.e the stack right throughout the entire system meaning the full piece of the equation right the entire system so the compute we're talking about the storage the network and of course that's all software defined today the containers that are running the software the database other middleware components the pipeline of data and then of course the client-side code everything the html the css everything down to the mobile device and the idea is to give people who can fix problems full visibility into the system with a dashboard of metrics that can be visualized at a high level and then drilled into to see logs or traces or events all the metrics that could help remediate an issue so a simple way to think about this eric is i like to think of it as the ability to see everything in the tech stack that could impact the customer experience right how do you see it if only we're that simple right it's it's a huge thing that we're trying to encompass there with full stack observability and uh even though the vendors might tell you on the first sales call that they can do it it's really not that simple based on everything you just said um in this particular survey we tried our best to look at it and we'll go into it later but you know we had to survey on the application side infrastructure side database side blog management security network it's a very difficult thing to encompass um the holy grail would be able to do it with one vendor and do it with one dashboard i don't think we're there anytime soon all right so let's get into this drill down survey results and talk about what you've learned first what is explain what a drill down study is how often does etr conduct these types of things you know who responds what can you tell us yeah sure so the drill downs are actually basically think of it as a custom type of survey work and that could be customized from two different ways either our clients will come to us with a particular topic and we will hold their hands and make sure that they get the the responses that they need uh and more often than not it's actually us as a research department uh wanting to dig into trends that our larger data encompasses and then we'll say hey we really need to look into that and we've done it with everything from rpa to identity access to you know hearing observability and also vendor specific and and macro trends as you know david this particular one the genesis was really a large amount of interest not only from our community the end users but clients i i can't tell you how much interest there is in observability right now we're constantly getting questions and demands for more research and deeper research in this space yeah so our audience will be familiar with the concept of net score that's the periodic survey every quarter like clockwork etr does that then in addition as eric was saying hot topics like in this case full stack observability so we're talking about respondents in the etr community in this case who have a deep understanding of observability and related topics and and they had varying degrees of knowledge about each vendor's offering so you asked the respondents to concentrate on the ones that they knew well correct yes that is correct so this was a smaller survey that we did the end was a little under 188 i believe um and essentially what we did was we took people that responded in the bigger study on these observability vendors and then sent this drill down out so they were specifically people that have purview over their spend with observability now some of it might be more database infrastructure application or security but everyone here is already qualified as an expert to answer these questions that's correct dave yeah so the first data point is the one we're showing you right here the respondents were asked who uses observability tools and eric i've highlighted app ops in in the site reliability engineers because given the emphasis on customer centricity that we hear all the time from the vendor community you would think these roles would be more highly represented but it's the folks in the boiler room that are using these tools highly technical and specialized roles what are your thoughts on this data you know i was a little surprised as well i i kind of thought the sres would be a little bit higher on this but it really just comes down to you know it's the infrastructure um devops and secops that seem to be using it the most i thought maybe the application operations teams would be a little bit more involved as well so i agree with you i was a little bit surprised on this but you know they're the experts so we have to take the data at their word for it but i think what's really happening here is you're recognizing that the work is being done across the entire enterprise as you mentioned before about full stack this isn't just one aspect it's touching every aspect of the enterprise and that's including the internal i.t teams well and i think too eric that this what i took away from this drill down and we'll get more into it is that the vendor marketing is not aligned with what's actually happening in the field and so there's these early days we'll talk about that some more okay next question i thought this was very interesting etr asked on the scale of one to three three being most preferred which pricing model host-based user-based or amount of data ingested based pricing that the responders preferred and eric so what are your what are your thoughts on this because just doing a quick scan scan pricing is all over the map yeah it really is all over the map from a vendor perspective right and also from an end user perspective and all the interviews and panels that i host pricing's a real concern but it is always but in this particular field it's a real concern and i actually just did a panel yesterday of four of these 88 survey takers to get a little bit deeper so i'm going to kind of remark on what they taught me a little bit yesterday one of them said ingestion pricing might be preferable but because it's so unpredictable that's why we're seeing the results skew away from it another one went so far that said uh ingestion based pricing is a nightmare that keeps him up at night because he's just so afraid he's gonna wake up the next day and see what the bill is so um really what they're looking for here and the reason the pricing is skewing that way in this survey is because they need predictability it's about their budget and it's about their planning even though they would prefer an ingestion-based model the fact that they have to plan for their budgets and they have to concern themselves with spending it's moving more to host based yeah so i mean it is complicated and because so for example i just took a quick snapshot of some of the pricing models like dynatrace appd datadog aws and others they tout their host-based pricing new relic they have a splash page up around its user-based pricing and the tiers datadog talks about its ingestion-based pricing for security monitoring aws prices by ingestion for cloud watch logs splunk prices on index data and calculates a per gigabyte per day metric so metrics dashboards alarms alerts events it's they could all be priced differently yeah that's true a few that got called out on us and i'm sure we're going to get into them later so i don't want to you know kill all of our fodder right now but when we were talking about this slide one person particularly decided to call out new relic and specif specifically for their flexibility around pricing he said that they have the ability to rapidly scale up but also contract as needed and he actually even though he's a user of splunk he's a user of dyna trees user of elastic um he also just really wanted to call out the flexibility of new relic in this area so to your point there's a lot of different ways to price this it's a complex problem but i think the key takeaway for vendors is flexibility is the key you really need to give people the ability to be flexible in what they want all right let's drill into the functionality and explore the usage and adoption of the different features by the respondents so this next chart shows module adoption for application performance monitoring apm database and digital experience down to the user and eric i underlined apm which is the blue bar because it seems it stands out especially for aws and you can see dynast dyna trace but also azure new relic and splunk and then digital experience which is the gray bar because despite all the chatter in the market and the marketing around digital transformation and customer experience other than a slightly higher response percentage for aws not a lot of adoption on that front so the vendor marketing again doesn't match the user behavior does it eric no it doesn't there's a couple of things to point out here but let's stick with that digital experience i i was surprised that it was so low on this slide and overall in our survey i did expect it to be more and not just from the vendor marketing perspective but you and i both know at the end of the day the whole point of this is to actually get into that 360 view of what your customer's doing so i i was a little bit surprised to see it that low when we spoke to the panel yesterday a couple of people said no listen it's not that we aren't doing that it's just that it's not the vendors that you put on this survey and they called out two particular names one is called catchpoint and the other one is thousand dies and i think you're aware a thousand dice i'm going to transition that off to you there yeah so a thousand eyes is now part of cisco and we're gonna talk about that a little bit later but but essentially like as i was saying up front they've got gaps in their in their product line so they've got to do m a and then package that up so you know we'll we'll get into that a little bit down the road but i want to bring up the next graphic because that looks at incident management infrastructure monitoring and log management and and what i did here is i called out infrastructure monitoring which is the gray bar and log management that light blue because aws and azure they stand out in these categories and splunk of course eric for for log management what what do you take away from this data yeah the previous slide and this slide you really have to call out aws cloudwatch and microsoft is your monitor um they are very pervasive in this survey and we could probably do an entire show on just that on the cloud versus independent but a couple of things i do want to point out even though these numbers are so high for these cloud tools the the panelists and the people i spoke to in more detail all said listen i'm going to look at my cloud tools first i'm on their infrastructure they're handing it to me i'm going to look at it and i'll use it for what it's good for however we're in a multi-cloud world and they're not good at things that aren't in their ecosystem so these are not even though these numbers are high i do not believe that you know aws or azure is going to go and take over all the independents in a multi-cloud world they want an independent vendor whether it's a data dog new relic we could talk about all of those later but um you know really i was surprised that the aws particularly was so high and so pervasive in here across the way a splunk what can you say i mean they are the most pervasive vendor you know they they're everywhere uh we had people in the panel call them a swiss army knife and you know that's a good and a bad that they have a lot of breadth of coverage which is great but because there's a breadth of coverage not all of it is great log management without a doubt is what they are great at they're specialized at it but the panelists were saying listen if you go away from their core and you try to use some of the other things they claim that they can do it requires a lot of heavy lifting and then we can get into a little bit later about their cloud cloud sas integration we had some issues with that in the survey as well and great points about the multi-cloud you're probably not going to trust that to your your cloud your public cloud vendor and so a lot of white space available for the traditional on-prem guys okay next the etr survey drilled into network monitoring and security monitoring and then other security functions and eric there are a couple of things that stood out to me in this chart i highlighted security monitoring which is the blue bar because you can again see the adoption from aws and azure and of course splunk and also we called out solar winds because of the large adoption in network monitoring so let me ask you what are you seeing in the data since the solarwinds breach and is there anything else in this chart that you want to call out i could go on for a while about solarwinds but you know the data since i guess it broke around 12 months ago even though the breach was even prior to that uh the headlines were big i think you remember you and i last year did a quick drill down survey just on solarwinds uh and the impact that we thought we would have it uh there's a very real impact happening uh with that said they're not easy to move away from um we asked about is there any one vendor that could take this entire space and the answer was solar winds was best positioned to do that but it's too late now and then i drilled down a little bit and i asked the panel well what can they do to reinvent themselves what can they do to change the reputational damage from this breach and the panelists all said nothing the reputational damage is done the best way for them to reinvent themselves would be to do an m a consolidate with somebody else change their name they truly believe that right now the only reason that people are still using solarwinds is it's not that easy to lift and shift away from but there will be no new net workloads going to these people at least according to the the ones who took our survey um that's on solar winds and we could get you know in more if you want but i think that's kind of you know giving the the the crux of the matter on splunk again what can you say on the security side on the sim side people don't want to use multiple vendors on the other side we were talking about with full stack some might be better at apm some might be better at infrastructure monitoring when you're talking about security you truly do want one vendor to rule them all and splunk does seem to be the one that's most well entrenched on the security side and as long as the policy is consistent across security you really can't say much about them so what they do well their core their the data shows that you know people still trust them great thank you for that okay now the last set of data we want to show we kind of consolidated some things you want the the detail and the drill down you had several drill down questions and what we try to do is consolidate them into a single chart which we had to stare at for a while so for each of the 11 companies etr asked respondents if the features across the top that you see here were strengths weaknesses or neutral and what we've done is we tried to consolidate the chart showing the strengths in the green which we just subjectively said okay that means more than 40 percent of the respondents identified the feature as a strength the weaknesses in yellow meant that more than 20 percent of the respondents cited the feature as a weakness and the neutrals in the gray where neither of those conditions were met but the gray was you know the neutral was high and what we did is we added four stars for standout features where 60 or more of the respondents cited the feature as a strength and we threw in two stars if they were close to 60 you know high 50s even mid 50s but but not single digit weakness for that feature that was got two stars so it was able to sort of visualize a lot of data so eric just a quick scan of this chart chart shows that the two big cloud players aws in particular but also azure they have a relatively strong showing and i say relatively because as you know eric there wasn't a single category of feature for any vendor where more than 70 percent of the respondents cited the strength for that single feature not one and there was a lot of gray and you can see pricing is a sore point for many customers including those evaluating solarwinds new relic elastic datadog dynatrace appd and splunk only aws and grafana were hit not hit hard on pricing and i guess the other thing that stands out to me here is that new relic eric showed some relative strength so the last thing i'll mention before you dive in look at what cisco is doing we talked about this before a little bit the drill down focused on appd but as i mentioned earlier companies that have mature stacks are filling the gaps so if you look at what cisco's doing this space they've put an interface layer over appd inner site and thousand eyes even though they're separate products they're historically priced separately i think they're still trying to figure out the pricing but they are definitely going to market with a strategy that bolts together these three separate products and that's not necessarily a bad strategy because combined they can claim even more depth and breadth eric what do you make of this data yeah just like this chart there is a lot there right so uh on a macro level let's just the obvious situation here is this is a crowded crowded marketplace and consolidation is needed i had one panelist say to me yesterday i can't wait for this to consolidate like this is just crazy that there needs to be consolidation uh now to your point about cisco cisco's taking the same playbook they did with security right they're going out and they're buying great tools and then now we have to make sure that they figure out a way to integrate these better uh the security side took them a little while to do that but they're getting there hopefully they can do this a little bit quicker here what we did here is that um appd is actually very strong on the application monitoring side for the core apm uh maybe not so much on these others and then that's why they go out and do what you're doing what you're saying about now so hopefully they will get there um kind of talking across the board pricing was a problem for all of them right so it just seems to me that you know the end users the buyers just feel like hey i shouldn't be paying this much for this we've got a lot of choices maybe there's some collusion on the pricing side but we have to figure it out because they do not want to pay this much for it it was the number one concern across almost every single vendor another aspect that i really want to call out on this and is something that our research team found really interesting and it's really about the digital transformation as digital transformation continues the workloads are moving towards the cloud and we're clearly seeing in this data that that's benefiting the newer players the data dogs and the new relics versus some of the others like a dynatrace and a splunk and when you go and actually look at the cloud sas integration answer option specifically it becomes very very obvious um you know splunk had a 38 on that number whereas datadog had 61 new relic at 58. so it's just very clear as a digital transformation increases workloads on observability it is lifting all boats but it's lifting some faster than others great points um all right as we said at the top you've got a set of incumbents they're jockeying for position you've got companies like datadog it's got as eric just mentioned strong cloud model elastic's got got the open source mojo and they're going after splunk's install base as is datadog and then you see startups like chaos search they're out now talking about how to do log analytics they do more than that but that's their sort of starter use case and they're going after the elastic and the elk stack which got dinged a bit in the survey on simplicity uh you know ease of standing it up and and so forth not a weakness if you're comfortable with full open source model but maybe not well understood as some of the other solution oriented plays and then you got other new entrants which are not covered in the drill down they're not as pervasive in the marketplace but guys like honeycomb and observe eric you mentioned some others that came out in the panel vmware even is getting into the act they're positioning tanzu around observability with really a strong kubernetes emphasis and there's dozens of other players in the space which we haven't talked about so eric this is jump ball and i'll give you the final word give us your last thoughts yeah there's a again a lot there it's such an interesting space like even ibm right they go out and buy turbonomics right everyone seems to be playing and not only that the ones that are already playing are expanding data dog comes out and says hey we do security now so i don't really know where this is going to end but there's too much happening there needs to be some sort of you know order out of the chaos uh to your point about some of the emerging names we just launched our emerging technology survey this week david those are the ones where we're going to see data on those names so stay tuned for that we don't track them in the core tsis which are more mature public vendors but we will be getting some data on those uh but to your point i really do believe that this space is rapidly expanding and i just kind of want to leave everyone with this there's a lot of growth still left in the panel yesterday i basically said to people how much of your infrastructure are you monitoring today versus how much you want to and the answer was around 65 to 70 percent being monitored now and without a doubt they all want to get to 100 so there is still a lot of room to grow in this space but i just don't know if there's enough room for all of these people that are basically going after the same percentage points so what we're seeing from a vendor strategy now is bundling they're trying to bundle because that's the way they're gonna actually gain that market share right and uh just one last point to you for elastic a lot of people still view elastic as a search functionality so even though they have use cases and observability i still think there's a lot of people that the elastic got into the elk stack in general got into their enterprise for search so that is still kind of where they are and maybe they're not moving as fast as a data dog or a new relic in pure full stack observability eric so great to have you on you guys cover so much space so we're gonna leave it there for now we really appreciate our friends at etr for the the work that they do and thank you eric for joining us today and sharing your insights great stuff welcome dave i always enjoy talking to you you know that and uh everyone else we'll be back in a couple of months with our predictions as well so yeah that's right yeah look for those all right remember these episodes are all available as podcasts wherever you listen all you gotta do is search breaking analysis podcast check out etr's website etr dot plus they've got a whole new packaging and and pricing models so check that out we also publish a full report every week on wikibon.com and siliconangle.com and you can get in touch with me david.velante at siliconangle.com or at divalante on twitter i'm on linkedin all the time this is dave vellante for the cube insights powered by etr have a great week everybody stay safe be well and we'll see you next time you

Published Date : Nov 5 2021

SUMMARY :

the most pervasive vendor you know they

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
two starsQUANTITY

0.99+

four starsQUANTITY

0.99+

bostonLOCATION

0.99+

yesterdayDATE

0.99+

eric bradleyPERSON

0.99+

58QUANTITY

0.99+

more than 20 percentQUANTITY

0.99+

ciscoORGANIZATION

0.99+

last yearDATE

0.99+

more than 40 percentQUANTITY

0.99+

more than 70 percentQUANTITY

0.99+

100QUANTITY

0.99+

splunkORGANIZATION

0.99+

microsoftORGANIZATION

0.99+

davidPERSON

0.99+

palo altoORGANIZATION

0.99+

38QUANTITY

0.98+

awsORGANIZATION

0.98+

ericPERSON

0.98+

this weekDATE

0.98+

datadogORGANIZATION

0.98+

dozens of other playersQUANTITY

0.98+

one dashboardQUANTITY

0.97+

siliconangle.comOTHER

0.97+

todayDATE

0.97+

eachQUANTITY

0.97+

dave vellantePERSON

0.97+

11 companiesQUANTITY

0.97+

one vendorQUANTITY

0.97+

davePERSON

0.96+

oneQUANTITY

0.96+

azureORGANIZATION

0.96+

firstQUANTITY

0.96+

three separate productsQUANTITY

0.96+

70 percentQUANTITY

0.96+

two different waysQUANTITY

0.95+

first sales callQUANTITY

0.95+

next dayDATE

0.95+

each vendorQUANTITY

0.95+

catchpointORGANIZATION

0.95+

single chartQUANTITY

0.95+

twitterORGANIZATION

0.94+

mid 50sQUANTITY

0.94+

360 viewQUANTITY

0.94+

one last pointQUANTITY

0.94+

etrORGANIZATION

0.93+

single featureQUANTITY

0.93+

single categoryQUANTITY

0.93+

around 12 months agoDATE

0.92+

around 65QUANTITY

0.92+

50sQUANTITY

0.92+

david.velantePERSON

0.92+

one segmentQUANTITY

0.91+

88 survey takersQUANTITY

0.91+

bothQUANTITY

0.91+

one vendorQUANTITY

0.9+

threeQUANTITY

0.89+

fourQUANTITY

0.89+

60QUANTITY

0.87+

one aspectQUANTITY

0.87+

two bigQUANTITY

0.87+

divalanteORGANIZATION

0.87+

two particular namesQUANTITY

0.86+

a couple of peopleQUANTITY

0.86+

one panelistQUANTITY

0.85+

first data pointQUANTITY

0.84+

a lot of peopleQUANTITY

0.84+

one personQUANTITY

0.84+

single digitQUANTITY

0.84+

61 newQUANTITY

0.81+

thousand diceQUANTITY

0.8+

Real Time Emotion Detection Using EEG With Real Time Noise Reduction


 

>>Hello. Nice to meet you. My name is yes. Um Escuela. I'm a professor in a university in Japan. So today I want to introduce my research. That title is a really time emotional detection using e g with riel time knowing the reduction. First of all, I want to introduce myself. My major is system identification and signal processing for large removed and by American signal process for owner off them. A common technique. It's most magical. Modern by you creation using this opportunity identification method. So today topic it's e easy modern by the Barriers Council with heavy notes. We call this technique the concept moody. Now what is a concept? I mean, the concept is Japanese world because studies are first in Japan. So consider is similar to emotion and sensibility, but quite different. The commercial nous sensibility is innate ability. The concert is acquired after birth, so concept is similar to how to be So we focus on this can see using the brain signals. As for the brain Sina, there is ah, many way to know the brain. For example, the optical leading X c T m i m e g e g optical topography um, function and my by using these devices, we have three areas off research, for example, like neural engineering area for obligation, including new market neuroscience area for understanding the mechanism a medically oil area for treatment. So but it's very important to use, depending on the purpose. So what did they can be obtained? Uh, in the case of e g, we can see the activity of neurons that scalp the case of in years so we can attain the river off oxygen bar Pratt The case off natural and safe Alagem we can see the activity of new uh, that contact is neck case off position. Martian topography. We can get activity off reception by the contact list. If we use that, I we can measure the amount of blood by the contractors. These devices are showing these figures. So our motivation is to get the concept question using their model by system identification where it's not removed on. The second motivation is to theorize that's simple and small cancer X election using the each information when we use the ever my the large scale and the expensive on binding. So it is unuseful. So we focus on the EEG because the e g iss Moscow inexpensive a non binding on to use. So we focus on the energy. So e g is actually a potential from the major from the scalp that detective data is translated to the pregnancy domain. And if you can see domain that their point to 44. We call the data death of it 4 to 6. We called a cedar with on 17. 14 were called the Alfa Hour and 14 to 26. We called a better work in a conventional method we want if we want use the cats a deep sleep, we use that death of it in a case of light sleep we used a secretive and so but this is just only the sensible method. So we cannot use that for all the film Actuary accuracies under the 20%. So we need to define the situation original. So recall this technique council modeling. So these are the block diagram Kansi the concept What? So this field this part eyes for the noise, this part for the mathematical model. So we calculate this transfer function like this. This is a discrete time water, and, uh, this time, uh, is continuous time model. So then we really right this part Thio Discrete time water. So we cull Create, uh, this part us like this This'll first part on the second part is calculated by the party application so we can get this the argumentative model. So then that we were right this part by using that the transfer function transport formation. So we right this argument ID model like this. So the off about the inverse and better off the inverse is the point as this equation. So each the coefficient is corrugated by this equation on. But then we calculate a way too busy with beaver by using this because of a least squares algorithm. So we call this identification method the self joining identification method. Um, that this is an example of stories modeling. The first of all, we decide we gather the data like a story. It's moving. So we move the small beans, try to trade at 41 hour. So last 10 minutes we used as stories and we measure that culture soul for sliced levin Onda. We associate the egg and we measure the 8000 data. Uh, in 17 years we? Yeah, that's a 17 years. So in the case, off the simple, easy universes that there are many simply devices in the world like this so many of them the There we calculate the signal nodes. Lazio, The signal means the medical easy system on the each device made it sn Lazio. And we investigate 58 kinds off devices on almost off All devices are noise devices. So I'm also asked about to various parts more device that best. So my answer is anything. Our skill is, you know, processing on def. With love. Data can be obtained from the device. No, but what device? He may use the same result commission. Our novelty is level Signal processing on our system is structured by 17 years Data for one situation. So the my answer is what? Anything. So we applied this system to Arial product. We call this product concern Analyzer. In a concept analyzer, you can see the concept that right the our time a concept dinner influence Solis sickness concentration on like so that we combine that this can't say analyzer And the camera system We made the euro system your account so pretty show it this is in Eureka. Well, this is, uh, e g system and we can get can say by using the iPhone on the, uh, we combine the camera system by the iPhone camera and if the cancer is higher than the 6% 60% so automatically recorded like this. Mhm. So every time we wear the e g devices, we can see the no awareness, the constant way. That's so finally we combine the each off cancer. So like that this movie, so we can see the thes one days. Can't say the movie s Oh, this is a miracle. On the next example, it's neuro marketing using a constant analyzer. So this is a but we don't know what is the number one point. So then we analyze the deeds CME by using concert analyzer so we can get the rial time concept then that we can see the one by one situation like this. So this is the interest level and we can see the high interest like this. So the recorded a moment automatically on the next one is really application. The productive design. Ah, >>Japanese professor has come up with a new technology she claims can read minds, she says. The brainwave analysis system will help businesses better understand their customers, needs workers at a major restaurant chain or testing a menu item that is being developed. This device measures brain waves from the frontal lobes of people who try the product. An application analyzes five feelings how much they like something and their interest, concentration, stress and sleepiness. >>The >>new menu item is a cheese souffle topped with kiwi, orange and other fruit. The APP checks the reaction of a person who sees the souffle for the first time. Please open your eyes. When she sees the souffle, the like and interest feelings surge on the ground. This proves the desert is visually appealing. Now please try it. After the first bite, the like level goes up to 60. That shows she likes how the dessert tastes. After another bite, the like level reaches 80. She really enjoys the taste of the souffle. It scores high in terms of both looks and taste, but there's an unexpected problem. When she tries to scoop up the fruit, the stress level soars to 90. I didn't know where to put the spoon. I felt it was a little difficult to eat. It turned out it was difficult to scoop up the fruit with a small spoon. So people at the restaurant chain are thinking of serving this a flavor with a fork instead. Green well. How could be the difference with the device? We can measure emotional changes in minute detail in real time. This is a printing and design firm in Tokyo. >>It >>designs direct mail and credit card application forms. The company is using the brainwave analyzing system to improve the layout of its products. The idea is to make them easier to read during this test, The subject wears an eye tracking device to record where she's looking. In addition to the brainwave analyzing device, her eye movements are shown by the red dots on the screen. Stress levels are indicated on the graph on the left. Please fill out the form. This is a credit card application form. Right after she turns her eyes to this section, her stress levels shoots up. It was difficult to read as each line contained 60 characters, so they decided to divide the section in two, cutting the length of the lines by half 15 a Hong Kong. This system is very useful for us. We can offer differentiated service to our clients by providing science based solutions. The brain wave analyzed. >>Okay, uh, now the we construct a concert detection like this. Like this. Like concentration, interest sickness stories contain, like comfortable, uncomfortable. I'm present the rats emotion, deadly addictive case lighting, comfort, satisfaction and the achievement. So finally we conquer more presentation. So in this presentation, we introduce the our such we construct the council question Onda we demonstrate that c street signal processing and we apply the proposed method to Arial product. Uh, we named the constant riser. So this is the first in the world, that's all. Thank you so much.

Published Date : Sep 21 2020

SUMMARY :

Uh, in the case of e g, we can see The brainwave analysis system will help businesses better understand their customers, at the restaurant chain are thinking of serving this a flavor with a fork instead. the brainwave analyzing system to improve the layout of its products. So finally we

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
TokyoLOCATION

0.99+

JapanLOCATION

0.99+

58 kindsQUANTITY

0.99+

17 yearsQUANTITY

0.99+

60 charactersQUANTITY

0.99+

firstQUANTITY

0.99+

each lineQUANTITY

0.99+

five feelingsQUANTITY

0.99+

iPhoneCOMMERCIAL_ITEM

0.99+

first partQUANTITY

0.99+

first timeQUANTITY

0.99+

second partQUANTITY

0.99+

41 hourQUANTITY

0.99+

90QUANTITY

0.99+

Barriers CouncilORGANIZATION

0.99+

first biteQUANTITY

0.99+

todayDATE

0.99+

Hong KongLOCATION

0.98+

one situationQUANTITY

0.98+

twoQUANTITY

0.98+

half 15QUANTITY

0.98+

each informationQUANTITY

0.97+

8000 dataQUANTITY

0.97+

eachQUANTITY

0.97+

20%QUANTITY

0.97+

each deviceQUANTITY

0.97+

80QUANTITY

0.97+

second motivationQUANTITY

0.95+

FirstQUANTITY

0.95+

MoscowLOCATION

0.95+

AmericanOTHER

0.92+

JapaneseOTHER

0.92+

6%QUANTITY

0.91+

EurekaLOCATION

0.9+

44QUANTITY

0.88+

one daysQUANTITY

0.85+

up to 60QUANTITY

0.84+

oneQUANTITY

0.83+

both looksQUANTITY

0.81+

60%QUANTITY

0.81+

KansiPERSON

0.8+

PrattPERSON

0.78+

4QUANTITY

0.76+

6QUANTITY

0.74+

LazioLOCATION

0.68+

10 minutesQUANTITY

0.66+

ArialORGANIZATION

0.65+

number one pointQUANTITY

0.64+

26DATE

0.62+

threeQUANTITY

0.57+

Alfa HourCOMMERCIAL_ITEM

0.55+

MartianOTHER

0.53+

JapaneseLOCATION

0.52+

14QUANTITY

0.49+

17QUANTITY

0.49+

CMEORGANIZATION

0.49+

Richard Hummel, NETSCOUT | CUBE Conversation


 

(melodic music) >> Welcome to this CUBE conversation, I'm Lisa Martin, Richard Hammel joins me next, manager of threat intelligence at NetScout. Richard, welcome back to theCUBE. >> Thanks Lisa it's nice to be back. Thank you for having me. >> We have a lot to talk about in the next 15 to 20 minutes. We're going to be talking about the NetScout threat intelligence report. The report covers the first half of 2021, January one to June 30th. Unprecedented events of 2020 Richard, spilling into 2021. How have the events of 2020 impacted the threat landscape? What are you seeing? >> I would say that it's significantly impacted it. The COVID pandemic and all that happened with remote work and education moving to remote, all of that had a hand in exponentially increasing the threat landscape that adversaries have at their disposal to compromise unknowing victims, to launch attacks. There's so much more that adversaries are able to really hook into. Just in the first half of 2021, we saw almost 5.4 million DDoS attacks. And if you go back to last year, we broke a record at 10 million, just over 10 million, and we're well on track to hit 11 million at the end of this year. So you can see how it's impacted. And even as much as some things are starting to tail off or taper off a little bit, as things start to get back to normal, we start to resume travel, we resume going to the office. There's still that tail end, we're still seeing this kind of heightened attack landscape, and there's lots of different phenomenon that's happening as a result, which we'll talk about throughout this interview. >> Yeah, we'll dissect that you said on pace for a record breaking 11 million DDoS attacks it by the end of 2021. One of the things I want to talk about is speed. I noticed in the report that seven attack vectors in seven months, which means that threat actors exploited, or weaponized seven, at least seven of the new DDoS specters in just seven months time. Why is that significant? >> You know, I'll even raise the ante a little bit just after the throw report. There's an eight factor. And so this is the nature that we're in. This is, the, really the age of innovation. And we've been in kind of an innovative space in the crime world for a couple of years now, where we continue to see this domino effect for lack of a better way of describing it, where it's just one after the next step to the next. And then you add in this compounding thing where you have more devices than ever before connected to the internet. And I have all that much more exposure for these things to take advantage of you. And so we see adversaries innovating. And one of the ways in which we see that is, they operate like a business enterprise. They have functional components for different things. And as you kind of fragments that business structure in the crime world, you get specialized areas for certain things. And so you have adversaries that are niche in a certain area, whether it's distribution of malware or it's launching a DDoS attack, or maybe it's just finding a reflectors amplifiers to launch those DDoS attacks, you have all of these kind of niche areas and the more you can consolidate or collapsed those different skillsets into different components, you're going to find it, it iterates a much more rapidly. It's the same thing that happens as entrepreneurs in the business enterprise. Do you outsource what you're not the expert at? And you outsource it to somebody who is an expert and we see the same phenomenon happening in the cyber-crime world. >> So the rate of discovery to weaponization is getting shorter. >> Super fast. And we've seen things weaponized, a short as one to two days from the time of proof of concept comes online to when an adversary adopts this into their tools or their toolkits. And so on most often, the way we see this adopted is maybe a bot picks it up. So you have like your Mariah's, your satory's, your dash, all these different IOT related bots out there that have capabilities, but then you also have these platforms called booter stressors. And adversaries, just continue to add vectors there. There's no reason to remove them because they're still effective. And so we see this continual add of new ways to compromise and new ways to attack somebody that just always goes up into the right. >> Up into the right, in some cases can be good, in this case, it's obviously it's a sign of distress. One of the things the report showed Richard, was the development of adaptive DDoS. Just the name adaptive leads me to think of evasive tactics, you know, that threat actors are employing, talk to us about adaptive DDoS and what the report showed for the first half of 2021. >> Sure. So the biggest thing we saw with adaptive DDoS and I have to preface this by one of the changes that we saw over the first half of 2021. Going into the first half of the year, DNS reflection amplification was kind of the predominant preferred method by adversaries. There's so many DNS servers out there. So it's something they're able to do. Well, we saw a different type of attack called TCP act floods actually surpassed that. And TCP act floods are a little bit different because it uses a different internet protocol. Now what's significant about TCP based connections is it's connection oriented. So requires what we would call a three-way handshake. So there's packets going to the target, they're coming back to the adversary, they're going to the target. And in most cases they're spoofing of IP addresses. So it never really goes to the actual adversary, but somebody else, right? And so it's much more process intensive or network intensive. And so you can basically launch these TCP floods, these scent attacks, these act floods, whatever they might be. And you're creating a bunch of different connections on that targeted entity and you're spoofing the source. So in other words, let's just say, I am victim one and there's an adversary out there that wants to target me. So they're going to actually spoof my IP address and they're going to send a bunch of these syn flood or a sin, you know, acts or TCPI floods or whatever they might be, to all these DNS servers around the world. And so they're all going to reply to their suppose source of those packets, which in fact, a spoofed, right? And so now you're getting all this flood attacks. And so what we're seeing here is a switch. We're moving from kind of the just connection list, the UDP based stuff the DNS reflection amplification to a more niche things such as TCP act floods. And it's the first time we've ever seen TCP act floods take first place. And what's notable about that is that there are certain types of DDoS mitigation that is susceptible to this kind of attack. And so what we see adversaries do is they'll watch that attack and the monitor did the, did my victim go down? If they didn't go down, they'll pivot, they'll try something else. Maybe they'll try typical volumetric attack. If that succeeds what, okay. We took one layer of the defense down. So is there anything else preventing us from taking our target offline? Well, maybe there's a second layer of defense. So now let's try this other thing and see if that works. And so we actually saw this successful against a commercial banks and payment card processors, where they used TCP act floods to bypass one layer. Then they use volumetric bypass the second, and then on a completely different target, we saw it in reverse. And so we see adversaries adapting to how we're putting our security posture is in place. What we're doing to defend our organizations and networks and adversaries are very quickly iterating and pivoting to follow what we're doing and overcome that. >> And when you say quickly, how quickly are we talking? Is this a matter of days? >> Well, in the case of the attacks that we're talking about, we're talking about seconds or minutes because they're actually launching the attack and they're sitting there watching to see if that goes down and if it doesn't go down, they can pivot really, really quickly and launch a secondary attack. And so in these cases it's really, really rapid and really fast. >> Wow. Another thing that I read in the report and that you sort of intimated a minute ago was the amount of collateral damage seems to also be expanding with what you're seeing in the threat landscape. Talk to us about the risks there and the collateral damage and get us some examples of that actually happening. >> So I think that the biggest example of this and this isn't actually DDoS related, but if you look at like the colonial pipeline incident that happened, right? So they didn't actually go after colonial pipeline. They went after a vendor that provides some sort of service to them. And that resulted in Colonial saying, "we got to shut down our pipeline "because now we can't build our customers." So that's like one aspect of collateral damage. Well, let's translate that to the DDoS world. What happens when a DNS server goes offline, that services 1000 different websites. Now you have all of these other websites that can't be accessed. Well, what happens if an adversary goes after a VPN for a prominent enterprise, they successfully take down that VPN concentrator, and now all of their remote workforce can no longer access those sources. In fact, there's something we're calling connectivity supply chain, which is what adversaries are moving to both in the corporate world, as well as commercial. VPNs increasingly used by gamers, for instance, to mask their IPS because DDoS attacks predominantly target gamers, 80, 85% of all attacks are against gamers. And so they're using VPNs to mask their source. Well, an adversary says, well, hey, I can't go after the individual because I don't know their IP, but I know what your VPN are using. So maybe if I target all the VPN nodes that are publicly available for that VPN concentrator or VPN service provider, now I can take them offline. But it as a consequence, you're not just taking off your individual target. You're taking off every single person that's using that VPN. >> Right. >> This is the collateral damage impact we're talking about. It can be very, very far reaching. >> You mentioned the conductivity supply chain. Let's go ahead and dissect that. Cause that was something else that the report showed was that there was vital components of what NetScout calls the conductivity supply chain, which you'll helped define, are under increasing attack, define the connectivity supply chain and tell us what the report is showing. >> So supply chain comes in many forms and fashion. You have your physical supply chain, you have your vendors that provide software. You have actual movers like such as semis and trains, and you have pipelines to get crude oil to places. All of these things are supply chain, but what's the underlying foundation behind these? How do all of these operate? And more and more in today's day and age, you rely on internet connectivity. You rely on that backbone to be able to operate your systems across a remote space, whether that's internationally, or if it's different countries, if it's just different states, you have to have some way of connecting all those things. And we're not often doing things physically in person there, right? We do this by remote access. We do this by having certain websites or controllers. And all of these things rely on a few critical things that if you were to take them offline, it would prevent you from doing this kind of management. So DNS servers, VPNs, I already talked about whether it's commercial or corporate to access your company's assets. And then you have internet exchanges. If any, one of these things went down from a DDoS attack, you're talking about massive collateral damage. And so what we're calling the conductivity supply chain is really just that, what connects all of us together? That's that's the internet and what makes the internet tick? And here at NetScout, we call ourselves guardians of the connected world. And though that might seem a little bit weird to say it that way. It's absolutely true because our primary goal, here at NetScout, is to make sure that organizations maintain that connection that allows them to really just live, breathe, survive, do their business, without that, you can't conduct business. >> Right? And we saw that the rapid pivot last year, and so many businesses and any, every industry had to rapidly pivot and shift to digital, but the risks as the innovation of technology, for use for good, continues do does it's innovation and use for adversarial things. Another thing that report showed, triple extortion. Talk about that. What you saw, what does that mean for businesses? >> So the triple extortion is three pronged attack. And, everybody here is going to know exactly what I'm talking about when I say ransomware, because ransomware is the biggest threat to the cyber world, really not even just the cyber world, just anybody that has a computer or device or anything, right? Whether it's a business, it's a user, it's a school, hospitals. Everybody is at risk for this and adversaries see the success that ransomware is having and more and more operators get involved in this. Well, what we're seeing here is that they are not satisfied with just encrypting your files and getting a one-time payment. No, they've got to take it a step further. And in fact, the double extortion has been ongoing since, as far back as 2013. When a popular, "Gameover Zeus" variant was distributing CryptoLocker ransomware. And so you have like your initial compromise and data theft and wire transfers of bank stuff followed by ransomware. I already stole your money from your bank. And now you're going to pay me a ransomware to decrypt your files. Well, let's move forward to today's day and age. And over the past year, one of the things we've seen is that adversaries are now adding a third tactics to this the DDoS. And so they will encrypt your files. They'll demand. Hey, you're going to pay us this amount of Bitcoin in order to decrypt your files. But you know, we're already in your system. So, you know, let's just steal your data. And then after you pay us for the decryption, we're going to hold your data hostage until you pay us again. Or maybe we're going to use that data as a lever to get you to pay that initial ransomware. Well, that's still not enough because more and more security researchers, like myself say don't pay. And I'm saying that right here, in plain English, do not pay the ransomware because it has detrimental effects. They, you don't even know if they're going to decrypt your files and you don't know if they're going to come back. Maybe you pay them. They never send you a decryption key. You pay them. And lo and behold, they're part of some terrorist organization. So now you're actually complicit in funding these guys, and the more success that these ransom operators have, the more they're going to do it. And so it has a lot of really negative consequences. Well, let's add another lever. Let's add DDoS to this. So it's not enough. We encrypted your files. It's not enough. We stole your data. Let's knock your network offline. So now you have no recourse whatsoever, except to pay us in order to resume services. And we're seeing at least four or five different ransomware groups of gangs actually use this triple extortion to go after their victims. And so it's something that we expect to see down the road and more and more operators continue to kind of adopt this. >> Lisa: Yeah. The report showed that there was a ransomware group that in the first half of 2021 alone, that vetted a hundred million dollars. So ransomware as a service, this is a big business. You say, don't pay, what can organizations do to defend themselves against triple extortion, even single or double? >> Yeah. So I mean, the thing is, preparation is key for a lot of this and not just for the ransomware piece and triple extortion, but DDoS in general preparation goes a long way to mitigating this potential threat. And one of the things we'd like to say here is that 80% of the things you can do to defend against ransomware also works for defending against DDoS. And the key word here is preparation. Making sure that you've done your, initial observations of your network. You understand what is in your network, every device, not just like the core critical systems, because there could be that IOT device sitting there on their fringe somewhere that has, for whatever reason, access to a system that if encrypted would cause detrimental harm to your company. So not only do you want to inventory your system, you also want to figure out, are they pastorally up to date? Do we allow on an authenticated logins? Are there using default usernames and passwords? In fact, the vast majority of ransomware today, the initial infection vector is either going to be some sort of spam messaging or brute forcing RDP, SSH, and Telnet, the tried and true methods that they've been using for five, six, seven years. They are still successful using to get into organizations. And so making sure that you're sufficiently locking those down. Specifically on the ransomware side, if you want to prevent those, not only are you going to do this preparation, but you're going to make sure that you isolate your critical systems. You shouldn't have everything connected to one spot. If somebody compromises one device, they should not be able to encrypt your entire network. They absolutely should never be able to encrypt your backup files and have backup files, right? So there's a lot of different things you can do here. And by practicing a lot of this preparation, this isolation, the segmenting of your networks, you're also helping in the DDoS space because if they go after one network asset, you'll have all this to fall back on. There was one significant difference between ransomware and DDoS. Ransomware, after you've been infected, unless you have backups or you pay the ransomware, your files are pretty much gone. Unless there's some decrypted that can be had, or the government has some sort of campaign that gets you the caption keys and they helped you with the decryption. So in those cases, if you get encrypted, there's often not a whole lot of recourse, unless you have prepared ahead of time. With DDoS, however, the vast majority, 99% of all DDoS attacks can be prevented if you have a mitigation and protection solution in place. And even if you get DDoS, oftentimes they're, short-lived in fact, the vast majority of DDoS attacks last less than 15 minutes. And so it's not like your stuff is going to be encrypted for days on end or weeks on end. You're going to get hits, you might go down for a period of time, but you can recover services. And during that recovery period, you can go and you can seek mitigation protection services. And so there's a big difference between DDoS and ransomware in that regard. >> That's a great way of describing that. And we've talked a lot about ransomware is it's been on the increase the last year and a half. We've talked about how it's not a matter of if we get attacked, it's a matter of when. But your distinction between ransomware and DDoS attacks show that both with preparation and the right tools, are preventable and recoverable provided organizations have put the proper tools and mechanisms in place to do that. And given how quickly we're seeing the adaptation of the threat actors, organizations, if they're not already on that preparation train, need to catch up. >> Absolutely. They need to get busy right away. There's there's really no delay. Like I said, like you said, it's not if, it's when. And so every single person, every organization, I would take a step further, not even organizations, every single individual that has a computer or some sort of internet connection at home needs to realize that they absolutely can be and are the target of these attacks. We've said it now for the past year and a half, that within five minutes of an IOT device going online, you're getting brute force attempts and that's any IOT device. That's something you connect that maybe you never even realize you can log into and change your password. Well, if it's online, then chances are somebody is trying to brute force that to access it and use it in the varies ways. >> And, and as we all sort of anticipate, we're going to be in this hybrid work environment, work from anywhere environment for quite a while longer. One last question want to ask you, when you talk about all the proliferation of IOT devices, and we're still on this work from anywhere situation, botnets? What are some of the things that the report showed and how can organizations protect all in a, you know, growing number of vulnerable IOT devices from botnets? >> So I think the biggest thing to protect against a IOT compromise is just simply patching up that your passwords Mariah has been out there for a long time, 2016. You know, we saw the dine attacks, but it's still using the same usernames and passwords. Sure, they add more to the list, but the predominant ones that are successful in compromised devices have been around for many years, but they're still successful at compromising these IOT devices. In fact, in the report, one of the things we wanted to show is actually, where are these botnets? How are they being used and specifically in a DDoS nature? And so we actually took all of the IP addresses that we're seeing from bots that are either coming back into our honeypot or things that we scan for. You know, and what we've determined. And that is that roughly 200 to 208,000 of the IP addresses. IP addresses that both we collected as well as a new partner of ours called Gray Noise. They've agreed to partner with us on this short report and you'll see that in the, in the report, if you actually read it. We took these lists of nodes and we compare that to what we're seeing in the DDoS attack landscape. And it turns out that approximately 200,000 of these contributed to more than 2.8 million DDoS attacks in the first half of 2021. Now there was 5.4 million tax total. So more than half of those had some form of DDoS botnet IOT representation. And so that should tell you that these botnets are huge and they're everywhere and they're active. And so the report actually walks you through where these are at, where the density zones are in clusters of these botnets, as well as what botnets in those high density zones are using to compromise other IOT devices. And so it's definitely a very informative read. And I think that you'll, you'll figure out that this isn't, something we talk about in the abstract, right? This is a botnet in my backyard, and I should absolutely be concerned of any IOT device in my home. >> Right. And the, the NetScout threat intelligence report, which Richard has just walked us through is not only available online. It's interactive. It's a great report. I've looked at the PDF, but Richard work in folks go to actually interact with the document and actually glean even more information about how they can prepare and defend. >> Yeah. So netscout.com/starreport. And as Lisa said, it is interactive. So you will need to sign up for the site and you can do both. You can either view the interactive webpage, or you can download the PDF, whatever your reading preference is. But I do encourage the interactive portion because for instance, like this botnet density map that I show, or that I that talked about, you can actually page through month over month to see where those density clusters are. And it is very souther animations. There's other maps in there so there's definitely a lot more value to perusing the interactive nature. >> A lot of granularity. Richard, thank you so much for joining me today, talking about what the first half of 2021 showed. And I can't wait to talk to you next year when we're going to be looking at the second half of the year where we are, with respect to that record, breaking 11 million DDoS attacks. Thank you for taking your time to explain the top trends in the report and for showing folks where they can go to interact with it. >> Well, thank you, Lisa. And thank you to theCUBE for hosting the interview. Definitely appreciate it. >> Our pleasure. For Richard Hammel, I am Lisa Martin, you're watching a CUBE conversation. (melodic music)

Published Date : Sep 20 2021

SUMMARY :

Welcome to this CUBE Thanks Lisa it's nice to be back. in the next 15 to 20 minutes. And if you go back to last year, One of the things I want and the more you can So the rate of And so on most often, the Just the name adaptive leads me to think And so they're all going to reply Well, in the case of the and that you sort of that to the DDoS world. This is the collateral damage that the report showed was You rely on that backbone to be able to but the risks as the And so you have like your that in the first half of 2021 alone, that 80% of the things you can and the right tools, that to access it and use that the report showed And so that should tell you I've looked at the PDF, and you can do both. And I can't wait to talk to you next year And thank you to theCUBE you're watching a CUBE conversation.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Richard HammelPERSON

0.99+

Lisa MartinPERSON

0.99+

RichardPERSON

0.99+

Richard HummelPERSON

0.99+

Richard HammelPERSON

0.99+

LisaPERSON

0.99+

2013DATE

0.99+

5.4 millionQUANTITY

0.99+

80%QUANTITY

0.99+

1000 different websitesQUANTITY

0.99+

NetScoutORGANIZATION

0.99+

fiveQUANTITY

0.99+

sevenQUANTITY

0.99+

10 millionQUANTITY

0.99+

last yearDATE

0.99+

99%QUANTITY

0.99+

2016DATE

0.99+

next yearDATE

0.99+

five minutesQUANTITY

0.99+

11 millionQUANTITY

0.99+

one layerQUANTITY

0.99+

2021DATE

0.99+

second layerQUANTITY

0.99+

two daysQUANTITY

0.99+

seven monthsQUANTITY

0.99+

Gray NoiseORGANIZATION

0.99+

one deviceQUANTITY

0.99+

one-timeQUANTITY

0.99+

June 30thDATE

0.99+

bothQUANTITY

0.99+

singleQUANTITY

0.99+

approximately 200,000QUANTITY

0.99+

seven attack vectorsQUANTITY

0.99+

todayDATE

0.99+

MariahPERSON

0.99+

oneQUANTITY

0.99+

sixQUANTITY

0.99+

January oneDATE

0.98+

three-wayQUANTITY

0.98+

OneQUANTITY

0.98+

seven yearsQUANTITY

0.98+

less than 15 minutesQUANTITY

0.98+

11 million DDoSQUANTITY

0.98+

208,000QUANTITY

0.98+

firstQUANTITY

0.98+

end of 2021DATE

0.97+

one spotQUANTITY

0.97+

One last questionQUANTITY

0.97+

eight factorQUANTITY

0.97+

doubleQUANTITY

0.97+

over 10 millionQUANTITY

0.97+

end of this yearDATE

0.97+

first timeQUANTITY

0.96+

COVID pandemicEVENT

0.96+

EnglishOTHER

0.96+

secondQUANTITY

0.95+

first half of 2021DATE

0.95+

more than 2.8 million DDoSQUANTITY

0.95+

more than halfQUANTITY

0.95+

netscout.com/starreportOTHER

0.94+

a minute agoDATE

0.94+

past year and a halfDATE

0.94+

past yearDATE

0.94+

CUBEORGANIZATION

0.94+

one aspectQUANTITY

0.94+

three pronged attackQUANTITY

0.93+

CryptoLockerTITLE

0.9+

200QUANTITY

0.89+

Neuromorphic in Silico Simulator For the Coherent Ising Machine


 

>>Hi everyone, This system A fellow from the University of Tokyo before I thought that would like to thank you she and all the stuff of entity for the invitation and the organization of this online meeting and also would like to say that it has been very exciting to see the growth of this new film lab. And I'm happy to share with you today or some of the recent works that have been done either by me or by character of Hong Kong Noise Group indicating the title of my talk is a neuro more fic in silica simulator for the commenters in machine. And here is the outline I would like to make the case that the simulation in digital Tektronix of the CME can be useful for the better understanding or improving its function principles by new job introducing some ideas from neural networks. This is what I will discuss in the first part and then I will show some proof of concept of the game in performance that can be obtained using dissimulation in the second part and the production of the performance that can be achieved using a very large chaos simulator in the third part and finally talk about future plans. So first, let me start by comparing recently proposed izing machines using this table there is adapted from a recent natural tronics paper from the Village Back hard People. And this comparison shows that there's always a trade off between energy efficiency, speed and scalability that depends on the physical implementation. So in red, here are the limitation of each of the servers hardware on, Interestingly, the F p G, a based systems such as a producer, digital, another uh Toshiba purification machine, or a recently proposed restricted Bozeman machine, FPD eight, by a group in Berkeley. They offer a good compromise between speed and scalability. And this is why, despite the unique advantage that some of these older hardware have trust as the currency proposition influx you beat or the energy efficiency off memory sisters uh P. J. O are still an attractive platform for building large theorizing machines in the near future. The reason for the good performance of Refugee A is not so much that they operate at the high frequency. No, there are particle in use, efficient, but rather that the physical wiring off its elements can be reconfigured in a way that limits the funding human bottleneck, larger, funny and phenols and the long propagation video information within the system in this respect, the f. D. A s. They are interesting from the perspective, off the physics off complex systems, but then the physics of the actions on the photos. So to put the performance of these various hardware and perspective, we can look at the competition of bringing the brain the brain complete, using billions of neurons using only 20 watts of power and operates. It's a very theoretically slow, if we can see. And so this impressive characteristic, they motivate us to try to investigate. What kind of new inspired principles be useful for designing better izing machines? The idea of this research project in the future collaboration it's to temporary alleviates the limitations that are intrinsic to the realization of an optical cortex in machine shown in the top panel here. By designing a large care simulator in silicone in the bottom here that can be used for suggesting the better organization principles of the CIA and this talk, I will talk about three neuro inspired principles that are the symmetry of connections, neural dynamics. Orphan, chaotic because of symmetry, is interconnectivity. The infrastructure. No neck talks are not composed of the reputation of always the same types of non environments of the neurons, but there is a local structure that is repeated. So here's a schematic of the micro column in the cortex. And lastly, the Iraqi co organization of connectivity connectivity is organizing a tree structure in the brain. So here you see a representation of the Iraqi and organization of the monkey cerebral cortex. So how can these principles we used to improve the performance of the icing machines? And it's in sequence stimulation. So, first about the two of principles of the estimate Trian Rico structure. We know that the classical approximation of the Cortes in machine, which is a growing toe the rate based on your networks. So in the case of the icing machines, uh, the okay, Scott approximation can be obtained using the trump active in your position, for example, so the times of both of the system they are, they can be described by the following ordinary differential equations on in which, in case of see, I am the X, I represent the in phase component of one GOP Oh, Theo F represents the monitor optical parts, the district optical parametric amplification and some of the good I JoJo extra represent the coupling, which is done in the case of the measure of feedback cooking cm using oh, more than detection and refugee A then injection off the cooking time and eso this dynamics in both cases of CME in your networks, they can be written as the grand set of a potential function V, and this written here, and this potential functionally includes the rising Maccagnan. So this is why it's natural to use this type of, uh, dynamics to solve the icing problem in which the Omega I J or the Eyes in coping and the H is the extension of the rising and attorney in India and expect so. >>Not that this potential function can only be defined if the Omega I j. R. A. Symmetric. So the well known problem of >>this approach is that this potential function V that we obtain is very non convicts at low temperature, and also one strategy is to gradually deformed this landscape, using so many in process. But there is no theorem. Unfortunately, that granted convergence to the global minimum of there's even 20 and using this approach. And so this is >>why we propose toe introduce a macro structure the system or where one analog spin or one D o. P. O is replaced by a pair off one and knock spin and one error on cutting. Viable. And the addition of this chemical structure introduces a symmetry in the system, which in terms induces chaotic dynamics, a chaotic search rather than a >>learning process for searching for the ground state of the icing. Every 20 >>within this massacre structure the role of the ER variable eyes to control the amplitude off the analog spins to force the amplitude of the expense toe, become equal to certain target amplitude. A Andi. This is known by moderating the strength off the icing complaints or see the the error variable e I multiply the icing complain here in the dynamics off UH, D o p o on Then the dynamics. The whole dynamics described by this coupled equations because the e I do not necessarily take away the same value for the different, I think introduces a >>symmetry in the system, which in turn creates chaotic dynamics, which I'm showing here for solving certain current size off, um, escape problem, Uh, in which the exiled from here in the i r. From here and the value of the icing energy is shown in the bottom plots. And you see this Celtics search that visit various local minima of the as Newtonian and eventually finds the local minima Um, >>it can be shown that this modulation off the target opportunity can be used to destabilize all the local minima off the icing hamiltonian so that we're gonna do not get stuck in any of them. On more over the other types of attractors, I can eventually appear, such as the limits of contractors or quality contractors. They can also be destabilized using a moderation of the target amplitude. And so we have proposed in the past two different motivation of the target constitute the first one is a moderation that ensure the 100 >>reproduction rate of the system to become positive on this forbids the creation of any non tree retractors. And but in this work I will talk about another modulation or Uresti moderation, which is given here that works, uh, as well as this first, uh, moderation, but is easy to be implemented on refugee. >>So this couple of the question that represent the current the stimulation of the cortex in machine with some error correction, they can be implemented especially efficiently on an F B G. And here I show the time that it takes to simulate three system and eso in red. You see, at the time that it takes to simulate the X, I term the EI term, the dot product and the rising everything. Yet for a system with 500 spins analog Spain's equivalent to 500 g. O. P. S. So in f b d a. The nonlinear dynamics which, according to the digital optical Parametric amplification that the Opa off the CME can be computed in only 13 clock cycles at 300 yards. So which corresponds to about 0.1 microseconds. And this is Toby, uh, compared to what can be achieved in the measurements tobacco cm in which, if we want to get 500 timer chip Xia Pios with the one she got repetition rate through the obstacle nine narrative. Uh, then way would require 0.5 microseconds toe do this so the submission in F B J can be at least as fast as, ah one gear repression to replicate the post phaser CIA. Um, then the DOT product that appears in this differential equation can be completed in 43 clock cycles. That's to say, one microseconds at 15 years. So I pieced for pouring sizes that are larger than 500 speeds. The dot product becomes clearly the bottleneck, and this can be seen by looking at the the skating off the time the numbers of clock cycles a text to compute either the non in your optical parts, all the dog products, respect to the problem size. And and if we had a new infinite amount of resources and PGA to simulate the dynamics, then the non in optical post can could be done in the old one. On the mattress Vector product could be done in the low carrot off, located off scales as a low carrot off end and while the kite off end. Because computing the dot product involves the summing, all the terms in the products, which is done by a nephew, Jay by another tree, which heights scares a logarithmic any with the size of the system. But this is in the case if we had an infinite amount of resources on the LPGA food but for dealing for larger problems off more than 100 spins, usually we need to decompose the metrics into ah smaller blocks with the block side that are not you here. And then the scaling becomes funny non inner parts linear in the and over you and for the products in the end of you square eso typically for low NF pdf cheap P a. You know you the block size off this matrix is typically about 100. So clearly way want to make you as large as possible in order to maintain this scanning in a log event for the numbers of clock cycles needed to compute the product rather than this and square that occurs if we decompose the metrics into smaller blocks. But the difficulty in, uh, having this larger blocks eyes that having another tree very large Haider tree introduces a large finding and finance and long distance started path within the refugee. So the solution to get higher performance for a simulator of the contest in machine eyes to get rid of this bottleneck for the dot product. By increasing the size of this at the tree and this can be done by organizing Yeah, click the extra co components within the F p G A in order which is shown here in this right panel here in order to minimize the finding finance of the system and to minimize the long distance that the path in the in the fpt So I'm not going to the details of how this is implemented the PGA. But just to give you a new idea off why the Iraqi Yahiko organization off the system becomes extremely important toe get good performance for simulator organizing mission. So instead of instead of getting into the details of the mpg implementation, I would like to give some few benchmark results off this simulator, uh, off the that that was used as a proof of concept for this idea which is can be found in this archive paper here and here. I should result for solving escape problems, free connected person, randomly person minus one, spin last problems and we sure, as we use as a metric the numbers >>of the mattress Victor products since it's the bottleneck of the computation, uh, to get the optimal solution of this escape problem with Nina successful BT against the problem size here and and in red here there's propose F B J implementation and in ah blue is the numbers of retrospective product that are necessary for the C. I am without error correction to solve this escape programs and in green here for noisy means in an evening which is, uh, behavior. It's similar to the car testing machine >>and security. You see that the scaling off the numbers of metrics victor product necessary to solve this problem scales with a better exponents than this other approaches. So so So that's interesting feature of the system and next we can see what is the real time to solution. To solve this, SK instances eso in the last six years, the time institution in seconds >>to find a grand state of risk. Instances remain answers is possibility for different state of the art hardware. So in red is the F B G. A presentation proposing this paper and then the other curve represent ah, brick, a local search in in orange and center dining in purple, for example, and So you see that the scaring off this purpose simulator is is rather good and that for larger politicizes, we can get orders of magnitude faster than the state of the other approaches. >>Moreover, the relatively good scanning off the time to search in respect to problem size uh, they indicate that the FBT implementation would be faster than risk Other recently proposed izing machine, such as the Hope you know network implemented on Memory Sisters. That is very fast for small problem size in blue here, which is very fast for small problem size. But which scanning is not good on the same thing for the >>restricted Bosman machine implemented a PGA proposed by some group in Brooklyn recently again, which is very fast for small promise sizes. But which canning is bad So that, uh, this worse than the purpose approach so that we can expect that for promise sizes larger than, let's say, 1000 spins. The purpose, of course, would be the faster one. >>Let me jump toe this other slide and another confirmation that the scheme scales well that you can find the maximum cut values off benchmark sets. The G sets better cut values that have been previously found by any other >>algorithms. So they are the best known could values to best of our knowledge. And, um, or so which is shown in this paper table here in particular, the instances, Uh, 14 and 15 of this G set can be We can find better converse than previously >>known, and we can find this can vary is 100 times >>faster than the state of the art algorithm and cp to do this which is a recount. Kasich, it s not that getting this a good result on the G sets, they do not require ah, particular hard tuning of the parameters. So the tuning issuing here is very simple. It it just depends on the degree off connectivity within each graph. And so this good results on the set indicate that the proposed approach would be a good not only at solving escape problems in this problems, but all the types off graph sizing problems on Mexican province in communities. >>So given that the performance off the design depends on the height of this other tree, we can try to maximize the height of this other tree on a large F p g A onda and carefully routing the trickle components within the P G A. And and we can draw some projections of what type of performance we can achieve in >>the near future based on the, uh, implementation that we are currently working. So here you see projection for the time to solution way, then next property for solving this escape problems respect to the prime assize. And here, compared to different with such publicizing machines, particularly the digital and, you know, free to is shown in the green here, the green >>line without that's and, uh and we should two different, uh, prosthesis for this productions either that the time to solution scales as exponential off n or that >>the time of social skills as expression of square root off. So it seems according to the data, that time solution scares more as an expression of square root of and also we can be sure >>on this and this production showed that we probably can solve Prime Escape Program of Science 2000 spins to find the rial ground state of this problem with 99 success ability in about 10 seconds, which is much faster than all the other proposed approaches. So one of the future plans for this current is in machine simulator. So the first thing is that we would like to make dissimulation closer to the rial, uh, GOP or optical system in particular for a first step to get closer to the system of a measurement back. See, I am. And to do this, what is, uh, simulate Herbal on the p a is this quantum, uh, condoms Goshen model that is proposed described in this paper and proposed by people in the in the Entity group. And so the idea of this model is that instead of having the very simple or these and have shown previously, it includes paired all these that take into account out on me the mean off the awesome leverage off the, uh, European face component, but also their violence s so that we can take into account more quantum effects off the g o p. O, such as the squeezing. And then we plan toe, make the simulator open access for the members to run their instances on the system. There will be a first version in September that will >>be just based on the simple common line access for the simulator and in which will have just a classical approximation of the system. We don't know Sturm, binary weights and Museum in >>term, but then will propose a second version that would extend the current arising machine to Iraq off eight f p g. A. In which we will add the more refined models truncated bigger in the bottom question model that just talked about on the supports in which he valued waits for the rising problems and support the cement. So we will announce >>later when this is available, and Farah is working hard to get the first version available sometime in September. Thank you all, and we'll be happy to answer any questions that you have.

Published Date : Sep 24 2020

SUMMARY :

know that the classical approximation of the Cortes in machine, which is a growing toe So the well known problem of And so this is And the addition of this chemical structure introduces learning process for searching for the ground state of the icing. off the analog spins to force the amplitude of the expense toe, symmetry in the system, which in turn creates chaotic dynamics, which I'm showing here is a moderation that ensure the 100 reproduction rate of the system to become positive on this forbids the creation of any non tree in the in the fpt So I'm not going to the details of how this is implemented the PGA. of the mattress Victor products since it's the bottleneck of the computation, uh, You see that the scaling off the numbers of metrics victor product necessary to solve So in red is the F B G. A presentation proposing Moreover, the relatively good scanning off the But which canning is bad So that, scheme scales well that you can find the maximum cut values off benchmark the instances, Uh, 14 and 15 of this G set can be We can find better faster than the state of the art algorithm and cp to do this which is a recount. So given that the performance off the design depends on the height the near future based on the, uh, implementation that we are currently working. the time of social skills as expression of square root off. And so the idea of this model is that instead of having the very be just based on the simple common line access for the simulator and in which will have just a classical to Iraq off eight f p g. A. In which we will add the more refined models any questions that you have.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
BrooklynLOCATION

0.99+

SeptemberDATE

0.99+

100 timesQUANTITY

0.99+

BerkeleyLOCATION

0.99+

Hong Kong Noise GroupORGANIZATION

0.99+

CIAORGANIZATION

0.99+

300 yardsQUANTITY

0.99+

1000 spinsQUANTITY

0.99+

IndiaLOCATION

0.99+

15 yearsQUANTITY

0.99+

second versionQUANTITY

0.99+

first versionQUANTITY

0.99+

FarahPERSON

0.99+

second partQUANTITY

0.99+

first partQUANTITY

0.99+

twoQUANTITY

0.99+

500 spinsQUANTITY

0.99+

ToshibaORGANIZATION

0.99+

first stepQUANTITY

0.99+

20QUANTITY

0.99+

more than 100 spinsQUANTITY

0.99+

ScottPERSON

0.99+

University of TokyoORGANIZATION

0.99+

500 g.QUANTITY

0.98+

MexicanLOCATION

0.98+

bothQUANTITY

0.98+

todayDATE

0.98+

KasichPERSON

0.98+

first versionQUANTITY

0.98+

firstQUANTITY

0.98+

IraqLOCATION

0.98+

third partQUANTITY

0.98+

13 clock cyclesQUANTITY

0.98+

43 clock cyclesQUANTITY

0.98+

first thingQUANTITY

0.98+

0.5 microsecondsQUANTITY

0.97+

JayPERSON

0.97+

HaiderLOCATION

0.97+

15QUANTITY

0.97+

one microsecondsQUANTITY

0.97+

SpainLOCATION

0.97+

about 10 secondsQUANTITY

0.97+

LPGAORGANIZATION

0.96+

eachQUANTITY

0.96+

500 timerQUANTITY

0.96+

one strategyQUANTITY

0.96+

both casesQUANTITY

0.95+

one errorQUANTITY

0.95+

20 wattsQUANTITY

0.95+

NinaPERSON

0.95+

about 0.1 microsecondsQUANTITY

0.95+

nineQUANTITY

0.95+

each graphQUANTITY

0.93+

14QUANTITY

0.92+

CMEORGANIZATION

0.91+

IraqiOTHER

0.91+

billions of neuronsQUANTITY

0.91+

99 successQUANTITY

0.9+

about 100QUANTITY

0.9+

larger than 500 speedsQUANTITY

0.9+

VectorORGANIZATION

0.89+

spinsQUANTITY

0.89+

VictorORGANIZATION

0.89+

last six yearsDATE

0.86+

oneQUANTITY

0.85+

one analogQUANTITY

0.82+

hamiltonianOTHER

0.82+

SimulatorTITLE

0.8+

EuropeanOTHER

0.79+

three neuro inspired principlesQUANTITY

0.78+

BosmanPERSON

0.75+

three systemQUANTITY

0.75+

trumpPERSON

0.74+

Xia PiosCOMMERCIAL_ITEM

0.72+

100QUANTITY

0.7+

one gearQUANTITY

0.7+

P.QUANTITY

0.68+

FPD eightCOMMERCIAL_ITEM

0.66+

first oneQUANTITY

0.64+

Escape Program of Science 2000TITLE

0.6+

CelticsOTHER

0.58+

TobyPERSON

0.56+

MachineTITLE

0.54+

Refugee ATITLE

0.54+

coupleQUANTITY

0.53+

TektronixORGANIZATION

0.51+

OpaOTHER

0.51+

P. J. OORGANIZATION

0.51+

BozemanORGANIZATION

0.48+

Gary Conway, Automation Anywhere | CUBEConversation, August 2019


 

(upbeat music) >> From our studios in the heart of Silicon Valley, Palo Alto, California, this is a CUBEConversation. >> Hello everyone, welcome to Palo Alto's CUBE studios. I'm John Furrier, host of theCUBE. We're here for a special CUBEConversation as part of our new brand of tech leader series as well as Extracting the Signal From the Noise. We're here with Gary Conway, the CMO of Automation Anywhere, a hot startup, heavily funded, attacking a whole new market segment, that's kind of changing the game of value in digital, obviously, RPA, robotic process automation, is the buzz word. It's actually real, it's happening, we're seeing a lot of success in companies there. It's changing the way business is operated, business is structured, and value is created. Gary, thanks for joining me. >> My pleasure. >> So we covered your event, Automation Anywhere. You guys are essentially doing very very well, heavily funded, growing like crazy. RPA is one of the fastest growing segments in this next generation digital culture. You're seeing a lot of companies coming out attacking this. What's your perspective, why is RPA so important, why is it so hot? >> It's a pretty simple reason, actually. You know, the truth of the matter is that companies are now, because of RPA, able to automate parts of their business processes, or entire processes that they were never able to automate before, and they can do it with RPA at a relatively little cost compared to a lot of other technologies out there, especially from the big ERP vendors. We say that, and we really believe this, and we're finding this to be true, that since the onset of automation about 30 years ago, from the big technology companies, only about 20% of the processes that businesses manage now are actually automated. The 80% of them that are not automated are pretty much done by human beings, you know. Millions of human beings employed to manage those back office processes. RPA is enabling companies to actually automate more of those processes than ever before. >> Before we get started, just quickly define, what is RPA for the folks that are learning for the first time, because we're now seeing the concept really penetrating mainstream right now. It's becoming, frankly, a topic that's being discussed across most of the largest enterprises and small business, what is RPA? >> So RPA means robotic process automation. So think of them as robots that are built as predesigned software bots that you can plug into any business process, and it'll automate a part of that process, or the entire process, by just plugging it in. It actually is capable of observing what human beings do, remembering what human beings do, and then repeating that again and again and again, only in a fraction of a second. That's the easiest way to think of it. >> So when I think of robots, I think of like a machine, you know, moving things around, from, like, manufacturing, whatnot. It's beyond that, it's not just robots, it's software as well, and this is the key in all this. >> It is software, I mean. >> It is software, it's not robots. >> RPA is only software. >> It's only software. >> I think most people, when they think of robotics, do think of, you know, mechanical robots used in manufacturing. That's not what RPA is. RPA is robotics that is only constructed with preconfigured software. >> I want to get your take on the impact to business and how leaders are adapting to this, but first I want to get to the mainstream topic that is trying to be figured out, and the classic one is technology's going to automate my jobs away, and the example that I use is retail. Most people go to retail, and they think, you know, whether it's a person out of college, or someone working in retail, that oh my god, a robot's going to show up and move stuff on the shelves, and eliminate those jobs. It's not so much robots, per se, it's Amazon that's going to impact in retail. We know what Amazon and Walmart has done to commerce. So that's already happening, retail's impacted. It's not so much that jobs are going away, they're just changing. That's our opinion. Can you share your opinion on the impact of software automation to jobs? >> We agree that jobs are not going away. They will change, but I always tell people when I'm asked this question that there's not been one technology that's ever been introduced that has actually done anything but create more jobs, and I always use the example of the PC. You know, I'm old enough to remember when the PC was introduced, the headlines were what will people do with all this additional time? You know, people were predicting a three day work week because of all the efficiencies that would be created by the PCs, and in fact the opposite has happened. Technology actually makes people more productive, and when they're more productive they're capable of doing more things. So with the automation of certain things that people happen to be doing now, those people are being upskilled, they are being redeployed to other jobs, as we've seen in the past, and actually, more jobs are being created. >> You know, we cover a lot of the Big Data space going back to 2010 when we first started theCUBE, at Hadoop World, which that kind of had its course, but ultimately Big Data, which became AI, you know the bank teller example, you know the ATM was going to kill the branches, when in reality there's been even more branch offices-- >> That's what we're seeing, yeah. >> than ever before. So again, I think the argument is pretty clear from the data and the trend, technology is actually helping create new jobs, but not the jobs maybe that there were once there. That seems to be the big debate, so we agree with you on that. Now we applied some of our, not RPA, but we had some technology that applied to all of our videos that we did with you at your event, and a couple things came out of the entity extraction. I want to share with you, I want to get your reaction. Business hubs, human versus machines, complex problems, digital colleagues, digital worker, new potential applications, digital native companies, supply chain, system integrators, labor platforms, AI assistance, inefficiencies, and machine learning. These are key words that really kind of point to the next generation. This is essentially the language of your company. What's your reaction to that? >> Well, I'm not sure it's the language of our company as much as it's the language that people are using to determine what role they will play in the future, and what role, how they will impact their businesses going into the future. So these are not our terms, these are terms that exist in the space right now as people try to determine for themselves the role they will play in defining the future and how they will use technology to make their businesses more efficient. >> And companies are using cloud, for instance, to kind of reshape. We had a big conversation yesterday around, you know, do I want to be in the business of managing data centers, or be in the business of managing my business with technology. These concepts are interesting from an industry standpoint. Business hubs. Good concept, I get that. Digital worker. This is the impact that you guys are enabling. What's the managerial leadership role as an executive or a worker in these new cultural shifts? Because, as this is being enabled, new value is being created. Digital is enabling that. How does someone manage all this? What do you guys see, how do you see that playing out? >> Look, I think that whenever things are changing, and things are changing dramatically in business today, the only way to manage it is a day at a time. You can't project yourself so far into the future that you trip over the things that are immediately facing you now. So my suggestion would always be to evaluate options every day, every week, and make decisions when it's the right time to make decisions for your business. But let's go back to one of the terms you described, digital worker. So a digital worker in our view is actually available in what we call our bot store, which is a bot that is actually preconfigured to have skillsets that you would require. So let's just say you need an order-to-cash person, person who understands that, and it's a part of an automated process. The idea is that you would be able to download a digital worker with similar skills, and plug that bot into your process, and it would begin to work with, I would say, the skillsets of somebody who understands the order-to-cash process. That's really what a digital worker is. Now imagine that, in the future, and that future is not that far away, where every human being will be working side by side with a digital worker, so that the human being can offload the repetitive things that a digital something could actually do for them, and that digital worker would take on the task-based stuff, freeing up the individual to use their creativity to create higher order value for the business. That's really what we mean by digital worker and the importance of a digital colleague, for example. >> I think that it's a profound statement, and I think this is one of the cultural shifts that I see that this next generation workforce and leaders have to get their arms around, and in watching folks in Washington, D.C., we've been covering a lot of the procurement changes going on in government and businesses. There's a leveling up going on in the IQ of organizations, because that is a profound statement. Now we saw that with DevOps in cloud. You know, you talk to tech people, if you're doing the repetitive task more than three times, automate it. You're getting at something a little bit different. You're not just automating, you're adding intelligence to it, so what I like about the process automation area, is it's not just an undifferentiated, heavy lifting, mundane task. Yes it is, but there's an era of machine learning, you're seeing intelligence being applied to it, so it's truly becoming an augmentation to a human. That's kind of what I hear you saying. Do you agree with that, and is that something that you guys see happening, and what does that actually mean for the enterprise? >> No, I do agree with it, and we are at various stages of that evolution. But like anything else in business, and in life, you don't just flip a switch and all of a sudden people migrate to that new model, that's not how life really works. We evolve to those things, and I think what we're seeing is a very fast evolution to exactly what you just described. >> I want to get your thoughts on operationalizing new technology. You know, obviously, being an entrepreneur, I've done a bunch of startups, and the startup ethos is come on a narrow entry, get a landing area, and then sequence to the broader market opportunity. There's a lot of entrepreneurial ethos involved in how to operationalize something new like RPA, because you can't just, you know, shut down the old and bring in the new, there's a method there. This is a challenge in any new technology. How do you guys see this playing out? Because you guys are on the front end, bringing real value to the table, but people might want to get more aspirational and then get the reality. How do you get into the point of going into someone and saying I love what you guys do, what's the playbook, what do I do next? This is the challenge, can you share your thoughts on how an executive or a business can operationalize these benefits? >> So we have a lot of customers, 1800 customers, unique customers, and 2800 entities around the world that are using the software now. And I think that each of them had one thing in common. They started in bite-sized chunks. They said we're going to try this, and what's happening with RPA, which is one of the reasons it's growing so fast, is that once you try it, once you implement a few bots to automate the things that you weren't able to automate before, it starts ramping like this, right? It has a very very fast ramp-up. So you realize some successes in the processes that you begin to automate that you've never automated before. And the more you do it, the more you learn from it. The more you learn from it, the more you want to do it, the more processes you identify that could be automated, and should be automated, and what starts happening in most companies is they start adopting much much faster once they understand the benefits of it. And the benefits to business is driving higher levels of efficiencies, and reducing costs dramatically. >> So the tie to value is fast. >> Right, the value is very fast, compared to-- >> And that's driving the ramp-up, to your point. >> And that's driving the map. >> The flywheel kicks in, you start with a process that's known, and you automate it, wow, that's good, do it again, do it again. >> Correct. Well, do it again, and do it with more processes, right? And the other unique thing about this technology is human beings, once they understand the advantages of automating things that other human beings may have to do manually, most of those people who have been doing them manually will say I want more of that. We should be automating this, we should be automating that, and it actually makes them much more productive, and it makes them feel as if they are delivering higher value to the business themselves, and what an amazing human dynamic that is. >> You know, I was talking to Dave Vellante about this, we were talking about the TAM, the total adjustment market, for RPA, we're like, I think it's just in the trillions because with digital, everything is connected, so you can measure everything. Everything is ultimately a supply chain, whether it's network effect for internet, whether it's, you know, some process with cryptocurrency, whether it's blockchain or a process with cybersecurity, digital is pretty much connected, it's pretty much a supply chain. Some of them are more formed than others. This seems to be the entry point that most people would go to. Do they go to the supply chains first, or, better yet, what's the use cases that you see as the low hanging fruit that people come in on and automate? Is it simple supply chain stuff that's known, or are they applying it as they grow to other areas? >> It's very broad, but the fastest adoption, especially beginning about two years ago, were from the companies in industries like banking, other financial services, insurance, healthcare, manufacturing, which is supply chain, as you rightly point out. Those businesses that tend to be earlier adopters of technology have also become earlier adopters of RPA. But what we're finding now is it's now, because of the results that these businesses have demonstrated, and because digital native competitors are actually coming into the space and threatening what are sometimes referred to as legacy businesses, businesses are not delaying the investments they're making so that they can actually become more competitive, and when you think about that, it's not just the efficiencies that these technologies like RPA drive, but it's the ability to make businesses acutely more competitive than they've ever been before. >> That's a great angle, competitive strategy has always been one of those things where, you know, the cloud native world or digital native world was like oh yeah, pick one feature, innovate, and you can go beat an incumbent. The incumbent now has leverage in the marketplace, whether it's physical presence or other assets. Using RPA gives them a way to level up, so to speak. >> Level up, for sure. So let's just take something we're all familiar with, right? You can now go on your phone, and you can have a car at your house to take you somewhere in about four minutes in most cities, right? If you have an issue, you can solve that issue on your phone as well. You don't have to call anybody, you just solve it on your phone. These ride share companies have made it so simple, it's almost as if there's no such thing anymore as a front office or a back office. Digital native companies have brought those things together, and now there's one office. So that immediacy is what legacy companies are actually competing against, and if those companies don't adopt this kind of automation to make more efficient those processes and narrow the gap between customer facing and back office, they won't be able to compete. >> Yeah, they can turn a liability into an advantage, with software. Big big bullish on the software, I think the competitive landscape also is interesting, I'd like your thoughts on. There seems to be a battlefield, at least from my perspective, my opinion is that, okay, RPA software is out there, it's going to grow really fast. The competitive battle will be around intelligence. How do you guys view the competitive levers? How do you guys compete, what's the advantage? Is it intelligence, is it being more intelligent, is it more operational, what's the advantage you guys see vis-a-vis the competition? >> Yeah, so we're actually seeing a sort of a bringing together of technology, what we have considered to be strictly technology, and what's being described broadly now as artificial intelligence. Artificial intelligence is still evolving. Everybody has his own definition of what it really is, but what we're seeing, and I think in other sectors we're seeing the same thing, is now the merging of things that have truly been technology with things that are perceived to be artificial intelligence, and they're beginning to come together. What that will look like five years from now, nobody knows. What it'll look like 10 years from now, no one can even conceive of, but we're seeing that dynamic in place now, and this is the beginning. >> It's a great wave, excited to have you on and share your insights, Gary. It's great stuff you guys are doing over there at Automation Anywhere, love the, we love this wave, I think it's going to be relevant. My final question for you, though, is little bit different. You know, you're at a cocktail party, you're at a friend's house, you're at a confab, and you see people that aren't in the business, and they're like Gary, I need to get, I need to be more competitive. What do I do, what is this RPA thing, how do I change my culture, how do I get my people and my process aligned with software, what's the playbook, what's your advice? >> So what I would say is, get started as quickly as possible, because if you delay too long, you will be left behind. So that's would be my first bit of advice. The other, it would be to start slowly. Learn as quickly as you can. Don't worry about automating things that are hard to automate, go to the things that are easy to automate. Companies find that when they address those things first, they're actually able to drive more success faster, and then they will look for more and more opportunities based on what they've learned and the success that they've derived, and that's what happens to create this ramp effect, where it becomes almost viral-like. Where you have one process that works great, you automate that, you automate another one, you automate five more, 10 more, and before you know it, believe it or not, we have customers that are implementing more than 3000 bots over the last year and a half, and that's how they started. >> Get rid of the mundane work, you've got happy people, HR is happy, you've got more revenue coming in, you're more competitive as a business, this is a good value proposition. It's an easy sale. >> Nothing's easy, but it has a huge appeal. >> Gary, thanks so much for coming on and sharing your insights around RPA, appreciate it and congratulations on your success. >> Thank you. >> This is CUBEConversation, and I'm John Furrier here in Palo Alto, thanks for watching. (upbeat music)

Published Date : Aug 1 2019

SUMMARY :

in the heart of Silicon Valley, that's kind of changing the game of value in digital, RPA is one of the fastest growing segments that since the onset of automation about 30 years ago, across most of the largest enterprises and small business, that you can plug into any business process, you know, moving things around, do think of, you know, Most people go to retail, and they think, you know, because of all the efficiencies that would be created that we did with you at your event, and what role, how they will impact their businesses This is the impact that you guys are enabling. The idea is that you would be able to download That's kind of what I hear you saying. what you just described. This is the challenge, can you share your thoughts And the more you do it, the more you learn from it. and you automate it, wow, that's good, and what an amazing human dynamic that is. so you can measure everything. and when you think about that, and you can go beat an incumbent. and you can have a car at your house to take you somewhere How do you guys view the competitive levers? and they're beginning to come together. and you see people that aren't in the business, and the success that they've derived, Get rid of the mundane work, you've got happy people, and sharing your insights around RPA, This is CUBEConversation, and I'm John Furrier

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

AmazonORGANIZATION

0.99+

GaryPERSON

0.99+

August 2019DATE

0.99+

WalmartORGANIZATION

0.99+

Palo AltoLOCATION

0.99+

John FurrierPERSON

0.99+

Silicon ValleyLOCATION

0.99+

80%QUANTITY

0.99+

1800 customersQUANTITY

0.99+

Washington, D.C.LOCATION

0.99+

2800 entitiesQUANTITY

0.99+

2010DATE

0.99+

Gary ConwayPERSON

0.99+

yesterdayDATE

0.99+

one officeQUANTITY

0.99+

eachQUANTITY

0.99+

more than 3000 botsQUANTITY

0.99+

oneQUANTITY

0.99+

one thingQUANTITY

0.99+

first timeQUANTITY

0.98+

one processQUANTITY

0.98+

CUBEConversationEVENT

0.98+

about 20%QUANTITY

0.97+

more than three timesQUANTITY

0.96+

10 moreQUANTITY

0.96+

about four minutesQUANTITY

0.95+

firstQUANTITY

0.95+

todayDATE

0.93+

one technologyQUANTITY

0.93+

about 30 years agoDATE

0.92+

RPATITLE

0.9+

Palo Alto, CaliforniaLOCATION

0.9+

a dayQUANTITY

0.88+

about two years agoDATE

0.88+

first bitQUANTITY

0.87+

CUBEORGANIZATION

0.86+

10 yearsQUANTITY

0.84+

Hadoop WorldORGANIZATION

0.84+

last year and a halfDATE

0.83+

CUBEConversationORGANIZATION

0.82+

three day work weekQUANTITY

0.82+

the Signal From the NoiseTITLE

0.8+

Millions of human beingsQUANTITY

0.78+

five moreQUANTITY

0.77+

Automation AnywhereORGANIZATION

0.74+

onceQUANTITY

0.74+

trillionsQUANTITY

0.73+

five yearsQUANTITY

0.71+

AutomationORGANIZATION

0.67+

of a secondQUANTITY

0.63+

theCUBEORGANIZATION

0.62+

coupleQUANTITY

0.6+

TAMORGANIZATION

0.59+

Day 2 Product Keynote Analysis | Google Cloud Next 2019


 

>> fly from San Francisco. It's the Cube covering Google Cloud. Next nineteen, right Tio by Google Cloud and its ecosystem partners. >> Welcome back to the cues live coverage Here in San Francisco, this is day two of Google Cloud. Next twenty nineteen cubes. Exclusive coverage. We're in the middle of the show floor. All the action Aquino's are still going on a little bit over. I'm John for David Law student and kicking off, breaking down the keynote analysis. Also breaking down Post Day one. All the action in the evening, where all the parties are all the action on alway conversations. Dave's to picking off day to day one was setting the table. New CEO on stage Date date. You gets into the into the products really about data data. I machine learning's all aboutthe data cloud data, and we're seeing a machine learning data management. Smart analytics say Aye and machine learning and collaborations. The four themes of Today Google. Clearly using data has a key value proposition. Big table, Big Queary machine learning the G A support for auto ml for tables, big announcements, your thoughts >> Yes. Oh, John, I think answering some of the things that we brought up yesterday is when When Google puts out their vision of why they should be your partner of choice, like customers choose way thought that data and I and M l would be let read upfront. So they kind of buried the lead a little bit. And, you know, question we had coming this week is and they reclaim that really thought leadership that, you know, a couple years ago, You know, data. You know, they really that G technical science stuff is what Google was really good at. So I thought they laid out some really good things. I think everybody was, you know, impressed. To see there was good diversity of customers as well as all the Google me. There were a lot of the women of Google that you've written about John here showing their sewing their chops here. So a lot of pieces to go through and everything from the G sweetened the chromebooks and sick security and privacy is something I like to talk a little bit about when we get into it here. But quite quite a lot of use that day. Today I at the center of it >> and one of the power Women dipped to use the big table you see and think we're all that stuff, Dave with >> big steam Us on the Kino also was B I with a II B. I think we've covered that do space going back to our ten years of doing the tube. It's the promise of Do Remember those days. Do came from Google about Eric. The emergent Borden works and do this kind of small little sliver of the ecosystem into Google's now showing what was once the promise. Big data. They're giving demos democratizing. Bring in for the masses. Wait stories on silicon engels dot com outlining this, But the reality is there. Now remember hitting the road with promise of big data? Now, with Cloud really changed the game? Your bosses, you've been covering this from Day one? >> Well, I think that there's no question that this is a date, a game, WeII said early on John on the Cube. That big data war was going to be one in the cloud. Data was going to reside in the cloud. And having now machine intelligence applied >> to that data is what's giving companies competitive >> advantage at scale and economics I was struck by the stats that Google gave >> at the beginning of the Kino today. Google in the last three years has spent forty seven billion dollars >> capital expenditures. This year to date alone, they've spent thirteen billion dollars in Cap Xidan Data Centers. Thirteen billion. It would take IBM three and a half years to spend that much in cap back there would take Oracle six years. So from an economic standpoint, in the scale standpoint, Google, Microsoft, Amazon are gonna win that game. There's no question in my mind. So, John, you know it is a game of scale and data and I What do you think? First >> of all, Google, they got the Cuban aunties two of the white paper. They wrote that they did commercialized communities in a way that I thought was really excellent, well executed. I like a Jew where they left out on the side of the road. You got picked up by a Cloudera Michaels and memorable Jeff. I'm a Wagner. We saw what happened do communities. It is true that up. They basically put it out there in the open source system, the way they get behind Ciencia really positive there. On the data front, Google's got so much in the tool shed all across Google from day one. Their legacy is data data driven, large scale. They built software and systems to manage data at scale at a hole on president. Well, I think that they have their well ahead of the marketplace on the technology that our inside Google proper Google Cloud will be proper alphabet, whatever you wanna call it. Self driving cars question for Google is, Can they bring it to get there? They >> need to hire a team of people, just >> go out and just get it all >> together, pull the jewels together and put it into a coherent platform. That's kind of the tea leaves that I see that we're reading here. Is that Curry and pointed down the keynote. We got tons of technology. The question is, can they pull it together in a package and make a consumable addressable programmable programing, FBI's? We've seen that movie that's happening right now. The next level of innovation for Google is, can they make data programmable? This is going to be a ten year opportunity. If they get that right, they will win. Big move the ball down the field to see Amazon going big on stage maker. It's all about data data, analytics at scale, auto machine learning. These are the tell signs do data program ability. They got all the things. Can >> they bring it to bear? >> Yeah, Well, John, one of the things I saw it got a lot of people excited is if I have, You know, I'm a G sweet. Customers were geese sweet customers, and I'm using spreadsheets. Now I can use Big Query with that. So the power of analytics and big data be able to plug that right in, make it really easy. And what's interesting is trying to squint through. You know what was kind of the Google consumer side of the house that many of us know. And if used for for lots of years versus the Enterprise G sweet chromebooks and mobile? Well, you know, under Diane Green, it was Google Enterprise, and now it's all part of Google Cloud. Just when we talk about Microsoft, it's like, Well, is it azure or is it au three sixty five? Well, it was a G sweet words. Is it Google and one that I want to, you know, get get your guys comment on is they talk about privacy way. No, Google as a whole alphabet is You know what, ninety five percent plus ad revenue and they were very strong out here is that we do not own your data. We will not sell it to a third party. Privacy, privacy, privacy. And it's great to hear them say that. But way all interacted work with Google. We know all the cloud providers. The data is an important thing. When I do Aye aye and ml type activities. I need to be able to anonymous isat and leverage it train on it. So data privacy issue is still something that, you know, I heard what they said, but you know, there's got to be some concerns. >> There is another angle here that I'd like to talk about, and that's the database. Google, Amazon, Microsoft, Oracle, IBM, Mike Attention, Alibaba. All the big cloud guys. They want your data. That's why Amazon spending so much effort on the database market. That's why you don't see Oracle having such a dominant position in database. You like Google's announcement yesterday they were basically doing a backhanded slap but Amazon, saying, We're more open. They didn't deal with Mongo. There's a lot of discussion in the community of software community about how how Amazon, obviously Bogart's open source. But But if you if you look, it's something that's true if you look at Amazon, they basically taken a lot of open source products. It built their own databases. But if you look at Google, Google's got relational databases. They got non relational databases. They got operational databases. So I wonder out loud, Is this a Trojan horse strategy? Because they need to own your data that databases so important now that I think that is I talked to one noise that yesterday was a executive VP at Oracle, and he said to me that the cloud providers basically looked at the data base as another application to run on top of servers in virtual machines, >> he said, Were Oracle we integrate, you know, they do all the exit data stuff, etcetera. So my point is, database is the war to be won. That's where it starts. And if you're going to go away, I you want to have the data proximate to the application. Well, >> I mean there's two ways to look at that day. I would say that what might take on >> the database war or a position in the stack is you look out from the old way the new way the old way would be an oracle. Well, we got to preserve the database. We license that we have the license agreements. The new way is to change the game with automation. Like what? Google showing where all this stuff is gonna be done on behalf of the customer. So the business model of how database and the impact of data is being used well dictated my opinion, the monetization. And that's the question that everyone that I've talked to on the show floor offline on email, on direct messages, how we're gonna make money with containers, how we're gonna make money with Cooper Netease. How am I going to make money with data? This is the fundamental question. Now, if you look at the success pattern of the partner ecosystem, moneymaking is about new economics, new price points and new services. So if you're Deloitte or you're a censure, you're saying wow of goo could automate all the stuff that used to be really hard to do, like data migration, moving application were close around. That was once a high profit yield activity for this system integrators or selling databases like Oracle. That's the old way. The smart partners are essential, saying, OK, I'LL take the new economics where all that cost is distracted away by the automation. And I'll lower my price point but still capture the margin margin. Opportunity for cloud is significant, and this is where the smart money is going. The smart monetization schemes are around leveraging what Google and Amazon are doing at scale and shifting their business model. Take advantage of the lower cost but then lowering the price not as much, so they still capture the margin. So this's the immigration, and these are things that were like months and months project going. Data migrations to Melrose projects are like could be months. So smart money is saying Okay, how dowe I make money on this. It's not the old way. So this classic you know what side his treaty on old way or new way that's going to define who wins and who loses >> weight. By the way, I mean it. Sue Ellen >> license selling database license, for instance, is an old way. Well, essentially, it was Ramadan. Amazon does databases of service. What is the license by as you go? But you don't have, You >> know, the Oracle sells a zit buys you go to mean they play that same game. To me, it's more about when it comes to database. It's more about workloads. How much of the world needs acid property databases? Because that's oracles game versus how much of the world needs you no less database data store for for Lex structure data. And that's really I think, what Google and to a certain extent, Amazon are betting on. Although both companies, especially Amazon, is making a bet on both transactional data bases and non relationship, I >> mean in the ideal world database would be free from the margin get shifted to another spot. That's not clear yet, but still it can make money on database but lower caught in lower price. So Google makes money at scale, so with clouds scale, they can lower the price of the database like this, whether it's it's a service or some fee. But it's the people implementing, like the integrators and the people that are building applications as they build that agility. And how are they going to monetize? How does a company out in this floor make money? >> I just remember data stacks and probably like twenty twelve. I was talking to Billy Bob's worth the CEO about the merits of being in the US marketplace, and he said, You know, I'm a little nervous about that. What do you think, Dave? Do you think? Do you think they're gonna like, own me at some point in time and compete with me? So And that's what Google's announcement yesterday said is, You know, you're our friends, we're not going. They don't really come out and say, We're not going to compete with you They just basically said We are more open than aided us without mentioning a W S >> s. So it's interesting, you know, I've only had a little bit of a chance to walk around, but it's a different ecosystem, then Amazon. I remember six years ago, when we first went to Amazon. It was like game developers and all these weird start ups that I couldn't understand what they do. And now it's like, you know, like VM world, but bigger with just that. A broad ecosystem here, you know, there's a big section on collaboration. I went toe Enterprise connect a couple of weeks ago, talking about contact centers and see a lot of the same companies here heard five nines mentioned on stage zooms. Here, you know howto they plug into Google Cloud hurt sales force talking very devout Contact center. So it's a diverse ecosystem, but it's different than than Amazon, and there's not and Amazon. There's always that underlying, you know thing. Oh, is Amazon going to take over this business here? You know, I haven't heard that concern at this show. Well, >> I mean, the bottom line is that there's a shift in the economics and his model technology back in the database. Question. The fact that Mongo D. B. Was once forecast to go out of business. Oh, Amazon's going kill Mongo Devi that dynamo d B. Google's got databases. The fact the matter is, there's no one database anymore. Every application at some level has a database. So if you think about that, then you're gonna have a a new model where everything's has a database and the database is going be characterises on the workload in application. So I do agree with that point. Question is, it's not mutually exclusive one database license for all versus databases everywhere. So if databases air everywhere, then the connective tissue becomes the opportunity. That's where I think you see somebody's data playing technologies with Cloud very compelling, because I can move data very quickly around, and that's where the machine learning really shines. That's going to be a latent see question that's going to be a data integrity question. This is the new model. This is what horizontal scale ability means in the cloud, not by Oracle database. And we're good. This is It's kind of that game is that game is slowly moving into the oblivion. >> Well, I think you know, I think Amazon would say, Hey, if you're a database vendor, you gotta innovate or because we're not going to stop innovating. Whereas I think Google's message to the database vendors is somewhat different is, you know we want to partner with you, and maybe that's because they're not coming from a position of enterprise strength. But that ice I'm sensing, too, apparently different strategies. I just don't know what the end game is. And I believe the endgame is on the data. >> The tell sign on the databases of the developer, right? If I want to run a document store because that's best for my Jason or my my feeds from using Sage, eh, John? A lot of drama script. I'LL use document store. I want to use a relational database. I'll use a relational David So the ideal world does not have to develop are forced into a tooling and database decision that data >> mongo changed its licensing policy as a direct result of what Amazon was doing. So they made their community edition Ah, licence terms more restrictive if you follow that. So what? They said anybody, any cloud service provider that distributes the our community edition has to open source their entire software stack associated with distributing that, or they got to pay us. So basically saying you have to pay an open source tax or you gonna pay us we'LL be looking very interesting change in their database. One of >> the one the announcements here on the day two was the data fusion thing, which essentially means tell sign as well that fusion data moving data integrating Data's a critical thing. Pray ay, ay, ay and machine machine learning in a eyes only as good as the data that it's working with. So the data is, if his missing data saying a retail transaction, you potentially missing out on an opportunity to better user experience. So address ability of data. Having that accessible is a critical feature for machine learning, an a I and again, it's garbage in garbage out relatives of the data equation. High quality data gets high quality machine learning. High quality machine learning is high quality. I. So let's do that's that's kind of cloud offers with large compute large horizontal scale ability. >> Well, I said yes, and I said yesterday was kind of disappointed. It wasn't of talk about a I will. Google certainly made up for that today, didn't they? Still, >> Yeah, sorry was their questions >> were what was your favorite keynote moment today? >> Look, it was it was good when they actually let a couple of customers go up there and talk was that was a little bit disappointed that, you know, some of the sessions field a little bit too scripted for my take, but they laid out a lot of pieces there It takes a little wild, uh, you know, squint through all of the adjustment, you know, and all the changes that they have their I'm still digging through, like on the Antos. We talked about it quite a bit yesterday, but, you know, had some good conversations afterwards. They've got the cloud run announcement that's coming out this afternoon. But But, you know, digging into that open source discussion that you were just talking about from the database is something that I have a lot of interested. I'm glad we're actually right had on today will get their opinion as to, you know, they know a thing or two about open source and communities. And how does something like open shift fit with aunt those? They can work together, but it's not a owe it. Everything works back and forth If I'm p k s if I'm open shift or from you know, the geek based Antos, it's not seamless, and it sure ain't free you >> for not customers so weird from UPS. Scotiabank Baker Hughes McCasland heard from Cole's yesterday. So it's pretty high level senior people from the customer side speaking on stage, which is progress in the C e >> o of ups. I thought was great. He really laid out, You know, the scale of their business and how they grow. >> All right, guys, we got dates. You were kicking off here on the show floor here in San Francisco for Google Cloud next twenty nineteen. They never got it all day. And every day, two of three days, a live coverage. Stay with us as we kick off a full day of great interviews. Executives, entrepreneurs and ecosystem parties here at Google next stay with us for more after this short break.

Published Date : Apr 10 2019

SUMMARY :

It's the Cube covering All the action in the evening, where all the parties are all the action on alway conversations. the G sweetened the chromebooks and sick security and privacy is something I like to talk a little bit about when we get big steam Us on the Kino also was B I with a II B. John on the Cube. at the beginning of the Kino today. standpoint, in the scale standpoint, Google, Microsoft, Amazon are gonna win On the data front, Google's got so much in the tool shed all Big move the ball down the field to see Amazon going big So the power of analytics and big data be able to plug that right in, There's a lot of discussion in the community of software is, database is the war to be won. I mean there's two ways to look at that day. the database war or a position in the stack is you look out from the old way By the way, I mean it. What is the license by as you go? How much of the world needs acid property databases? But it's the people implementing, like the integrators and the people that are building applications as they build that agility. They don't really come out and say, We're not going to compete with you They just basically said We are more open And now it's like, you know, like VM world, is going be characterises on the workload in application. And I believe the endgame is on the data. The tell sign on the databases of the developer, right? the our community edition has to open source their entire software stack associated with distributing the one the announcements here on the day two was the data fusion thing, which essentially means tell sign as well that Well, I said yes, and I said yesterday was kind of disappointed. They've got the cloud run announcement that's coming out this afternoon. So it's pretty high level senior people from the customer side speaking on stage, which is progress He really laid out, You know, the scale of their business and how they Stay with us as we kick off a full

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
AmazonORGANIZATION

0.99+

OracleORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

IBMORGANIZATION

0.99+

AlibabaORGANIZATION

0.99+

JohnPERSON

0.99+

twoQUANTITY

0.99+

JasonPERSON

0.99+

DavePERSON

0.99+

San FranciscoLOCATION

0.99+

thirteen billion dollarsQUANTITY

0.99+

forty seven billion dollarsQUANTITY

0.99+

DeloitteORGANIZATION

0.99+

Thirteen billionQUANTITY

0.99+

MongoORGANIZATION

0.99+

ten yearQUANTITY

0.99+

yesterdayDATE

0.99+

FBIORGANIZATION

0.99+

JeffPERSON

0.99+

DavidPERSON

0.99+

EricPERSON

0.99+

FirstQUANTITY

0.99+

BogartORGANIZATION

0.99+

UPSORGANIZATION

0.99+

ninety five percentQUANTITY

0.99+

this afternoonDATE

0.99+

bothQUANTITY

0.99+

Diane GreenPERSON

0.99+

ten yearsQUANTITY

0.99+

USLOCATION

0.99+

todayDATE

0.99+

three daysQUANTITY

0.99+

Sue EllenPERSON

0.99+

AquinoPERSON

0.99+

Meagen Eisenberg, TripActions | CUBEConversation, March 2019


 

from our studios in the heart of Silicon Valley Palo Alto California this is a cute conversation hello and welcome to this special cube conversation here in Palo Alto California cube headquarters I'm Jennifer echoes the cube our guest here is Megan Eisenberg CMO of a new hot company called trip actions formerly the CMO at MongoDB before that taki sign we've known each other some advisory boards great to see you yes great to see you as well so exciting new opportunity for you at trip actions just transitioned from MongoDB which by the way had great earnings they did what was the big secret to Mongo DB z earnings tell us well it's fresh and I think they're executing and their growth is amazing they're bringing their costs down I mean they're they've got product market fit their developers love them and so I'm proud and not surprised you're there for four years yeah transformed their go-to market so that fruits coming off the tree yes yeah it's exciting to see the you know process people technology all coming together and seeing them scale and do so well in the markets yes you know being here in 20 years living in California Palo Alto you see the rocket ships the ones that flame out the ones that make it and there's a pattern right when you start to see companies that are attracting talent ones that have pedigree VCS involved yeah raising the kind of rounds in a smart way where there's traction product market fit you kind of take special notice and one of the companies that you're now working for trip actions yes seems to have the parameters so it's off the pad it's going up its orbit or taking off you guys have really growing you got a new round of funding one hundred fifty million dollars yes unique application in a market that is waiting to be disrupted yes travel about company you work for transactions trip actions is a fast growing business travel platform we service customers like we work slack zoom box and we're growing we're adding 200 customers a month and it's amazing just to see these fast-growing companies right when they hit product market fit I think the keys are they've gotten a massive addressable market which we have 800 billion online travel they're solving a pain and they're disrupting a legacy the legacy providers that are out there we're three and a half years old and we are you know really focused on the customer experience giving you the choice that you want when you book making it easy down to six minutes not an hour to book something and we've got 24/7 support which not many can compete with you know it's interesting you know I look at these different ways of innovation especially SAS and mobile apps you know chapter one of this wave great economics yeah and once you get that unit economics visibility say great SAS efficacious happened but now we're kind of in a chapter two I think you guys kind of fit into this chapter to where it's not just SAS cuz you know we've seen travel sites get out there you book travel it's chapter two of SAS is about personalization you see machine learning you got cloud economics new ventures are coming out of the woodwork where you could take a unique idea innovate on it and disrupt a category that seems to be what you guys are doing talk about this new dynamic because this is not just another travel app when you guys are doing gets a unique angle on this applying some tech with the Corpse talked about that this chapter to kind of assess business I think when I think about chapter 2 I think about all the data that's out there I think about the machine learning I think about how we understand the user and personalize everything to them to make it frictionless and these apps that I love on my phone are because they they know what I want before I want it and I just took a trip to Dallas this week and the app knew I needed to check in it was one click told me my flight was delayed gave me options checked me in for my hotel I mean it was just amazing experience that I haven't seen before and it's really if you think about that that business travel trip there's 40 steps you have to do along the way there's got to be a way to make it easier because all we want to do is get to the business meeting and get back we don't want to deal with weather we don't want to deal with Hotel issues or flight changes and our app is specific to when you look at it you've got a chat 24/7 and someone's taking care of you that concierge service and we can do that because the amount of data we're looking at we're learning from it and we make it easier for travel manager half the people go rogue and don't even book through their travel solution it's because it's not tailored to them so this is the thing I want to get it so you guys aren't like a consumer app per se you have a specific unique target audience on this opportunity its travel management I'm I'm gonna date myself but back when I broke into the business they would have comes like Thomas Cook would handle all the travel for youlet Packard when I worked there in the 80s and you had these companies I had these contracts and they would do all the travel for the employees yes today it's hard to find that those solutions out there yes I would say it's hard to find one that you love and trip Actions has designed something that our travelers love and it is it's for business travel it's for your business trips it's taking care of your air your hotel your car your rail whatever you need and making sure that you can focus on the trip focus on getting there and not just the horrible experience we've all had it you travel a lot I traveled certainly back and forth to the East Coast and to take those problems away so I can focus on my business is what it's so just just look at this right so you guys are off to unicorn the funding great valuation growing like crazy got employees so people looking for jobs because they're hiring probably yeah but you're targeting not consumers to download the app it's for businesses that want to have company policies and take all that pressure off yes of the low so as a user can't buy myself can't just use the app or get I know you can Nano that's the the the whole thing is that as a user there's three things we're providing to one inventory and choice so you go and you know all the options you get the flight you want it's very clear and art we have a new storefront where it shows you what's in policy what's not so we've got that its ease of use it's booking quickly nobody wants to waste time dealing with this stuff right you want to go in booked quickly and then when you're on the trip you need 24/7 support because things go wrong airline travel gets cancelled weather happens you need to change something in your trip and so yes the user has the app on their phone can book it can you do it fast and can get support if they need it so stand alone usually can just use it as a consumer app but when you combine with business that's the magic that you guys see is that the opportunity yes I should say as a consumer as a business traveler so you're doing it through your company so I'm getting reimbursed for the companies the company is your customer yes the company's our customer is the traveler yes okay got it so if we want to have a travel desk in our company which we don't have yet yes it would we would sign up as a company and then all your employees would have the ease of use to book travel so what happens what's the sum of the numbers in terms of customers you have said 200 month-over-month yes we're over 1500 customers we're adding 200 a month we've got some significant growth it's amazing to see product market and the cost of the solution tell people $25 a booking and there's no add-on costs after that if you need to make as many changes as you need because of the trip calls on it you do it so basically per transaction yes well Little Feat one of our dollars yes okay so how do you guys see this growing for the company what's the some of the initiatives you guys are doing a new app yes mo what's what's the plan it's a massive market 800 billion right and we've only just started we've got a lot of customers but we've got many more to go after we are international so we have offices around the world we have an Amsterdam office we've got customers travelling all over so we're you know continuing to deliver on that experience and bringing on more customers we just on-boarded we were ten thousand travelers and will continue to onboard more and more so as head of marketing what's the current staff you have openings you mentioned yet some some some open recs yes yes hi are you gonna build out I've got 20 open Rex on the website so I'm hiring in all functions we're growing that fast and what's the marketing strategy what's your plan can you give it a little teaser on yes thinking core positioning go to market what are some of the things you're thinking about building out marketing CloudStack kind of thing what's what's going on all of these things my three top focuses are one marketing sales systems making sure we have that mark tech stack and that partnership with the sales tech stack second thing is marketing sales alignment that closed-loop we're building we're building pipeline making sure when people come in there's a perfect partnership to service what they need and then our our brand and messaging and it's the phase I love in these companies it's really building and it's the people process and technology to do that in the core positioning is what customer service being the most user-friendly what's the core position we're definitely focused on the traveler I would say we're we're balancing customer experience in making sure we get that adoption but also for the travel managers making sure that they can administer the solution and they get the adoption and we align the ascent in the incentives between the traveler and the travel manager and customer profile what small munis I business to large enterprise we have SMB and we're going all the way up to enterprise yes has it been much of a challenge out there in the business travel side I'm just don't know that's why I'm asking is like because we don't have one I can see our r-cube team having travel challenge we always do no centralizing that making that available but it'd have to be easier is it hard to get is there a lot of business travel firms out there is what are some of the challenges that you guys are going after there well I I think what matters is one picking the solution and being able to implement it quickly we have customers implementing in a week right it's understanding how we load your policies get you on board get your cut you're you're really your employees traveling and so it's pretty fast onboarding and we're able to tailor solutions to what people need what are some of the policies that are typical that might be out there that people like yeah so maybe for hotels you may have New York and your your policy is $500 a night what the I would say a normal typical behavior would someone would book it at $4.99 they go all the way up to the limit we've actually aligned our incentives with the travel managers and the employees and that if you save your company money you save and get rewards back so let's say you book it for 400 that $100 savings $30 goes back to the employee and rewards they can get an Amazon card donate to Cherry charity whatever they'd like to kind of act like an owner cuz they get a kickback yes that's the dot so that's how you an interest adoption yes what other adoption concerns you guys building around with the software and or programs to make it easy to use and we're constantly thinking about the experience we want to make sure just I mean I think about what I used to drive somewhere I'd pull out a map and map it out and then I got lucky and you could do MapQuest and now you have ways we are that ways experience when you're traveling we're thinking about everything you need to do that customer when they leave their front door all the way to the trip all the things that can hang them up along the way we're trying to remove that friction that's a very example I mean Waze is a great service yes these Google Maps or even Apple Maps ways everyone goes to backed away yes yeah I don't I mean ways did cause a lot of Street congestion the back streets of Palo Alto we're gonna expedite our travelers well it's a great utility new company what what attracted you to the opportunity when was some of the because you had a kid going over there MongoDB what it was the yeah motivation to come over to the hot startup yeah you know I love disruptive companies I love massive addressable markets good investors and a awesome mission that I can get behind you know I'm a mom of three kids and I did a lot of travel I'm your typical road warrior and I wanted to get rid of the pain of travel and the booking systems that existed before trip actions and so I was drawn to the team the market and the product that's awesome well you've been a great CMO your career has been phenomenal of great success as a CPM mother of three you know the challenges of juggling all this life is short you got to be using these apps to make sure you get on the right plane I mean I know I'm always getting back for my son's lacrosse game or yes event at school this is these are like it's like ways it's not necessary in the travel portfolio but it's a dynamic that the users care about this is the kind of thing that you guys are thinking about is that right yeah definitely I mean I always think about my mom when she worked in having three daughters and I work and have three daughters I feel like I can do so much more I've got door - I've got urban sitter I've got ways I've got Google Calendar I've got trip actions right I've got all these technologies that allow me to do more and not focus on things that are not that productive and I have no value add on it just makes me more efficient and productive how about some of the tech before we get in some of the industry questions I want to talk about some of the advantages on the tech side is there any machine learning involved what's some what's not what's some of the secret sauce and the app yeah definitely we're constantly learning our users preferences so when you go in we start to learn what you what hotels you're gonna select what where do you like to be near the office do you like to be near downtown we're looking at your flights do aisle window nobody wants middle yes but we're we're learning about your behaviors and we can predict pretty closely one if you're gonna book and two what you're gonna book and as we continue learning you that's why we make you more efficient that's why we can do it in six minutes instead of an hour that's awesome so Megan a lot of things going on you've been a progressive marker you love Terry's tech savvy you've done a lot of implementations but we're in a sea change now where you know people that think differently they gonna think okay I need to be on an app for your case with with business travel it's real policies there so you want to also make it good for the user experience again people centric this personalization has been kind of a cutting edge concept now in this chapter to a lot of CMOS are either they're they're not are trying to get there what are you finding in the industry these days that's a best practice to help people cross that bridge as they think they cracked the code on one side then realize wow it's a whole another chapter to go you know I think traditionally a lot of times we think we need we're aligning very much with sales and that matters that go to market marketing sales aligned but when it comes to products and a customer experience it's that alignment with marketing and the product and engineering team and really understanding the customer and what they want and listening and hearing and testing and and making sure we're partnering in those functions in terms of distribution getting the earned concept what's your thoughts on her and media yeah I mean I definitely think it's the direction right there's a ton of noise out there so you've got to be on topic you've got to understand what people care about you've got to hit them in the channel that they care about and very quick right is you don't have time nobody's gonna watch something that's 30 minutes long you get seconds and so part of the earned is making sure you're relevant you what they care about and they can find you and content big part of that for you guys huge part of it yes and understanding the influencers in the market who's talking about travel who's who is out there leading ahead you know leading in these areas that travel managers go and look to you know making sure we're in front of them and they get to see what we're delivering I like how you got the incentives of the employees to get kind of a line with the business I mean having that kind of the perks yes if you align with the company policies the reward could be a Starbucks card or vacation one more time oh whatever they the company want this is kind of the idea right yeah they kind of align the incentives and make the user experience both during travel and post travel successful that's right yes making sure that they are incented to go but they have a great experience okay if you explain the culture of the company to someone watching then maybe interested in using the app or buying you guys as a team what's the trip actions culture like if you had to describe it yeah I would say one we love travel too we are fast growing scaling and we're always raising the bar and so it's learning and it's moving fast but learning from it and continually to improve it's certainly about the user all of the users so not just the travel manager but our travelers themselves we love dogs if you ever come to the Palo Alto office we've got a lot of dogs we love our pups and just you know building something amazing and it's hard to be the employees gonna know that's a rocket ship so it's great get a hold on you got a run hard yes that's the right personality to handle the pace because you're hiring a lot of people and I think that's a part of the learning we need continual learning because we are scaling so fast you have to reinvent what we need to do next and not a lot of people have seen that type of scale and in order to do it you have to learn and help others learn and move fast well great to see you thanks for coming in and sharing the opportunity to give you the final plug for the company share what who you what positions you're hiring for what's your key hires what are you guys trying to do give a quick plug to the company yeah so I mean we've grown 5x and employees so we're hiring across the board from a marketing standpoint I'm hiring in content and product marketing I'm hiring designers I'm hiring technical I you know I love my marketing technology so we're building out our tech stack our website pretty much any function all right you heard it here trip actions so when you get the product visibility those unit economics as they say in the VC world they've got a rocket ship so congratulations keep it up yeah now you're in palo alto you can come visit us here anytime yes love to Meagen Eisenberg CMO trip access here inside the cube I'm John Ferrier thanks for watching you [Music]

Published Date : Mar 15 2019

SUMMARY :

and sharing the opportunity to give you

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
30 minutesQUANTITY

0.99+

DallasLOCATION

0.99+

$4.99QUANTITY

0.99+

John FerrierPERSON

0.99+

Megan EisenbergPERSON

0.99+

$100QUANTITY

0.99+

$25QUANTITY

0.99+

$30QUANTITY

0.99+

Meagen EisenbergPERSON

0.99+

40 stepsQUANTITY

0.99+

New YorkLOCATION

0.99+

800 billionQUANTITY

0.99+

Palo AltoLOCATION

0.99+

three kidsQUANTITY

0.99+

20 yearsQUANTITY

0.99+

three daughtersQUANTITY

0.99+

Meagen EisenbergPERSON

0.99+

March 2019DATE

0.99+

six minutesQUANTITY

0.99+

Silicon ValleyLOCATION

0.99+

one hundred fifty million dollarsQUANTITY

0.99+

400QUANTITY

0.99+

CherryORGANIZATION

0.99+

800 billionQUANTITY

0.99+

AmsterdamLOCATION

0.99+

CloudStackTITLE

0.98+

5xQUANTITY

0.98+

six minutesQUANTITY

0.98+

CorpseORGANIZATION

0.98+

todayDATE

0.98+

this weekDATE

0.98+

AmazonORGANIZATION

0.98+

three daughtersQUANTITY

0.98+

three thingsQUANTITY

0.98+

three and a half years oldQUANTITY

0.98+

threeQUANTITY

0.98+

200 monthQUANTITY

0.98+

an hourQUANTITY

0.98+

secondQUANTITY

0.98+

four yearsQUANTITY

0.97+

PackardORGANIZATION

0.97+

JenniferPERSON

0.97+

California Palo AltoLOCATION

0.96+

over 1500 customersQUANTITY

0.96+

twoQUANTITY

0.96+

TerryPERSON

0.96+

Palo Alto CaliforniaLOCATION

0.96+

200 a monthQUANTITY

0.96+

MongoDBORGANIZATION

0.96+

Google MapsTITLE

0.96+

Apple MapsTITLE

0.96+

80sDATE

0.95+

Mongo DBORGANIZATION

0.94+

200 customers a monthQUANTITY

0.93+

ten thousand travelersQUANTITY

0.93+

SASORGANIZATION

0.93+

chapter twoOTHER

0.92+

bothQUANTITY

0.92+

East CoastLOCATION

0.92+

$500 a nightQUANTITY

0.92+

lacrosseTITLE

0.92+

MeganPERSON

0.91+

one clickQUANTITY

0.91+

one sideQUANTITY

0.91+

Thomas CookORGANIZATION

0.91+

StarbucksORGANIZATION

0.91+

chapter 2OTHER

0.9+

trip actionsORGANIZATION

0.9+

oneQUANTITY

0.88+

Google CalendarTITLE

0.88+

Palo Alto CaliforniaLOCATION

0.86+

r-cubeORGANIZATION

0.85+

one inventoryQUANTITY

0.85+

three top focusesQUANTITY

0.84+

chapter oneOTHER

0.83+

waveEVENT

0.82+

TripActionsORGANIZATION

0.79+

lot of peopleQUANTITY

0.78+

CMOORGANIZATION

0.78+

WazeTITLE

0.76+

20 open RexQUANTITY

0.75+

one of the companiesQUANTITY

0.75+

a weekQUANTITY

0.75+

one more timeQUANTITY

0.73+

a lot of peopleQUANTITY

0.72+

a ton of noiseQUANTITY

0.71+

half the peopleQUANTITY

0.7+

lotQUANTITY

0.68+

lot of customersQUANTITY

0.65+

chapter twoOTHER

0.62+

secondsQUANTITY

0.58+

Data Science for All: It's a Whole New Game


 

>> There's a movement that's sweeping across businesses everywhere here in this country and around the world. And it's all about data. Today businesses are being inundated with data. To the tune of over two and a half million gigabytes that'll be generated in the next 60 seconds alone. What do you do with all that data? To extract insights you typically turn to a data scientist. But not necessarily anymore. At least not exclusively. Today the ability to extract value from data is becoming a shared mission. A team effort that spans the organization extending far more widely than ever before. Today, data science is being democratized. >> Data Sciences for All: It's a Whole New Game. >> Welcome everyone, I'm Katie Linendoll. I'm a technology expert writer and I love reporting on all things tech. My fascination with tech started very young. I began coding when I was 12. Received my networking certs by 18 and a degree in IT and new media from Rochester Institute of Technology. So as you can tell, technology has always been a sure passion of mine. Having grown up in the digital age, I love having a career that keeps me at the forefront of science and technology innovations. I spend equal time in the field being hands on as I do on my laptop conducting in depth research. Whether I'm diving underwater with NASA astronauts, witnessing the new ways which mobile technology can help rebuild the Philippine's economy in the wake of super typhoons, or sharing a first look at the newest iPhones on The Today Show, yesterday, I'm always on the hunt for the latest and greatest tech stories. And that's what brought me here. I'll be your host for the next hour and as we explore the new phenomenon that is taking businesses around the world by storm. And data science continues to become democratized and extends beyond the domain of the data scientist. And why there's also a mandate for all of us to become data literate. Now that data science for all drives our AI culture. And we're going to be able to take to the streets and go behind the scenes as we uncover the factors that are fueling this phenomenon and giving rise to a movement that is reshaping how businesses leverage data. And putting organizations on the road to AI. So coming up, I'll be doing interviews with data scientists. We'll see real world demos and take a look at how IBM is changing the game with an open data science platform. We'll also be joined by legendary statistician Nate Silver, founder and editor-in-chief of FiveThirtyEight. Who will shed light on how a data driven mindset is changing everything from business to our culture. We also have a few people who are joining us in our studio, so thank you guys for joining us. Come on, I can do better than that, right? Live studio audience, the fun stuff. And for all of you during the program, I want to remind you to join that conversation on social media using the hashtag DSforAll, it's data science for all. Share your thoughts on what data science and AI means to you and your business. And, let's dive into a whole new game of data science. Now I'd like to welcome my co-host General Manager IBM Analytics, Rob Thomas. >> Hello, Katie. >> Come on guys. >> Yeah, seriously. >> No one's allowed to be quiet during this show, okay? >> Right. >> Or, I'll start calling people out. So Rob, thank you so much. I think you know this conversation, we're calling it a data explosion happening right now. And it's nothing new. And when you and I chatted about it. You've been talking about this for years. You have to ask, is this old news at this point? >> Yeah, I mean, well first of all, the data explosion is not coming, it's here. And everybody's in the middle of it right now. What is different is the economics have changed. And the scale and complexity of the data that organizations are having to deal with has changed. And to this day, 80% of the data in the world still sits behind corporate firewalls. So, that's becoming a problem. It's becoming unmanageable. IT struggles to manage it. The business can't get everything they need. Consumers can't consume it when they want. So we have a challenge here. >> It's challenging in the world of unmanageable. Crazy complexity. If I'm sitting here as an IT manager of my business, I'm probably thinking to myself, this is incredibly frustrating. How in the world am I going to get control of all this data? And probably not just me thinking it. Many individuals here as well. >> Yeah, indeed. Everybody's thinking about how am I going to put data to work in my organization in a way I haven't done before. Look, you've got to have the right expertise, the right tools. The other thing that's happening in the market right now is clients are dealing with multi cloud environments. So data behind the firewall in private cloud, multiple public clouds. And they have to find a way. How am I going to pull meaning out of this data? And that brings us to data science and AI. That's how you get there. >> I understand the data science part but I think we're all starting to hear more about AI. And it's incredible that this buzz word is happening. How do businesses adopt to this AI growth and boom and trend that's happening in this world right now? >> Well, let me define it this way. Data science is a discipline. And machine learning is one technique. And then AI puts both machine learning into practice and applies it to the business. So this is really about how getting your business where it needs to go. And to get to an AI future, you have to lay a data foundation today. I love the phrase, "there's no AI without IA." That means you're not going to get to AI unless you have the right information architecture to start with. >> Can you elaborate though in terms of how businesses can really adopt AI and get started. >> Look, I think there's four things you have to do if you're serious about AI. One is you need a strategy for data acquisition. Two is you need a modern data architecture. Three is you need pervasive automation. And four is you got to expand job roles in the organization. >> Data acquisition. First pillar in this you just discussed. Can we start there and explain why it's so critical in this process? >> Yeah, so let's think about how data acquisition has evolved through the years. 15 years ago, data acquisition was about how do I get data in and out of my ERP system? And that was pretty much solved. Then the mobile revolution happens. And suddenly you've got structured and non-structured data. More than you've ever dealt with. And now you get to where we are today. You're talking terabytes, petabytes of data. >> [Katie] Yottabytes, I heard that word the other day. >> I heard that too. >> Didn't even know what it meant. >> You know how many zeros that is? >> I thought we were in Star Wars. >> Yeah, I think it's a lot of zeroes. >> Yodabytes, it's new. >> So, it's becoming more and more complex in terms of how you acquire data. So that's the new data landscape that every client is dealing with. And if you don't have a strategy for how you acquire that and manage it, you're not going to get to that AI future. >> So a natural segue, if you are one of these businesses, how do you build for the data landscape? >> Yeah, so the question I always hear from customers is we need to evolve our data architecture to be ready for AI. And the way I think about that is it's really about moving from static data repositories to more of a fluid data layer. >> And we continue with the architecture. New data architecture is an interesting buzz word to hear. But it's also one of the four pillars. So if you could dive in there. >> Yeah, I mean it's a new twist on what I would call some core data science concepts. For example, you have to leverage tools with a modern, centralized data warehouse. But your data warehouse can't be stagnant to just what's right there. So you need a way to federate data across different environments. You need to be able to bring your analytics to the data because it's most efficient that way. And ultimately, it's about building an optimized data platform that is designed for data science and AI. Which means it has to be a lot more flexible than what clients have had in the past. >> All right. So we've laid out what you need for driving automation. But where does the machine learning kick in? >> Machine learning is what gives you the ability to automate tasks. And I think about machine learning. It's about predicting and automating. And this will really change the roles of data professionals and IT professionals. For example, a data scientist cannot possibly know every algorithm or every model that they could use. So we can automate the process of algorithm selection. Another example is things like automated data matching. Or metadata creation. Some of these things may not be exciting but they're hugely practical. And so when you think about the real use cases that are driving return on investment today, it's things like that. It's automating the mundane tasks. >> Let's go ahead and come back to something that you mentioned earlier because it's fascinating to be talking about this AI journey, but also significant is the new job roles. And what are those other participants in the analytics pipeline? >> Yeah I think we're just at the start of this idea of new job roles. We have data scientists. We have data engineers. Now you see machine learning engineers. Application developers. What's really happening is that data scientists are no longer allowed to work in their own silo. And so the new job roles is about how does everybody have data first in their mind? And then they're using tools to automate data science, to automate building machine learning into applications. So roles are going to change dramatically in organizations. >> I think that's confusing though because we have several organizations who saying is that highly specialized roles, just for data science? Or is it applicable to everybody across the board? >> Yeah, and that's the big question, right? Cause everybody's thinking how will this apply? Do I want this to be just a small set of people in the organization that will do this? But, our view is data science has to for everybody. It's about bring data science to everybody as a shared mission across the organization. Everybody in the company has to be data literate. And participate in this journey. >> So overall, group effort, has to be a common goal, and we all need to be data literate across the board. >> Absolutely. >> Done deal. But at the end of the day, it's kind of not an easy task. >> It's not. It's not easy but it's maybe not as big of a shift as you would think. Because you have to put data in the hands of people that can do something with it. So, it's very basic. Give access to data. Data's often locked up in a lot of organizations today. Give people the right tools. Embrace the idea of choice or diversity in terms of those tools. That gets you started on this path. >> It's interesting to hear you say essentially you need to train everyone though across the board when it comes to data literacy. And I think people that are coming into the work force don't necessarily have a background or a degree in data science. So how do you manage? >> Yeah, so in many cases that's true. I will tell you some universities are doing amazing work here. One example, University of California Berkeley. They offer a course for all majors. So no matter what you're majoring in, you have a course on foundations of data science. How do you bring data science to every role? So it's starting to happen. We at IBM provide data science courses through CognitiveClass.ai. It's for everybody. It's free. And look, if you want to get your hands on code and just dive right in, you go to datascience.ibm.com. The key point is this though. It's more about attitude than it is aptitude. I think anybody can figure this out. But it's about the attitude to say we're putting data first and we're going to figure out how to make this real in our organization. >> I also have to give a shout out to my alma mater because I have heard that there is an offering in MS in data analytics. And they are always on the forefront of new technologies and new majors and on trend. And I've heard that the placement behind those jobs, people graduating with the MS is high. >> I'm sure it's very high. >> So go Tigers. All right, tangential. Let me get back to something else you touched on earlier because you mentioned that a number of customers ask you how in the world do I get started with AI? It's an overwhelming question. Where do you even begin? What do you tell them? >> Yeah, well things are moving really fast. But the good thing is most organizations I see, they're already on the path, even if they don't know it. They might have a BI practice in place. They've got data warehouses. They've got data lakes. Let me give you an example. AMC Networks. They produce a lot of the shows that I'm sure you watch Katie. >> [Katie] Yes, Breaking Bad, Walking Dead, any fans? >> [Rob] Yeah, we've got a few. >> [Katie] Well you taught me something I didn't even know. Because it's amazing how we have all these different industries, but yet media in itself is impacted too. And this is a good example. >> Absolutely. So, AMC Networks, think about it. They've got ads to place. They want to track viewer behavior. What do people like? What do they dislike? So they have to optimize every aspect of their business from marketing campaigns to promotions to scheduling to ads. And their goal was transform data into business insights and really take the burden off of their IT team that was heavily burdened by obviously a huge increase in data. So their VP of BI took the approach of using machine learning to process large volumes of data. They used a platform that was designed for AI and data processing. It's the IBM analytics system where it's a data warehouse, data science tools are built in. It has in memory data processing. And just like that, they were ready for AI. And they're already seeing that impact in their business. >> Do you think a movement of that nature kind of presses other media conglomerates and organizations to say we need to be doing this too? >> I think it's inevitable that everybody, you're either going to be playing, you're either going to be leading, or you'll be playing catch up. And so, as we talk to clients we think about how do you start down this path now, even if you have to iterate over time? Because otherwise you're going to wake up and you're going to be behind. >> One thing worth noting is we've talked about analytics to the data. It's analytics first to the data, not the other way around. >> Right. So, look. We as a practice, we say you want to bring data to where the data sits. Because it's a lot more efficient that way. It gets you better outcomes in terms of how you train models and it's more efficient. And we think that leads to better outcomes. Other organization will say, "Hey move the data around." And everything becomes a big data movement exercise. But once an organization has started down this path, they're starting to get predictions, they want to do it where it's really easy. And that means analytics applied right where the data sits. >> And worth talking about the role of the data scientist in all of this. It's been called the hot job of the decade. And a Harvard Business Review even dubbed it the sexiest job of the 21st century. >> Yes. >> I want to see this on the cover of Vogue. Like I want to see the first data scientist. Female preferred, on the cover of Vogue. That would be amazing. >> Perhaps you can. >> People agree. So what changes for them? Is this challenging in terms of we talk data science for all. Where do all the data science, is it data science for everyone? And how does it change everything? >> Well, I think of it this way. AI gives software super powers. It really does. It changes the nature of software. And at the center of that is data scientists. So, a data scientist has a set of powers that they've never had before in any organization. And that's why it's a hot profession. Now, on one hand, this has been around for a while. We've had actuaries. We've had statisticians that have really transformed industries. But there are a few things that are new now. We have new tools. New languages. Broader recognition of this need. And while it's important to recognize this critical skill set, you can't just limit it to a few people. This is about scaling it across the organization. And truly making it accessible to all. >> So then do we need more data scientists? Or is this something you train like you said, across the board? >> Well, I think you want to do a little bit of both. We want more. But, we can also train more and make the ones we have more productive. The way I think about it is there's kind of two markets here. And we call it clickers and coders. >> [Katie] I like that. That's good. >> So, let's talk about what that means. So clickers are basically somebody that wants to use tools. Create models visually. It's drag and drop. Something that's very intuitive. Those are the clickers. Nothing wrong with that. It's been valuable for years. There's a new crop of data scientists. They want to code. They want to build with the latest open source tools. They want to write in Python or R. These are the coders. And both approaches are viable. Both approaches are critical. Organizations have to have a way to meet the needs of both of those types. And there's not a lot of things available today that do that. >> Well let's keep going on that. Because I hear you talking about the data scientists role and how it's critical to success, but with the new tools, data science and analytics skills can extend beyond the domain of just the data scientist. >> That's right. So look, we're unifying coders and clickers into a single platform, which we call IBM Data Science Experience. And as the demand for data science expertise grows, so does the need for these kind of tools. To bring them into the same environment. And my view is if you have the right platform, it enables the organization to collaborate. And suddenly you've changed the nature of data science from an individual sport to a team sport. >> So as somebody that, my background is in IT, the question is really is this an additional piece of what IT needs to do in 2017 and beyond? Or is it just another line item to the budget? >> So I'm afraid that some people might view it that way. As just another line item. But, I would challenge that and say data science is going to reinvent IT. It's going to change the nature of IT. And every organization needs to think about what are the skills that are critical? How do we engage a broader team to do this? Because once they get there, this is the chance to reinvent how they're performing IT. >> [Katie] Challenging or not? >> Look it's all a big challenge. Think about everything IT organizations have been through. Some of them were late to things like mobile, but then they caught up. Some were late to cloud, but then they caught up. I would just urge people, don't be late to data science. Use this as your chance to reinvent IT. Start with this notion of clickers and coders. This is a seminal moment. Much like mobile and cloud was. So don't be late. >> And I think it's critical because it could be so costly to wait. And Rob and I were even chatting earlier how data analytics is just moving into all different kinds of industries. And I can tell you even personally being effected by how important the analysis is in working in pediatric cancer for the last seven years. I personally implement virtual reality headsets to pediatric cancer hospitals across the country. And it's great. And it's working phenomenally. And the kids are amazed. And the staff is amazed. But the phase two of this project is putting in little metrics in the hardware that gather the breathing, the heart rate to show that we have data. Proof that we can hand over to the hospitals to continue making this program a success. So just in-- >> That's a great example. >> An interesting example. >> Saving lives? >> Yes. >> That's also applying a lot of what we talked about. >> Exciting stuff in the world of data science. >> Yes. Look, I just add this is an existential moment for every organization. Because what you do in this area is probably going to define how competitive you are going forward. And think about if you don't do something. What if one of your competitors goes and creates an application that's more engaging with clients? So my recommendation is start small. Experiment. Learn. Iterate on projects. Define the business outcomes. Then scale up. It's very doable. But you've got to take the first step. >> First step always critical. And now we're going to get to the fun hands on part of our story. Because in just a moment we're going to take a closer look at what data science can deliver. And where organizations are trying to get to. All right. Thank you Rob and now we've been joined by Siva Anne who is going to help us navigate this demo. First, welcome Siva. Give him a big round of applause. Yeah. All right, Rob break down what we're going to be looking at. You take over this demo. >> All right. So this is going to be pretty interesting. So Siva is going to take us through. So he's going to play the role of a financial adviser. Who wants to help better serve clients through recommendations. And I'm going to really illustrate three things. One is how do you federate data from multiple data sources? Inside the firewall, outside the firewall. How do you apply machine learning to predict and to automate? And then how do you move analytics closer to your data? So, what you're seeing here is a custom application for an investment firm. So, Siva, our financial adviser, welcome. So you can see at the top, we've got market data. We pulled that from an external source. And then we've got Siva's calendar in the middle. He's got clients on the right side. So page down, what else do you see down there Siva? >> [Siva] I can see the recent market news. And in here I can see that JP Morgan is calling for a US dollar rebound in the second half of the year. And, I have upcoming meeting with Leo Rakes. I can get-- >> [Rob] So let's go in there. Why don't you click on Leo Rakes. So, you're sitting at your desk, you're deciding how you're going to spend the day. You know you have a meeting with Leo. So you click on it. You immediately see, all right, so what do we know about him? We've got data governance implemented. So we know his age, we know his degree. We can see he's not that aggressive of a trader. Only six trades in the last few years. But then where it gets interesting is you go to the bottom. You start to see predicted industry affinity. Where did that come from? How do we have that? >> [Siva] So these green lines and red arrows here indicate the trending affinity of Leo Rakes for particular industry stocks. What we've done here is we've built machine learning models using customer's demographic data, his stock portfolios, and browsing behavior to build a model which can predict his affinity for a particular industry. >> [Rob] Interesting. So, I like to think of this, we call it celebrity experiences. So how do you treat every customer like they're a celebrity? So to some extent, we're reading his mind. Because without asking him, we know that he's going to have an affinity for auto stocks. So we go down. Now we look at his portfolio. You can see okay, he's got some different holdings. He's got Amazon, Google, Apple, and then he's got RACE, which is the ticker for Ferrari. You can see that's done incredibly well. And so, as a financial adviser, you look at this and you say, all right, we know he loves auto stocks. Ferrari's done very well. Let's create a hedge. Like what kind of security would interest him as a hedge against his position for Ferrari? Could we go figure that out? >> [Siva] Yes. Given I know that he's gotten an affinity for auto stocks, and I also see that Ferrari has got some terminus gains, I want to lock in these gains by hedging. And I want to do that by picking a auto stock which has got negative correlation with Ferrari. >> [Rob] So this is where we get to the idea of in database analytics. Cause you start clicking that and immediately we're getting instant answers of what's happening. So what did we find here? We're going to compare Ferrari and Honda. >> [Siva] I'm going to compare Ferrari with Honda. And what I see here instantly is that Honda has got a negative correlation with Ferrari, which makes it a perfect mix for his stock portfolio. Given he has an affinity for auto stocks and it correlates negatively with Ferrari. >> [Rob] These are very powerful tools at the hand of a financial adviser. You think about it. As a financial adviser, you wouldn't think about federating data, machine learning, pretty powerful. >> [Siva] Yes. So what we have seen here is that using the common SQL engine, we've been able to federate queries across multiple data sources. Db2 Warehouse in the cloud, IBM's Integrated Analytic System, and Hortonworks powered Hadoop platform for the new speeds. We've been able to use machine learning to derive innovative insights about his stock affinities. And drive the machine learning into the appliance. Closer to where the data resides to deliver high performance analytics. >> [Rob] At scale? >> [Siva] We're able to run millions of these correlations across stocks, currency, other factors. And even score hundreds of customers for their affinities on a daily basis. >> That's great. Siva, thank you for playing the role of financial adviser. So I just want to recap briefly. Cause this really powerful technology that's really simple. So we federated, we aggregated multiple data sources from all over the web and internal systems. And public cloud systems. Machine learning models were built that predicted Leo's affinity for a certain industry. In this case, automotive. And then you see when you deploy analytics next to your data, even a financial adviser, just with the click of a button is getting instant answers so they can go be more productive in their next meeting. This whole idea of celebrity experiences for your customer, that's available for everybody, if you take advantage of these types of capabilities. Katie, I'll hand it back to you. >> Good stuff. Thank you Rob. Thank you Siva. Powerful demonstration on what we've been talking about all afternoon. And thank you again to Siva for helping us navigate. Should be give him one more round of applause? We're going to be back in just a moment to look at how we operationalize all of this data. But in first, here's a message from me. If you're a part of a line of business, your main fear is disruption. You know data is the new goal that can create huge amounts of value. So does your competition. And they may be beating you to it. You're convinced there are new business models and revenue sources hidden in all the data. You just need to figure out how to leverage it. But with the scarcity of data scientists, you really can't rely solely on them. You may need more people throughout the organization that have the ability to extract value from data. And as a data science leader or data scientist, you have a lot of the same concerns. You spend way too much time looking for, prepping, and interpreting data and waiting for models to train. You know you need to operationalize the work you do to provide business value faster. What you want is an easier way to do data prep. And rapidly build models that can be easily deployed, monitored and automatically updated. So whether you're a data scientist, data science leader, or in a line of business, what's the solution? What'll it take to transform the way you work? That's what we're going to explore next. All right, now it's time to delve deeper into the nuts and bolts. The nitty gritty of operationalizing data science and creating a data driven culture. How do you actually do that? Well that's what these experts are here to share with us. I'm joined by Nir Kaldero, who's head of data science at Galvanize, which is an education and training organization. Tricia Wang, who is co-founder of Sudden Compass, a consultancy that helps companies understand people with data. And last, but certainly not least, Michael Li, founder and CEO of Data Incubator, which is a data science train company. All right guys. Shall we get right to it? >> All right. >> So data explosion happening right now. And we are seeing it across the board. I just shared an example of how it's impacting my philanthropic work in pediatric cancer. But you guys each have so many unique roles in your business life. How are you seeing it just blow up in your fields? Nir, your thing? >> Yeah, for example like in Galvanize we train many Fortune 500 companies. And just by looking at the demand of companies that wants us to help them go through this digital transformation is mind-blowing. Data point by itself. >> Okay. Well what we're seeing what's going on is that data science like as a theme, is that it's actually for everyone now. But what's happening is that it's actually meeting non technical people. But what we're seeing is that when non technical people are implementing these tools or coming at these tools without a base line of data literacy, they're often times using it in ways that distance themselves from the customer. Because they're implementing data science tools without a clear purpose, without a clear problem. And so what we do at Sudden Compass is that we work with companies to help them embrace and understand the complexity of their customers. Because often times they are misusing data science to try and flatten their understanding of the customer. As if you can just do more traditional marketing. Where you're putting people into boxes. And I think the whole ROI of data is that you can now understand people's relationships at a much more complex level at a greater scale before. But we have to do this with basic data literacy. And this has to involve technical and non technical people. >> Well you can have all the data in the world, and I think it speaks to, if you're not doing the proper movement with it, forget it. It means nothing at the same time. >> No absolutely. I mean, I think that when you look at the huge explosion in data, that comes with it a huge explosion in data experts. Right, we call them data scientists, data analysts. And sometimes they're people who are very, very talented, like the people here. But sometimes you have people who are maybe re-branding themselves, right? Trying to move up their title one notch to try to attract that higher salary. And I think that that's one of the things that customers are coming to us for, right? They're saying, hey look, there are a lot of people that call themselves data scientists, but we can't really distinguish. So, we have sort of run a fellowship where you help companies hire from a really talented group of folks, who are also truly data scientists and who know all those kind of really important data science tools. And we also help companies internally. Fortune 500 companies who are looking to grow that data science practice that they have. And we help clients like McKinsey, BCG, Bain, train up their customers, also their clients, also their workers to be more data talented. And to build up that data science capabilities. >> And Nir, this is something you work with a lot. A lot of Fortune 500 companies. And when we were speaking earlier, you were saying many of these companies can be in a panic. >> Yeah. >> Explain that. >> Yeah, so you know, not all Fortune 500 companies are fully data driven. And we know that the winners in this fourth industrial revolution, which I like to call the machine intelligence revolution, will be companies who navigate and transform their organization to unlock the power of data science and machine learning. And the companies that are not like that. Or not utilize data science and predictive power well, will pretty much get shredded. So they are in a panic. >> Tricia, companies have to deal with data behind the firewall and in the new multi cloud world. How do organizations start to become driven right to the core? >> I think the most urgent question to become data driven that companies should be asking is how do I bring the complex reality that our customers are experiencing on the ground in to a corporate office? Into the data models. So that question is critical because that's how you actually prevent any big data disasters. And that's how you leverage big data. Because when your data models are really far from your human models, that's when you're going to do things that are really far off from how, it's going to not feel right. That's when Tesco had their terrible big data disaster that they're still recovering from. And so that's why I think it's really important to understand that when you implement big data, you have to further embrace thick data. The qualitative, the emotional stuff, that is difficult to quantify. But then comes the difficult art and science that I think is the next level of data science. Which is that getting non technical and technical people together to ask how do we find those unknown nuggets of insights that are difficult to quantify? Then, how do we do the next step of figuring out how do you mathematically scale those insights into a data model? So that actually is reflective of human understanding? And then we can start making decisions at scale. But you have to have that first. >> That's absolutely right. And I think that when we think about what it means to be a data scientist, right? I always think about it in these sort of three pillars. You have the math side. You have to have that kind of stats, hardcore machine learning background. You have the programming side. You don't work with small amounts of data. You work with large amounts of data. You've got to be able to type the code to make those computers run. But then the last part is that human element. You have to understand the domain expertise. You have to understand what it is that I'm actually analyzing. What's the business proposition? And how are the clients, how are the users actually interacting with the system? That human element that you were talking about. And I think having somebody who understands all of those and not just in isolation, but is able to marry that understanding across those different topics, that's what makes a data scientist. >> But I find that we don't have people with those skill sets. And right now the way I see teams being set up inside companies is that they're creating these isolated data unicorns. These data scientists that have graduated from your programs, which are great. But, they don't involve the people who are the domain experts. They don't involve the designers, the consumer insight people, the people, the salespeople. The people who spend time with the customers day in and day out. Somehow they're left out of the room. They're consulted, but they're not a stakeholder. >> Can I actually >> Yeah, yeah please. >> Can I actually give a quick example? So for example, we at Galvanize train the executives and the managers. And then the technical people, the data scientists and the analysts. But in order to actually see all of the RY behind the data, you also have to have a creative fluid conversation between non technical and technical people. And this is a major trend now. And there's a major gap. And we need to increase awareness and kind of like create a new, kind of like environment where technical people also talks seamlessly with non technical ones. >> [Tricia] We call-- >> That's one of the things that we see a lot. Is one of the trends in-- >> A major trend. >> data science training is it's not just for the data science technical experts. It's not just for one type of person. So a lot of the training we do is sort of data engineers. People who are more on the software engineering side learning more about the stats of math. And then people who are sort of traditionally on the stat side learning more about the engineering. And then managers and people who are data analysts learning about both. >> Michael, I think you said something that was of interest too because I think we can look at IBM Watson as an example. And working in healthcare. The human component. Because often times we talk about machine learning and AI, and data and you get worried that you still need that human component. Especially in the world of healthcare. And I think that's a very strong point when it comes to the data analysis side. Is there any particular example you can speak to of that? >> So I think that there was this really excellent paper a while ago talking about all the neuro net stuff and trained on textual data. So looking at sort of different corpuses. And they found that these models were highly, highly sexist. They would read these corpuses and it's not because neuro nets themselves are sexist. It's because they're reading the things that we write. And it turns out that we write kind of sexist things. And they would sort of find all these patterns in there that were sort of latent, that had a lot of sort of things that maybe we would cringe at if we sort of saw. And I think that's one of the really important aspects of the human element, right? It's being able to come in and sort of say like, okay, I know what the biases of the system are, I know what the biases of the tools are. I need to figure out how to use that to make the tools, make the world a better place. And like another area where this comes up all the time is lending, right? So the federal government has said, and we have a lot of clients in the financial services space, so they're constantly under these kind of rules that they can't make discriminatory lending practices based on a whole set of protected categories. Race, sex, gender, things like that. But, it's very easy when you train a model on credit scores to pick that up. And then to have a model that's inadvertently sexist or racist. And that's where you need the human element to come back in and say okay, look, you're using the classic example would be zip code, you're using zip code as a variable. But when you look at it, zip codes actually highly correlated with race. And you can't do that. So you may inadvertently by sort of following the math and being a little naive about the problem, inadvertently introduce something really horrible into a model and that's where you need a human element to sort of step in and say, okay hold on. Slow things down. This isn't the right way to go. >> And the people who have -- >> I feel like, I can feel her ready to respond. >> Yes, I'm ready. >> She's like let me have at it. >> And the people here it is. And the people who are really great at providing that human intelligence are social scientists. We are trained to look for bias and to understand bias in data. Whether it's quantitative or qualitative. And I really think that we're going to have less of these kind of problems if we had more integrated teams. If it was a mandate from leadership to say no data science team should be without a social scientist, ethnographer, or qualitative researcher of some kind, to be able to help see these biases. >> The talent piece is actually the most crucial-- >> Yeah. >> one here. If you look about how to enable machine intelligence in organization there are the pillars that I have in my head which is the culture, the talent and the technology infrastructure. And I believe and I saw in working very closely with the Fortune 100 and 200 companies that the talent piece is actually the most important crucial hard to get. >> [Tricia] I totally agree. >> It's absolutely true. Yeah, no I mean I think that's sort of like how we came up with our business model. Companies were basically saying hey, I can't hire data scientists. And so we have a fellowship where we get 2,000 applicants each quarter. We take the top 2% and then we sort of train them up. And we work with hiring companies who then want to hire from that population. And so we're sort of helping them solve that problem. And the other half of it is really around training. Cause with a lot of industries, especially if you're sort of in a more regulated industry, there's a lot of nuances to what you're doing. And the fastest way to develop that data science or AI talent may not necessarily be to hire folks who are coming out of a PhD program. It may be to take folks internally who have a lot of that domain knowledge that you have and get them trained up on those data science techniques. So we've had large insurance companies come to us and say hey look, we hire three or four folks from you a quarter. That doesn't move the needle for us. What we really need is take the thousand actuaries and statisticians that we have and get all of them trained up to become a data scientist and become data literate in this new open source world. >> [Katie] Go ahead. >> All right, ladies first. >> Go ahead. >> Are you sure? >> No please, fight first. >> Go ahead. >> Go ahead Nir. >> So this is actually a trend that we have been seeing in the past year or so that companies kind of like start to look how to upscale and look for talent within the organization. So they can actually move them to become more literate and navigate 'em from analyst to data scientist. And from data scientist to machine learner. So this is actually a trend that is happening already for a year or so. >> Yeah, but I also find that after they've gone through that training in getting people skilled up in data science, the next problem that I get is executives coming to say we've invested in all of this. We're still not moving the needle. We've already invested in the right tools. We've gotten the right skills. We have enough scale of people who have these skills. Why are we not moving the needle? And what I explain to them is look, you're still making decisions in the same way. And you're still not involving enough of the non technical people. Especially from marketing, which is now, the CMO's are much more responsible for driving growth in their companies now. But often times it's so hard to change the old way of marketing, which is still like very segmentation. You know, demographic variable based, and we're trying to move people to say no, you have to understand the complexity of customers and not put them in boxes. >> And I think underlying a lot of this discussion is this question of culture, right? >> Yes. >> Absolutely. >> How do you build a data driven culture? And I think that that culture question, one of the ways that comes up quite often in especially in large, Fortune 500 enterprises, is that they are very, they're not very comfortable with sort of example, open source architecture. Open source tools. And there is some sort of residual bias that that's somehow dangerous. So security vulnerability. And I think that that's part of the cultural challenge that they often have in terms of how do I build a more data driven organization? Well a lot of the talent really wants to use these kind of tools. And I mean, just to give you an example, we are partnering with one of the major cloud providers to sort of help make open source tools more user friendly on their platform. So trying to help them attract the best technologists to use their platform because they want and they understand the value of having that kind of open source technology work seamlessly on their platforms. So I think that just sort of goes to show you how important open source is in this movement. And how much large companies and Fortune 500 companies and a lot of the ones we work with have to embrace that. >> Yeah, and I'm seeing it in our work. Even when we're working with Fortune 500 companies, is that they've already gone through the first phase of data science work. Where I explain it was all about the tools and getting the right tools and architecture in place. And then companies started moving into getting the right skill set in place. Getting the right talent. And what you're talking about with culture is really where I think we're talking about the third phase of data science, which is looking at communication of these technical frameworks so that we can get non technical people really comfortable in the same room with data scientists. That is going to be the phase, that's really where I see the pain point. And that's why at Sudden Compass, we're really dedicated to working with each other to figure out how do we solve this problem now? >> And I think that communication between the technical stakeholders and management and leadership. That's a very critical piece of this. You can't have a successful data science organization without that. >> Absolutely. >> And I think that actually some of the most popular trainings we've had recently are from managers and executives who are looking to say, how do I become more data savvy? How do I figure out what is this data science thing and how do I communicate with my data scientists? >> You guys made this way too easy. I was just going to get some popcorn and watch it play out. >> Nir, last 30 seconds. I want to leave you with an opportunity to, anything you want to add to this conversation? >> I think one thing to conclude is to say that companies that are not data driven is about time to hit refresh and figure how they transition the organization to become data driven. To become agile and nimble so they can actually see what opportunities from this important industrial revolution. Otherwise, unfortunately they will have hard time to survive. >> [Katie] All agreed? >> [Tricia] Absolutely, you're right. >> Michael, Trish, Nir, thank you so much. Fascinating discussion. And thank you guys again for joining us. We will be right back with another great demo. Right after this. >> Thank you Katie. >> Once again, thank you for an excellent discussion. Weren't they great guys? And thank you for everyone who's tuning in on the live webcast. As you can hear, we have an amazing studio audience here. And we're going to keep things moving. I'm now joined by Daniel Hernandez and Siva Anne. And we're going to turn our attention to how you can deliver on what they're talking about using data science experience to do data science faster. >> Thank you Katie. Siva and I are going to spend the next 10 minutes showing you how you can deliver on what they were saying using the IBM Data Science Experience to do data science faster. We'll demonstrate through new features we introduced this week how teams can work together more effectively across the entire analytics life cycle. How you can take advantage of any and all data no matter where it is and what it is. How you could use your favorite tools from open source. And finally how you could build models anywhere and employ them close to where your data is. Remember the financial adviser app Rob showed you? To build an app like that, we needed a team of data scientists, developers, data engineers, and IT staff to collaborate. We do this in the Data Science Experience through a concept we call projects. When I create a new project, I can now use the new Github integration feature. We're doing for data science what we've been doing for developers for years. Distributed teams can work together on analytics projects. And take advantage of Github's version management and change management features. This is a huge deal. Let's explore the project we created for the financial adviser app. As you can see, our data engineer Joane, our developer Rob, and others are collaborating this project. Joane got things started by bringing together the trusted data sources we need to build the app. Taking a closer look at the data, we see that our customer and profile data is stored on our recently announced IBM Integrated Analytics System, which runs safely behind our firewall. We also needed macro economic data, which she was able to find in the Federal Reserve. And she stored it in our Db2 Warehouse on Cloud. And finally, she selected stock news data from NASDAQ.com and landed that in a Hadoop cluster, which happens to be powered by Hortonworks. We added a new feature to the Data Science Experience so that when it's installed with Hortonworks, it automatically uses a need of security and governance controls within the cluster so your data is always secure and safe. Now we want to show you the news data we stored in the Hortonworks cluster. This is the mean administrative console. It's powered by an open source project called Ambari. And here's the news data. It's in parquet files stored in HDFS, which happens to be a distributive file system. To get the data from NASDAQ into our cluster, we used IBM's BigIntegrate and BigQuality to create automatic data pipelines that acquire, cleanse, and ingest that news data. Once the data's available, we use IBM's Big SQL to query that data using SQL statements that are much like the ones we would use for any relation of data, including the data that we have in the Integrated Analytics System and Db2 Warehouse on Cloud. This and the federation capabilities that Big SQL offers dramatically simplifies data acquisition. Now we want to show you how we support a brand new tool that we're excited about. Since we launched last summer, the Data Science Experience has supported Jupyter and R for data analysis and visualization. In this week's update, we deeply integrated another great open source project called Apache Zeppelin. It's known for having great visualization support, advanced collaboration features, and is growing in popularity amongst the data science community. This is an example of Apache Zeppelin and the notebook we created through it to explore some of our data. Notice how wonderful and easy the data visualizations are. Now we want to walk you through the Jupyter notebook we created to explore our customer preference for stocks. We use notebooks to understand and explore data. To identify the features that have some predictive power. Ultimately, we're trying to assess what ultimately is driving customer stock preference. Here we did the analysis to identify the attributes of customers that are likely to purchase auto stocks. We used this understanding to build our machine learning model. For building machine learning models, we've always had tools integrated into the Data Science Experience. But sometimes you need to use tools you already invested in. Like our very own SPSS as well as SAS. Through new import feature, you can easily import those models created with those tools. This helps you avoid vendor lock-in, and simplify the development, training, deployment, and management of all your models. To build the models we used in app, we could have coded, but we prefer a visual experience. We used our customer profile data in the Integrated Analytic System. Used the Auto Data Preparation to cleanse our data. Choose the binary classification algorithms. Let the Data Science Experience evaluate between logistic regression and gradient boosted tree. It's doing the heavy work for us. As you can see here, the Data Science Experience generated performance metrics that show us that the gradient boosted tree is the best performing algorithm for the data we gave it. Once we save this model, it's automatically deployed and available for developers to use. Any application developer can take this endpoint and consume it like they would any other API inside of the apps they built. We've made training and creating machine learning models super simple. But what about the operations? A lot of companies are struggling to ensure their model performance remains high over time. In our financial adviser app, we know that customer data changes constantly, so we need to always monitor model performance and ensure that our models are retrained as is necessary. This is a dashboard that shows the performance of our models and lets our teams monitor and retrain those models so that they're always performing to our standards. So far we've been showing you the Data Science Experience available behind the firewall that we're using to build and train models. Through a new publish feature, you can build models and deploy them anywhere. In another environment, private, public, or anywhere else with just a few clicks. So here we're publishing our model to the Watson machine learning service. It happens to be in the IBM cloud. And also deeply integrated with our Data Science Experience. After publishing and switching to the Watson machine learning service, you can see that our stock affinity and model that we just published is there and ready for use. So this is incredibly important. I just want to say it again. The Data Science Experience allows you to train models behind your own firewall, take advantage of your proprietary and sensitive data, and then deploy those models wherever you want with ease. So summarize what we just showed you. First, IBM's Data Science Experience supports all teams. You saw how our data engineer populated our project with trusted data sets. Our data scientists developed, trained, and tested a machine learning model. Our developers used APIs to integrate machine learning into their apps. And how IT can use our Integrated Model Management dashboard to monitor and manage model performance. Second, we support all data. On premises, in the cloud, structured, unstructured, inside of your firewall, and outside of it. We help you bring analytics and governance to where your data is. Third, we support all tools. The data science tools that you depend on are readily available and deeply integrated. This includes capabilities from great partners like Hortonworks. And powerful tools like our very own IBM SPSS. And fourth, and finally, we support all deployments. You can build your models anywhere, and deploy them right next to where your data is. Whether that's in the public cloud, private cloud, or even on the world's most reliable transaction platform, IBM z. So see for yourself. Go to the Data Science Experience website, take us for a spin. And if you happen to be ready right now, our recently created Data Science Elite Team can help you get started and run experiments alongside you with no charge. Thank you very much. >> Thank you very much Daniel. It seems like a great time to get started. And thanks to Siva for taking us through it. Rob and I will be back in just a moment to add some perspective right after this. All right, once again joined by Rob Thomas. And Rob obviously we got a lot of information here. >> Yes, we've covered a lot of ground. >> This is intense. You got to break it down for me cause I think we zoom out and see the big picture. What better data science can deliver to a business? Why is this so important? I mean we've heard it through and through. >> Yeah, well, I heard it a couple times. But it starts with businesses have to embrace a data driven culture. And it is a change. And we need to make data accessible with the right tools in a collaborative culture because we've got diverse skill sets in every organization. But data driven companies succeed when data science tools are in the hands of everyone. And I think that's a new thought. I think most companies think just get your data scientist some tools, you'll be fine. This is about tools in the hands of everyone. I think the panel did a great job of describing about how we get to data science for all. Building a data culture, making it a part of your everyday operations, and the highlights of what Daniel just showed us, that's some pretty cool features for how organizations can get to this, which is you can see IBM's Data Science Experience, how that supports all teams. You saw data analysts, data scientists, application developer, IT staff, all working together. Second, you saw how we support all tools. And your choice of tools. So the most popular data science libraries integrated into one platform. And we saw some new capabilities that help companies avoid lock-in, where you can import existing models created from specialist tools like SPSS or others. And then deploy them and manage them inside of Data Science Experience. That's pretty interesting. And lastly, you see we continue to build on this best of open tools. Partnering with companies like H2O, Hortonworks, and others. Third, you can see how you use all data no matter where it lives. That's a key challenge every organization's going to face. Private, public, federating all data sources. We announced new integration with the Hortonworks data platform where we deploy machine learning models where your data resides. That's been a key theme. Analytics where the data is. And lastly, supporting all types of deployments. Deploy them in your Hadoop cluster. Deploy them in your Integrated Analytic System. Or deploy them in z, just to name a few. A lot of different options here. But look, don't believe anything I say. Go try it for yourself. Data Science Experience, anybody can use it. Go to datascience.ibm.com and look, if you want to start right now, we just created a team that we call Data Science Elite. These are the best data scientists in the world that will come sit down with you and co-create solutions, models, and prove out a proof of concept. >> Good stuff. Thank you Rob. So you might be asking what does an organization look like that embraces data science for all? And how could it transform your role? I'm going to head back to the office and check it out. Let's start with the perspective of the line of business. What's changed? Well, now you're starting to explore new business models. You've uncovered opportunities for new revenue sources and all that hidden data. And being disrupted is no longer keeping you up at night. As a data science leader, you're beginning to collaborate with a line of business to better understand and translate the objectives into the models that are being built. Your data scientists are also starting to collaborate with the less technical team members and analysts who are working closest to the business problem. And as a data scientist, you stop feeling like you're falling behind. Open source tools are keeping you current. You're also starting to operationalize the work that you do. And you get to do more of what you love. Explore data, build models, put your models into production, and create business impact. All in all, it's not a bad scenario. Thanks. All right. We are back and coming up next, oh this is a special time right now. Cause we got a great guest speaker. New York Magazine called him the spreadsheet psychic and number crunching prodigy who went from correctly forecasting baseball games to correctly forecasting presidential elections. He even invented a proprietary algorithm called PECOTA for predicting future performance by baseball players and teams. And his New York Times bestselling book, The Signal and the Noise was named by Amazon.com as the number one best non-fiction book of 2012. He's currently the Editor in Chief of the award winning website, FiveThirtyEight and appears on ESPN as an on air commentator. Big round of applause. My pleasure to welcome Nate Silver. >> Thank you. We met backstage. >> Yes. >> It feels weird to re-shake your hand, but you know, for the audience. >> I had to give the intense firm grip. >> Definitely. >> The ninja grip. So you and I have crossed paths kind of digitally in the past, which it really interesting, is I started my career at ESPN. And I started as a production assistant, then later back on air for sports technology. And I go to you to talk about sports because-- >> Yeah. >> Wow, has ESPN upped their game in terms of understanding the importance of data and analytics. And what it brings. Not just to MLB, but across the board. >> No, it's really infused into the way they present the broadcast. You'll have win probability on the bottom line. And they'll incorporate FiveThirtyEight metrics into how they cover college football for example. So, ESPN ... Sports is maybe the perfect, if you're a data scientist, like the perfect kind of test case. And the reason being that sports consists of problems that have rules. And have structure. And when problems have rules and structure, then it's a lot easier to work with. So it's a great way to kind of improve your skills as a data scientist. Of course, there are also important real world problems that are more open ended, and those present different types of challenges. But it's such a natural fit. The teams. Think about the teams playing the World Series tonight. The Dodgers and the Astros are both like very data driven, especially Houston. Golden State Warriors, the NBA Champions, extremely data driven. New England Patriots, relative to an NFL team, it's shifted a little bit, the NFL bar is lower. But the Patriots are certainly very analytical in how they make decisions. So, you can't talk about sports without talking about analytics. >> And I was going to save the baseball question for later. Cause we are moments away from game seven. >> Yeah. >> Is everyone else watching game seven? It's been an incredible series. Probably one of the best of all time. >> Yeah, I mean-- >> You have a prediction here? >> You can mention that too. So I don't have a prediction. FiveThirtyEight has the Dodgers with a 60% chance of winning. >> [Katie] LA Fans. >> So you have two teams that are about equal. But the Dodgers pitching staff is in better shape at the moment. The end of a seven game series. And they're at home. >> But the statistics behind the two teams is pretty incredible. >> Yeah. It's like the first World Series in I think 56 years or something where you have two 100 win teams facing one another. There have been a lot of parity in baseball for a lot of years. Not that many offensive overall juggernauts. But this year, and last year with the Cubs and the Indians too really. But this year, you have really spectacular teams in the World Series. It kind of is a showcase of modern baseball. Lots of home runs. Lots of strikeouts. >> [Katie] Lots of extra innings. >> Lots of extra innings. Good defense. Lots of pitching changes. So if you love the modern baseball game, it's been about the best example that you've had. If you like a little bit more contact, and fewer strikeouts, maybe not so much. But it's been a spectacular and very exciting World Series. It's amazing to talk. MLB is huge with analysis. I mean, hands down. But across the board, if you can provide a few examples. Because there's so many teams in front offices putting such an, just a heavy intensity on the analysis side. And where the teams are going. And if you could provide any specific examples of teams that have really blown your mind. Especially over the last year or two. Because every year it gets more exciting if you will. I mean, so a big thing in baseball is defensive shifts. So if you watch tonight, you'll probably see a couple of plays where if you're used to watching baseball, a guy makes really solid contact. And there's a fielder there that you don't think should be there. But that's really very data driven where you analyze where's this guy hit the ball. That part's not so hard. But also there's game theory involved. Because you have to adjust for the fact that he knows where you're positioning the defenders. He's trying therefore to make adjustments to his own swing and so that's been a major innovation in how baseball is played. You know, how bullpens are used too. Where teams have realized that actually having a guy, across all sports pretty much, realizing the importance of rest. And of fatigue. And that you can be the best pitcher in the world, but guess what? After four or five innings, you're probably not as good as a guy who has a fresh arm necessarily. So I mean, it really is like, these are not subtle things anymore. It's not just oh, on base percentage is valuable. It really effects kind of every strategic decision in baseball. The NBA, if you watch an NBA game tonight, see how many three point shots are taken. That's in part because of data. And teams realizing hey, three points is worth more than two, once you're more than about five feet from the basket, the shooting percentage gets really flat. And so it's revolutionary, right? Like teams that will shoot almost half their shots from the three point range nowadays. Larry Bird, who wound up being one of the greatest three point shooters of all time, took only eight three pointers his first year in the NBA. It's quite noticeable if you watch baseball or basketball in particular. >> Not to focus too much on sports. One final question. In terms of Major League Soccer, and now in NFL, we're having the analysis and having wearables where it can now showcase if they wanted to on screen, heart rate and breathing and how much exertion. How much data is too much data? And when does it ruin the sport? >> So, I don't think, I mean, again, it goes sport by sport a little bit. I think in basketball you actually have a more exciting game. I think the game is more open now. You have more three pointers. You have guys getting higher assist totals. But you know, I don't know. I'm not one of those people who thinks look, if you love baseball or basketball, and you go in to work for the Astros, the Yankees or the Knicks, they probably need some help, right? You really have to be passionate about that sport. Because it's all based on what questions am I asking? As I'm a fan or I guess an employee of the team. Or a player watching the game. And there isn't really any substitute I don't think for the insight and intuition that a curious human has to kind of ask the right questions. So we can talk at great length about what tools do you then apply when you have those questions, but that still comes from people. I don't think machine learning could help with what questions do I want to ask of the data. It might help you get the answers. >> If you have a mid-fielder in a soccer game though, not exerting, only 80%, and you're seeing that on a screen as a fan, and you're saying could that person get fired at the end of the day? One day, with the data? >> So we found that actually some in soccer in particular, some of the better players are actually more still. So Leo Messi, maybe the best player in the world, doesn't move as much as other soccer players do. And the reason being that A) he kind of knows how to position himself in the first place. B) he realizes that you make a run, and you're out of position. That's quite fatiguing. And particularly soccer, like basketball, is a sport where it's incredibly fatiguing. And so, sometimes the guys who conserve their energy, that kind of old school mentality, you have to hustle at every moment. That is not helpful to the team if you're hustling on an irrelevant play. And therefore, on a critical play, can't get back on defense, for example. >> Sports, but also data is moving exponentially as we're just speaking about today. Tech, healthcare, every different industry. Is there any particular that's a favorite of yours to cover? And I imagine they're all different as well. >> I mean, I do like sports. We cover a lot of politics too. Which is different. I mean in politics I think people aren't intuitively as data driven as they might be in sports for example. It's impressive to follow the breakthroughs in artificial intelligence. It started out just as kind of playing games and playing chess and poker and Go and things like that. But you really have seen a lot of breakthroughs in the last couple of years. But yeah, it's kind of infused into everything really. >> You're known for your work in politics though. Especially presidential campaigns. >> Yeah. >> This year, in particular. Was it insanely challenging? What was the most notable thing that came out of any of your predictions? >> I mean, in some ways, looking at the polling was the easiest lens to look at it. So I think there's kind of a myth that last year's result was a big shock and it wasn't really. If you did the modeling in the right way, then you realized that number one, polls have a margin of error. And so when a candidate has a three point lead, that's not particularly safe. Number two, the outcome between different states is correlated. Meaning that it's not that much of a surprise that Clinton lost Wisconsin and Michigan and Pennsylvania and Ohio. You know I'm from Michigan. Have friends from all those states. Kind of the same types of people in those states. Those outcomes are all correlated. So what people thought was a big upset for the polls I think was an example of how data science done carefully and correctly where you understand probabilities, understand correlations. Our model gave Trump a 30% chance of winning. Others models gave him a 1% chance. And so that was interesting in that it showed that number one, that modeling strategies and skill do matter quite a lot. When you have someone saying 30% versus 1%. I mean, that's a very very big spread. And number two, that these aren't like solved problems necessarily. Although again, the problem with elections is that you only have one election every four years. So I can be very confident that I have a better model. Even one year of data doesn't really prove very much. Even five or 10 years doesn't really prove very much. And so, being aware of the limitations to some extent intrinsically in elections when you only get one kind of new training example every four years, there's not really any way around that. There are ways to be more robust to sparce data environments. But if you're identifying different types of business problems to solve, figuring out what's a solvable problem where I can add value with data science is a really key part of what you're doing. >> You're such a leader in this space. In data and analysis. It would be interesting to kind of peek back the curtain, understand how you operate but also how large is your team? How you're putting together information. How quickly you're putting it out. Cause I think in this right now world where everybody wants things instantly-- >> Yeah. >> There's also, you want to be first too in the world of journalism. But you don't want to be inaccurate because that's your credibility. >> We talked about this before, right? I think on average, speed is a little bit overrated in journalism. >> [Katie] I think it's a big problem in journalism. >> Yeah. >> Especially in the tech world. You have to be first. You have to be first. And it's just pumping out, pumping out. And there's got to be more time spent on stories if I can speak subjectively. >> Yeah, for sure. But at the same time, we are reacting to the news. And so we have people that come in, we hire most of our people actually from journalism. >> [Katie] How many people do you have on your team? >> About 35. But, if you get someone who comes in from an academic track for example, they might be surprised at how fast journalism is. That even though we might be slower than the average website, the fact that there's a tragic event in New York, are there things we have to say about that? A candidate drops out of the presidential race, are things we have to say about that. In periods ranging from minutes to days as opposed to kind of weeks to months to years in the academic world. The corporate world moves faster. What is a little different about journalism is that you are expected to have more precision where people notice when you make a mistake. In corporations, you have maybe less transparency. If you make 10 investments and seven of them turn out well, then you'll get a lot of profit from that, right? In journalism, it's a little different. If you make kind of seven predictions or say seven things, and seven of them are very accurate and three of them aren't, you'll still get criticized a lot for the three. Just because that's kind of the way that journalism is. And so the kind of combination of needing, not having that much tolerance for mistakes, but also needing to be fast. That is tricky. And I criticize other journalists sometimes including for not being data driven enough, but the best excuse any journalist has, this is happening really fast and it's my job to kind of figure out in real time what's going on and provide useful information to the readers. And that's really difficult. Especially in a world where literally, I'll probably get off the stage and check my phone and who knows what President Trump will have tweeted or what things will have happened. But it really is a kind of 24/7. >> Well because it's 24/7 with FiveThirtyEight, one of the most well known sites for data, are you feeling micromanagey on your people? Because you do have to hit this balance. You can't have something come out four or five days later. >> Yeah, I'm not -- >> Are you overseeing everything? >> I'm not by nature a micromanager. And so you try to hire well. You try and let people make mistakes. And the flip side of this is that if a news organization that never had any mistakes, never had any corrections, that's raw, right? You have to have some tolerance for error because you are trying to decide things in real time. And figure things out. I think transparency's a big part of that. Say here's what we think, and here's why we think it. If we have a model to say it's not just the final number, here's a lot of detail about how that's calculated. In some case we release the code and the raw data. Sometimes we don't because there's a proprietary advantage. But quite often we're saying we want you to trust us and it's so important that you trust us, here's the model. Go play around with it yourself. Here's the data. And that's also I think an important value. >> That speaks to open source. And your perspective on that in general. >> Yeah, I mean, look, I'm a big fan of open source. I worry that I think sometimes the trends are a little bit away from open source. But by the way, one thing that happens when you share your data or you share your thinking at least in lieu of the data, and you can definitely do both is that readers will catch embarrassing mistakes that you made. By the way, even having open sourceness within your team, I mean we have editors and copy editors who often save you from really embarrassing mistakes. And by the way, it's not necessarily people who have a training in data science. I would guess that of our 35 people, maybe only five to 10 have a kind of formal background in what you would call data science. >> [Katie] I think that speaks to the theme here. >> Yeah. >> [Katie] That everybody's kind of got to be data literate. >> But yeah, it is like you have a good intuition. You have a good BS detector basically. And you have a good intuition for hey, this looks a little bit out of line to me. And sometimes that can be based on domain knowledge, right? We have one of our copy editors, she's a big college football fan. And we had an algorithm we released that tries to predict what the human being selection committee will do, and she was like, why is LSU rated so high? Cause I know that LSU sucks this year. And we looked at it, and she was right. There was a bug where it had forgotten to account for their last game where they lost to Troy or something and so -- >> That also speaks to the human element as well. >> It does. In general as a rule, if you're designing a kind of regression based model, it's different in machine learning where you have more, when you kind of build in the tolerance for error. But if you're trying to do something more precise, then so much of it is just debugging. It's saying that looks wrong to me. And I'm going to investigate that. And sometimes it's not wrong. Sometimes your model actually has an insight that you didn't have yourself. But fairly often, it is. And I think kind of what you learn is like, hey if there's something that bothers me, I want to go investigate that now and debug that now. Because the last thing you want is where all of a sudden, the answer you're putting out there in the world hinges on a mistake that you made. Cause you never know if you have so to speak, 1,000 lines of code and they all perform something differently. You never know when you get in a weird edge case where this one decision you made winds up being the difference between your having a good forecast and a bad one. In a defensible position and a indefensible one. So we definitely are quite diligent and careful. But it's also kind of knowing like, hey, where is an approximation good enough and where do I need more precision? Cause you could also drive yourself crazy in the other direction where you know, it doesn't matter if the answer is 91.2 versus 90. And so you can kind of go 91.2, three, four and it's like kind of A) false precision and B) not a good use of your time. So that's where I do still spend a lot of time is thinking about which problems are "solvable" or approachable with data and which ones aren't. And when they're not by the way, you're still allowed to report on them. We are a news organization so we do traditional reporting as well. And then kind of figuring out when do you need precision versus when is being pointed in the right direction good enough? >> I would love to get inside your brain and see how you operate on just like an everyday walking to Walgreens movement. It's like oh, if I cross the street in .2-- >> It's not, I mean-- >> Is it like maddening in there? >> No, not really. I mean, I'm like-- >> This is an honest question. >> If I'm looking for airfares, I'm a little more careful. But no, part of it's like you don't want to waste time on unimportant decisions, right? I will sometimes, if I can't decide what to eat at a restaurant, I'll flip a coin. If the chicken and the pasta both sound really good-- >> That's not high tech Nate. We want better. >> But that's the point, right? It's like both the chicken and the pasta are going to be really darn good, right? So I'm not going to waste my time trying to figure it out. I'm just going to have an arbitrary way to decide. >> Serious and business, how organizations in the last three to five years have just evolved with this data boom. How are you seeing it as from a consultant point of view? Do you think it's an exciting time? Do you think it's a you must act now time? >> I mean, we do know that you definitely see a lot of talent among the younger generation now. That so FiveThirtyEight has been at ESPN for four years now. And man, the quality of the interns we get has improved so much in four years. The quality of the kind of young hires that we make straight out of college has improved so much in four years. So you definitely do see a younger generation for which this is just part of their bloodstream and part of their DNA. And also, particular fields that we're interested in. So we're interested in people who have both a data and a journalism background. We're interested in people who have a visualization and a coding background. A lot of what we do is very much interactive graphics and so forth. And so we do see those skill sets coming into play a lot more. And so the kind of shortage of talent that had I think frankly been a problem for a long time, I'm optimistic based on the young people in our office, it's a little anecdotal but you can tell that there are so many more programs that are kind of teaching students the right set of skills that maybe weren't taught as much a few years ago. >> But when you're seeing these big organizations, ESPN as perfect example, moving more towards data and analytics than ever before. >> Yeah. >> You would say that's obviously true. >> Oh for sure. >> If you're not moving that direction, you're going to fall behind quickly. >> Yeah and the thing is, if you read my book or I guess people have a copy of the book. In some ways it's saying hey, there are lot of ways to screw up when you're using data. And we've built bad models. We've had models that were bad and got good results. Good models that got bad results and everything else. But the point is that the reason to be out in front of the problem is so you give yourself more runway to make errors and mistakes. And to learn kind of what works and what doesn't and which people to put on the problem. I sometimes do worry that a company says oh we need data. And everyone kind of agrees on that now. We need data science. Then they have some big test case. And they have a failure. And they maybe have a failure because they didn't know really how to use it well enough. But learning from that and iterating on that. And so by the time that you're on the third generation of kind of a problem that you're trying to solve, and you're watching everyone else make the mistake that you made five years ago, I mean, that's really powerful. But that doesn't mean that getting invested in it now, getting invested both in technology and the human capital side is important. >> Final question for you as we run out of time. 2018 beyond, what is your biggest project in terms of data gathering that you're working on? >> There's a midterm election coming up. That's a big thing for us. We're also doing a lot of work with NBA data. So for four years now, the NBA has been collecting player tracking data. So they have 3D cameras in every arena. So they can actually kind of quantify for example how fast a fast break is, for example. Or literally where a player is and where the ball is. For every NBA game now for the past four or five years. And there hasn't really been an overall metric of player value that's taken advantage of that. The teams do it. But in the NBA, the teams are a little bit ahead of journalists and analysts. So we're trying to have a really truly next generation stat. It's a lot of data. Sometimes I now more oversee things than I once did myself. And so you're parsing through many, many, many lines of code. But yeah, so we hope to have that out at some point in the next few months. >> Anything you've personally been passionate about that you've wanted to work on and kind of solve? >> I mean, the NBA thing, I am a pretty big basketball fan. >> You can do better than that. Come on, I want something real personal that you're like I got to crunch the numbers. >> You know, we tried to figure out where the best burrito in America was a few years ago. >> I'm going to end it there. >> Okay. >> Nate, thank you so much for joining us. It's been an absolute pleasure. Thank you. >> Cool, thank you. >> I thought we were going to chat World Series, you know. Burritos, important. I want to thank everybody here in our audience. Let's give him a big round of applause. >> [Nate] Thank you everyone. >> Perfect way to end the day. And for a replay of today's program, just head on over to ibm.com/dsforall. I'm Katie Linendoll. And this has been Data Science for All: It's a Whole New Game. Test one, two. One, two, three. Hi guys, I just want to quickly let you know as you're exiting. A few heads up. Downstairs right now there's going to be a meet and greet with Nate. And we're going to be doing that with clients and customers who are interested. So I would recommend before the game starts, and you lose Nate, head on downstairs. And also the gallery is open until eight p.m. with demos and activations. And tomorrow, make sure to come back too. Because we have exciting stuff. I'll be joining you as your host. And we're kicking off at nine a.m. So bye everybody, thank you so much. >> [Announcer] Ladies and gentlemen, thank you for attending this evening's webcast. If you are not attending all cloud and cognitive summit tomorrow, we ask that you recycle your name badge at the registration desk. Thank you. Also, please note there are two exits on the back of the room on either side of the room. Have a good evening. Ladies and gentlemen, the meet and greet will be on stage. Thank you.

Published Date : Nov 1 2017

SUMMARY :

Today the ability to extract value from data is becoming a shared mission. And for all of you during the program, I want to remind you to join that conversation on And when you and I chatted about it. And the scale and complexity of the data that organizations are having to deal with has It's challenging in the world of unmanageable. And they have to find a way. AI. And it's incredible that this buzz word is happening. And to get to an AI future, you have to lay a data foundation today. And four is you got to expand job roles in the organization. First pillar in this you just discussed. And now you get to where we are today. And if you don't have a strategy for how you acquire that and manage it, you're not going And the way I think about that is it's really about moving from static data repositories And we continue with the architecture. So you need a way to federate data across different environments. So we've laid out what you need for driving automation. And so when you think about the real use cases that are driving return on investment today, Let's go ahead and come back to something that you mentioned earlier because it's fascinating And so the new job roles is about how does everybody have data first in their mind? Everybody in the company has to be data literate. So overall, group effort, has to be a common goal, and we all need to be data literate But at the end of the day, it's kind of not an easy task. It's not easy but it's maybe not as big of a shift as you would think. It's interesting to hear you say essentially you need to train everyone though across the And look, if you want to get your hands on code and just dive right in, you go to datascience.ibm.com. And I've heard that the placement behind those jobs, people graduating with the MS is high. Let me get back to something else you touched on earlier because you mentioned that a number They produce a lot of the shows that I'm sure you watch Katie. And this is a good example. So they have to optimize every aspect of their business from marketing campaigns to promotions And so, as we talk to clients we think about how do you start down this path now, even It's analytics first to the data, not the other way around. We as a practice, we say you want to bring data to where the data sits. And a Harvard Business Review even dubbed it the sexiest job of the 21st century. Female preferred, on the cover of Vogue. And how does it change everything? And while it's important to recognize this critical skill set, you can't just limit it And we call it clickers and coders. [Katie] I like that. And there's not a lot of things available today that do that. Because I hear you talking about the data scientists role and how it's critical to success, And my view is if you have the right platform, it enables the organization to collaborate. And every organization needs to think about what are the skills that are critical? Use this as your chance to reinvent IT. And I can tell you even personally being effected by how important the analysis is in working And think about if you don't do something. And now we're going to get to the fun hands on part of our story. And then how do you move analytics closer to your data? And in here I can see that JP Morgan is calling for a US dollar rebound in the second half But then where it gets interesting is you go to the bottom. data, his stock portfolios, and browsing behavior to build a model which can predict his affinity And so, as a financial adviser, you look at this and you say, all right, we know he loves And I want to do that by picking a auto stock which has got negative correlation with Ferrari. Cause you start clicking that and immediately we're getting instant answers of what's happening. And what I see here instantly is that Honda has got a negative correlation with Ferrari, As a financial adviser, you wouldn't think about federating data, machine learning, pretty And drive the machine learning into the appliance. And even score hundreds of customers for their affinities on a daily basis. And then you see when you deploy analytics next to your data, even a financial adviser, And as a data science leader or data scientist, you have a lot of the same concerns. But you guys each have so many unique roles in your business life. And just by looking at the demand of companies that wants us to help them go through this And I think the whole ROI of data is that you can now understand people's relationships Well you can have all the data in the world, and I think it speaks to, if you're not doing And I think that that's one of the things that customers are coming to us for, right? And Nir, this is something you work with a lot. And the companies that are not like that. Tricia, companies have to deal with data behind the firewall and in the new multi cloud And so that's why I think it's really important to understand that when you implement big And how are the clients, how are the users actually interacting with the system? And right now the way I see teams being set up inside companies is that they're creating But in order to actually see all of the RY behind the data, you also have to have a creative That's one of the things that we see a lot. So a lot of the training we do is sort of data engineers. And I think that's a very strong point when it comes to the data analysis side. And that's where you need the human element to come back in and say okay, look, you're And the people who are really great at providing that human intelligence are social scientists. the talent piece is actually the most important crucial hard to get. It may be to take folks internally who have a lot of that domain knowledge that you have And from data scientist to machine learner. And what I explain to them is look, you're still making decisions in the same way. And I mean, just to give you an example, we are partnering with one of the major cloud And what you're talking about with culture is really where I think we're talking about And I think that communication between the technical stakeholders and management You guys made this way too easy. I want to leave you with an opportunity to, anything you want to add to this conversation? I think one thing to conclude is to say that companies that are not data driven is And thank you guys again for joining us. And we're going to turn our attention to how you can deliver on what they're talking about And finally how you could build models anywhere and employ them close to where your data is. And thanks to Siva for taking us through it. You got to break it down for me cause I think we zoom out and see the big picture. And we saw some new capabilities that help companies avoid lock-in, where you can import And as a data scientist, you stop feeling like you're falling behind. We met backstage. And I go to you to talk about sports because-- And what it brings. And the reason being that sports consists of problems that have rules. And I was going to save the baseball question for later. Probably one of the best of all time. FiveThirtyEight has the Dodgers with a 60% chance of winning. So you have two teams that are about equal. It's like the first World Series in I think 56 years or something where you have two 100 And that you can be the best pitcher in the world, but guess what? And when does it ruin the sport? So we can talk at great length about what tools do you then apply when you have those And the reason being that A) he kind of knows how to position himself in the first place. And I imagine they're all different as well. But you really have seen a lot of breakthroughs in the last couple of years. You're known for your work in politics though. What was the most notable thing that came out of any of your predictions? And so, being aware of the limitations to some extent intrinsically in elections when It would be interesting to kind of peek back the curtain, understand how you operate but But you don't want to be inaccurate because that's your credibility. I think on average, speed is a little bit overrated in journalism. And there's got to be more time spent on stories if I can speak subjectively. And so we have people that come in, we hire most of our people actually from journalism. And so the kind of combination of needing, not having that much tolerance for mistakes, Because you do have to hit this balance. And so you try to hire well. And your perspective on that in general. But by the way, one thing that happens when you share your data or you share your thinking And you have a good intuition for hey, this looks a little bit out of line to me. And I think kind of what you learn is like, hey if there's something that bothers me, It's like oh, if I cross the street in .2-- I mean, I'm like-- But no, part of it's like you don't want to waste time on unimportant decisions, right? We want better. It's like both the chicken and the pasta are going to be really darn good, right? Serious and business, how organizations in the last three to five years have just And man, the quality of the interns we get has improved so much in four years. But when you're seeing these big organizations, ESPN as perfect example, moving more towards But the point is that the reason to be out in front of the problem is so you give yourself Final question for you as we run out of time. And so you're parsing through many, many, many lines of code. You can do better than that. You know, we tried to figure out where the best burrito in America was a few years Nate, thank you so much for joining us. I thought we were going to chat World Series, you know. And also the gallery is open until eight p.m. with demos and activations. If you are not attending all cloud and cognitive summit tomorrow, we ask that you recycle your

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Tricia WangPERSON

0.99+

KatiePERSON

0.99+

Katie LinendollPERSON

0.99+

RobPERSON

0.99+

GoogleORGANIZATION

0.99+

JoanePERSON

0.99+

DanielPERSON

0.99+

Michael LiPERSON

0.99+

Nate SilverPERSON

0.99+

AppleORGANIZATION

0.99+

HortonworksORGANIZATION

0.99+

TrumpPERSON

0.99+

NatePERSON

0.99+

HondaORGANIZATION

0.99+

SivaPERSON

0.99+

McKinseyORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

Larry BirdPERSON

0.99+

2017DATE

0.99+

Rob ThomasPERSON

0.99+

MichiganLOCATION

0.99+

YankeesORGANIZATION

0.99+

New YorkLOCATION

0.99+

ClintonPERSON

0.99+

IBMORGANIZATION

0.99+

TescoORGANIZATION

0.99+

MichaelPERSON

0.99+

AmericaLOCATION

0.99+

LeoPERSON

0.99+

four yearsQUANTITY

0.99+

fiveQUANTITY

0.99+

30%QUANTITY

0.99+

AstrosORGANIZATION

0.99+

TrishPERSON

0.99+

Sudden CompassORGANIZATION

0.99+

Leo MessiPERSON

0.99+

two teamsQUANTITY

0.99+

1,000 linesQUANTITY

0.99+

one yearQUANTITY

0.99+

10 investmentsQUANTITY

0.99+

NASDAQORGANIZATION

0.99+

The Signal and the NoiseTITLE

0.99+

TriciaPERSON

0.99+

Nir KalderoPERSON

0.99+

80%QUANTITY

0.99+

BCGORGANIZATION

0.99+

Daniel HernandezPERSON

0.99+

ESPNORGANIZATION

0.99+

H2OORGANIZATION

0.99+

FerrariORGANIZATION

0.99+

last yearDATE

0.99+

18QUANTITY

0.99+

threeQUANTITY

0.99+

Data IncubatorORGANIZATION

0.99+

PatriotsORGANIZATION

0.99+

Mel Kirk, Ryder - Informatica World 2017 - #INFA17 - #theCUBE


 

>> Announcer: Live from San Francisco, it's theCUBE covering Informatica World 2017. Brought to you by Informatica. (light techno music) >> Welcome back to Informatica World 2017. I'm Peter Burris, and once again theCUBE is broadcasting morning to night two days in a row to bring you The Signal from the Noise, this very very important conference. There's a lot going on here as we talk about the increasing role that data's playing in the world. Now, to get a user perspective, and not just any user perspective, a leading user perspective, on some of these issues, we've asked Mel Kirk to come on board. Mel, welcome to theCUBE . >> Thank you sir. Glad to be here. >> Mel is the senior vice president chief information officer for Ryder Systems. For those of you who don't know Ryder, it's a trucking company, a trucking and leasing company. >> Mel: Absolutely. >> And my background is I used to actually do a lot of research around transportation-related things, and I always found the ability to use queuing theory, >> Mel: Ah. in both technology and in transportation, >> Yes. to be very fascinating. So again, Mel, welcome here, but tell us a little bit about what you're here at Informatica World for, and what's your interest in all this? >> You know it's interesting, this was one of the conferences that I set out this year that I wanted to come to because I wanted to learn more about where Informatica is going in terms of leveraging data. Transportation company, we generate a lot of data. We have three business units, we have a fleet management company, a 3PL traditional transportation, supply chain company, and a dedicated transportation company. All three of those businesses generate a lot of data, and we're on a journey to try to figure out how, what's the best way of using that data to improve business outcomes. So that's what I'm here for this week, is to learn more about the tools that are here, the applications that are here, that we can use to do just that. >> So one of the things that I'm fascinated, often the new branding of Informatica, which we think is good: enterprise, Cloud, data management, leader. We know what enterprise is, we know what Cloud is, we know what leader is. One of the dynamics is, what is the new data management? We've talked to a couple of people about it. From your perspective, all this data coming in, what is the new data management function at Ryder, or the new requirements and capabilities? >> I think the biggest thing for us, from a data management standpoint, is mastering our data. Like I said, we generate a lot of data. We've got two really important domains in which that data revolves around. It's a customer and it's a vehicle. And so our objective this year is to master both the customer and the vehicle, the information around those, so that our marketing team can create better solutions by understanding all of the ways that a particular customer may interact with our business. It's also our operating team is leveraging that same data to win at the local level on a day-to-day basis. When a driver comes to one of our facilities, and he wants work done on his truck, our account people and our service people at that location will be able to pull up specific information about that customer and perform the work that they need based on the contract they have with us. That's a win for the customer and a win for our local team. >> So key, handle the customers, handle the crucial assets. That seems to be a general trend in the industry, is you look across both the conversations that you're having here at Informatica World, but also beyond. Where do you think the industry is going, from a trend standpoint, with some of these questions around data? >> I think we're all on a journey to try to figure out the best ways to leverage the data, treat data as an enterprise asset, right? A real enterprise asset that may have more value to it than some of the physical assets that sit in our business. And as I've talked to people during the week here, it's really about that journey of trying to figure out how do you get better value out of the investment that you make, and understanding, cleansing, liberating your data. And for us, again that's creating products, new products, from the data that we have, and it's improving productivity and efficiency in our operations with that data. >> So you must be excited about some of the new capabilities Informatica's announcing about being able to discover, you know, inventory, and then use metadata in new and different ways. What do you think about some of the metadata issues that Informatica's talking about here? >> Yeah, I think, you know, both metadata and Cloud for me is very important. The metadata is important because, again, we've got multiple business units, right, that are operating with elements of data that are not associated across the enterprise. And so, you know, getting more deliberate about understanding the data at the metadata level will help us as we try to bridge everything together across our enterprise. The Cloud's important because more and more of our customers are moving from a batch world to a near real time world. And what's happening there is we need the ability to spin up operations in a very quick way, receive data in large swaths. So having burst capacity is what the Cloud is going to give us. The immediacy of capacity is important to us, so the Cloud-based applications that I've seen here, even the enterprise information catalog is important because as we go through and we cleanse and harness our data, having it in a structured, governed pattern is important to us as well. >> So you had been in the business. You're ex-GE before you came to Ryder, a couple iterations before, you know, Master Black Belt, Six Sigma, that kind of stuff. You're an operations guy. >> I'm an operations guy. >> So as you think about going from an operations guy, and great operations guys are very focused on data, into the CIO, how was that transition? >> It was more than what I thought. You know it's interesting, I've said that as an operator, I'm not sure that I would've been effective in this role five, ten years ago, because it was a different type of role. >> Peter: Right. >> Today I don't know how you'd not do this role, how you could do this type of role, the CIO role, without having an operational background because the technology is integral to everything we do now. So, you know, where before, companies differentiated themselves on, you know, operational rigor and process, which is what I live in, >> Yep. >> Now it's about data. Now it's about data and the technology tools that can free up capacity, create productivity, and again, generate products. And so, this has been a great exercise for me, a great learning experience for me getting involved in technology at a time when it's moving so fast, right? Every day is a different day from a technology standpoint, and bridging that with my operating background, I think it's been a great experiment for both me and Ryder. >> Well a lot of CIOs that have great job satisfaction at heart are operations people who have figured out how to be operations people as opposed to people who often, CIOs who often don't have that satisfaction are spending their days putting out fires, and they never get into that groove. But think about as the role of the CIO changes at Ryder, but just in general, how do you see yourself organizing your groups around data assets, because it used to be that the key assets were, you know, the hardware. >> Right. >> Or the network. How is that catalyzing a new way of thinking about getting your talent mobilized to do what Ryder needs your function to do? >> You know, the big shift is away from keeping the lights on and keeping the phones working to delivering outcomes for the business. So that's that operational view, right? It's really whether there's an application development team or a talent on our, employee on our infrastructure team, it's about delivering outcomes for the operating team, for the business team. And so an example of that is in our fleet management business, right, we run 850 shops around the US and Canada, repair centers, and our core application in that business, our technicians in those shops say, "Mel, if you can do one thing for us, "make the application faster." That's both an application problem and an infrastructure problem. >> Peter: Sure. >> Right? In terms of trying to find the right solve. What I've been able to do and what I've been focusing on is translating that ask, of give me more speed, to the infrastructure team and the application team in a way that they understand that that incremental speed means better customer service, better outcomes for the business as well as our customer. That driver that comes to that repair center, he or she is on the clock. >> Peter: Right. And they want to get out as fast as, they are more, of more value to the customer when they're on the road doing their job. >> And a truck is typically not a cheap thing. >> Mel: It's not a cheap thing. >> So a truck's on the clock too. >> Mel: Absolutely. >> So as you think about the new, these new disciplines, and then acculturating the application team to, at least in this case, speed, the infrastructure team to speed, are there any new skills or any new disciplines that you are finding need to be filled within your shop? >> You know, the thing that's been interesting, and I'm going to go back to my Six Sigma background, the thing that's really been interesting, and when I take into consideration the pace of change of technology, it's been change management, right? I mean, the application team can come up with the best, the absolute best solution. I'm going to add two, it's change management and the UI, the user interface is important to that journey, right? >> Peter: Absolutely. >> And so they can come up with the greatest application, it could be the best solution ever, but you've got to get people, like in our organization it's nothing to see employees that have been with the company for 20 years. And getting them to fundamentally change how they do work, that's a challenge. And so we, what we've been focusing on is educating both the IT organization as well as the business team on how to drive change, especially in an organization with such a long, rich heritage. >> So as these changes start to manifest themselves, your relationship with the executive staff, how's that evolving? >> Yes, so when I went over to, when I came over into this role, you know, I'd left the operating role as a peer, and I came over to the IT role, and I think they felt sorry for me because of all of the challenges. But what's evolved is that as I've learned more about the technology and how to deploy, I've been able to actually balance between communicating with the technology team on the needs of the operating side of the business, and then translating the technical challenges to the operating team so that they've got a better sense of if we're going to launch a new product, or if we're going to onboard a new account, right, there's some lead time, there's some pre-thinking that needs to happen to get the technology right for you to be successful when you deploy for that customer. So I think bridging the gap between the two sides of the company has been very important for us, especially now given that, again, the pace of change with technology. >> Peter: So does Ryder have a COO? >> Ryder actually doesn't have a COO at the corporate level. We have a COO in our fleet management business, but I'm playing kind of a hybrid role I'd say. >> Peter: Yes. >> You know, kind of a CIO/COO because I can blend the two. >> Excellent! And how's that, how's that going? >> It's actually good. When I first moved into the CIO role, I was very deliberate about not encroaching on the role of the operating teams, right, even though my heritage and all of the things I had done in the company was around operations, I didn't want to make operating decisions from the CIO role. What I'm realizing now is the best value, the best benefit for Ryder and the customers is for me to bring all of the skills that I have, right, plus the talents of the team, to bear on a problem for the company. So I've thought less about boundaries and more about delivering outcomes. And if that means I have to put a, you know, a little bit of an operating perspective on a technical challenge, so be it. >> Which is really quite frankly what any real great Chief anything does. >> Yes. >> How do I take shareholder capital and translate it into an outcome through my purview. >> Mel: Right. >> So, Mel, let's pretend we got five CIOs sitting here, >> Mel: Okay. >> All about ready to start the journey that you're quite a ways along. What is the one thing you want to say to them? Say, here's how you're going to get started, and here's the pothole that you have to look out for. >> You know, I think one of the most important things that I would advise is to divide, especially if you're like me coming from a different purview and even folks that have been in technology for a while, establish a board of directors, right, your own personal board of directors. For me that was, I had to identify, you know, a couple of folks that had been in this role before that I could call and reach out to and get unfiltered advice, right? It was also identifying, the second one was identifying a short list of vendor partners that I could go to for technical questions in their domain, plus beyond their domain where I felt comfortable with the autonomy of the answer. >> Good ideas. >> Right, just good ideas. No sale, just good ideas. Then I had to reach inside of my team and figure out who are the one or two people in the organization that I'd go bounce ideas across for the sake of the change management that I talked about, right? Some for technology but also from a change management standpoint. And then build a couple of key partners at the leadership level within the organization, again to help with some of the concepts and the ideas. A lot of what a CIO is going to bring to bear now is going to be disruptive to the way a business, a company does business today, and so they're going to need constituents or partners from the executive leadership team. >> Yeah, none of it happens if the CIO doesn't recognize the change management that they have to drive. >> Absolutely. >> About their role within the business. >> Absolutely. So I used my board of directors, this board of directors, as a way of getting smarter about the job, you know, secondly, to help facilitate the change that we need, and three, just to bounce ideas. For sanity. >> Awesome. Fantastic. Mel Kirk is senior vice president, chief information officer of Ryder Systems Inc. Mel, great conversation. Thank you very much for being here in theCUBE . >> Okay, thank you for your time. >> Once again, Peter Burris, Informatica World 2017, we'll be back with more in a moment. (light techno music)

Published Date : May 17 2017

SUMMARY :

Brought to you by Informatica. about the increasing role that data's playing in the world. Glad to be here. Mel is the senior vice president chief information officer in both technology and in transportation, and what's your interest in all this? is to learn more about the tools that are here, So one of the things that I'm fascinated, and perform the work that they need So key, handle the customers, handle the crucial assets. out of the investment that you make, about being able to discover, you know, inventory, that are not associated across the enterprise. So you had been in the business. You know it's interesting, I've said that as an operator, because the technology is integral to everything we do now. and bridging that with my operating background, I think Well a lot of CIOs that have great job satisfaction to do what Ryder needs your function to do? and keeping the phones working That driver that comes to that repair center, And they want to get out as fast as, I mean, the application team can come up with the best, is educating both the IT organization as I've learned more about the technology and how to deploy, Ryder actually doesn't have a COO at the corporate level. And if that means I have to put a, you know, Which is really quite frankly and translate it into an outcome through my purview. and here's the pothole that you have to look out for. that I could go to for technical questions in their domain, and so they're going to need constituents or partners that they have to drive. and three, just to bounce ideas. Thank you very much for being here in theCUBE . we'll be back with more in a moment.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
PeterPERSON

0.99+

InformaticaORGANIZATION

0.99+

oneQUANTITY

0.99+

Peter BurrisPERSON

0.99+

Mel KirkPERSON

0.99+

USLOCATION

0.99+

Ryder Systems Inc.ORGANIZATION

0.99+

San FranciscoLOCATION

0.99+

850 shopsQUANTITY

0.99+

MelPERSON

0.99+

20 yearsQUANTITY

0.99+

CanadaLOCATION

0.99+

Ryder SystemsORGANIZATION

0.99+

two sidesQUANTITY

0.99+

twoQUANTITY

0.99+

two daysQUANTITY

0.99+

bothQUANTITY

0.99+

threeQUANTITY

0.99+

two peopleQUANTITY

0.99+

GEORGANIZATION

0.99+

Informatica WorldORGANIZATION

0.99+

RyderORGANIZATION

0.98+

firstQUANTITY

0.98+

OneQUANTITY

0.98+

#INFA17EVENT

0.98+

theCUBEORGANIZATION

0.97+

five CIOsQUANTITY

0.97+

RyderPERSON

0.97+

this yearDATE

0.97+

ten years agoDATE

0.97+

one thingQUANTITY

0.96+

TodayDATE

0.95+

Informatica World 2017EVENT

0.95+

this weekDATE

0.94+

two really important domainsQUANTITY

0.93+

Six SigmaORGANIZATION

0.93+

secondlyQUANTITY

0.91+

three business unitsQUANTITY

0.91+

second oneQUANTITY

0.89+

fiveQUANTITY

0.87+

CloudTITLE

0.79+

todayDATE

0.77+

coupleQUANTITY

0.76+

#theCUBEEVENT

0.65+

NoiseORGANIZATION

0.49+

Mercedes De Luca, Basecamp | Catalyst Conference 2016


 

>> From Phoenix, Arizona, the Cube at Catalyst Conference. Here's your host, Jeff Frick. (upbeat music) >> Hey, welcome back everybody, Jeff Frick here with the Cube. We are in Phoenix, Arizona, at the Girls in Tech Catalyst Conference, about 400 people, a great show, they're fourth year in existence. Back in the Bay Area next year, wanted to come down and check it out. So we're really excited to be here, and our next guest Mercedes De Luca, the Chief Operating Officer from Base Camp, welcome. >> Thank you. >> Base Camp, everybody knows Base Camp. >> Everybody knows Base Camp, it's been around for a long time. >> Absolutely, we use it and a lot of people use it, just one of those kind of tools that's ubiquitous, it's all over the place. >> Yeah, we just introduced our Base Camp 3 version, and now it's something we operate the business on. >> Excellent, so we talked a little bit off camera about your session, which is really about career pivots, and there's probably no place more important to be able to execute a successful career pivot than Silicon Valley. We hear about it often with companies, and usually it's associated when things aren't going so well that you have to do some type of a business model pivot or design pivot. But from a career perspective, super important. So what are some of the lessons that you shared here in your talk? >> Yeah, so one of the things that we did, was how do you sort of take the risk out of pivots, and what vectors do you move along. Basically recommending that people sort of take one vector at a time. I think getting the industry right is really important, and when I first started I had an opportunity to work in financial service or high-tech. I chose high-tech and that formed my career. And so I think getting the industry right's important. I think when you want to move to different functions, there's ways to do that inside companies, there's ways to do that when you move to a different company. >> It's interesting, there's so much pressure with kids and young people trying to figure out, you know, what's the right decision. I got to make the right decision. You don't really need to make the right decision. You just need to make a decision and get on your path, right? >> Exactly, you just want to make that next move. That's really where you want to focus your energy because as long as you're moving toward your strengths, you're beginning to amplify those, it's just about making that next step. And it's really important to talk to other people and verify that what you think you're going to be moving to is actually what's going to be happening. >> Right, so when you define some of these vectors, what are some of the vectors that are consistent or adjacent that make some of these moves easier or more successful? >> So one would be industry vector, so if you want to get out of the industry you're in, but you may still do the same function in that industry. There's the function vector, which says, I'm in a function of engineering and I want to get into marketing, or I'm in project management and I want to do engineering. And then the third has more to do with how you contribute the level you're at. Vice president, director, size of company, individual contributor versus line management. So there's a lot of different vectors, there's three basically, is how I think about it. And it's just a recommendation of how to think about making moves. >> Now, we had Jim McCarthy on earlier, who was a speaker, and he talked about making the big shift, you know. You have a life changing event, and you just decide this is not what I want to do, I want to do something different. How does that play into what you're trying to help people do, to make it successful? So you don't just drop everything and change buildings. You have to kind of work your way over I would imagine. >> Right, I think the most important thing though is focusing on your strengths, really figuring out what is it in your career. For me it's been emerging technology, it's been consumer, and it's been leadership. And culture, so when I look at those things together, it's always making sure that that next step is moving you even closer and closer to that ultimate place. Base Camp is known for its culture. So one of the things that was really important to me in this last move, was to make sure that I wound up in a company that really walked the walk. That was important to me. >> So what tips do you give to people when they're thinking about that, to figure out culture? It's hard to figure out culture. You go through an interview process, and you get to meet the person across the table, and you do a little investigative work, but a lot of times you don't really know what you've got into until you got into it. So how do you coach people to try to figure out some of that culture fit, and again what are the vectors of culture that are the big ones that you should be aligning to? >> Well, we're lucky today because there's Twitter, and there's Facebook, and there's all sorts of social media that allows us to really learn a lot more about the company and the culture, check out what the people in the company are saying about the company. In my case, super lucky, because both of the founders blog a lot on our Signal Versus Noise. They do a lot of writing, so I almost felt like I knew the culture going into it. They've written books, et cetera. But for companies that haven't written books and haven't blogged, I think you can absolutely get that by also talking to people inside the company and being clear about what you're looking for. I think that's a big part of it. >> Okay, well Mercedes, I'll give you the last word. What is your kind of parting tip to people who are looking to make a move, or just concerned, oh my gosh, I'm just locked up cause I think I have to get it right the first time? >> Don't let others define you. (laughing) >> Short and sweet. I should have asked you the bumper sticker question, give me the bumper sticker for it. Don't let others define you, that's perfect. Well Mercedes, thanks for taking a few minutes to stop by. >> Thank you Jeff. I appreciate it. >> Absolutely. >> Nice to meet you. >> So Jeff Frick here, at the Girls in Tech Catalyst Conference in Phoenix, Arizona. You're watching the Cube. Thanks for watching. (upbeat music)

Published Date : Apr 22 2016

SUMMARY :

the Cube at Catalyst Conference. Back in the Bay Area next year, it's been around for a long time. it's all over the place. operate the business on. that you shared here in your talk? and what vectors do you move along. I got to make the right decision. and verify that what you think how you contribute the level you're at. making the big shift, you know. that next step is moving you and you get to meet the I knew the culture going into it. I'll give you the last word. Don't let others define you. I should have asked you the Thank you Jeff. at the Girls in Tech

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JeffPERSON

0.99+

Jim McCarthyPERSON

0.99+

Jeff FrickPERSON

0.99+

Mercedes De LucaPERSON

0.99+

threeQUANTITY

0.99+

Bay AreaLOCATION

0.99+

thirdQUANTITY

0.99+

oneQUANTITY

0.99+

bothQUANTITY

0.99+

FacebookORGANIZATION

0.99+

Phoenix, ArizonaLOCATION

0.99+

fourth yearQUANTITY

0.99+

TwitterORGANIZATION

0.98+

next yearDATE

0.98+

about 400 peopleQUANTITY

0.97+

Base Camp 3TITLE

0.97+

todayDATE

0.97+

CubeORGANIZATION

0.96+

firstQUANTITY

0.96+

Silicon ValleyLOCATION

0.96+

first timeQUANTITY

0.94+

MercedesORGANIZATION

0.94+

Base CampLOCATION

0.88+

BasecampORGANIZATION

0.86+

Catalyst Conference 2016EVENT

0.85+

Signal Versus NoiseTITLE

0.78+

one vectorQUANTITY

0.78+

CampLOCATION

0.77+

Girls in Tech Catalyst ConferenceEVENT

0.73+

Tech Catalyst ConferenceEVENT

0.71+

BaseORGANIZATION

0.63+

ConferenceLOCATION

0.5+

CatalystORGANIZATION

0.5+

GirlsORGANIZATION

0.44+

BaseTITLE

0.43+

CubeTITLE

0.28+

Dimitrios Stiliadis - OpenStack Summit 2013 - theCUBE


 

okay we're back live here at the OpenStack summit in Portland Oregon I'm John furry the founder SiliconANGLE comment rose mykos Dave a latte from Wikibon org this is silicon angles the cube our flagship program we go out to the events and extract the signal from the noise and certainly here OpenStack there's not a lot of noise but a lot of signal a lot of developers a lot of use cases really really the Alpha geeks the practitioner is really putting new technology into place to power this modern era of computing cloud mobile and social David Floria we're here with Demetri stilly at us from nudge networks and mountain view welcome to the cube thank you David I want to get your take on this before we set up this interview because honestly we've heard from right scale there in the management side just previous we've had Rackspace on earlier there on the Omni on the provider side we had big switch-on software-defined networking and now Dimitri's company the software is eating the world what's your take on the SDN market right now relative to OpenStack relative to open saying well what you're clearly wanting to do in every part of it is separate out all of the different layers and you ought to be able to separate out the physical and the the logical and the the software is the way that that's going to be done so instead of having to have a switch which is a piece of hardware and the software you want to separate the two out so that you have the logical function and the physical function from from the two pieces so that's very important to be able to contribute to every layer take new technologies along with you and then define the software element of that as the piece that you keep constant as technologies themselves adjust so durable code we walk manageable and build on and we clean can take advantage of new technologies as they come along and obviously I coming back to you what are you contributing what I think needs to be contributing was the white space in that area that you're going after right so see when people started thinking about the cloud and OpenStack and to always kind of think they they quickly realize that the network is a fundamental piece right you have to start with the network you have to interconnect your components and so on the angle that we are taking is yes it's good with in your data center within your cloud you have to create this network services interconnect applications and so on but much more importantly you need to be able to dynamically connect these applications with your existing network services right so you have a large amount of enterprise VPN services you have hybrid clouds coming out so you need to be able the moment you activate a network service in the data center to be able to seamlessly interconnect this now with your enterprise side with other network services in other data centers in other clouds and so on right so the network is always a network of networks and we have to bring everything together we cannot just restrict ourselves with is the confinements of a single administrative model so that's that's a fundamental part of what we are trying to to bring here together okay and so how are you fitting in with the the network layer right so our view is say that first of all we need to talk both both languages if you don't think of it as a as a translation thing right so we need to understand the language of the cloud we need to understand the language of the application developers in the cloud they want to use some abstract mechanism to define their network services and install them if you want in the hypervisors and OpenStack quantum seems to be the prevalent way to do that so that's language number one but then we have all these thousands of networks out there where their language is bgp so what we are doing is we are marrying the two we allow you to codon define services in OpenStack and we allow you to define the mekinese between interconnect the service is automatically with all the other networks that are out there right so I call it sometimes we are just translating between languages all right a language translator live from an application point of view they want to consume resources and previously networks and the computers were the main things they consumed but it seems now that sorry computing and storage with the main things they consumed but it now it seems that networks themselves have to pay a much bigger role in providing a quality of service to those places Rick you've got a quality of service down in the nano seconds when you get to the server level and used to have milliseconds for the for the storage side it's now coming down to micro second what are you doing to make sure that that quality of service no it is not just the bandwidth but it's also the latency are you planning to marry that see the weight datacenter networks of all these people are quickly realizing that the same if you want principles that we used in order to build the Internet itself can be used inside the data center so if you think about the internet right in the internet there is voice services that is video services there is all these other services running and they are actually running by assuming you have a well-engineered IP network and then you run the service is at the edges if you want all that you push all the intelligence at the edges it's the same thing where the network on the data center is going the data center network becomes a very scalable IP fabric it it is very well managed if you want very well traffic engineer and you push the edges at the hypervisors you push essentially the services at the hypervisors where traffic is differentiated so if you see for example a tenant misbehaving you are going to block him at the hypervisor layer if you're going to provide us or map different tenants to different classes of traffic it's happening at the hypervisor so the center of the network behaves like a scalable IP fabric and all the intelligence it's pushed around the edges and the reason you want to do that is because this allows you the ultimate scalability right the network or doesn't need to know about every flow that goes into the through through the corner of the network there right you don't need to know the IP addresses of virtual machines you don't need to know what individual virtual machines no need to know I want to do there you just need to worry about aggregates so you can engineer and scale the core make it very cheap and because you make it very tip you can increase the capacity at the core and you can say distribute all the intelligence at the edges of the network right but so you said that you can do the hypervisor and that's obviously on the compute side that side of it but what about the data network isn't that a don't you need to regulate the priorities and flex all the data through and isn't that today that's that's a very big part of it yes but it is still happening at the hypervisor right the the first touch of it enough an application with a network it is not anymore the top of rack sheets let's say on the data center but it does it is actually the hypervisor virtual sheets right that's the first time that you see a packet when a packet comes out of a virtual machine the first time you see it is at the hypervisor itself and at this layer when the first time you see the bucket of the hypervisor itself is where you apply all your policies right in other words the edge of the network is not the hardware is not the switch on the top of rack the edge of the network is inside the server now ok yeah ok excellent so I want to ask you we have a couple minutes left here I wanted we have two minutes less I want to get your perspective on the state of the business around OpenStack what is your view ok because your chief architects you're looking at the tech yes and you but you have to intersect the business objectives what are you seeing as the core business drivers that are that are causing you to make your technology in a certain way right so it's clear that what people want to do is they they want to provide this ability to their end users to consume services rapidly right that is what is driving this call OpenStack development and more important the community came together in order to unify view on the core engine and the core AP is in order to make this consumption of services very easy and in order to allow the application developers to move from one cloud to the other and so on right what we do is what we try to do is in addition is expanding view on this model amazing the network as consumable as the storage and compute facilities right and I'm not talking just about the network in the data center I'm talking about also the network in the way that the service in the data center of a cloud provider will interconnect with the enterprise read if you see then the next if you want Holy Grail that everybody is talking about is the hybrid cloud the hybrid cloud is only possible if you can connect the network and the services in the service provider cloud with a network and services in the in the in the enterprise itself right so they what links the two together is the network so we have to make this network to be consumable final question for you is actually DevOps is a mindset we heard from right scale that that adoption is in mainstream enterprises and service providers but the word infrastructure as code is becoming more popular outside of the the geeks and the album the architects the coders what in your mind how would you describe infrastructure as code to the folks out there give it a try it's okay no right answer it's a moving target that's what it is realities it's that the applications and code is a living organization it's constantly changing and you cannot assume at any point it's static right it's not there it's not the good old days if you want and that's what it really means right it's a living organism it it will constantly adapt to the new to the new requirements out there like switches in the old days you knew exactly ports and you you knew i was going now it's all kinds of weird stuff happening right it's all stuff you you have to be you you have to accept change if you want right so it's the actually there is a there is an okay Isaac Asimov code right there another the author of the science fiction yes that's the only constant is change yeah we should be no project just on the network genome here Software Defined Networking Dmitry stylianos thanks for jumping inside the cube again you're here like with a lot of the chief architects making things happen congratulations thanks for joining us thank you we'll be right back with more analysis from David's lawyer after the at break on a breakdown day 1 and day chu here in more depth from the analysts here at opens Dec 2 SiliconANGLE Gibbons exclusive coverage of OpenStack summit be right back

Published Date : Apr 16 2013

**Summary and Sentiment Analysis are not been shown because of improper transcript**

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

Isaac AsimovPERSON

0.99+

Dimitrios StiliadisPERSON

0.99+

David FloriaPERSON

0.99+

two piecesQUANTITY

0.99+

Dec 2DATE

0.99+

first timeQUANTITY

0.99+

twoQUANTITY

0.99+

both languagesQUANTITY

0.99+

bothQUANTITY

0.98+

Dmitry stylianosPERSON

0.98+

first touchQUANTITY

0.98+

DimitriPERSON

0.98+

DavePERSON

0.97+

OpenStackTITLE

0.97+

todayDATE

0.97+

WikibonORGANIZATION

0.95+

thousands of networksQUANTITY

0.95+

OpenStack Summit 2013EVENT

0.95+

firstQUANTITY

0.93+

OpenStackEVENT

0.92+

singleQUANTITY

0.92+

OpenStack summitEVENT

0.92+

a couple minutesQUANTITY

0.87+

Demetri stillyPERSON

0.86+

two minutes lessQUANTITY

0.84+

one cloudQUANTITY

0.83+

every layerQUANTITY

0.82+

Portland OregonLOCATION

0.81+

a lot of signalQUANTITY

0.77+

a lot of developersQUANTITY

0.77+

SiliconANGLEORGANIZATION

0.76+

John furryPERSON

0.76+

RackspaceORGANIZATION

0.72+

micro secondQUANTITY

0.7+

GibbonsPERSON

0.66+

nano secondsQUANTITY

0.66+

OpenStackORGANIZATION

0.65+

day 1QUANTITY

0.61+

OmniORGANIZATION

0.59+

noiseQUANTITY

0.59+

theCUBEORGANIZATION

0.57+

a lotQUANTITY

0.54+

partQUANTITY

0.52+

oneQUANTITY

0.48+