Image Title

Search Results for PAL:

Dr. Sumon Pal, Thync - Zuora Subscribed 2017 (old)


 

(clicks) >> Hey welcome back everybody. Jeff Frick here with theCUBE. We're in downtown San Francisco with Zuora Subscribe. About 2,000 people all focused on the subscription economy. And we're looking at some really cool products. We've had GE on, we're going to have Caterpillar on, but this is something new. You know, kind of these medical devices. Fitbit on steroids. I don't know how you describe it. Dr Sumon Pal. He is the cofounder and Chief Scientific Officer for Thync. Welcome. >> Thank you, thank you for having me. >> Absolutely. So give us a little bit of background on Thync, and then we'll jump into the device. >> Absolutely. So we're the first subscription service for wellness and better mental health. >> Okay. >> And the way it works is that there's hardware which is a neuromodulator, and they interface with your skin which is some pads and basically you put this on the back of your neck. There's software, there are programs that come along in the app and what these are are algorithms that have been developed to stimulate certain nerves in the right way. Those nerves in turn connect with your brain stem and that is the center for stress, for sleep cycles, for mood in general. And over the last five years we've developed a way to safely stimulate those nerves, such that you can sleep better, your mood is improved and you can de-stress. >> Okay, so let's back, back way up. You covered like a, you went the whole enchilada there. So you basically did some research. You guys figured out that nerve can stimulation can give better wellness. >> Right. >> And is that just during sleeping hours, during waking hours, all the above, kind of? >> Yeah, so it's both. I mean a session lasts about 10 to 15 minutes. >> Okay. >> In that time, what's happening is that it's dampening the stress response in your body. >> Jeff: Okay >> So if you do this on a daily basis or you do this in the evenings when you come home from work, you are kind of detaching from that stress that's built up during the day. >> Jeff: Without drinking a glass of wine or a bottle of beer. >> Absolutely. Without really any toxicity, without any side effects, without any addiction. Without any of the issues that come along with pills and substances. >> All those other things. Okay so then you put this thing on. >> That's right. >> So you put it on like right after you get home from work, or? >> Sure. >> Or when you go to sleep? Does it make a difference? >> Or if you just had a bad meeting. You had a rough morning. If there's kind of an acute occasion where you're anxious or highly stressed then you can use it then, too. >> Okay, so it's kind of yoga in a box. If I would be so presumptuous. >> Without any effort, right. >> No damage to the knees. >> Right, right. >> All right, super. So, a little bit in terms of the history of the company, so you said this is version two that you just came out with >> Right. Yeah, we've been developing the product, the technology in general for about five years. We've done three published studies. We've tested thousands of subjects. The first product has over two million minutes of use without any adverse side effects or, you know, we know that it's a really safe and powerful method to help people. >> Okay, and what does it retail for? >> So the hardware costs 149. >> Okay. >> And then there's the subscription. And the subscription is because there's a consumable involved, which are these pads. Which are actually a proprietary formulation so that this is absolutely painless, absolutely comfortable. And we have algorithms, so you're actually streaming these programs and those programs are highly complex, changing over time and constantly being updated. So for the software, service, and for the pads you pay either $29.99 a month or you pay $19.99 a month depending on a longer commitment. >> Okay. And when you decided to go with the subscription pricing, versus just selling it and if I need more pads, I order a 12-pack of pads or whatever. What were some of the things you thought about and then what are some of the outcomes that you have found? Both kind of expected and unexpected in having a subscription relationship with your customers. >> Yeah, it's a great question. So, one of the things that's really important about, so stress leads to a huge number of health issues. Everything from cardiovascular issues to being linked with diabetes, to being linked with premature aging. And so it's important to chronically reduce your stress levels. And you want to have all the components around when you need it. It's not one of those things where you've had a terrible day, you're extremely anxious. You know, you want everything to be there. You don't want to go and then order some pads online, order what you need online. >> Right, right. >> So that's one aspect, and the second is that you want access to the programs that are being updated all of the time. And what we find is that when people are on a subscription service, that kind of constant use which is so critical for your health, mental health, general well-being, is maintained in a better way than if you're kind of having to reorder these things or buy them. So really it's about supporting and promoting this kind of continuous regular use and routines. >> And I would presume that then you also get the benefit too 'cause you're getting all those data. >> Absolutely. >> Points that are feeding your algorithms. >> Absolutely. >> So you can make changes to the application, changes to the algorithm. >> That's right and also we have a library, about a thousand programs. And it's also about, we can, for any customer switch out the programs that they have if it's not working for whatever reason. So to kind of rescue people, it's also important to get that data of, what is happening month to month. >> So is a program the sequence of, of, I don't want to say charges but stimulations or whatever. >> Yeah, that's right. >> That set a different pattern, a different frequency and that creates like a program. >> That's right. >> And you experiment to find out what works best for you? >> That's right, it's a lot like music. It's a stimulation pattern that's built in blocks and those blocks change over time. And that is one of the things that we figured out how to do, that no one really had done before. >> Alright, well, pretty exciting stuff. >> Thank you. >> I look forward to watching you guys grow and see how things continue to progress. >> Absolutely, thank you very much. >> Alright, thanks for stopping by theCUBE. Alright, he's Dr. Sumon Pal, I'm Jeff Rick. You're watching theCUBE from Zuora Subscribe. Thanks for watching. (clicks)

Published Date : Jun 8 2017

SUMMARY :

I don't know how you describe it. and then we'll jump into the device. So we're the first subscription service and basically you put this on the back of your neck. So you basically did some research. I mean a session lasts about 10 to 15 minutes. it's dampening the stress response in your body. So if you do this on a daily basis Jeff: Without drinking a glass of wine Without any of the issues that come along with Okay so then you put this thing on. or highly stressed then you can use it then, too. Okay, so it's kind of yoga in a box. that you just came out with you know, we know that it's a really safe or you pay $19.99 a month depending on a longer commitment. and then what are some of the outcomes that you have found? And you want to have all the components around and the second is that you want access And I would presume that then you also get the benefit too Points that are feeding So you can make changes So to kind of rescue people, So is a program the sequence of, of, that creates like a program. And that is one of the things that we figured out watching you guys grow Thanks for watching.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jeff FrickPERSON

0.99+

Jeff RickPERSON

0.99+

12-packQUANTITY

0.99+

Sumon PalPERSON

0.99+

2017DATE

0.99+

JeffPERSON

0.99+

three published studiesQUANTITY

0.99+

ZuoraPERSON

0.99+

ThyncORGANIZATION

0.99+

GEORGANIZATION

0.99+

Thync - ZuoraPERSON

0.99+

BothQUANTITY

0.98+

first productQUANTITY

0.98+

secondQUANTITY

0.98+

one aspectQUANTITY

0.98+

Dr.PERSON

0.98+

bothQUANTITY

0.98+

oneQUANTITY

0.98+

about five yearsQUANTITY

0.97+

About 2,000 peopleQUANTITY

0.97+

$19.99 a monthQUANTITY

0.97+

about a thousand programsQUANTITY

0.96+

15 minutesQUANTITY

0.96+

over two million minutesQUANTITY

0.95+

$29.99 a monthQUANTITY

0.94+

about 10QUANTITY

0.92+

149QUANTITY

0.89+

San FranciscoLOCATION

0.89+

first subscriptionQUANTITY

0.88+

ZuoraORGANIZATION

0.86+

thousands of subjectsQUANTITY

0.83+

CaterpillarCOMMERCIAL_ITEM

0.81+

theCUBEORGANIZATION

0.79+

last five yearsDATE

0.78+

one of the thingsQUANTITY

0.74+

DrPERSON

0.74+

version twoOTHER

0.72+

FitbitORGANIZATION

0.36+

Chris Lilley, Grant Thornton | Inforum DC 2018


 

(upbeat techno music) >> Live, from Washington D.C., it's theCUBE. Covering Inforum DC 2018 bought to you by Infor. >> Well Welcome back here on theCUBE as we continue our coverage here at Inforum 2018. We are in DC, nation's capitol. Kind of sandwiched between the Capitol Hill and the White House, where there is never a dull moment these days. (laughing) >> John Walls with Dave Vellante and we are joined by Chris Lilley, who is the national managing principle of tech solutions at Grant Thornton. Chris, good to see you, thanks for joining us. >> Good to see you, thank you. >> Yeah, so first off, let's just talk about the relationship, Grant Thornton and Infor. Still fairly new? >> Yes. >> It's been about a year, a year and a half, in the making. >> It's been slightly over a year. >> Yeah, let's talk about how that began and then kind of a status update, where you are right now? >> Sure. Well, it began about a year ago, around that time that Coke made an investment into Infor and Grant Thornton was looking at expanding our technology footprint, looking at other vendors who were providing solutions to the clients that, you know, the we serve. We also saw that Infor has a very, very common client base with Grant Thornton and we spent a few days with Gardner, we spend a few days with Forrester; learned about their products, learned where they were, were very impressed and decided to make a commitment to the relationship. It's been a terrific first year with Infor. >> I talked to one of the principles last year of Coke, PAL, and he said to me that one of the benefits that we're going to bring to Infor is that we have relationships with guys like Grant Thornton. We're not going to get him in a headlock, but we're going to expose them to Infor and say, "Hey look, look for opportunities," because we think they exist and that that's what you found, right? >> 100%. To elaborate a little on the story, we spent a few days with Coke out in Wichita, understood what they saw in Infor and obviously we were aware of Infor, aware of their product base, but what they have done with the product over the past four or five years? Frankly, news to us. And where they've taken the product, the investments they've made, the other products that they've acquired around their core, the kind of edge products, if you will, absolutely tremendous and decided to make that investment. So it wasn't so much of an arm twist. >> Right >> It was some awareness that they created for us and we decided to jump in. >> What was your, all be, you know, you ah-ha that you said because you spent a little bit of time? >> Mm-hmm. >> Doing your due diligence and working, again, with the Coke folks, so, what was it that got your attention you think? I tell ya, there's really something here. >> Yeah, I think what put us over the top, is we we brought our leadership team up to New York for a few days, spent a little bit of time with Charles Phillips, who is incredibly impressive and can probably sell anything to anybody. But we really spent time with their hook and loop folks and their developers. And when we saw kind of the brainchild of hook and loop, which I don't know if you're familiar with what this? >> The in-house agency, sure. >> Yeah, the in-house-agency and what they are doing to make the product more user-friendly, to make it more engaging. When you look at the world that we live in right now, you know, I see a phone here, everything's easy to use and intuitive. Business applications are not. Now, it's a lot harder issue we're dealing with, but what they've done with the interface, what they've done with the usability kind of, that was our ah-ha moment. They showed us a couple other things that they have done for specific clients with their analytics tool set and how they've integrated that in some dashboarding and we were committed at that point. >> So talk about Grant Thornton's unique approach in terms of how you're applying Infor with clients. What's hot? You know, any specific industries and trends that you're seeing. >> Sure. What we wanted to do is we wanted to make sure that when we made the commitment, we followed through on that commitment. We very narrowly focused our initial relationship with with Infor. Our industry focus is healthcare, public sector. Our product focus is the cloud suite products along with the enterprise asset management product. By focusing on the enterprise asset management product, that allows us to get into the asset intensive industries. So, utilities, anything with large fleets, public sector munies that are managing infrastructure. So we made that commitment very narrowly so that we weren't trying to be too many things to too many people and we could really commit to them, make the investment that we needed to make. We obviously had a technology practice so we know how to do this work and the way I think about technology practices today is they're really there to transform businesses, right? We used to spend a lot of time making technology work. Technology works. Now we've got to make sure that our clients step back from what they do today, leverage the best practice in the technology, or the leading practice in the technology, and transform their business around it. That's how we've approached the relationship with Infor. >> Well that's interesting because we heard Charles' keynote day one, and he talked on theCUBE about the disparity between the number of jobs that are out there and the number of candidates that are qualified, so there's a disparity there and then he showed productivity numbers and I remember back in, I don't know what it was, 80's or 90's, whatever it was, before the PC kicked in. >> Mm-hmm. >> In a big way, in terms of productivity impact. The spending was going through the roof, but you couldn't see it in the productivity and you're sort of seeing the same thing today. The tech market's booming, but the productivity numbers are relatively flat, so the promise is that, okay, we're going to have efficiencies out of cloud, you know, all this data that we've been collecting for all this time applying machine intelligence is going to drive, we've predicted, productivity. >> Right >> The next sort of big wave. It's kind of your job to make that all happen. >> Yeah, and so, I'm guilty. I've been in this industry a long time. I've seen the waves from the Y2K to the ERPs, to when we went to distributed internet, so I've seen all that. Absolutely agree, the productivity gains haven't been there but I would say that foundation is now laid. If you think about what we did during that time frame, we got our clients onto fairly common platform, somewhat consistent practices, right? They did a lot of custom work still, but we also cleaned up a lot of data, but what we did at that point, is we did it in silos. And enterprises don't run in silos. They have to run at the enterprise level. We've got the foundation laid now, we're now to the next generation. The next generation says your basic transaction processing systems? Use 'em as they come. Let's look at what's available to us. Let's look at the partner ecosystem that's out there. Let's look at the connectivity that's out there. Let's look at how we can better engage our client base and better run our operations and that's where I think we're going to start to see the productivity and that's what Infor is doing with their last mile functionality, they're taking the need to spin any customization away from the client, they're givin' it to 'em but they're letting us think about how to transform the business and drive value. >> You talked about utilities, which is a unique animal unto itself, right? From the regulatory environment, from their various services, what they provide and the scale they provide it at? Where can Infor come in and play in that space in terms of people being receptive to new ideas, being receptive to new mousetraps when, you know, sometimes they're bound too. >> Right. >> By what they can and can't do. >> Right, that's a good question. So utilities an interesting industry, right? Everybody says utilities are behind, they are slow to adapt. But if you think about the utility and fundamentally what they do, they're one of the most complex advanced engineering businesses that you can find in the world, right? From the generation to the distribution of power is a highly complex activity that they do extremely well. So they've made a ton of investment to make sure they keep doing that extremely well, deliver power safely. We got to renew the infrastructure so they got to spend money there and that's where we see Infor coming in. If you think about what's out there right now, all the sensors that we can put in to the generation facilities, all the devices that we can use. We can use drones to look at the solar farms, figure out where the maintenance needs to be done. I think what you're going to see is Infor product being adapted into how they operate the business. Analytics being applied to how they manage their maintenance facility, which is critical in utility. Analytics being brought in to how they prepare for storms. If you think about the recovery, what we just went through in the south. You know, 800,000 people out? Relatively quick recovery there. Now it's painful, and everybody's not back, I'm not saying it's easy but the utilities down there used a lot of information to better position crews for recovery. I think that's how you're going to see it on the operational side. On the customer side, you're going to see utilities do more and more what everybody else is doing. How do you want to interact with me? When do you want to interact with me? Where do you want to interact with me? Utilities will start putting all that out there and they are putting it out there. The websites are good, they're starting to go to mobility. So I think Infor products will play across that entire space. >> You're right about the utilities, I mean the instrumentation of the homes through smart meters, I mean what a transformation in the last 10 years? Five to 10 years, even. >> Yep. >> And it's all about the data. It always come back to data. (laughing) Healthcare and public sector, utilities as well, highly regulated industries. >> Yes. >> That you chose. By design, I presume. >> Yes. >> Talk about that in terms of Grant Thornton's wheelhouse. >> Yeah, we chose healthcare and public sector because we have good existing practices. Specific in healthcare space, we were doing a lot of epic cerner work, which is their ERM systems >> Yeah. >> That are out there. Lawson is by far the leading product in their ERP back office. So it made a natural fit for us to jump into that. Grant Thornton also has a very large public sector practice, both at the federal and state local level, so again, it gave us an avenue to get in, bring Infor into some of our existing clients. But back to your point about being regulated environments, Grant Thornton is basically a public accounting firm so we're used to dealing in regulatory environment, that's part of our culture. Quality is what we focus on as a firm. We understand how to interact with the regulators. Personally, I think, things are moving so quickly that the regulators, in some cases, are still catching up. But the one piece of advice I would have to all of clients out there that operate in the regulated world, rely on your partners. Rely on your software provider, your internal audit, your external audit, your systems integrator to help you keep current with the regulatory changes. On the tail of that is all the exposure on the cyber side. If you think about what's going on, you've mentioned in home devices, smart meters, those are all access points so we've got to really harden the access and the infrastructure to make sure that people aren't using those to gain control of these systems. >> Yeah the threat matrix is expanding. >> The matrix is huge. >> And then, you know, securing the data. (laughing) Security, in many ways, is do over, right? (laughing) In this new world. >> And just looking forward, and briefly if you will, before we let you go? >> Yep. >> Where do you see the relationship going then? Because you've established your verticals, you know where you're working, you know what's going on. What's next step then? Because there's always something else down the road, right? >> Yeah, so in our industry, we've got some terrific competitors out there who have also engaged with Infor. There's some other products out there. So I think what we need to focus on now, we've got the relationship, Infor is an incredible company, they're incredibly collaborative. They're agile. We recently were working with a healthcare provider who was dealing with some of the personnel issues you were talking about, resource shortages. How do I optimize scheduling? Who do I need? Where do I need 'em? Infor was all over it. They brought in their chief nursing officer, she helped us think through how to better manage that, used their workforce management product. So, where we want to go with them is we want to innovate with them. We want to bring the innovation that we're applying, whether it's robotics in terms of bots, whether it's digital transformation which are all buzzwords, and leverage all that. But the other thing I think we're starting to really get our arms around is the broader ecosystem. They're all cloud enabled. There are a significant number of niche players out there that can bring us point solutions. You know, you mentioned the data? The data's the key to all that so we want to help them understand, architect that. Use the technology to solve our client's business problems. >> And you know these buzzwords are actually, there's substance behind them. I mean, every company is trying to get digital, right? >> Yes. >> Every company has, or should have, a digital strategy, is tying to figure out and seize pathways to, maybe not monetizing data directly but figuring out how data contributes to monetization. Software robots are real. They work. >> Right. >> Not perfect, chat bots aren't perfect but they're getting better, and better and better. You look at things like fraud detection, how far that's come just in the last five or six years? You pointed out earlier, Chris, the technology is there, it works. It's not a mystery anymore, right? I've been around a long time, too And technology used to be so mysterious and nobody knew how it worked. The Wall Street analysts, it was like, how's this tech work? Today, it's ubiquitous. >> Yes, agree, absolutely. >> It's the process, it's the people, it's the collaboration, that's the hard part. >> Yeah, I mean you said it earlier, it's getting businesses to adopt what they do, right? To really focus on where they can add value and get the people to come along. >> Chris, thank you. >> Yeah, thank you. >> Appreciate the time. >> Sure. >> And enjoy the rest of the show and again, we do thank you for the time here today. >> Okay, take care. >> Good deal, alright. Back with more here, you're watching theCUBE from Washington D.C. (upbeat techno music)

Published Date : Sep 27 2018

SUMMARY :

bought to you by Infor. and the White House, where there and we are joined by Chris Lilley, about the relationship, Grant Thornton and Infor. we spend a few days with Forrester; that one of the benefits that we're going to bring To elaborate a little on the story, we spent a few days that they created for us and we decided to jump in. so, what was it that got your attention you think? and can probably sell anything to anybody. Yeah, the in-house-agency and trends that you're seeing. make the investment that we needed to make. and the number of candidates that are qualified, are relatively flat, so the promise is that, It's kind of your job to make that all happen. from the client, they're givin' it to 'em and the scale they provide it at? From the generation to the distribution of power I mean the instrumentation of the homes And it's all about the data. That you chose. Specific in healthcare space, we were doing and the infrastructure to make sure securing the data. Where do you see the relationship going then? The data's the key to all that And you know these buzzwords are actually, but figuring out how data contributes to monetization. how far that's come just in the last five or six years? it's the collaboration, that's the hard part. and get the people to come along. and again, we do thank you for the time here today. Back with more here, you're watching theCUBE

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
ChrisPERSON

0.99+

Chris LilleyPERSON

0.99+

Dave VellantePERSON

0.99+

ForresterORGANIZATION

0.99+

DCLOCATION

0.99+

InforORGANIZATION

0.99+

John WallsPERSON

0.99+

WichitaLOCATION

0.99+

FiveQUANTITY

0.99+

Charles'PERSON

0.99+

New YorkLOCATION

0.99+

Washington D.C.LOCATION

0.99+

CokeORGANIZATION

0.99+

Grant ThorntonPERSON

0.99+

Capitol HillLOCATION

0.99+

todayDATE

0.99+

Charles PhillipsPERSON

0.99+

last yearDATE

0.99+

over a yearQUANTITY

0.99+

bothQUANTITY

0.98+

800,000 peopleQUANTITY

0.98+

TodayDATE

0.98+

10 yearsQUANTITY

0.98+

one pieceQUANTITY

0.98+

oneQUANTITY

0.98+

first yearQUANTITY

0.98+

about a yearQUANTITY

0.98+

White HouseLOCATION

0.98+

Y2KEVENT

0.97+

a year and a halfQUANTITY

0.96+

firstQUANTITY

0.95+

PALORGANIZATION

0.93+

90'sDATE

0.92+

wavesEVENT

0.91+

Grant ThorntonORGANIZATION

0.9+

LawsonORGANIZATION

0.89+

six yearsQUANTITY

0.86+

day oneQUANTITY

0.85+

2018DATE

0.84+

ThorntonPERSON

0.78+

about a year agoDATE

0.78+

GrantORGANIZATION

0.75+

100%QUANTITY

0.75+

last 10 yearsDATE

0.73+

five yearsQUANTITY

0.72+

GardnerORGANIZATION

0.71+

Inforum 2018EVENT

0.71+

pastDATE

0.7+

Wall StreetORGANIZATION

0.69+

80'sDATE

0.68+

waveEVENT

0.65+

2018EVENT

0.62+

InforumEVENT

0.56+

bigEVENT

0.53+

fourQUANTITY

0.51+

fiveQUANTITY

0.5+

tonQUANTITY

0.49+

lastDATE

0.47+

InforumORGANIZATION

0.31+

Bina Hallman & Steven Eliuk, IBM | IBM Think 2018


 

>> Announcer: Live, from Las Vegas, it's theCUBE. Covering IBM Think 2018. Brought to you by IBM. >> Welcome back to IBM Think 2018. This is theCUBE, the leader in live tech coverage. My name is Dave Vellante and I'm here with Peter Burress. Our wall-to-wall coverage, this is day two. Everything AI, Blockchain, cognitive, quantum computing, smart ledger, storage, data. Bina Hallman is here, she's the Vice President of Offering Management for Storage and Software Defined. Welcome back to theCUBE, Bina. >> Bina: Thanks for having me back. >> Steve Elliot is here. He's the Vice President of Deep Learning in the Global Chief Data Office at IBM. >> Thank you sir. >> Dave: Welcome to the Cube, Steve. Thanks, you guys, for coming on. >> Pleasure to be here. >> That was a great introduction, Dave. >> Thank you, appreciate that. Yeah, so this has been quite an event, consolidating all of your events, bringing your customers together. 30,000 40,000, too many people to count. >> Very large event, yes. >> Standing room only at all the sessions. It's been unbelievable, your thoughts? >> It's been fantastic. Lots of participation, lots of sessions. We brought, as you said, all of our conferences together and it's a great event. >> So, Steve, tell us more about your role. We were talking off the camera, we've had here Paul Bhandari on before, Chief Data Officer at IBM. You're in that office, but you've got other roles around Deep Learning, so explain that. >> Absolutely. >> Sort of multi-tool star here. >> For sure, so, roles and responsibility at IBM and the Chief Data Office, kind of two pillars. We focus in the Deep Learning group on foundation platform components. So, how to accelerate the infrastructure and platform behind the scenes, to accelerate the ideation or product phase. We want data scientists to be very effective, and for us to ensure our projects very very quickly. That said, I mentioned projects, so on the applied side, we have a number of internal use cases across IBM. And it's not just hand vault, it's in the orders of hundreds and those applied use cases are part of the cognitive plan, per se, and each one of those is part of the transformation of IBM into our cognitive. >> Okay, now, we were talking to Ed Walsh this morning, Bina, about how you collaborate with colleagues in the storage business. We know you guys have been growing, >> Bina: That's right. >> It's the fourth quarter straight, and that doesn't event count, some of the stuff that you guys ship on the cloud in storage, >> That's right, that's right. >> Dave: So talk about the collaboration across company. >> Yeah, we've had some tremendous collaboration, you know, the broader IBM and bringing all of that together, and that's one of the things that, you know, we're talking about here today with Steve and team is really as they built out their cognitive architecture to be able to then leverage some of our capabilities and the strengths that we bring to the table as part of that overall architecture. And it's been a great story, yeah. >> So what would you add to that, Steve? >> Yeah, absolutely refreshing. You know I've built up super computers in the past, and, specifically for deep learning, and coming on board at IBM about a year ago, seeing the elastic storage solution, or server. >> Bina: Yeah, elastic storage server, yep. >> It handles a number of different aspects of my pipeline, very uniquely, so for starters, I don't want to worry about rolling out new infrastructure all the time. I want to be able to grow my team, to grow my projects, and that's what nice about ESS is it's distensible, I'm able to roll out more projects, more people, multi-tenancy et cetera, and it supports us effectively. Especially, you know, it has very unique attributes like the read only performance feed, and random access of data, is very unique to the offering. >> Okay, so, if you're a customer of Bina's, right? >> I am, 100%. >> What do you need for infrastructure for Deep Learning, AI, what is it, you mentioned some attributes before, but, take it down a little bit. >> Well, the reality is, there's many different aspects and if anything kind of breaks down, then the data science experience breaks down. So, we want to make sure that everything from the interconnect of the pipelines is effective, that you heard Jensen earlier today from Nvidia, we've got to make sure that we have compute devices that, you know, are effective for the computation that we're rolling out on them. But that said, if those GPUs are starved by data, that we don't have the data available which we're drawing from ESS, then we're not making effective use of those GPUs. It means we have to roll out more of them, et cetera, et cetera. And more importantly, the time for experimentation is elongated, so that whole idea, so product timeline that I talked about is elongated. If anything breaks down, so, we've got to make sure that the storage doesn't break down, and that's why this is awesome for us. >> So let me um, especially from a deep learning standpoint, let me throw, kind of a little bit of history, and tell me if you think, let me hear your thoughts. So, years ago, the data was put as close to the application as possible, about 10, 15 years ago, we started breaking the data from the application, the storage from the application, and now we're moving the algorithm down as close to the data as possible. >> Steve: Yeah. >> At what point in time do we stop calling this storage, and start acknowledging that we're talking about a fabric that's actually quite different, because we put a lot more processing power as close to the data as possible. We're not just storing. We're really doing truly, deeply distributing computing. What do you think? >> There's a number of different areas where that's coming from. Everything from switches, to storage, to memory that's doing computing very close to where the data actually residents. Still, I think that, you know, this is, you can look all the way back to Google file system. Moving computation to where the data is, as close as possible, so you don't have to transfer that data. I think that as time goes on, we're going to get closer and closer to that, but still, we're limited by the capacity of very fast storage. NVMe, very interesting technology, still limited. You know, how much memory do we have on the GPUs? 16 gigs, 24 is interesting, 48 is interesting, the models that I want to train is in the 100s of gigabytes. >> Peter: But you can still parallelize that. >> You can parallelize it, but there's not really anything that's true model parallelism out there right now. There's some hacks and things that people are doing, but. I think we're getting there, it's still some time, but moving it closer and closer means we don't have to spend the power, the latency, et cetera, to move the data. >> So, does that mean that the rate of increase of data and the size of the objects we're going to be looking at, is still going to exceed the rate of our ability to bring algorithms and storage, or algorithms and data together? What do you think? >> I think it's getting closer, but I can always just look at the bigger problem. I'm dealing with 30 terabytes of data for one of the problems that I'm solving. I would like to be using 60 terabytes of data. If I could, if I could do it in the same amount of time, and I wasn't having to transfer it. With that said, if you gave me 60, I'd say, "I really wanted 120." So, it doesn't stop. >> David: (laughing) You're one of those kind of guys. >> I'm definitely one of those guys. I'm curious, what would it look like? Because what I see right now is it would be advantageous, and I would like to do it, but I ran 40,000 experiments with 30 terabytes of data. It would be four times the amount of transfer if I had to run that many experiments of 120. >> Bina, what do you think? What is the fundamental, especially from a software defined side, what does the fundamental value proposition of storage become, as we start pushing more of the intelligence close to the data? >> Yeah, but you know the storage layer fundamentally is software defined, you still need that setup, protocols, and the file system, the NFS, right? And, so, some of that still becomes relevant, even as you kind of separate some of the physical storage or flash from the actual compute. I think there's still a relevance when you talk about software defined storage there, yeah. >> So you don't expect that there's going to be any particular architectural change? I mean, NVMe is going to have a real impact. >> NVMe will have a real impact, and there will be this notion of composable systems and we will see some level of advancement there, of course, and that's around the corner, actually, right? So I do see it progressing from that perspective. >> So what's underneath it all, what actually, what products? >> Yeah, let me share a little bit about the product. So, what Steve and team are using is our elastic storage server. So, I talked about software defined storage. As you know, we have a very complete set of software defined storage offerings, and within that, our strategy has always been allow the clients to consume the capabilities the way they want. A software only on their own hardware, or as a service, or as an integrated solution. And so what Steve and team are using is an integrated solution with our spectrum scale software, along with our flash and power nine server power systems. And on the software side from spectrum scale, this is a very rich offering that we've had in our portfolio. Highly scalable file system, it's one of the solutions that powers a lot of our supercomputers. A project that we are still in the process and have delivered on around Whirl, our national labs. So same file system combined with a set of servers and flash system, right? Highly scalable, erasure coding, high availability as well as throughput, right? 40 gigabytes per second, so that's the solution, that's the storage and system underneath what Steve and team are leveraging. >> Steve, you talk about, "you want more," what else is on Bina's to-do-list from your standpoint? >> Specifically targeted at storage, or? >> Dave: Yeah, what do you want from the products? >> Well, I think long stretch goals are multi-tenancy and the wide array of dimensions that, especially in the chief data office, that we're dealing with. We have so many different business units, so many different of those enterprise problems in the orders of hundreds how do you effectively use that storage medium driving so many different users? I think it's still hard, I think we're doing it a hell of a lot better than we ever have, but it's still, it's an open research area. How do you do that? And especially, there's unique attributes towards deep learning, like, most of the data is read only to a certain degree. When data changes there's some consistency checks that could be done, but really, for my experiment that's running right now, it doesn't really matter that it's changed. So there's a lot of nuances specific to deep learning that I would like exploited if I could, and that's some of the interactions that we're working on to kind of alleviate those pains. >> I was at a CDO conference in Boston last October, and Indra Pal was there and he presented this enterprise data architecture, and there were probably about three or four hundred CDOs, chief data officers, in the room, to sort of explain that. Can you, sort of summarize what that is, and how it relates to sort of what you do on a day to day basis, and how customers are using it? >> Yeah, for sure, so the architecture is kind of like the backbone and rules that kind of govern how we work with the data, right? So, the realities are, there's no sort of blueprint out there. What works at Google, or works at Microsoft, what works at Amazon, that's very unique to what they're doing. Now, IBM has a very unique offering as well. We have so many, we're a composition of many, many different businesses put together. And now, with the Chief Data Office that's come to light across many organizations like you said, at the conference, three to 400 people, the requirements are different across the orders. So, bringing the data together is kind of one of the big attributes of it, decreasing the number of silos, making a monolithic kind of reliable, accessible entity that various business units can trust, and that it's governed behind the scenes to make sure that it's adhering to everyone's policies, that their own specific business unit has deemed to be their policy. We have to adhere to that, or the data won't come. And the beauty of the data is, we've moved into this cognitive era, data is valuable but only if we can link it. If the data is there, but there's no linkages there, what do I do with it? I can't really draw new insights. I can't draw, all those hundreds of enterprise use cases, I can't build new value in them, because I don't have any more data. It's all about linking the data, and then looking for alternative data sources, or additional data sources, and bringing that data together, and then looking at the new insights that come from it. So, in a nutshell, we're doing that internally at IBM to help our transformation. But at the same time creating a blueprint that we're making accessible to CDOs around the world, and our enterprise customers around the world, so they can follow us on this new adventure. New adventure being, you know, two years old, but. >> Yeah, sure, but it seems like, if you're going to apply AI, you've got to have your data house in order to do that. So this sounds like a logical first step, is that right? >> Absolutely, 100%. And, the realities are, there's a lot of people that are kicking the tires and trying to figure out the right way to do that, and it's a big investment. Drawing out large sums of money to kind of build this hypothetical better area for data, you need to have a reference design, and once you have that you can actually approach the C-level suite and say, "Hey, this is what we've seen, this is the potential, "and we have an architecture now, "and they've already gone down all the hard paths, "so now we don't have to go down as many hard paths." So, it's incredibly empowering for them to have that reference design and learning from our mistakes. >> Already proven internally now, bringing it to our enterprise alliance. >> Well, and so we heard Jenny this morning talk about incumbent disruptors, so I'm kind of curious as to what, any learnings you have there? It's early days, I realize that, but when you think about, the discussions, are banks going to lose control of the payment systems? Are retail stores going to go away? Is owning and driving your own vehicle going to be the exception, not the norm? Et cetera, et cetera, et cetera, you know, big questions, how far can we take machine intelligence? Have you seen your clients begin to apply this in their businesses, incumbents, we saw three examples today, good examples, I thought. I don't think it's widespread yet, but what are you guys seeing? What are you learning, and how are you applying that to clients? >> Yeah, so, I mean certainly for us, from these new AI workloads, we have a number of clients and a number of different types of solutions. Whether it's in genomics, or it's AI deep learning in analyzing financial data, you know, a variety of different types of use cases where we do see clients leveraging the capabilities, like spectrum scale, ESS, and other flash system solutions, to address some of those problems. We're seeing it now. Autonomous driving as well, right, to analyze data. >> How about a little road map, to end this segment? Where do you want to take this initiative? What should we be looking for as observers from the outside looking in? >> Well, I think drawing from the endeavors that we have within the CDO, what we want to do is take some of those ideas and look at some of the derivative products that we can take out of there, and how do we kind of move those in to products? Because we want to make it as simple as possible for the enterprise customer. Because although, you see these big scale companies, and all the wonderful things that they're doing, what we've had the feedback from, which is similar to our own experiences, is that those use cases aren't directly applicable for most of the enterprise customers. Some of them are, right, some of the stuff in vision and brand targeting and speech recognition and all that type of stuff are, but at the same time the majority and the 90% area are not. So we have to be able to bring down sorry, just the echoes, very distracting. >> It gets loud here sometimes, big party going on. >> Exactly, so, we have to be able to bring that technology to them in a simpler form so they can make it more accessible to their internal data scientists, and get better outcomes for themselves. And we find that they're on a wide spectrum. Some of them are quite advanced. It doesn't mean just because you have a big name you're quite advanced, some of the smaller players have a smaller name, but quite advanced, right? So, there's a wide array, so we want to make that accessible to these various enterprises. So I think that's what you can expect, you know, the reference architecture for the cognitive enterprise data architecture, and you can expect to see some of the products from those internal use cases come out to some of our offerings, like, maybe IGC or information analyzer, things like that, or maybe the Watson studio, things like that. You'll see it trickle out there. >> Okay, alright Bina, we'll give you the final word. You guys, business is good, four straight quarters of growth, you've got some tailwinds, currency is actually a tailwind for a change. Customers seem to be happy here, final word. >> Yeah, no, we've got great momentum, and I think 2018 we've got a great set of roadmap items, and new capabilities coming out, so, we feel like we've got a real strong set of future for our IBM storage here. >> Great, well, Bina, Steve, thanks for coming on theCUBE. We appreciate your time. >> Thank you. >> Nice meeting you. >> Alright, keep it right there everybody. We'll be back with our next guest right after this. This is day two, IBM Think 2018. You're watching theCUBE. (techno jingle)

Published Date : Mar 21 2018

SUMMARY :

Brought to you by IBM. Bina Hallman is here, she's the Vice President He's the Vice President of Deep Learning Dave: Welcome to the Cube, Steve. Yeah, so this has been quite an event, Standing room only at all the sessions. We brought, as you said, all of our conferences together You're in that office, but you've got other roles behind the scenes, to accelerate the ideation in the storage business. and that's one of the things that, you know, seeing the elastic storage solution, or server. like the read only performance feed, AI, what is it, you mentioned some attributes before, that the storage doesn't break down, and tell me if you think, let me hear your thoughts. and start acknowledging that we're talking about a fabric the models that I want to train is in the 100s of gigabytes. to move the data. for one of the problems that I'm solving. and I would like to do it, protocols, and the file system, the NFS, right? So you don't expect that there's going to be and that's around the corner, actually, right? allow the clients to consume the capabilities and that's some of the interactions that we're working on and how it relates to sort of what you do on a and that it's governed behind the scenes you've got to have your data house in order to do that. that are kicking the tires and trying to figure out bringing it to our enterprise alliance. and how are you applying that to clients? leveraging the capabilities, like spectrum scale, ESS, and all the wonderful things that they're doing, So I think that's what you can expect, you know, Okay, alright Bina, we'll give you the final word. and new capabilities coming out, so, we feel We appreciate your time. This is day two, IBM Think 2018.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
StevePERSON

0.99+

Steve ElliotPERSON

0.99+

DavidPERSON

0.99+

Peter BurressPERSON

0.99+

Dave VellantePERSON

0.99+

IBMORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

DavePERSON

0.99+

Paul BhandariPERSON

0.99+

GoogleORGANIZATION

0.99+

BostonLOCATION

0.99+

Bina HallmanPERSON

0.99+

Indra PalPERSON

0.99+

60 terabytesQUANTITY

0.99+

90%QUANTITY

0.99+

16 gigsQUANTITY

0.99+

PeterPERSON

0.99+

100%QUANTITY

0.99+

2018DATE

0.99+

Ed WalshPERSON

0.99+

NvidiaORGANIZATION

0.99+

30 terabytesQUANTITY

0.99+

JennyPERSON

0.99+

threeQUANTITY

0.99+

60QUANTITY

0.99+

40,000 experimentsQUANTITY

0.99+

Steven EliukPERSON

0.99+

Las VegasLOCATION

0.99+

24QUANTITY

0.99+

BinaPERSON

0.99+

two yearsQUANTITY

0.99+

120QUANTITY

0.99+

48QUANTITY

0.99+

last OctoberDATE

0.99+

oneQUANTITY

0.98+

40 gigabytesQUANTITY

0.98+

first stepQUANTITY

0.98+

hundredsQUANTITY

0.97+

three examplesQUANTITY

0.97+

30,000 40,000QUANTITY

0.97+

todayDATE

0.97+

400 peopleQUANTITY

0.97+

four hundred CDOsQUANTITY

0.96+

WhirlORGANIZATION

0.95+

about 10, 15 years agoDATE

0.94+

this morningDATE

0.94+

about threeQUANTITY

0.92+

four timesQUANTITY

0.91+

years agoDATE

0.91+

100s of gigabytesQUANTITY

0.89+

fourth quarterDATE

0.89+

a year agoDATE

0.88+

four straight quartersQUANTITY

0.88+

Watson studioORGANIZATION

0.85+

day twoQUANTITY

0.84+

ESSORGANIZATION

0.83+

nine server power systemsQUANTITY

0.82+

Vice PresidentPERSON

0.78+

Wikibon Predictions Webinar with Slides


 

(upbeat music) >> Hi, welcome to this year's Annual Wikibon Predictions. This is our 2018 version. Last year, we had a very successful webinar describing what we thought was going to happen in 2017 and beyond and we've assembled a team to do the same thing again this year. I'm very excited to be joined by the folks listed here on the screen. My name is Peter Burris. But with me is David Floyer, Jim Kobielus is remote. George Gilbert's here in our Pal Alto studio with me. Neil Raden is remote. David Vellante is here in the studio with me. And Stuart Miniman is back in our Marlboro office. So thank you analysts for attending and we look forward to a great teleconference today. Now what we're going to do over the course of the next 45 minutes or so is we're going to hit about 13 of the 22 predictions that we have for the coming year. So if you have additional questions, I want to reinforce this, if you have additional questions or things that don't get answered, if you're a client, give us a call. Reach out to us. We'll leave you with the contact information at the end of the session. But to start things off we just want to make sure that everybody understands where we're coming from. And let you know who is Wikibon. So Wikibon is a company that starts with the idea of what's important as to research communities. Communities are where the action is. Community is where the change is happening. And community is where the trends are being established. And so we use digital technologies like theCUbE, CrowdChat and others to really ensure that we are surfacing the best ideas that are in a community and making them available to our clients so that they can succeed successfully, they can be more successful in their endeavors. When we do that, our focus has always been on a very simple premise. And that is that we're moving to an era of digital business. For many people, digital business can mean virtually anything. For us it means something very specific. To us, the difference between business and digital business is data. A digital business uses data to differentially create and keep a customer. So borrowing from what Peter Drucker said if the goal of business is to create customers and keep and sustain customers, the goal of digital business is to use data to do that. And that's going to inform an enormous number of conversations and an enormous number of decisions and strategies over the next few years. We specifically believe that all businesses are going to have establish what we regard as the five core digital business capabilities. First, they're going to have to put in place concrete approaches to turning more data into work. It's not enough to just accrete data, to capture data or to move data around. You have to be very purposeful and planful in how you establish the means by which you turn that data into work so that you can create and keep more customers. Secondly, it's absolutely essential that we build kind of the three core technology issues here, technology capabilities of effectively doing a better job of capturing data and IoT and people, or internet of things and people, mobile computing for example, is going to be a crucial feature of that. You have to then once you capture that data, turn it into value. And we think this is the essence of what big data and in many respects AI is going to be all about. And then once you have the possibility, kind of the potential energy of that data in place, then you have to turn it into kinetic energy and generate work in your business through what we call systems of agency. Now, all of this is made possible by this significant transformation that happens to be conterminous with this transition to digital business. And that is the emergence of the cloud. The technology industry has always been defined by the problems it was able to solve, catalyzed by the characteristics of the technology that made it possible to solve them. And cloud is crucial to almost all of the new types of problems that we're going to solve. So these are the five digital business capabilities that we're going to talk about, where we're going to have our predictions. Let's start first and foremost with this notion of turn more data into work. So our first prediction relates to how data governance is likely to change in a global basis. If we believe that we need to turn more data into work well, businesses haven't generally adopted many of the principles associated with those practices. They haven't optimized to do that better. They haven't elevated those concepts within the business as broadly and successfully as they have or as they should. We think that's going to change in part by the emergence of GDPR or the General Data Protection Regulation. It's going to go in full effect in May 2018. A lot has been written about it. A lot has been talked about. But our core issues ultimately are is that the dictates associated with GDPR are going to elevate the conversation on a global basis. And it mandates something that's now called the data protection officer. We're going to talk about that in a second David Vellante. But if is going to have real teeth. So we were talking with one chief privacy officer not too long ago who suggested that had the Equifax breach occurred under the rules of GDPR that the actual finds that would have been levied would have been in excess of 160 billion dollars which is a little bit more than the zero dollars that has been fined thus far. Now we've seen new bills introduced in Congress but ultimately our observation and our conversations with a lot of data chief privacy officers or data protection officers is that in the B2B world, GDPR is going to strongly influence not just our businesses behavior regarding data in Europe but on a global basis. Now that has an enormous implication David Vellante because it certainly suggest this notion of a data protection officer is something now we've got another potential chief here. How do we think that's going to organize itself over the course of the next few years? >> Well thank you Peter. There are a lot of chiefs (laughs) in the house and sometimes it gets confusing as the CIO, there's the CDO and that's either chief digital officer or chief data officer. There's the CSO, could be strategy, sometimes that could be security. There's the CPO, is that privacy or product. As he says, it gets confusing sometimes. On theCUbE we talked to all of these roles so we wanted to try to add some clarity to that. First thing we want to say is that the CIO, the chief information officer, that role is not going away. A lot of people predict that, we think that's nonsense. They will continue to have a critical role. Digital transformations are the priority in organizations. And so the chief digital officer is evolving from more than just a strategy role to much more of an operation role. Generally speaking, these chiefs tend to report in our observation to the chief operating officer, president COO. And we see the chief digital officer as increasing operational responsibility aligning with the COO and getting incremental responsibility that's more operational in nature. So the prediction really is that the chief digital officer is going to emerge as a charismatic leader amongst these chiefs. And by 2022, nearly 50% of organizations will position the chief digital officer in a more prominent role than the CIO, the CISO, the CDO and the CPO. Those will still be critical roles. The CIO will be an enabler. The chief information security officer has a huge role obviously to play especially in terms of making security a teams sport and not just falling on IT's shoulders or the security team's shoulders. The chief data officer who really emerged from a records and data management role in many cases, particularly within regulated industries will still be responsible for that data architecture and data access working very closely with the emerging chief privacy officer and maybe even the chief data protection officer. Those roles will be pretty closely aligned. So again, these roles remain critical but the chief digital officer we see as increasing in prominence. >> Great, thank you very much David. So when we think about these two activities, what we're really describing is over the course of the next few years, we strongly believe that data will be regarded more as an asset within business and we'll see resources devoted to it and we'll see certainly management devoted to it. Now, that leads to the next set of questions as data becomes an asset, the pressure to acquire data becomes that much more acute. We believe strongly that IoT has an enormous implication longer term as a basis for thinking about how data gets acquired. Now, operational technology has been in place for a long time. We're not limiting ourselves just operational technology when we talk about this. We're really talking about the full range of devices that are going to provide and extend information and digital services out to consumers, out to the Edge, out to a number of other places. So let's start here. Over the course of the next few years, the Edge analytics are going to be an increasingly important feature overall of how technology decisions get made, how technology or digital business gets conceived and even ultimately how business gets defined. Now David Floyer's done a significant amount of work in this domain and we've provided that key finding on the right hand side. And what it shows is that if you take a look at an Edge based application, a stylized Edge based application and you presume that all the data moves back to an centralized cloud, you're going to increase your costs dramatically over a three year period. Now that moderates the idea or moderates the need ultimately for providing an approach to bringing greater autonomy, greater intelligence down to the Edge itself and we think that ultimately IoT and Edge analytics become increasingly synonymous. The challenge though is that as we evolve, while this has a pressure to keep more of the data at the Edge, that ultimately a lot of the data exhaust can someday become regarded as valuable data. And so as a consequence of that, there's still a countervailing impression to try to still move all data not at the moment of automation but for modeling and integration purposes, back to some other location. The thing that's going to determine that is going to be rate at which the cost of moving the data around go down. And our expectation is over the next few years when we think about the implications of some of the big cloud suppliers, Amazon, Google, others, that are building out significant networks to facilitate their business services may in fact have a greater impact on the common carriers or as great an impact on the common carriers as they have on any server or other infrastructure company. So our prediction over the next few years is watch what Amazon, watch what Google do as they try to drive costs down inside their networks because that will have an impact how much data moves from the Edge back to the cloud. It won't have an impact necessarily on the need for automation at the Edge because latency doesn't change but it will have a cost impact. Now that leads to a second consideration and the second consideration is ultimately that when we talk about greater autonomy at the Edge we need to think about how that's going to play out. Jim Kobielus. >> Jim: Hey thanks a lot Peter. Yeah, so what we're seeing at Wikibon is that more and more of the AI applications, more of the AI application development involves AI and more and more of the AI involves deployment of those models, deep learning machine learning and so forth to the Edges of the internet of things and people. And much of that AI will be operating autonomously with little or no round-tripping back to the cloud. What that's causing, in fact, we're seeing really about a quarter of the AI development projects (static interference with web-conference) as Edge deployment. What that involves is that more and more of that AI will be, those applications will be bespoke. They'll be one of a kind, or unique or an unprecedented application and what that means is that, you know, there's a lot of different deployment scenarios within which organizations will need to use new forms of learning to be able to ready that data, those AI applications to do their jobs effectively albeit to predictions of real time, guiding of an autonomous vehicle and so forth. Reinforcement learning is the core of what many of these kinds of projects, especially those that involve robotics. So really software is hitting the world and you know the biggest parts are being taken out of the Edge, much of that is AI, much of that autonomous, where there is no need or less need for real time latency in need of adaptive components, AI infused components where as they can learn by doing. From environmental variables, they can adapt their own algorithms to take the right actions. So, they'll have far reaching impacts on application development in 2018. For the developer, the new developer really is a data scientist at heart. They're going to have to tap into a new range of sources of data especially Edge sourced data from the senors on those devices. They're going to need to do commitment training and testing especially reinforcement learning which doesn't involve trained data so much as it involves being able to build an algorithm that can learn to maximum what's called accumulative reward function and if you do the training there adaptly in real time at the Edge and so forth and so on. So really, much of this will be bespoke in the sense that every Edge device increasingly will have its own set of parameters and its own set of objective functions which will need to be optimized. So that's one of the leading edge forces, trends, in development that we see in the coming year. Back to you Peter. >> Excellent Jim, thank you very much. The next question here how are you going to create value from data? So once you've, we've gone through a couple trends and we have multiple others about what's going to happen at the Edge. But as we think about how we're going to create value from data, Neil Raden. >> Neil: You know, the problem is that data science emerged rapidly out of sort of a perfect storm of big data and cloud computing and so forth. And people who had been involved in quantitative methods you know rapidly glommed onto the title because it was, lets face it, it was very glamorous and paid very well. But there weren't really good best practices. So what we have in data science is a pretty wide field of things that are called data science. My opinion is that the true data scientists are people who are scientists and are involved in developing new or improving algorithms as opposed to prepping data and applying models. So the whole field really kind of generated very quickly, in really, just in a few years. To me I called it generation zero which is more like data prep and model management all done manually. And it wasn't really sustainable in most organizations because for obvious reasons. So generation one, then some vendors stepped up with tool kits or benchmarks or whatever for data scientists and made it a little better. And generation two is what we're going to see in 2018, is the need for data scientists to no longer prep data or at least not spend very much time with it. And not to do model management because the software will not only manage the progression of the models but even recommend them and generate them and select the data and so forth. So it's in for a very big change and I think what you're going to see is that the ranks of data scientists are going to sort of bifurcate to old style, let me sit down and write some spaghetti code in R or Java or something and those that use these advanced tool kits to really get the work done. >> That's great Neil and of course, when we start talking about getting the work done, we are becoming increasingly dependent upon tools, aren't we George? But the tool marketplace for data science, for big data, has been somewhat fragmented and fractured. And hasn't necessarily focused on solving the problems of the data scientists. But in many respects focusing the problems that the tools themselves have. What's going to happen in the coming year when we start thinking about Neil's prescription that as the tools improve what's going to happen to the tools. >> Okay so, the big thing that we see supporting what Neil's talking about, what Neil was talking about is partly a symptom of a product issue and a go to market issue where the produce issue was we had a lot of best of breed products that were all designed to fit together. That in the broader big data space, that's the same issue that we faced with more narrowly with ArpiM Hadoop where you know, where we were trying to fit together a bunch of open source packages that had an admin and developer burden. More broadly, what Neil is talking about is sort of a richer end to end tools that handle both everything from the ingest all to the way to the operationalization and feedback of the models. But part of what has to go on here is that with open source, these open source tools the price point and the functional footprints that many of the vendors are supporting right now can't feed an enterprise sales force. Everyone talks with their open source business models about land and expand and inside sales. But the problem is once you want to go to wide deployment in an enterprise, you still need someone negotiating commercial terms at a senior level. You still need the technical people fitting the tools into a broader architecture. And most of the vendors that we have who are open source vendors today, don't have either the product breadth or the deal size to support traditional enterprise software. An account team would typically a million and a half to two million quota every year so we see consolidation and the consolidation again driven by the need for simplicity for the admins and the developers and for business model reasons to support enterprise sales force. >> All right, so what we're going to see happen in the course of the coming year is a lot of specialization and recognition of what is data science, what are the practices, how is it going to work, supported by an increasing quality of tools and a lot of tool vendors are going to be left behind. Now the third kind of notion here for those core technology capabilities is we still have to enact based on data. The good new is that big data is starting to show some returns in part because of some of the things that AI and other technologies are capable of doing. But we have to move beyond just creating the potential for, we have to turn that into work and that's what we mean ultimately by this notion of systems of agency. The idea that data driven applications will increasingly be act on behalf of a brand, on behalf of a company and building those systems out is going to be crucial. It's going to have a whole new set of disciplines and expertise required. So when we think about what's going to be required, it always starts with this notion of AI. A lot of folks are presuming however, that AI is going to be relatively easy to build or relatively easy to put together. We have a different opinion George. What do we think is going to happen as these next few years unfold related to AI adoption in large enterprises? >> Okay so, let's go back to the lessons we learned from sort of the big data, the raw, you know, let's put a data link in place which was sort of the top of everyone's agenda for several years. The expectation was it was going to cure cancer, taste like chocolate and cost a dollar. And uh. (laughing) It didn't quite work out that way. Partly because we had a burden on the administrator again of so many tools that weren't all designed to fit together, even though they were distributed together. And then the data scientists, the guys who had to take all this data that wasn't carefully curated yet. And turn that into advanced analytics and machine learning models. We have many of the same problems now with tool sets that are becoming more integrated but at lower levels. This is partly what Neil Raden was just talking about. What we have to recognize is something that we see all along, I mean since the beginning of (laughs) corporate computing. We have different levels of extraction and you know at the very bottom, when you're dealing with things like Tensorflow or MXNet, that's not for mainstream enterprises. That's for you know, the big sophisticated tech companies who are building new algorithms on those frameworks. There's a level above that where you're using like a spark cluster in the machine learning built into that. That's slightly more accessible but when we talk about mainstream enterprises taking advantage of AI, the low hanging fruit is for them to use the pre-trained models that the public cloud vendors have created with all the consumer data on speech, image recognition, natural language processing. And then some of those capabilities can be further combined into applications like managing a contact center and we'll see more from like Amazon, like recommendation engines, fulfillment optimization, pricing optimization. >> So our expectation ultimately George is that we're going to see a lot of this, a lot of AI adoption happen through existing applications because the vendors that are capable of acquiring a talent, taking or experimenting, creating value, software vendors are going to be where a lot of the talent ends up. So Neil, we have an example of that. Give us an example of what we think is going to happen in 2018 when we start thinking about exploiting AI and applications. >> Neil: I think that it's fairly clear to be the application of what's called advanced analytics and data science and even machine learning. But really, it's rapidly becoming a commonplace in organizations not just at the bottom of the triangle here. But I like the example of SalesForce.com. What they've done with Einstein, is they've made machine learning and I guess you can say, AI applications available to their customer base and why is that a good thing? Because their customer base already has a giant database of clean data that they can use. So you're going to see a huge number of applications being built with Einstein against Salesforce.com data. But there's another thing to consider and that is a long time ago Salesforce.com built connectors to a zillion times of external data. So, if you're a SalesForce.com customer using Einstein, you're going to be able to use those advanced tools without knowing anything about how to train a machine learning model and start to build those things. And I think that they're going to lead the industry in that sense. That's going to push their revenue next year to, I don't know, 11 billion dollars or 12 billion dollars. >> Great, thanks Neil. All right so when we think about further evidence of this and further impacts, we ultimately have to consider some of the challenges associated with how we're going to create application value continually from these tools. And that leads to the idea that one of the cobblers children, it's going to gain or benefit from AI will in fact be the developer organization. Jim, what's our prediction for how auto-programming impacts development? >> Jim: Thank you very much Peter. Yeah, automation, wow. Auto-programming like I said is the epitome of enterprise application development for us going forward. People know it as co-generation but that really understates the control of auto-programming as it's evolving. Within 2018, what we're going to see is that machine learning driven co-generation approach of becoming the forefront of innovation. We're seeing a lot of activity in the industry in which applications use ML to drive the productivity of developers for all kinds of applications. We're also seeing a fair amount of what's called RPA, robotic process automation. And really, how they differ is that ML will deliver or will drive co-generation, from what I call the inside out meaning, creating reams of code that are geared to optimize a particular application scenario. This is RPA which really takes over the outside in approach which is essentially, it's the evolution of screen scraping that it's able to infer the underlined code needed for applications of various sorts from the external artifacts, the screens and from sort of the flow of interactions and clips and so forth for a given application. We're going to see that ML and RPA will compliment each other in the next generation of auto-programming capabilities. And so, you know, really application development tedium is really the enemy of, one of the enemies of productivity (static interference with web-conference). This is a lot of work, very detailed painstaking work. And what they need is to be better, more nuanced and more adaptive auto-programming tools to be able to build the code at the pace that's absolutely necessary for this new environment of cloud computing. So really AI-related technologies can be applied and are being applied to application development productivity challenges of all sorts. AI is fundamental to RPA as well. We're seeing a fair number of the vendors in that stage incorporate ML driven OCR and natural language processing and screen scraping and so forth into their core tools to be able to quickly build up the logic albeit to drive sort of the verbiage outside in automation of fairly complex orchestration scenario. In 2018, we'll see more of these technologies come together. But you know, they're not a silver bullet. 'Cause fundamentally and for organizations that are considering going deeply down into auto-programming they're going to have to factor AI into their overall plans. They need to get knowledgeable about AI. They're going to need to bring more AI specialists into their core development teams to be able to select from the growing range of tools that are out there, RPA and ML driven auto-programming. Overall, really what we're seeing is that the AI, the data scientists, who's been the fundamental developer of AI, they're coming into the core of development tools and skills in organizations. And they're going to be fundamental to this whole trend in 2018 and beyond. If AI gets proven out in auto-programming, these developers will then be able to evangelize the core utility of the this technology, AI. In a variety of other backend but critically important investments that organizations will be making in 2018 and beyond. Especially in IT operations and in management, AI is big in that area as well. Back to you there, Peter. >> Yeah, we'll come to that a little bit later in the presentation Jim, that's a crucial point but the other thing we want to note here regarding ultimately how folks will create value out of these technologies is to consider the simple question of okay, how much will developers need to know about infrastructure? And one of the big things we see happening is this notion of serverless. And here we've called it serverless, developer more. Jim, why don't you take us through why we think serverless is going to have a significant impact on the industry, at least certainly from a developer perspective and developer productivity perspective. >> Jim: Yeah, thanks. Serverless is really having an impact already and has for the last several years now. Now, everybody, many are familiar in the developer world, AWS Lambda which is really the ground breaking public cloud service that incorporates the serverless capabilities which essentially is an extraction layer that enables developers to build stateless code that executes in a cloud environment without having to worry about and to build microservices, we don't have to worry about underlined management of containers and virtual machines and so forth. So in many ways, you know, serverless is a simplification strategy for developers. They don't have to worry about the underlying plumbing. They can worry, they need to worry about the code, of course. What are called Lambda functions or functional methods and so forth. Now functional programming has been around for quite a while but now it's coming to the form in this new era of serverless environment. What we'll see in 2018 is that we're predicting is that more than 50% of lean microservices employees, in the public cloud will be deployed in serverless environments. There's AWS and Microsoft has the Azure function. IMB has their own. Google has their own. There's a variety of private, there's a variety of multiple service cloud code bases for private deployment of serverless environments that we're seeing evolving and beginning to deploy in 2018. They all involve functional programming which really, along, you know, when coupled with serverless clouds, enables greater scale and speed in terms of development. And it's very agile friendly in the sense that you can quickly Github a functionally programmed serverless microservice in a hurry without having to manage state and so forth. It's very DevOps friendly. In the very real sense it's a lot faster than having to build and manage and tune. You know, containers and DM's and so forth. So it can enable a more real time and rapid and iterative development pipeline going forward in cloud computing. And really fundamentally what serverless is doing is it's pushing more of these Lamba functions to the Edge, to the Edges. If you're at an AWS Green event last week or the week before, but you notice AWS is putting a big push on putting Lambda functions at the Edge and devices for the IoT as we're going to see in 2018. Pretty much the entire cloud arena. Everybody will push more of the serverless, functional programming to the Edge devices. It's just a simplification strategy. And that actually is a powerful tool for speeding up some of the development metabolism. >> All right, so Jim let me jump in here and say that we've now introduced the, some of these benefits and really highlighted the role that the cloud is going to play. So, let's turn our attention to this question of cloud optimization. And Stu, I'm going to ask you to start us off by talking about what we mean by true private cloud and ultimately our prediction for private cloud. Do we have, why don't you take us through what we think is going to happen in this world of true private cloud? >> Stuart: Sure Peter, thanks a lot. So when Wikibon, when we launched the true private cloud terminology which was about two weeks ago next week, two years ago next week, it was in some ways coming together of a lot of trends similar to things that you know, George, Neil and James have been talking about. So, it is nothing new to say that we needed to simplify the IT stack. We all know, you know the tried and true discussion of you know, way too much of the budget is spent kind of keeping lights on. What we'd like to say is kind of running the business. If you squint through this beautiful chart that we have on here, a big piece of this is operational staffing is where we need to be able to make a significant change. And what we've been really excited and what led us to this initial market segment and what we're continuing to see good growth on is the move from traditional, really siloed infrastructure to you want to have, you know, infrastructure where it is software based. You want IT to really be able to focus on the application services that they're running. And what our focus for the this for the 2018 is of course it's the central point, it's the data that matters here. The whole reason we've infrastructured this to be able to run applications and one of the things that is a key determiner as to where and what I use is the data and how can I not only store that data but actually gain value from that data. Something we've talked about time and again and that is a major determining factor as to am I building this in a public cloud or am I doing it in you know my core. Is it something that is going to live on the Edge. So that's what we were saying here with the true private cloud is not only are we going to simplify our environment and therefore it's really the operational model that we talked about. So we often say the line, cloud is not a destination. But it's an operational model. So a true private cloud giving me some of the you know, feel and management type of capability that I had had in the public cloud. It's, as I said, not just virtualization. It's much more than that. But how can I start getting services and one of the extensions is true private cloud does not live in isolation. When we have kind of a core public cloud and Edge deployments, I need to think about the operational models. Where data lives, what processing happens need to be as environments, and what data we'll need to move between them and of course there's fundamental laws of physics that we need to consider in that. So, the prediction of course is that we know how much gear and focus has been on the traditional data center. And true private cloud helps that transformation to modernization and the big focus is many of these applications we've been talking about and uses of data sets are starting to come into these true private cloud environments. So, you know, we've had discussions. There's Spark, there's modern databases. Many of these, there's going to be many reasons why they might live in the private cloud environment. And therefore that's something that we're going to see tremendous growth and a lot of focus. And we're seeing a new wave of companies that are focusing on this to deliver solutions that will do more than just a step function for infrastructure or get us outside of our silos. But really helps us deliver on those cloud native applications where we pull in things like what Jim was talking about with serverless and the like. >> All right, so Stu, what that suggests ultimately is that data is going to dictate that everything's not going to end up in the private or in the public cloud or centralized public clouds because of latency costs, data governance and IP protection reasons. And there will be some others. At bare minimum, that means that we're going to have it in most large enterprises as least a couple of clouds. Talk to us about what this impact of multi cloud is going to look like over the course of the next few years. >> Stuart: Yeah, critical point there Peter. Because, right, unfortunately, we don't have one solution. There's nobody that we run into that say, oh, you know, I just do a single you know, one environment. You know it would be great if we only had one application to worry about. But as you've done this lovely diagram here, we all use lots of SaaS and increasingly, you know, Oracle, Microsoft, SalesForce, you know, all pushing everybody to multiple SaaS environments that has major impacts on my security and where my data lives. Public clouds, no doubt is growing at leaps and bounds. And many customers are choosing applications to live in different places. So just as in data centers, I would kind of look at it from an application standpoint and build up what I need. Often, there's you know, Amazon doing phenomenal. But you know, maybe there's things that I'm doing with Azure. Maybe there's things that's I'm doing with Google or others as well as my service providers for locality, for you know, specialized services, that there's reasons why people are doing it. And what customers would love is an operational model that can actually span between those. So we are very early in trying to attack this multi cloud environment. There's everything from licensing to security to you know, just operationally how do I manage those. And a piece of them that we were touching on in this prediction year, is Kubernetes actually can be a key enabler for that cloud native environment. As Jim talked about the serverless, what we'd really like is our developer to be able to focus on building their application and not think as much about the underlined infrastructure whether that be you know, racket servers that I built myself or public cloud infrastructures. So we really want to think more it's at the data and application level. It's SaaS and pass is the model and Kubernetes holds the promise to solve a piece of this puzzle. Now Kubernetes is not by no means a silver bullet for everything that we need. But it absolutely, it is doing very well. Our team was at the Linux, the CNCF show at KubeCon last week and there is you know, broad adoption from over 40 of the leading providers including Amazon is now a piece. Even SalesForce signed up to the CNCF. So Kubernetes is allowing me to be able to manage multi cloud workflows and therefore the prediction we have here Peter is that 50% of developing teams will be building, sustaining multi cloud with Kubernetes as a foundational component of that. >> That's excellent Stu. But when we think about it, the hardware of technology especially because of the opportunities associated with true private cloud, the hardware technologies are also going to evolve. There will be enough money here to sustain that investment. David Floyer, we do see another architecture on the horizon where for certain classes of workloads, we will be able to collapse and replicate many of these things in an economical, practical way on premise. We call that UniGrid, NVME is, over fabric is a crucial feature of UniGrid. >> Absolutely. So, NVMe takes, sorry NVMe over fabric or NVMe-oF takes NVMe which is out there as storage and turns it into a system framework. It's a major change in system architecture. We call this UniGrid. And it's going to be a focus of our research in 2018. Vendors are already out there. This is the fastest movement from early standards into products themselves. You can see on the chart that IMB have come out with NVMe over fabrics with the 900 storage connected to the power. Nine systems. NetApp have the EF750. A lot of other companies are there. Meta-Lox is out there looking for networks, for high speed networks. Acceler has a major part of the storage software. So and it's going to be used in particular with things like AI. So what are the drivers and benefits of this architecture? The key is that data is the bottleneck for application. We've talked about data. The amount of data is key to making applications more effective and higher value. So NVMe and NVMe over fabrics allows data to be accessed in microseconds as opposed to milliseconds. And it allows gigabytes of data per second as opposed to megabytes of data per second. And it also allows thousands of processes to access all of the data in very very low latencies. And that gives us amazing parallelism. So what's is about is disaggregation of storage and network and processes. There are some huge benefits from that. Not least of which is you save about 50% of the processor you get back because you don't have to do storage and networking on it. And you save from stranded storage. You save from stranded processor and networking capabilities. So it's overall, it's going to be cheaper. But more importantly, it makes it a basis for delivering systems of intelligence. And systems of intelligence are bringing together systems of record, the traditional systems, not rewriting them but attaching them to real time analytics, real time AI and being able to blend those two systems together because you've got all of that additional data you can bring to bare on a particular problem. So systems themselves have reached pretty well the limit of human management. So, one of the great benefits of UniGrid is to have a single metadata lab from all of that data, all of those processes. >> Peter: All those infrastructure elements. >> All those infrastructure elements. >> Peter: And application. >> And applications themselves. So what that leads to is a huge potential to improve automation of the data center and the application of AI to operations, operational AI. >> So George, it sounds like it's going to be one of the key potential areas where we'll see AI be practically adopted within business. What do we think is going to happen here as we think about the role that AI is going to play in IT operations management? >> Well if we go back to the analogy with big data that we thought was going to you know, cure cancer, taste like chocolate, cost a dollar, and it turned out that the application, the most wide spread application of big data was to offload ETL from expensive data warehouses. And what we expect is the first widespread application of AI embedded in applications for horizontal use where Neil mentioned SalesForce and the ability to use Einstein as SalesForce data and connected data. Now because the applications we're building are so complex that as Stu mentioned you know, we have this operational model with a true private cloud. It's actually not just the legacy stuff that's sucking up all the admin overhead. It's the complexity of the new applications and the stringency of the SLA's, means that we would have to turn millions of people into admins, the old you know, when the telephone networks started, everyone's going to have to be an operator. The only way we can get past this is if we sort of apply machine learning to IT Ops and application performance management. The key here is that the models can learn how the infrastructure is laid out and how it operates. And it can also learn about how all the application services and middleware works, behaving independently and with each other and how they tie with the infrastructure. The reason that's important is because all of a sudden you can get very high fidelity root cause analysis. In the old management technology, if you had an underlined problem, you'd have a whole sort of storm of alerts, because there was no reliable way to really triangulate on the or triage the root cause. Now, what's critical is if you have high fidelity root cause analysis, you can have really precise recommendations for remediation or automated remediation which is something that people will get comfortable with over time, that's not going to happen right away. But this is critical. And this is also the first large scale application of not just machine learning but machine data and so this topology of collecting widely desperate machine data and then applying models and then reconfiguring the software, it's training wheels for IoT apps where you're going to have it far more distributed and actuating devices instead of software. >> That's great, George. So let me sum up and then we'll take some questions. So very quickly, the action items that we have out of this overall session and again, we have another 15 or so predictions that we didn't get to today. But one is, as we said, digital business is the use of data assets to compete. And so ultimately, this notion is starting to diffuse rapidly. We're seeing it on theCUbE. We're seeing it on the the CrowdChats. We're seeing it in the increase of our customers. Ultimately, we believe that the users need to start preparing for even more business scrutiny over their technology management. For example, something very simple and David Floyer, you and I have talked about this extensively in our weekly action item research meeting, the idea of backing up and restoring a system is no longer in a digital business world. It's not just backing up and restoring a system or an application, we're talking about restoring the entire business. That's going to require greater business scrutiny over technology management. It's going to lead to new organizational structures. New challenges of adopting systems, et cetera. But, ultimately, our observations is that data is going to indicate technology directions across the board whether we talk about how businesses evolve or the roles that technology takes in business or we talk about the key business capability, digital business capabilities, of capturing data, turning it into value and then turning into work. Or whether we talk about how we think about cloud architecture and which organizations of cloud resources we're going to utilize. It all comes back to the role that data's going to play in helping us drive decisions. The last action item we want to put here before we get to the questions is clients, if we don't get to your question right now, contact us. Send us an inquiry. Support@silicongangle.freshdesk.com. And we'll respond to you as fast as we can over the course of the next day, two days, to try to answer your question. All right, David Vellante, you've been collecting some questions here. Why don't we see if we can take a couple of them before we close out. >> Yeah, we got about five or six minutes in the chat room, Jim Kobielus has been awesome helping out and so there's a lot of detailed answer there. The first, there's some questions and comments. The first one was, are there too many chiefs? And I guess, yeah. There's some title inflation. I guess my comment there would be titles are cheap, results aren't. So if you're creating chief X officers just for the, to check a box, you're probably wasting money. So you've got to give them clear roles. But I think each of these chiefs has clear roles to the extent that they are you know empowered. Another comment came up which is we don't want you know, Hadoop spaghetti soup all over again. Well true that. Are we at risk of having Hadoop spaghetti soup as the centricity of big data moves from Hadoop to AI and ML and deep learning? >> Well, my answer is we are at risk of that but that there's customer pressure and vendor economic pressure to start consolidating. And we'll also see, what we didn't see in the ArpiM big data era, with cloud vendors, they're just going to start making it easier to use some of the key services together. That's just natural. >> And I'll speak for Neil on this one too, very quickly, that the idea ultimately is as the discipline starts to mature, we won't have people that probably aren't really capable of doing some of this data science stuff, running around and buying a tool to try to supplement their knowledge and their experience. So, that's going to be another factor that I think ultimately leads to clarity in how we utilize these tools as we move into an AI oriented world. >> Okay, Jim is on mute so if you wouldn't mind unmuting him. There was a question, is ML a more informative way of describing AI? Jim, when you and I were in our Boston studio, I sort of asked a similar question. AI is sort of the uber category. Machine learning is math. Deep learning is a more sophisticated math. You have a detailed answer in the chat. But maybe you can give a brief summary. >> Jim: Sure, sure. I don't want too pedantic here but deep learning is essentially, it's a lot more hierarchical deeper stacks of neural network of layers to be able to infer high level extractions from data, you know face recognitions, sentiment analysis and so forth. Machine learning is the broader phenomenon. That's simply along a different and part various approaches for distilling patterns, correlations and algorithms from the data itself. What we've seen in the last week, five, six tenure, let's say, is that all of the neural network approaches for AI have come to the forefront. And in fact, the core often market place and the state of the art. AI is an ancient paradigm that's older than probably you or me that began and for the longest time was rules based system, expert systems. Those haven't gone away. The new era of AI we see as a combination of both statical approaches as well as rules based approaches, and possibly even orchestration based approaches like graph models or building broader context or AI for a variety of applications especially distributed Edge application. >> Okay, thank you and then another question slash comment, AI like graphics in 1985, we move from a separate category to a core part of all apps. AI infused apps, again, Jim, you have a very detailed answer in the chat room but maybe you can give the summary version. >> Jim: Well quickly now, the most disruptive applications we see across the world, enterprise, consumer and so forth, the advantage involves AI. You know at the heart of its machine learning, that's neural networking. I wouldn't say that every single application is doing AI. But the ones that are really blazing the trail in terms of changing the fabric of our lives very much, most of them have AI at their heart. That will continue as the state of the art of AI continues to advance. So really, one of the things we've been saying in our research at Wikibon `is that the data scientists or those skills and tools are the nucleus of the next generation application developer, really in every sphere of our lives. >> Great, quick comment is we will be sending out these slides to all participants. We'll be posting these slides. So thank you Kip for that question. >> And very importantly Dave, over the course of the next few days, most of our predictions docs will be posted up on Wikibon and we'll do a summary of everything that we've talked about here. >> So now the questions are coming through fast and furious. But let me just try to rapid fire here 'cause we only got about a minute left. True private cloud definition. Just say this, we have a detailed definition that we can share but essentially it's substantially mimicking the public cloud experience on PRIM. The way we like to say it is, bringing the cloud operating model to your data versus trying to force fit your business into the cloud. So we've got detailed definitions there that frankly are evolving. about PaaS, there's a question about PaaS. I think we have a prediction in one of our, you know, appendices predictions but maybe a quick word on PaaS. >> Yeah, very quick word on PaaS is that there's been an enormous amount of effort put on the idea of the PaaS marketplace. Cloud Foundry, others suggested that there would be a PaaS market that would evolve because you want to be able to effectively have mobility and migration and portability for this large cloud application. We're not seeing that happen necessarily but what we are seeing is that developers are increasingly becoming a force in dictating and driving cloud decision making and developers will start biasing their choices to the platforms that demonstrate that they have the best developer experience. So whether we call it PaaS, whether we call it something else. Providing the best developer experience is going to be really important to the future of the cloud market place. >> Okay great and then George, George O, George Gilbert, you'll follow up with George O with that other question we need some clarification on. There's a question, really David, I think it's for you. Will persistent dims emerge first on public clouds? >> Almost certainly. But public clouds are where everything is going first. And when we talked about UniGrid, that's where it's going first. And then, the NVMe over fabrics, that architecture is going to be in public clouds. And it has the same sort of benefits there. And NV dims will again develop pretty rapidly as a part of the NVMe over fabrics. >> Okay, we're out of time. We'll look through the chat and follow up with any other questions. Peter, back to you. >> Great, thanks very much Dave. So once again, we want to thank you everybody here that has participated in the webinar today. I apologize for, I feel like Hans Solo and saying it wasn't my fault. But having said that, none the less, I apologize Neil Raden and everybody who had to deal with us finding and unmuting people but we hope you got a lot out of today's conversation. Look for those additional pieces of research on Wikibon, that pertain to the specific predictions on each of these different things that we're talking about. And by all means, Support@silicongangle.freshdesk.com, if you have an additional question but we will follow up with as many as we can from those significant list that's starting to queue up. So thank you very much. This closes out our webinar. We appreciate your time. We look forward to working with you more in 2018. (upbeat music)

Published Date : Dec 16 2017

SUMMARY :

And that is the emergence of the cloud. but the chief digital officer we see how much data moves from the Edge back to the cloud. and more and more of the AI involves deployment and we have multiple others that the ranks of data scientists are going to sort Neil's prescription that as the tools improve And most of the vendors that we have that AI is going to be relatively easy to build the low hanging fruit is for them to use of the talent ends up. of the triangle here. And that leads to the idea the logic albeit to drive sort of the verbiage And one of the big things we see happening is in the sense that you can quickly the role that the cloud is going to play. Is it something that is going to live on the Edge. is that data is going to dictate that and Kubernetes holds the promise to solve the hardware technologies are also going to evolve. of the processor you get back and the application of AI to So George, it sounds like it's going to be one of the key and the stringency of the SLA's, over the course of the next day, two days, to the extent that they are you know empowered. in the ArpiM big data era, with cloud vendors, as the discipline starts to mature, AI is sort of the uber category. and the state of the art. in the chat room but maybe you can give the summary version. at Wikibon `is that the data scientists these slides to all participants. over the course of the next few days, bringing the cloud operating model to your data Providing the best developer experience is going to be with that other question we need some clarification on. that architecture is going to be in public clouds. Peter, back to you. on Wikibon, that pertain to the specific predictions

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
David FloyerPERSON

0.99+

David VellantePERSON

0.99+

JimPERSON

0.99+

NeilPERSON

0.99+

DavidPERSON

0.99+

StuartPERSON

0.99+

Jim KobielusPERSON

0.99+

Neil RadenPERSON

0.99+

EuropeLOCATION

0.99+

AmazonORGANIZATION

0.99+

2018DATE

0.99+

AWSORGANIZATION

0.99+

Peter BurrisPERSON

0.99+

GeorgePERSON

0.99+

WikibonORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

2017DATE

0.99+

Stuart MinimanPERSON

0.99+

George GilbertPERSON

0.99+

Peter DruckerPERSON

0.99+

May 2018DATE

0.99+

PeterPERSON

0.99+

MicrosoftORGANIZATION

0.99+

General Data Protection RegulationTITLE

0.99+

DavePERSON

0.99+

1985DATE

0.99+

50%QUANTITY

0.99+

Last yearDATE

0.99+

George OPERSON

0.99+

OracleORGANIZATION

0.99+

Hans SoloPERSON

0.99+

Support@silicongangle.freshdesk.comOTHER

0.99+

12 billion dollarsQUANTITY

0.99+

second considerationQUANTITY

0.99+

11 billion dollarsQUANTITY

0.99+

Nine systemsQUANTITY

0.99+

#SiliconValley Friday Show with John Furrier - Feb. 10th, 2017


 

>> We're here, about to go live, here in a selfie on the pre Silicon Valley Friday Show, about to go live for our show, for some live Friday. We've got a great lineup, it's on my Twitter. Donald Trump and all his viral tweets and now there's an algorithm out there that creates a shorting stock called Trump and Dump, we're going to be talking to the inventor of that new app. Bunch of other great stuff, controversy around Silicon Valley and Intel, controversy on Google, and we'll be watching a great show, well, hopefully you'll be watching. >> Male Announcer: Live, from Cube headquarters in Palo Alto, California it's the Silicon Valley Friday Show, with John Furrier. (serene techno music) >> Hello, everyone, and welcome to the Silicon Valley Friday Show, I'm John Furrier, we are here live in Palo Alto, California for the Silicon Valley Friday Show every Friday morning we broadcast what's going on in Silicon Valley, what's going on in the streets, we call up people and find out what's going on, this show we've got a great lineup. We're going to talk about, I'll say, the news, Twitter, but we've got this fun segment where we have an algorithm, a bot, an AI bot that goes out there and takes all of Donald Trump's tweets and creates a shorting of the stock and creates making money, apparently, Donald Trump's tweets do move the market. We're going to talk about Snapchat, Snap Inc's IPO, and a refiling and some controversy going around that. Also, controversy around Intel Corporation that just announced a fab plant in Arizona and the CEO is in the White House making the announcement, giving the impression that Donald Trump was all behind this, turns out the CEO is a Republican and supports Donald Trump, when apparently this has been in the works for multiple years, so, not sure that's going to be a game changer for Trump but certainly Intel's taking advantage of the schmooze factor and the PR stunt that has people in Silicon Valley up in arms. Obviously, Intel is pro-immigration, bringing people in, obviously, Andy Grove was an immigrant, legend of Intel. And we have also tons of stuff going on, we're going to preview Mobile World Congress the big show in Barcelona at the end of the month. We're doing a two day special here, live in Pal Alto, we're going to do a special, new Silicon Valley version of Mobile World Congress. We'll give you a preview, we're going to talk to some analysts. And also, the fake news, fake accuracy, and all the stuff that's going on, what is fake news? What is inaccurate news? Is there a difference? Does it matter? It certainly does, we have an opinion on that so, great show lineup. First, is actually Twitter earnings are out and they kind of missed and hit their up on the monthly active uniques by two million people. A total of I think 300 million people are using the number here, just on my notes here says, that there are up to 319 million active, monthly active users. And of course, Trump has been taking advantage of Twitter and the Trump bump did not happen for Twitter, although some say Trump kept it alive. But Trump is using Twitter. And he's been actively on Twitter and is causing a lot of people, we've talked about it many times on the show, but the funniest thing that we've seen, and probably the coolest thing that's interesting is that there's an entrepreneur out there, an agency guy named Brian, Ben Gaddis, I'm sorry, president of T3. He's a branding guy, created viral videos on NPR, all over the news, went viral, he created an AI chatbot that essentially takes Donald Trump's tweets, analyzes any company mentioned and then instantly shorts the stock of that company. And apparently it's working, so we're going to take a look at that. We're also going to talk to him and find out what's going on. We're going to have Ben Rosenbaum on, we're going to have someone from Intel on, we have a lot of great guests, so let's take a look at this clip of the Trump and Dump and then we're going to talk to Ben right after. >> Announcer: T3 noticed something interesting about Twitter lately, particularly when this guy gets hold of it. Anytime a company mentions moving to Mexico or overseas or just doing something bad, he's on it, he tweets, the stock tanks. Tweet, tank. Tweet, tank. Tweet, tank. Everyone's talking about how to make sense of all this. T3 thought the unpredictability of it created a real opportunity. Meet the Trump and Dump automated trading platform. Trump and Dump is a bot powered by a complex algorithm that helps us short stocks ahead of the market. Here's how. Every time he tweets, the bot analyzes the tweet to see if a publicly traded company is mentioned. Then, the algorithm runs an instant sentiment analysis of the tweet in less than 20 milliseconds. It figures, positive or negative. A negative tweet triggers the bot to short the stock. Like earlier this month, his Toyota tweet immediately tanked the stock. But the Trump and Dump bot was out ahead of the market. It shorted the second after his tweet. As the stock tanked, we closed our short and we made a profit, huge profit. Oh, and we donated our profits here. So now, when President Trump tweets, we save a puppy. It's the Trump and Dump automated trading platform. Twitter monitoring, sentiment analysis, complex algorithms, real time stock trades. All fully automated, all in milliseconds. And all for a good cause. From your friends at T3. >> Okay, we're back here in Silicon Valley Friday Show, I'm John Furrier and you just saw the Trump and Dump, Trump and Dump video and the creator, that is Ben Gaddis on the phone, president of T3, a privately owned think tank focused on branding. Ben, thanks for joining us today. >> Thanks for having me, John. Excited to talk with you. >> So, big news NPR had on their page, which had the embed on there and it went viral. Great video, but first talk about the motivation, what's going on behind this video? This is very cool, explain to the folks out there what this Trump and Dump video is about, why did you create it, and how does it work? >> So, we had just like, I think, almost everyone in the United States, we were having a conversation about what do you do with the fact that President Trump is tweeting and tweeting about these companies, and in many cases negatively. So we saw articles talking about it and actually one day a guy in our New York office came up with this idea that we ought to follow those tweets in real time and if he mentions a publicly traded company negatively, short the stock. And so, we kicked that idea around over slack and in about 30 minutes we had an idea for the platform. And about two days later one of our engineers had actually built it. And so what the platform does is it's really actually simple yet complex. It listens to every tweet that the president puts out and then it does two things: it determines if there's a publicly traded company mentioned and if there is, and it actually does sentiment analysis in real time, so, in about 20 milliseconds, it can tell if the tweet is positive or negative. If it's negative, we've seen the stocks typically go down and we short sell that stock. And so, the profit that we develop from that, then we donate it to the ASPCA and then hopefully we save a puppy or two in the process. >> Yeah, and that's key, I think that's one thing I liked about this was you weren't arbitraging, you weren't like a real time seller like these finance guys on Wall Street, which by the way, have all these complex trading algorithms. Yours is very specific, the variables are basically Donald Trump, public company, and he tends to be kind of a negative Tweeter so, mostly to do with moving to Mexico or some sort of you know, slam or bullying kind of Tweet he does. And which moves the market, and this is interesting though, because you're teasing out something clever and cool on the AI kind of side of life and you know, some sort of semantic bot that essentially looks at some context and looks at the impact. But this is kind of the real world we're living in now, these kinds of statements from a president of the United States, or anyone who's in a position of authority, literally moves the market, so you're not doing it to make money you're doing it to prove a point which is that the responsibility here is all about getting exposed in the sense that you got to be careful of what you say on Twitter when you're the president of the United States. I mean, if it was me saying it, I mean, I'm not going to move the market but certainly, you know, the press who impact large groups of people and certainly the president does that so, did you guys have that in mind when you were thinking about this? >> Well, we did. I mean, I think, you know, our goal was, this is what we do for a living, we help big brands monitor all their digital presences and build digital strategy. So, we're already monitoring sentiment around Twitter and around social platforms so, it's pretty core to what we do. But we're also looking at things that are happening in pop culture and societally, what kind of impact social might have on business. And so, the fact that we're able to take an action and deliver a social action, and deliver a real business outcome is pretty core to what we do. What's different here and what's so unique is the fact that we've never really seen things like, policy, whether it's monetary policy, or just general policy be distributed through one platform like Twitter and have such a big impact. So, we think it's kind of a societal shift that is sort of the new norm. That, I don't know that if everyone has figured out what to do with yet and so our goal is to experiment and decide one, can we consume the information fast enough to take an action? And then how do we build through AI platforms that allow us to be smarter in the world that we're living in today that is very, very unpredictable. >> We have Ben Gaddis, as president of T3 also part of the group that did the Trump and Dump video but he brings out a great point about using data and looking at the collective impact of information in real time. And this interesting, I was looking at some of the impact last night in this and Nordstrom's had a tweet about Ivanka Trump and apparently Nordstrom's stock is up so, is there a flaw in the algorithm here? What's the take on that? Because in a way, that's the reverse of the bullying, he's defensive on that one so, is there a sentiment of him being more offensive or defensive? >> It's pretty standard. So, we're starting to see a pattern. So, what happens is that actually, the Nordstrom stock actually did go down right after the tweet. And so, we saw that that's a pattern that's typical when the president tweets negatively. When he tweets positively, we don't see that much of a bump. When he tweets negatively, typically the stock drops anywhere between one and four percent, sometimes even greater than that. But it rebounds very quickly. So, a big part of what we're trying to do with the bot and the algorithm is understand how long do we hold, and what is that timeframe before people actually come back to more of a rational state and start to buy back a stock that's valuable. Now what's really interesting, you mentioned, you know, the algorithm and whether there's a flaw in it, we learned something very interesting yesterday about Nordstrom's. So, the president tweeted and in that tweet he talked negatively about Nordstrom's, but he also talked very positively about his daughter, Ivanka. And so, the algorithm actually picked up that tweet and registered it as 61.5% positive. So, it didn't trade. So, we actually got kind of lucky on that one. >> You bring up a good point, and this is something that I want to get your thoughts on. You know, we live in an era of fake news, and it's just Snapchat just filed IPO filing to make a change in their filing to show that Amazon is going to be a billion dollar partner as well, which wasn't in the filing. So, there's a line between pure, fake news, which is essentially just made up stuff, and inaccurate news, so what you're kind of pointing out is a new mechanism to take advantage of the collective intelligence of real time information. And so this is kind of a new concept in the media business. And brands, who used to advertise with big media companies, are now involved in this so, as someone who's, you know, an architect for brand and understanding data, how are brands becoming more data driven? >> Well, I think what brands are realizing is that they live in this world that is more real time, that's such a buzzword. But more real time than I think they even thought would ever be possible, the fact that someone like the president can tweet and have literally cut off billions of dollars in market cap value in a moment's time is something that they have to figure out. So, I think the first thing is having the tools in place to actually monitor and understand, and then having a plan in place to react to things that are really quite unpredictable. So, not only, I don't think that you can have a plan for everything but you have to at least have a plan for understanding how you get legal approval on a response. Who would be responsible for that. You know, who do you work with, either through partners or inside of your organization to, you know, to be able to respond to something when you need to get back in promoting, you know, minutes versus hours. The thing that we don't hear people talk near as much about is, our goal was to see how close we can get to the information so we can zoom the data from Twitter's fire hose, so we get it hopefully when everyone else does. And then our goal is to take an action on that quicker than anybody else, and that delta is where we'll make a profit. What's really interesting to me is that the only person closer to that information than the president is Twitter. >> Ben, great to have you on, appreciate it, love to get you back on as a guest. We love to talk about is our model here, it's looking angle, it's extracting the signal from the noise. And certainly the game is changing, you're working with brands and the old model of ad agencies, this is a topic we love to cover here, the old ad agency model's certainly becoming much more platform oriented with data, these real time tools really super valuable, having a listening engine, having some actionable mechanisms to go out there and be part of and influence the conversation with information. Seems to be a good trend that you guys are really riding. Love to have you back on. >> We'd love to be back on, and thanks for the time, we enjoyed it. >> That was Ben Gaddis, who's the president of T3, the firm behind the Trump and Dump, but more importantly highlighting a really big megatrend which is the use of data, understanding its impact, having some analysis, and trying to figure out what that means for people. Be right back with more after this short break. >> [Female Announcer] Why wait for the future? The next evolution in IT infrastructure is happening now. And Cisco's Unified Computing System is ready to power your data center in the internet of everything. Urgent data center needs went addressed for years, so Cisco wiped the slate clean and built a new fabric-centric computing architecture that addresses the application delivery challenges faced by IT in the dynamic environments of virtualization, Cloud, and big data. Cisco UCS represents true innovation with revolutionary integration. It improves performance, while dramatically driving down complexity and cost. Far lower than alternatives from the past. Cisco's groundbreaking solution is producing real results for a growing list of satisfied customers now moving to unified computing, transforming how IT can perform. Pushing out the boundaries of performance and scale and changing the face of business from the inside out. Right now, the industry is witnessing the next wave of computing. So, why should your business wait for the future? Unify your data center with Cisco UCS. >> Male Announcer: You're listening to Cube Fridays, brought to you by Silicon Angle Media. Now, here's John Furrier. >> Okay, welcome back to the Silicon Valley Friday Show, I'm John Furrier, great show today. Our next guest is Dan Rosenbaum, who is the editor of Wearable Tech Insider, Media Probe, been around the industry for years, been a journalist, reporter, editor, variety through his career, knows the tech business certainly on the infrastructure level with the device. Okay, welcome to the show, great to have you, thanks for being available, he's in New York so, Palo Alto, New York connection here. >> Yeah, we got about maybe an hour or so of snow left. But you know, it's February, it does this in New York. >> Great to have you on, we were just talking on our earlier segment before the break about the guy who created the Trump and Dump video which is a chat bot that goes out, looks at Donald Trump's tweets, and then identifies if there's a public company, shorts the stock, and donates to save puppies. So, they're not doing it for profit but they're, you know, they have their intelligence and listening, and we were just riffing on the concept of that there's been fake news and inaccuracy and a new dynamic that's impacting the media business, which is real time information, data, and certainly the world that you're in with Wearables, this new internet of things, which is hard to understand for most common people but it's really the AI new connected network. It's really impacting things, certainly how people get information, how fast they create data, and it's changing the industry landscape certainly from a media standpoint. You get on TV and the mainstream... >> It really is. When the press secretary stood up and said that that the administration sees the media as the adversary, you know, everyone got sort of upset about it but you know, in a lot of ways it's true. That's a fitting way that the media and any administration, any power structure should be facing each other. There's been such a hop in the media to report the truth as best as it can determine and as accurately as it can. Now, there are differing impacts depending on which sphere you're in, and in politics there's always going to be sort of the tension, well, we think, we look at these facts and we think that and we look at those facts and think the other. >> I think ultimately this new formats that are developing really comes back down to I would add to that as trust. This is a collision course of a complete re-transformation of the media landscape and technology's at the heart of it and, you know, you're in the middle of it. With Wearables, you're seeing that at the edge of the network, these are new phenomenons. What's your take on this new trend of, you know, of computing? And I'm not saying singularity, as Ray Kurzweil would say, but you know, ultimately, it is going down to the point now where it's on your body, potentially in your body, but this is a new form of connection. What's your thoughts on this? >> 12 years ago, I was at the party where they launched MSNBC, and I ran into Andrew Lack, who's the CEO of MSNBC at the time, and asked him, why NBC was cutting this collaboration deal with Microsoft, because remember that's how it was started, when there wasn't any means for the news to go upwards. There was no way for citizen news gathering to be represented on this Microsoft-NBC co-venture. And Andrew actually looked down his nose at me, sneered, and goes, "Who in the world would want "people to be contributing to the news?" Well, now we're 10 or 12 years later and as you say, Snapchat and Skype, and all these mobile technologies have just transformed how people get their information, because they're now witnesses, and there are witnesses everywhere. One of the big transformations in, or about wearable technology is that computing infrastructure has moved from islands of stand-alone, massive computers, to networks of massive computers to stand-alone PCs, to networks to PCs, and now the model for computing and communication is the personal area network, the idea of sensor-based technologies is going to change, or already has changed the world of news, it's in the process of changing the world of medicine, it's in the process of changing the way we build houses, the construction business, with the smartphone, the way that we build and relate to cities. >> So, we're here with Dan Rosenbaum, he's the editor of Wearable Tech Insider, but more importantly he's been a tech insider in media going way back, he's seen the cycles of innovation. Love your point about the flowing conversations coming out of the MSNBC kind of executive in the old broadcast models. I mean, I have four kids, my oldest is 21, they don't use, they don't really care about cable TV anymore so, you know, this is now a new narrative so, those executives that are making those comments are either retired or will be dinosaurs. You now have Amazon, you have Netflix, you have, you know, folks, trying to look at this internet TV model where it's fully synchronous so, now you have collective intelligence of vertical markets that have real time ability to surface information up to bigger outlets. So, this collective media intelligence is happening, and it's all being driven by mobile technology. And with that being said, you know, you're in the business, we've got Mobile World Congress coming up, what is that show turning into? Because it's not about the mobile device anymore, the iPhone's 10 years old, that's a game changer. It's growing up. The impact of mobile is now beyond the device. >> Mobile World Congress is all about wireless infrastructure. It goes from everything from a one millimeter square sensor to the national grade wireless network. But what's really cool about Mobile World is that it's the place where communications or telecom ministers get together with infrastructure carriers, get together with the hardware manufacturers, and they hash out the problems that won't resolve five, 10, 15 years down the road in new products and new services. This is the place where everyone comes together. The back rooms at Mobile World Congress are the hottest place, and the back rooms are the places that you can't get into. >> We're here with Dan Rosenbaum, who's an industry veteran, also in the media frontlines in wireless technology, I mean, wearable technology and among other things, good view of the landscape. Final point, I want to just get a quick comment from ya, I was watching on Facebook, you had a great post around Facebook is feeding you an ad for a $19 million staid-in, let's feel Connecticut. And then you said, "One of us as the wrong idea, so you must be really loaded." This retargeting bullshit on Facebook is just ridiculous, I mean, come on, this bad, big data, isn't it? >> (laughing) Yeah, I mean, the boast of Google is that they want to make, you know, ads so relevant that they look like content. Well, in the process to getting there, there's going to be misses. You know, if this real estate agent decides that they want to hit everyone in my zip code, or everyone in my county, or whatever, and they wanted pay the five dollars so that I'd see that video, god bless 'em, let 'em do it, it's not going to make me, it's not going to overcome any kind of sales resistance. I don't know that I wanted to move up to Litchfield, Connecticut anyway, but if I did, sure, a $19 million house would be really nice. >> You could take a chopper into Manhattan, you know, just drop into Manhattan with a helicopter. >> They would want to take it. >> Alright, we can always take the helicopter in from Litchfield, you know, right at the top of your building. Dan, thanks so much for spending the time, really appreciate it, and we'll have to bring, circle back with you on our two day Mobile World Congress special in Palo Alto we'll be doing, so appreciate the time. Thanks a lot. >> Love to do it, thanks for having me. >> Okay, that was Dan Rosenbaum, really talking about, going down in the weeds a little bit but really more importantly, this Mobile World Congress, what's going on with this new trend, digital transformation really is about the impact to the consumer. And what's going on Silicon Valley right now is there's some hardcore tech that is changing the game from what we used to know as a device. The iPhone's only 10 years old, yet 10 years old, before the iPhone, essentially it was a phone, you made phone calls, maybe surf the Web through some bad browser and do text messages. That's now completely transforming, not just the device, it's the platform, so what we're going to see is new things that are happening and the tell signs are there. Self driving cars, autonomous vehicles, drones delivering packages from Amazon, a completely new, digitized world is coming. This is the real trend and we're going to have an executive from Intel on next to tell us kind of what's going on because Intel is at the ground zero of the innovation with Moore's Law and the integrated circuit. But they're bringing their entire Intel inside as a global platform, and this is really going to be driven through a ton of 5G, a new technology so, we're going to dig in on that, and we're going to have a call-in from her, she's going to be coming in from Oregon and again, we're going to get down to the engineers, the people making the chips under the hood and bringing that to you here on the Silicon Valley Friday Show, I'm John Furrier, we'll be right back after this short break. >> My name is Dave Vellante, and I'm a long-time industry analyst. So, when you're as old as I am you've seen a lot of transitions. Everybody talks about industry cycles and waves, I've seen many, many waves. I've seen a lot of industry executives and I'm a little bit of an industry historian. When you interview many thousands of people, probably five or six thousand people as I have over the last half of the decade, you get to interact with a lot of people's knowledge. And you begin to develop patterns so, that's sort of what I bring is an ability to catalyze a conversation and, you know, share that knowledge with others in the community. Our philosophy is everybody is an expert at something, everybody's passionate about something and has real deep knowledge about that something. Well, we want to focus in on that area and extract that knowledge and share with our communities. This is Dave Vellante, and thanks for watching the Cube. (serene techno music) >> Male Announcer: You're listening to the Silicon Valley Friday Show with John Furrier. >> Okay, welcome back to the Silicon Valley Friday Show, I'm John Furrier, we're here in Palo Alto for this Friday Show, we're going to go under the hood and get into some technology impact around what's going on in the industry, specifically kind of as a teaser for Mobile World Congress at the end of the month, it's a big show in Barcelona, Spain where the whole mobile and infrastructure industry comes together, it's kind of like CES, Consumer Electronics Show, in the mobile world but it's evolved in a big way and it's certainly impacting everyone in the industry and all consumers and businesses. This is Intel's Lynn Comp and this is Intel who, we know about Moore's Law, we know all about the chips that make everything happen, Intel has been the engine of innovation of the PC revolutions, it's been the engine of innovation now in the Cloud and as Intel looks at the next generation, they are the key player in this transformation that we are seeing with AI, wearable computers, internet of things, self driving cars, AI, this is all happening, new stuff's going on. Lynn, welcome to the program. >> Thank you so much, it's great to be here. >> So, you're up in Oregon, thanks for taking the time to allow us to talk via phone, appreciate it. Obviously, Intel, we've been following you guys, and I've been a big fan since 1987, when I almost worked there right out of college. Went to Hewlett Packard instead, but that's a different story but, great, great innovation over the years, Intel has been the bell weather in the tech industry, been a big part of the massive change. But now, as you look at the next generation, I mean, I have four kids and they don't watch cable TV, they don't like, they don't do the things that we used to do, they're on the mobile phone all the time. And the iPhone is now 10 years old as of this year, this early winter part of this, Steve Jobs announced it 10 years ago. And what a change has it been, it's moved from telephone calls to a computer that happens to have software that makes telephone calls. This is a game changer. But now it seems that Mobile World Congress has changed from being a telephone centric, voice centric, phone device centric show to a software show, it seems to be that software is eating the world just like CES is turning into an automotive show. What is Mobile World Congress turning into? What's the preview from Intel's perspective? >> You know, it's a really fascinating question because many years ago, you would only see a bunch of very, very intense base station design, you know, it was very, very oriented around wireless, wireless technology, and radios, and those are really important because they're an engine of fabric that you can build capabilities onto. But last year, just as a reference point for how much it's changed, we have Facebook giving one of the main keynotes. And they're known for their software, they're known for social media, and so you'll see Facebook and Google with an exhibitor there last year as well, so you're not just seeing suppliers into the traditional wireless industry for equipment and the operators who are the purchaser, you're seeing many, many different players show up very much like how you said CES has a lot of automotives there now. >> Yeah, we've seen a lot of revolutions in the computer industry, Intel created a revolution called the Computer Revolution, the PC Revolution, and then it became kind of an evolution, that seems to be the big trends you see, that cycle. But it seems now that we are, kind of been doing the evolution of mobile computing, and my phone gets better, 10 years down to the iPhone, 3G, 4G, LT, okay, I want more bandwidth, of course, but is there a revolution? Where can you point to? Where is the revolution, versus just standard evolutionary kind of trends? Is there something coming out of this that we're going to see? >> That is such a great question because when you look at the first digital wireless technologies that came out and then you had 2G, and 3G, and 4G, those really were evolutionary. And what we're finding with 5G that I believe is going to be a huge theme at Mobile World Congress this year is it is a completely different ballgame, I would say it's more of an inflection point or very revolutionary. And there's a couple reasons for that, both tie up in how ITU is specifying the use cases, it's licensed and unlicensed spectrum which is kind of unusual for how it's been done if you will get 2, 3, and 4G. The other thing that's really interesting about 5G, that it's an inflection point is there's a lot more intelligence assumed in the network and it helps address some of the challenges I think that the industry is seeing a different industry with some of the IoT promise we'll roll out where some of the macro design networks that we'd seen in the past, the ability to have the right latency, the right bandwidth, and the right cost matched to the needs of a specific IoT use case was much more limited in the past and I think we'll see a lot more opportunities moving forward. >> Great, great stuff, we're with Lynn Comp with the Network Platforms Group at Intel. You know, you bring up some, I like the way you're going with this, there's so much like, impact to society going on with these big, big trends. But also I was just having a conversation with some young folks here in Palo Alto, high school kids and some college kids and they're all jazzed up about AI, you can almost see the... I don't want to say addiction but fascination and intoxication with technology. And there's some real hardcore good tech going on here, could you just share your thoughts on, you know, what are some of those things that are going to, 'cause I mean, 5G to wireless, I get that, but I mean, you know, these kids that we talked to and folks that are in the next generation, they love the autonomous vehicles. But sometimes I can't get a phone signal, how are cars going to talk to each other? I mean, how does this, I mean, you've got to pull this together. And these kids are like, and it's into these new careers. What's your thoughts on what are some of the game changing tech challenges that are coming out of this? >> Let's just start with something that was a great example this year 'cause I think I have kids a similar age. And I had been skeptical of things like even just virtual reality, a augmented or virtual reality. And then we had this phenomena last summer that really was just a hint, it wasn't really augmented reality, but it was a hint of the demand that could be met by it and it's Pokemon Go. And so, an example with that, I mean, it really wasn't asking a significantly higher amount of data off the network, but it did change the use profile for many of the coms service providers and many of the networks where they realized I actually have to change the architecture, not just of what's at the edge but in my core network, to be more responsive and flexible, you are going to see something even more so with autonomous driving, even if it's just driver assist. And similar to how the auto pilot evolution happened, you're still going to have these usage patterns where people have too many demands, too much information coming at them, they do want that assistance, or they do want that augmented experience to interact with a brand, and it's going to really stress the network and there's going to have to be a lot of innovation about where some of these capabilities are placed and how much intelligence is close to the user as opposed to just a radio, probably going to need a lot more analytics and a lot more machine learning capabilities there as well. >> We had a segment earlier in the show, it was the entrepreneur who created the Trump and Dump chat bot that would go out and read Donald Trump's tweets and then short all public companies that were mentioned because the trend is, they would do that, but this is an example of some of these chat bots and some of this automation that's going on and it kind of brings the question up to some of the technology challenges that we're looking out at the landscape that we're discussing is the role of data really is a big deal and software and data now have an interaction play where you got to move data around the networks, networks are now ubiquitous, networks are now on people, networks are now in cars, networks are now part of all this, I won't say unstructured networks, but omni-connected fabric. So, data can really change what looks like an optimal architecture to a failed one, if you don't think about it properly. So, how do you guys at Intel think about the role of data? I mean, how do you build the new chips and how do you look at the landscape? And it must be a big consideration, what's your thoughts about the role of data? Because it can happen at any time, a tsunami of data could hit anything. >> Right, the tsunami of data. So for us, it's any challenge, and this is just in Intel's DNA, historically, we'll get challenges as opportunities because we love to solve these really big problems. And so, when you're talking about data moving around a network you're talking about transformation of the network. We've been having a lot of discussions with operators where they see the data tsunami, they're already seeing it, and they realized, I have got to reconfigure the architecture of my network to leverage these technologies and these capabilities in a way that's relevant for the regulatory environment I'm in. But I still have to be flexible, I have to be agile, I have to be leveraging programmability instead of having to rewrite software every generation or every time a new app comes out. >> Lynn, thanks so much for coming on. Like we always say, you know, engine room more power, you can never have enough compute power available in network bandwidth, as far as I'm concerned. You know, we'd love to increase the power, Moore's Law's been just a great thing, keeps on chugging along. Thanks for your time and joining us on the Silicon Valley Friday Show, appreciate it. Thanks so much. >> Thank you. >> Alright, take care. Okay, this is Silicon Valley Friday Show, I'm John Furrier, thanks so much for listening. I had Ben Gaddis on, Dan Rosenbaum, and Lynn Comp from Intel really breaking it down and bringing you all the best stories of the week here on the Silicon Valley, thanks for watching. (techno music) (bright instrumental music)

Published Date : Feb 10 2017

SUMMARY :

here in a selfie on the pre Silicon Valley Friday Show, it's the Silicon Valley Friday Show, and all the stuff that's going on, what is fake news? As the stock tanked, we closed our short that is Ben Gaddis on the phone, president of T3, Excited to talk with you. why did you create it, and how does it work? And so, the profit that we develop from that, and looks at the impact. And so, the fact that we're able to take and looking at the collective impact of And so, the algorithm actually picked up the collective intelligence of real time information. the only person closer to that information and influence the conversation with information. and thanks for the time, we enjoyed it. the firm behind the Trump and Dump, and changing the face of business from the inside out. brought to you by Silicon Angle Media. certainly on the infrastructure level with the device. But you know, it's February, it does this in New York. and certainly the world that you're in the adversary, you know, everyone got sort of upset about it technology's at the heart of it and, you know, and goes, "Who in the world would want is now beyond the device. and the back rooms are the places that you can't get into. And then you said, the boast of Google is that they want to make, you know, you know, just drop into Manhattan with a helicopter. and we'll have to bring, circle back with you and bringing that to you here as I have over the last half of the decade, the Silicon Valley Friday Show with John Furrier. and it's certainly impacting everyone in the industry thanks for taking the time to and the operators who are the purchaser, that seems to be the big trends you see, that cycle. and it helps address some of the challenges and folks that are in the next generation, and there's going to have to be a lot of innovation and it kind of brings the question up to the architecture of my network to leverage on the Silicon Valley Friday Show, appreciate it. and bringing you all the best stories of the week here

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dan RosenbaumPERSON

0.99+

BrianPERSON

0.99+

AmazonORGANIZATION

0.99+

Andy GrovePERSON

0.99+

Dave VellantePERSON

0.99+

Ben GaddisPERSON

0.99+

NBCORGANIZATION

0.99+

OregonLOCATION

0.99+

Andrew LackPERSON

0.99+

MicrosoftORGANIZATION

0.99+

MexicoLOCATION

0.99+

ManhattanLOCATION

0.99+

TrumpPERSON

0.99+

CiscoORGANIZATION

0.99+

Ray KurzweilPERSON

0.99+

New YorkLOCATION

0.99+

ArizonaLOCATION

0.99+

fiveQUANTITY

0.99+

Palo AltoLOCATION

0.99+

NordstromORGANIZATION

0.99+

Ben RosenbaumPERSON

0.99+

MSNBCORGANIZATION

0.99+

Feb. 10th, 2017DATE

0.99+

Donald TrumpPERSON

0.99+

ASPCAORGANIZATION

0.99+

Pal AltoLOCATION

0.99+

IvankaPERSON

0.99+

John FurrierPERSON

0.99+

T3ORGANIZATION

0.99+

AndrewPERSON

0.99+

DanPERSON

0.99+

Silicon Angle MediaORGANIZATION

0.99+

IntelORGANIZATION

0.99+

SnapchatORGANIZATION

0.99+

BarcelonaLOCATION

0.99+

10QUANTITY

0.99+

Silicon ValleyLOCATION

0.99+

21QUANTITY

0.99+

four kidsQUANTITY

0.99+

Steve JobsPERSON

0.99+

Lynn CompPERSON

0.99+

two dayQUANTITY

0.99+

last yearDATE

0.99+

$19 millionQUANTITY

0.99+

NPRORGANIZATION

0.99+

LynnPERSON

0.99+

GoogleORGANIZATION

0.99+

Snap IncORGANIZATION

0.99+

10 yearsQUANTITY

0.99+

United StatesLOCATION

0.99+

five dollarsQUANTITY

0.99+

Pokemon GoTITLE

0.99+

JohnPERSON

0.99+

BenPERSON

0.99+

ToyotaORGANIZATION

0.99+

Intel CorporationORGANIZATION

0.99+

TwitterORGANIZATION

0.99+