Breaking Analysis: AWS Growth Slows but Remains Amazon’s Profit Engine
>> From the SiliconANGLE media office in Boston, Massachusetts, it's theCUBE. Now here's your host Dave Vellante. (techy music) >> Hello everybody. Welcome to this episode of CUBE Insights powered by ETR. My name is Dave Vellante, and in this breaking analysis we're going to take a look at AWS. Today's October 25th, last night Amazon announced its earnings. It missed and it lowered guidance, particularly for the all-important Q4 holiday season, but I want to drill into the AWS portion of Amazon's business. When we do these breaking analysis we'd like to provide data, we'd like to provide content, context, data from our friends at ETR, things that we learn on theCUBE, input from our community, and if you look at AWS in the quarter, they came in at just around $9 billion. That was about 35% growth, and people were concerned that that's a lower growth than a year ago, so 35% is, you know, year on year comparison they were 46% last year Q3. Operating margins were also down. I'll talk a little bit about that. They were 25%, which is still pretty strong, but they were down from 31% last year in Q3. The ETR spending data shows that AWS is still strong but spending is not as robust, that's why they have a positive to neutral rating on Amazon. But Amazon's still a share gainer. When you look at the AWS customers inside of the ETR survey base, they're spending more on Amazon and less on Oracle. They're spending less on IBM, they're spending less on SAP. We talked about Cloudera last week. They're spending less on Teradata, so they're shifting spend from those legacy platforms. This is AWS customers now into AWS. The other piece is Microsoft's moving. Microsoft has been consistently growing faster than AWS, and I'm going to talk a little bit about that and what it means. So you're seeing AWS revenue slows, business is strong, but Microsoft is gaining, so let's dig into it. Alex, bring up the first slide. What I'm showing here is AWS, a little history on AWS, the quarterly revenue year on year growth rates, and what you can see here is on the left hand side is the revenue in millions, so you can see they did about $9 billion, that's the blue bars, this past quarter, just under $9 billion. On the right hand axis is the growth rate, and you can see it spiked up there in '15 and then sort of slowly came down, spiked back up in '18. You can see Q3 '18 was 46%, as I said, and then it's sort of down around 35% in Q3 of 2019. Now compare that to Microsoft. I think it grew 59% at the most recent quarter. It's been consistently up in the 60s percent growth each quarter. Now AWS will say, "Look, yeah, that's true, "but they're growing from a much smaller base." While that's true, and the other thing that AWS will say is that every year Amazon's growth is about the entire size of Azure, so in other words Amazon's growth rate and what they add is about the size of Azure. That's changing, and I'll share some data that will show you that, but this year AWS will probably add about $10 billion in new revenue, and Microsoft, if you strip out Office 365 and Skype and LinkedIn and all that other, you know, and all the SaaS stuff and just focus on the infrastructure as a service so you try to make an apples to apples comparison between AWS and Azure, Azure will be quite a bit larger than that. We think probably in the $14 to $15, maybe even $16-plus billion, so that narrative is starting to change. Now the next slide that I want to show you is our same quarterly revenue, so you can see that bar chart and the $9 billion phenomenal growth, but also show, then the red line on the right hand axis is operating margin, (clears throat) and you can see the operating margin moderated here at 25% in Q3, the announcement this week, and you can see a year ago it was 31%. So this has the street a little bit concerned. Now this is still very strong operating margins. Remember AWS, (chuckles) they started selling Compute. If you think about Dell and HPE's operating margins you're talking, you know, they're thrilled if they're in the 10%. They're, you know, oftentimes much lower than that in the single digits, sometimes, you know, low single digits, but so... So AWS much, much more profitable. Compare the AWS to some of the other leaders. Cisco, who's got 60% of the networking market, its operating margins have been in the 20% to 27% range over the last, you know, several years. Intel, which essentially has or had a monopoly, 28% to 33%, Microsoft is pure software play, or you know, largely software play, low 30%s, and again, a company that had or has, you know, a monopolistic-like, you know, cashflow and profitability. So AWS at that 25% to 30% operating margins very, very strong. Okay, now I want to shift gears and show you some of the ETR data. What this next slide shows is the net scores from the cloud sector, so what I've done is pulled from the dataset just the cloud sector and done some comparisons. Now what you can see is in the October survey the N of the entire survey is 1,336, so out of the 4,500 CIOs and IT practitioners that ETR surveys each quarter, 1,300 answered this question, and of those there were 611 Azure accounts, 546 AWS, 215 Google Cloud Platform, 157 Oracle, and 107 IBM. So you can obviously see there's multiple clouds per respondent. What the net score does, it takes the green, which is basically we're spending more, and subtracts the red, which says we're spending less, and it comes up with a net score that you can see on the right hand side. And this is just for the cloud sector. Now look at Azure's net score, 71%, that's very, very high. I mean it's up there with some of the hottest sectors in the industry and some of the pure plays like UiPath. We've talked about UiPath, like Snowflake to the companies that are demonstrating the highest net score in the ETR dataset. AWS still very strong at 63%, but not quite as strong as Azure. As we said before, there's a lot of ways to, last week, there are a lot of ways to spend money with Microsoft. Google Cloud Platform strong at 50%, but as we know, not nearly the market share of those other two, and I put in Oracle and IBM just for the sake of comparison. You can see their net scores are much, much lower, so you can, so you see the cloud continues to do really well, but particularly those two cloud leaders maintaining their dominance, but you know, Azure from a growth rate and a spending momentum standpoint is really picking up. Now the next slide that I want to show you underscores the sentiment from ETR. Remember, ETR when it released last week its October survey results they said, "We have a positive to neutral rating on AWS." Why were they neutral? Well, some of the things I was saying before is some of that spending momentum is slowing. Why are they positive? This slide underscores that. If you take all sectors, so everything for AWS, not just the cloud stuff, I mean it's all cloud, but put in AI, the machine learning, all the database activity, et cetera, et cetera, et cetera, everything you can buy from AWS and then look at the Fortune 500. They're a great indicator, obviously, of spend, so the big companies, notice the net score, which is that top line, that top blue line, very consistent. It's elevated, and it's inline, and it's up, you know, up in the high 60s, so that's very, very strong. The yellow line is market share. ETR defines market share in relative terms, so how much people are spending on AWS relative to other sectors, and you can see it's a steady, steady rise, so this is why ETR and we remain positive on AWS, because also in addition to Microsoft there's a lot of ways to spend money with AWS and that keeps growing and growing and growing. So you're seeing nice continued market share gains. They continue to be a gainer. Okay, so let me wrap here. As I say, positive to neutral because what's happening is you're seeing that share gain beyond Compute. AI, machine learning, analytics very hot. AWS cited, or Amazon cited SageMaker as a tailwind for their business. Database now at AWS is a multibillion dollar business. Amazon cited Aurora as again another tailwind. Here's the thing, the reason why we're somewhat neutral on Amazon and AWS specifically is the law of large numbers appears to finally be kicking in, you know, or is it? I had a conversation this spring with a Gartner analyst, John Lovelock, and I asked him can this continue, so let's, Alex, if you would, play the video and then we'll come back and talk about it. Can a company that size, in your experience, continue to grow at that pace? >> Absolutely. There is nothing stopping AWS from taking advantage of this market. We are nowhere near saturated for cloud changes. Most of software spend is still on legacy and maintenance of software on-prem. There is still a great deal of money being spent on servers and infrastructure and networking equipment, and all of that gets bled out into the cloud eventually. >> So you heard John Lovelock, he said there's really nothing there to stop AWS. They're going to continue to gain, and so now the question is will they bounce back to the growth rates that they did before, or are those large numbers, the law of large numbers kicking in. Let's talk about Microsoft a little bit. Microsoft is clearly gaining. So in 2017 Microsoft's cloud business, when you strip out, or you try to strip out, it's a little fuzzy, there's some gray area going on in there, but you do your best. I talked to the folks at Wikibon and Ralph Finos tracks this stuff very closely, but AWS or Azure was about 33% of AWS's business. If you go to 2018 it was about 41%, so you had Azure at a, you know, starting to crack or get close to the $10 billion mark. In 2019 it's going to be closer to 50% of AWS's business, so if AWS, let's say AWS comes in at $34, $35 billion this year, that puts Azure, you know, $13, $14, $15, maybe even $16 billion. So they're starting to get to the high 40%s or even that 50% level, so Microsoft is making moves. Microsoft is partner-friendly. People in the ecosystem at Amazon, they complain that they sometimes are concerned with Amazon competing with them. Microsoft doing partnerships even with Oracle, so this is kind of interesting that Oracle and Microsoft are partnering. One of the areas that's been difficult to get into the cloud is mission-critical workloads, and that's really what Oracle and Microsoft are partnering on. It's giving Oracle customers an option, because they may not want to go to the Oracle cloud, they may not like the Oracle cloud. They may feel like it's too locked-in. They may feel like it's too deficient relative to Azure. They may be a big Microsoft customer and they feel more comfortable with Azure, so the deal between Oracle and Microsoft, the partnership, will allow more mission-critical workloads to go into Azure. We know that AWS, if you look at the case studies on AWS's website for the Database and the Database migration they've done, they've been very successful but it's largely the analytic stuff, the data warehousing, the data marts. It's not a lot of mission-critical stuff and you see Amazon itself is struggling to, you know, convert off of Oracle into, you know, its own transaction database, and that's still taking, you know, a long time. They'll certainly tout the successes they've had in the data warehousing, but the transaction stuff is much, much tougher, so that's something that we're watching as part of what we sometimes refer to as cloud 2.0. Will the mission-critical workloads migrate into the cloud? Now again, the Microsoft numbers are fuzzy. You've got to peel, you know, the onion back. You have to take out Office 365. You got to, is Skype in there? What about GitHub, you know? They throw these things in. The companies, you know, they'll all play the kitchen sink game, but here's something I want you to think about. What percent of AWS's business comes from the Amazon retail side? It's got to be substantial. Amazon retail is easily a 10% customer of AWS, and likely much, much larger. Could Amazon retail account for $10 billion in AWS's revenue, you know it's possible. How are transfer costs allocated from quarter to quarter? What is, you know, Amazon retail pay? I think they pay rack rates, but you know, we're not sure how those transfer costs are allocated. People talk about breaking up AWS. I read an article last week that said that Jeff Bezos may even spin it off before the government forces him to. I'm not sure that makes sense. I don't think it makes any sense to do that from a business standpoint because right now AWS is subsidizing Amazon's entry into all these other markets. They're into grocery, they're into content, they're into now logistics. They're vertically integrating into logistics, and that's one of the items that they mentioned in their conference call last night, which was they're investing in logistics as potentially a future business, another, you know, big pillar. You know their ad business is really taking off, so you're seeing, you know, Amazon, like Microsoft, a lot of ways to spend with those guys. So is this pullback, the stock's down about 34 points today. Is it a buying opportunity? Yeah, probably yes, but cloud 2.0 undoubtedly in this next phase is going to see tougher competition, you know, particularly from Microsoft but of course then you've got Alibaba, and you know, China, Inc. and the China cloud coming in, and you've got, you know, partners saying, "Hey, we have to be careful because if we don't move fast "Amazon's going to gobble up some of our business," so they're hedging their bets, and you're seeing some of the customers hedge their bets as well. Bottom line, though, Amazon remains very, very strong, a leader, a continued share-gainer, so we're very positive on the company generally and AWS specifically. All right, this is Dave Vellante. Thank you for watching this version, this episode of CUBE Insights powered by ETR. We'll see you next time. (techy music)
SUMMARY :
From the SiliconANGLE media office in the single digits, sometimes, you know, and all of that gets bled out into the cloud eventually. and that's still taking, you know, a long time.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
IBM | ORGANIZATION | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
John Lovelock | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
$14 | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
Jeff Bezos | PERSON | 0.99+ |
Cisco | ORGANIZATION | 0.99+ |
$9 billion | QUANTITY | 0.99+ |
$13 | QUANTITY | 0.99+ |
October | DATE | 0.99+ |
2019 | DATE | 0.99+ |
611 | QUANTITY | 0.99+ |
John Lovel | PERSON | 0.99+ |
Alibaba | ORGANIZATION | 0.99+ |
Skype | ORGANIZATION | 0.99+ |
2018 | DATE | 0.99+ |
10% | QUANTITY | 0.99+ |
$10 billion | QUANTITY | 0.99+ |
31% | QUANTITY | 0.99+ |
$15 | QUANTITY | 0.99+ |
60% | QUANTITY | 0.99+ |
2017 | DATE | 0.99+ |
25% | QUANTITY | 0.99+ |
20% | QUANTITY | 0.99+ |
last week | DATE | 0.99+ |
$16 billion | QUANTITY | 0.99+ |
Wikibon Action Item | The Roadmap to Automation | April 27, 2018
>> Hi, I'm Peter Burris and welcome to another Wikibon Action Item. (upbeat digital music) >> Cameraman: Three, two, one. >> Hi. Once again, we're broadcasting from our beautiful Palo Alto studios, theCUBE studios, and this week we've got another great group. David Floyer in the studio with me along with George Gilbert. And on the phone we've got Jim Kobielus and Ralph Finos. Hey, guys. >> Hi there. >> So we're going to talk about something that's going to become a big issue. It's only now starting to emerge. And that is, what will be the roadmap to automation? Automation is going to be absolutely crucial for the success of IT in the future and the success of any digital business. At its core, many people have presumed that automation was about reducing labor. So introducing software and other technologies, we would effectively be able to substitute for administrative, operator, and related labor. And while that is absolutely a feature of what we're talking about, the bigger issue is ultimately is that we cannot conceive of more complex workloads that are capable of providing better customer experience, superior operations, all the other things a digital business ultimately wants to achieve. If we don't have a capability for simplifying how those underlying resources get put together, configured, or organized, orchestrated, and ultimately sustained delivery of. So the other part of automation is to allow for much more work that can be performed on the same resources much faster. It's a basis for how we think about plasticity and the ability to reconfigure resources very quickly. Now, the challenge is this industry, the IT industry has always used standards as a weapon. We use standards as a basis of creating eco systems or scale, or mass for even something as, like mainframes. Where there weren't hundreds of millions of potential users. But IBM was successful at using that as a basis for driving their costs down and approving a superior product. That's clearly what Microsoft and Intel did many years ago, was achieve that kind of scale through the driving more, and more, and more, ultimately, volume of the technology, and they won. But along the way though, each time, each generation has featured a significant amount of competition at how those interfaces came together and how they worked. And this is going to be the mother of all standard-oriented competition. How does one automation framework and another automation framework fit together? One being able to create value in a way that serves another automation framework, but ultimately as a, for many companies, a way of creating more scale onto their platform. More volume onto that platform. So this notion of how automation is going to evolve is going to be crucially important. David Floyer, are APIs going to be enough to solve this problem? >> No. That's a short answer to that. This is a very complex problem, and I think it's worthwhile spending a minute just on what are the component parts that need to be brought together. We're going to have a multi-cloud environment. Multiple private clouds, multiple public clouds, and they've got to work together in some way. And the automation is about, and you've got the Edge as well. So you've got a huge amount of data all across all of these different areas. And automation and orchestration across that, are as you said, not just about efficiency, they're about making it work. Making it able to be, to work and to be available. So all of the issues of availability, of security, of compliance, all of these difficult issues are a subject to getting this whole environment to be able to work together through a set of APIs, yes, but a lot lot more than that. And in particular, when you think about it, to me, volume of data is critical. Is who has access to that data. >> Peter: Now, why is that? >> Because if you're dealing with AI and you're dealing with any form of automation like this, the more data you have, the better your models are. And if you can increase that amount of data, as Google show every day, you will maintain that handle on all that control over that area. >> So you said something really important, because the implied assumption, and obviously, it's a major feature of what's going on, is that we've been talking about doing more automation for a long time. But what's different this time is the availability of AI and machine learning, for example, >> Right. as a basis for recognizing patterns, taking remedial action or taking predictive action to avoid the need for remedial action. And it's the availability of that data that's going to improve the quality of those models. >> Yes. Now, George, you've done a lot of work around this a whole notion of ML for ITOM. What are the kind of different approaches? If there's two ways that we're looking at it right now, what are the two ways? >> So there are two ends of the extreme. One is I want to see end to end what's going on across my private cloud or clouds. As well as if I have different applications in different public clouds. But that's very difficult. You get end-to-end visibility but you have to relax a lot of assumptions about what's where. >> And that's called the-- >> Breadth first. So the pro is end-to-end visibility. Con is you don't know how all the pieces fit together quite as well, so you get less fidelity in terms of diagnosing root causes. >> So you're trying to optimize at a macro level while recognizing that you can't optimize at a micro level. >> Right. Now the other approach, the other end of the spectrum, is depth first. Where you constrain the set of workloads and services that you're building and that you know about, and how they fit together. And then the models, based on the data you collect there, can become so rich that you have very very high fidelity root cause determination which allows you to do very precise recommendations or even automated remediation. What we haven't figured out hot to do yet is marry the depth first with the breadth first. So that you have multiple focus depth first. That's very tricky. >> Now, if you think about how the industry has evolved, we wrote some stuff about what we call, what I call the iron triangle. Which is basically a very tight relationship between specialists in technology. So the people who were responsible for a particular asset, be it storage, or the system, or the network. The vendors, who provided a lot of the knowledge about how that worked, and therefore made that specialist more or less successful and competent. And then the automation technology that that vendor ultimately provided. Now, that was not automation technology that was associated with AI or anything along those lines. It was kind of out of the box, buy our tool, and this is how you're going to automate various workflows or scripts, or whatever else it might be. And every effort to try to break that has been met with screaming because, well, you're now breaking my automation routines. So the depth-first approach, even without ML, has been the way that we've done it historically. But, David, you're talking about something different. It's the availability of the data that starts to change that. >> Yeah. >> So are we going to start seeing new compacts put in place between users and vendors and OEMs and a lot of these other folks? And it sounds like it's going to be about access to the data. >> Absolutely. So you're going to start. let's start at the bottom. You've got people who have a particular component, whatever that component is. It might be storage. It might be networking. Whatever that component is. They have products in that area which will be collecting data. And they will need for their particular area to provide a degree of automation. A degree of capability. And they need to do two things. They need to do that optimization and also provide data to other people. So they have to have an OEM agreement not just for the equipment that they provide, but for the data that they're going to give and the data they're going to give back. The automatization of the data, for example, going up and the availability of data to help themselves. >> So contracts effectively mean that you're going to have to negotiate value capture on the data side as well as the revenue side. >> Absolutely. >> The ability to do contracting historically has been around individual products. And so we're pretty good at that. So we can say, you will buy this product. I'm delivering you the value. And then the utility of that product is up to you. When we start going to service contracts, we get a little bit different kind of an arrangement. Now, it's an ongoing continuous delivery. But for the most part, a lot of those service contracts have been predicated to known in advance classes of functions, like Salesforce, for example. Or the SASS business where you're able to write a contract that says over time you will have access to this service. When we start talking about some of this automation though, now we're talking about ongoing, but highly bespoke, and potentially highly divergent, over a relatively short period of time, that you have a hard time writing contracts that will prescribe the range of behaviors and the promise about how those behaviors are actually going to perform. I don't think we're there yet. What do you guys think? >> Well, >> No, no way. I mean, >> Especially when you think about realtime. (laughing) >> Yeah. It has to be realtime to get to the end point of automating the actual reply than the actual action that you take. That's where you have to get to. You can't, It won't be sufficient in realtime. I think it's a very interesting area, this contracts area. If you think about solutions for it, I would be going straight towards blockchain type architectures and dynamic blockchain contracts that would have to be put in place. >> Peter: But they're not realtime. >> The contracts aren't realtime. The contracts will never be realtime, but the >> Accessed? access to the data and the understanding of what data is required. Those will be realtime. >> Well, we'll see. I mean, the theorem's what? Every 12 seconds? >> Well. That's >> Everything gets updated? >> That's To me, that's good enough. >> Okay. >> That's realtime enough. It's not going to solve the problem of somebody >> Peter: It's not going to solve the problem at the edge. >> At the very edge, but it's certainly sufficient to solve the problem of contracts. >> Okay. >> But, and I would add to that and say, in addition to having all this data available. Let's go back like 10, 20 years and look at Cisco. A lot of their differentiation and what entrenched them was sort of universal familiarity with their admin interfaces and they might not expose APIs in a way that would make it common across their competitors. But if you had data from them and a constrained number of other providers for around which you would build let's say, these modern big data applications. It's if you constrain the problem, you can get to the depth first. >> Yeah, but Cisco is a great example of it's an archetype for what I said earlier, that notion of an iron triangle. You had Cisco admins >> Yeah. that were certified to run Cisco gear and therefore had a strong incentive to ensure that more Cisco gear was purchased utilizing a Cisco command line interface that did incorporate a fair amount of automation for that Cisco gear and it was almost impossible for a lot of companies to penetrate that tight arrangement between the Cisco admin that was certified, the Cisco gear, and the COI. >> And the exact same thing happened with Oracle. The Oracle admin skillset was pervasive within large >> Peter: Happened with everybody. >> Yes, absolutely >> But, >> Peter: The only reason it didn't happen in the IBM mainframe, David, was because of a >> It did happen, yeah, >> Well, but it did happen, but governments stepped in and said, this violates antitrust. And IBM was forced by law, by court decree, to open up those interfaces. >> Yes. That's true. >> But are we going to see the same type of thing >> I think it's very interesting to see the shape of this market. When we look a little bit ahead. People like Amazon are going to have IAS, they're going to be running applications. They are going to go for the depth way of doing things across, or what which way around is it? >> Peter: The breadth. They're going to be end to end. >> But they will go depth in individual-- >> Components. Or show of, but they will put together their own type of things for their services. >> Right. >> Equally, other players like Dell, for example, have a lot of different products. A lot of different components in a lot of different areas. They have to go piece by piece and put together a consortium of suppliers to them. Storage suppliers, chip suppliers, and put together that outside and it's going to have to be a different type of solution that they put together. HP will have the same issue there. And as of people like CA, for example, who we'll see an opportunity for them to be come in again with great products and overlooking the whole of all of this data coming in. >> Peter: Oh, sure. Absolutely. >> So there's a lot of players who could be in this area. Microsoft, I missed out, of course they will have the two ends that they can combine together. >> Well, they may have an advantage that nobody else has-- >> Exactly. Yeah. because they're strong in both places. But I have Jim Kobielus. Let me check, are you there now? Do we got Jim back? >> Can you hear me? >> Peter: I can barely hear you, Jim. Could we bring Jim's volume up a little bit? So, Jim, I asked the question earlier, about we have the tooling for AI. We know how to get data. How to build models and how to apply the models in a broad brush way. And we're certainly starting to see that happen within the IT operations management world. The ITOM world, but we don't yet know how we're going to write these contracts that are capable of better anticipating, putting in place a regime that really describes how the, what are the limits of data sharing? What are the limits of derivative use? Et cetera. I argued, and here in the studio we generally agreed, that's we still haven't figured that out and that this is going to be one of the places where the tension between, at least in the B2B world, data availability and derivative use and where you capture value and where those profitables go, is going to be significant. But I want to get your take. Has the AI community >> Yeah. started figuring out how we're going to contractually handle obligations around data, data use, data sharing, data derivative use. >> The short answer is, no they have not. The longer answer is, that can you hear me, first of all? >> Peter: Barely. >> Okay. Should I keep talking? >> Yeah. Go ahead. >> Okay. The short answer is, no that the AI community has not addressed those, those IP protection issues. But there is a growing push in the AI community to leverage blockchain for such requirements in terms of block chains to store smart contracts where related to downstream utilization of data and derivative models. But that's extraordinarily early on in its development in terms of insight in the AI community and in the blockchain community as well. In other words, in fact, in one of the posts that I'm working on right now, is looking at a company called 8base that's actually using blockchain to store all of those assets, those artifacts for the development and lifecycle along with the smart contracts to drive those downstream uses. So what I'm saying is that there's lots of smart people like yourselves are thinking about these problems, but there's no consensus, definitely, in the AI community for how to manage all those rights downstream. >> All right. So very quickly, Ralph Finos, if you're there. I want to get your perspective >> Yeah. on what this means from markets, market leadership. What do you think? How's this going to impact who are the leaders, who's likely to continue to grow and gain even more strength? What're your thoughts on this? >> Yeah. I think, my perspective on this thing in the near term is to focus on simplification. And to focus on depth, because you can get return, you can get payback for that kind of work and it simplifies the overall picture so when you're going broad, you've got less of a problem to deal with. To link all these things together. So I'm going to go with the Shaker kind of perspective on the world is to make things simple. And to focus there. And I think the complexity of what we're talking about for breadth is too difficult to handle at this point in time. I don't see it happening any time in the near future. >> Although there are some companies, like Splunk, for example, that are doing a decent job of presenting a more of a breadth approach, but they're not going deep into the various elements. So, George, really quick. Let's talk to you. >> I beg to disagree on that one. >> Peter: Oh! >> They're actually, they built a platform, originally that was breadth first. They built all these, essentially, forwarders which could understand the formats of the output of all sorts of different devices and services. But then they started building what they called curated experiences which is the equivalent of what we call depth first. They're doing it for IT service management. They're doing it for what's called user behavior. Analytics, which is it's a way of tracking bad actors or bad devices on a network. And they're going to be pumping out more of those. What's not clear yet, is how they're going to integrate those so that IT service management understands security and vice versa. >> And I think that's one of the key things, George, is that ultimately, the real question will be or not the real question, but when we think about the roadmap, it's probably that security is going to be early on one of the things that gets addressed here. And again, it's not just security from a perimeter standpoint. Some people are calling it a software-based perimeter. Our perspective is the data's going to go everywhere and ultimately how do you sustain a zero trust world where you know your data is going to be out in the clear so what are you going to do about it? All right. So look. Let's wrap this one up. Jim Kobielus, let's give you the first Action Item. Jim, Action Item. >> Action Item. Wow. Action Item Automation is just to follow the stack of assets that drive automation and figure out your overall sharing architecture for sharing out these assets. I think the core asset will remain orchestration models. I don't think predictive models in AI are a huge piece of the overall automation pie in terms of the logic. So just focus on building out and protecting and sharing and reusing your orchestration models. Those are critically important. In any domain. End to end or in specific automation domains. >> Peter: David Floyer, Action Item. >> So my Action Item is to acknowledge that the world of building your own automation yourself around a whole lot of piece parts that you put together are over. You won't have access to a sufficient data. So enterprises must take a broad view of getting data, of getting components that have data be giving them data. Make contracts with people to give them data, masking or whatever it is and become part of a broader scheme that will allow them to meet the automation requirements of the 21st century. >> Ralph Finos, Action Item. >> Yeah. Again, I would reiterate the importance of keeping it simple. Taking care of the depth questions and moving forward from there. The complexity is enormous, and-- >> Peter: George Gilbert, Action Item. >> I say, start with what customers always start with with a new technology, which is a constrained environment like a pilot and there's two areas that are potentially high return. One is big data, where it's been a multi vendor or multi-vendor component mix, and a mess. And so you take that and you constrain that and make that a depth-first approach in the cloud where there is data to manage that. And the second one is security, where we have now a more and more trained applications just for that. I say, don't start with a platform. Start with those solutions and then start adding more solutions around that. >> All right. Great. So here's our overall Action Item. The question of automation or roadmap to automation is crucial for multiple reasons. But one of the most important ones is it's inconceivable to us to envision how a business can institute even more complex applications if we don't have a way of improving the degree of automation on the underlying infrastructure. How this is going to play out, we're not exactly sure. But we do think that there are a few principals that are going to be important that users have to focus on. Number one is data. Be very clear that there is value in your data, both to you as well as to your suppliers and as you think about writing contracts, don't write contracts that are focused on a product now. Focus on even that product as a service over time where you are sharing data back and forth in addition to getting some return out of whatever assets you've put in place. And make sure that the negotiations specifically acknowledge the value of that data to your suppliers as well. Number two, that there is certainly going to be a scale here. There's certainly going to be a volume question here. And as we think about where a lot of the new approaches to doing these or this notion of automation, is going to come out of the cloud vendors. Once again, the cloud vendors are articulating what the overall model is going to look like. What that cloud experience is going to look like. And it's going to be a challenge to other suppliers who are providing an on-premises true private cloud and Edge orientation where the data must live sometimes it is not something that they just want to do because they want to do it. Because that data requires it to be able to reflect that cloud operating model. And expect, ultimately, that your suppliers also are going to have to have very clear contractual relationships with the cloud players and each other for how that data gets shared. Ultimately, however, we think it's crucially important that any CIO recognized that the existing environment that they have right now is not converged. The existing environment today remains operators, suppliers of technology, and suppliers of automation capabilities and breaking that up is going to be crucial. Not only to achieving automation objectives, but to achieve a converged infrastructure, hyper converged infrastructure, multi-cloud arrangements, including private cloud, true private cloud, and the cloud itself. And this is going to be a management challenge, goes way beyond just products and technology, to actually incorporating how you think about your shopping, organized, how you institutionalize the work that the business requires, and therefore what you identify as a tasks that will be first to be automated. Our expectation, security's going to be early on. Why? Because your CEO and your board of directors are going to demand it. So think about how automation can be improved and enhanced through a security lens, but do so in a way that ensures that over time you can bring new capabilities on with a depth-first approach at least, to the breadth that you need within your shop and within your business, your digital business, to achieve the success and the results that you want. Okay. Once again, I want to thank David Floyer and George Gilbert here in the studio with us. On the phone, Ralph Finos and Jim Kobielus. Couldn't get Neil Raiden in today, sorry Neil. And I am Peter Burris, and this has been an Action Item. Talk to you again soon. (upbeat digital music)
SUMMARY :
and welcome to another Wikibon Action Item. And on the phone we've got Jim Kobielus and Ralph Finos. and the ability to reconfigure resources very quickly. that need to be brought together. the more data you have, is the availability of AI and machine learning, And it's the availability of that data What are the kind of different approaches? You get end-to-end visibility but you have to relax So the pro is end-to-end visibility. while recognizing that you can't optimize at a micro level. So that you have multiple focus depth first. that starts to change that. And it sounds like it's going to be about access to the data. and the data they're going to give back. have to negotiate value capture on the data side and the promise about how those behaviors I mean, Especially when you think about realtime. than the actual action that you take. but the access to the data and the understanding I mean, the theorem's what? To me, that's good enough. It's not going to solve the problem of somebody but it's certainly sufficient to solve the problem in addition to having all this data available. Yeah, but Cisco is a great example of and therefore had a strong incentive to ensure And the exact same thing happened with Oracle. to open up those interfaces. They are going to go for the depth way of doing things They're going to be end to end. but they will put together their own type of things that outside and it's going to have to be a different type Peter: Oh, sure. the two ends that they can combine together. Let me check, are you there now? and that this is going to be one of the places to contractually handle obligations around data, The longer answer is, that and in the blockchain community as well. I want to get your perspective How's this going to impact who are the leaders, So I'm going to go with the Shaker kind of perspective Let's talk to you. I beg to disagree And they're going to be pumping out more of those. Our perspective is the data's going to go everywhere Action Item Automation is just to follow that the world of building your own automation yourself Taking care of the depth questions and make that a depth-first approach in the cloud Because that data requires it to be able to reflect
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jim | PERSON | 0.99+ |
David Floyer | PERSON | 0.99+ |
Jim Kobielus | PERSON | 0.99+ |
David | PERSON | 0.99+ |
George Gilbert | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
George | PERSON | 0.99+ |
Neil | PERSON | 0.99+ |
Peter | PERSON | 0.99+ |
April 27, 2018 | DATE | 0.99+ |
Ralph Finos | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Neil Raiden | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Cisco | ORGANIZATION | 0.99+ |
Dell | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
21st century | DATE | 0.99+ |
two ways | QUANTITY | 0.99+ |
8base | ORGANIZATION | 0.99+ |
10 | QUANTITY | 0.99+ |
hundreds | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
Splunk | ORGANIZATION | 0.99+ |
two areas | QUANTITY | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
One | QUANTITY | 0.99+ |
HP | ORGANIZATION | 0.99+ |
each generation | QUANTITY | 0.99+ |
theCUBE | ORGANIZATION | 0.99+ |
Intel | ORGANIZATION | 0.99+ |
both places | QUANTITY | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
both | QUANTITY | 0.98+ |
two things | QUANTITY | 0.98+ |
Three | QUANTITY | 0.98+ |
two | QUANTITY | 0.98+ |
SASS | ORGANIZATION | 0.98+ |
this week | DATE | 0.97+ |
each time | QUANTITY | 0.97+ |
two ends | QUANTITY | 0.97+ |
today | DATE | 0.97+ |
ORGANIZATION | 0.96+ | |
first | QUANTITY | 0.96+ |
second one | QUANTITY | 0.94+ |
CA | LOCATION | 0.92+ |
Action Item | How to get more value out of your data, April 06, 2018
>> Hi I'm Peter Burris and welcome to another Wikibon Action Item. (electronic music) One of the most pressing strategic issues that businesses face is how to get more value out of their data, In our opinion that's the essence of a digital business transformation, is the using of data as an asset to improve your operations and take better advantage of market opportunities. The problem of data though, it's shareable, it's copyable, it's reusable. It's easy to create derivative value out of it. One of the biggest misnomers in the digital business world is the notion that data is the new fuel or the new oil. It's not, You can only use oil once. You can apply it to a purpose and not multiple purposes. Data you can apply to a lot of purposes, which is why you are able to get such interesting and increasing returns to that asset if you use it appropriately. Now, this becomes especially important for technology companies that are attempting to provide digital business technologies or services or other capabilities to their customers. In the consumer world, it started to reach a head. Questions about Facebook's reuse of a person's data through an ad based business model is now starting to lead people to question the degree to which the information asymmetry about what I'm giving and how they're using it is really worth the value that I get out of Facebook, is something that consumers and certainly governments are starting to talk about. it's also one of the bases for GDPR, which is going to start enforcing significant fines in the next month or so. In the B2B world that question is going to become especially acute. Why? Because as we try to add intelligence to the services and the products that we are utilizing within digital business, some of that requires a degree of, or some sort of relationship where some amount of data is passed to improve the models and machine learning and AI that are associated with that intelligence. Now, some companies have come out and said flat out they're not going to reuse a customer's data. IBM being a good example of that. When Ginni Rometty at IBM Think said, we're not going to reuse our customer's data. The question for the panel here is, is that going to be a part of a differentiating value proposition in the marketplace? Are we going to see circumstances in which companies keep products and services low by reusing a client's data and others sustaining their experience and sustaining a trust model say they won't. How is that going to play out in front of customers? So joining me today here in the studio, David Floyer. >> Hi there. >> And on the remote lines we have Neil Raden, Jim Kobielus, George Gilbert, and Ralph Finos. Hey, guys. >> All: Hey. >> All right so... Neil, let me start with you. You've been in the BI world as a user, as a consultant, for many, many number of years. Help us understand the relationship between data, assets, ownership, and strategy. >> Oh, God. Well, I don't know that I've been in the BI world. Anyway, as a consultant when we would do a project for a company, there were very clear lines of what belong to us and what belong to the client. They were paying us generously. They would allow us to come in to their company and do things that they needed and in return we treated them with respect. We wouldn't take their data. We wouldn't take their data models that we built, for example, and sell them to another company. That's just, as far as I'm concerned, that's just theft. So if I'm housing another company's data because I'm a cloud provider or some sort of application provider and I say well, you know, I can use this data too. To me the analogy is, I'm a warehousing company and independently I go into the warehouse and I say, you know, these guys aren't moving their inventory fast enough, I think I'll sell some of it. It just isn't right. >> I think it's a great point. Jim Kobielus. As we think about the role that data, machine learning play, training models, delivering new classes of services, we don't have a clean answer right now. So what's your thought on how this is likely to play out? >> I agree totally with Neil, first of all. If it's somebody else's data, you don't own it, therefore you can't sell and you can't monetize it, clearly. But where you have derivative assets, like machine learning models that are derivative from data, it's the same phenomena, it's the same issue at a higher level. You can build and train, or should, your machine learning models only from data that you have legal access to. You own or you have license and so forth. So as you're building these derivative assets, first and foremost, make sure as you're populating your data lake, to build and to do the training, that you have clear ownership over the data. So with GDPR and so forth, we have to be doubly triply vigilant to make sure that we're not using data that we don't have authorized ownership or access to. That is critically important. And so, I get kind of queasy when I hear some people say we use blockchain to make... the sharing of training data more distributed and federated or whatever. It's like wait a second. That doesn't solve the issues of ownership. That makes it even more problematic. If you get this massive blockchain of data coming from hither and yon, who owns what? How do you know? Do you dare build any models whatsoever from any of that data? That's a huge gray area that nobody's really addressed yet. >> Yeah well, it might mean that the blockchain has been poorly designed. I think that we talked in one of the previous Action Items about the role that blockchain design's going to play. But moving aside from the blockchain, so it seems as though we generally agree that data is owned by somebody typically and that the ownership of it, as Neil said, means that you can't intercept it at some point in time just because it is easily copied and then generate rents on it yourself. David Floyer, what does that mean from a ongoing systems design and development standpoint? How are we going to assure, as Jim said, not only that we know what data is ours but make sure that we have the right protection strategies, in a sense, in place to make sure that the data as it moves, we have some influence and control over it. >> Well, my starting point is that AI and AI infused products are fueled by data. You need that data, and Jim and Neil have already talked about that. In my opinion, the most effective way of improving a company's products, whatever the products are, from manufacturing, agriculture, financial services, is to use AI infused capabilities. That is likely to give you the best return on your money and businesses need to focus on their own products. That's the first place you are trying to protect from anybody coming in. Businesses own that data. They own the data about your products, in use by your customers, use that data to improve your products with AI infused function and use it before your competition eats your lunch. >> But let's build on that. So we're not saying that, for example, if you're a storage system supplier, since that's a relatively easy one. You've got very, very fast SSDs. Very, very fast NVMe over Fabric. Great technology. You can collect data about how that system is working but that doesn't give you rights to then also collect data about how the customer's using the system. >> There is a line which you need to make sure that you are covering. For example, Call Home on a product, any product, whose data is that? You need to make sure that you can use that data. You have some sort of agreement with the customer and that's a win-win because you're using that data to improve the product, prove things about it. But that's very, very clear that you should have a contractual relationship, as Jim and Neil were pointing out. You need the right to use that data. It can't come beyond the hand. But you must get it because if you don't get it, you won't be able to improve your products. >> Now, we're talking here about technology products which have often very concrete and obvious ownership and people who are specifically responsible for administering them. But when we start getting into the IoT domain or in other places where the device is infused with intelligence and it might be collecting data that's not directly associated with its purpose, just by virtue of the nature of sensors that are out there and the whole concept of digital twin introduces some tension in all this. George Gilbert. Take us through what's been happening with the overall suppliers of technology that are related to digital twin building, designing, etc. How are they securing or making promises committing to their customers that they will not cross this data boundary as they improve the quality of their twins? >> Well, as you quoted Ginni Rometty starting out, she's saying IBM, unlike its competitors, will not take advantage and leverage and monetize your data. But it's a little more subtle than that and digital twins are just sort of another manifestation of industry-specific sort of solution development that we've done for decades. The differences, as Jim and David have pointed out, that with machine learning, it's not so much code that's at the heart of these digital twins, it's the machine learning models and the data is what informs those models. Now... So you don't want all your secret sauce to go from Mercedes Benz to BMW but at the same time the economics of industry solutions means that you do want some of the repeatability that we've always gotten from industry solutions. You might have parts that are just company specific. And so in IBM's case, if you really parse what they're saying, they take what they learn in terms of the models from the data when they're working with BMW, and some of that is going to go into the industry specific models that they're going to use when they're working with Mercedes-Benz. If you really, really sort of peel the onion back and ask them, it's not the models, it's not the features of the models, but it's the coefficients that weight the features or variables in the models that they will keep segregated by customer. So in other words, you get some of the benefits, the economic benefits of reuse across customers with similar expertise but you don't actually get all of the secret sauce. >> Now, Ralph Finos-- >> And I agree with George here. I think that's an interesting topic. That's one of the important points. It's not kosher to monetize data that you don't own but conceivably if you can abstract from that data at some higher level, like George's describing, in terms of weights and coefficients and so forth, in a neural network that's derivative from the model. At some point in the abstraction, you should be able to monetize. I mean, it's like a paraphrase of some copyrighted material. A paraphrase, I'm not a lawyer, but you can, you can sell a paraphrase because it's your own original work that's based obviously on your reading of Moby Dick or whatever it is you're paraphrasing. >> Yeah, I think-- >> Jim I-- >> Peter: Go ahead, Neil. >> I agree with that but there's a line. There was a guy who worked at Capital One, this was about ten years ago, and he was their chief statistician or whatever. This was before we had words like machine learning and data science, it was called statistics and predictive analytics. He left the company and formed his own company and rewrote and recoded all of the algorithms he had for about 20 different predictive models. Formed a company and then licensed that stuff to Sybase and Teradata and whatnot. Now, the question I have is, did that cross the line or didn't it? These were algorithms actually developed inside Capital One. Did he have the right to use those, even if he wrote new computer code to make them run in databases? So it's more than just data, I think. It's a, well, it's a marketplace and I think that if you own something someone should not be able to take it and make money on it. But that doesn't mean you can't make an agreement with them to do that, and I think we're going to see a lot of that. IMSN gets data on prescription drugs and IRI and Nielsen gets scanner data and they pay for it and then they add value to it and they resell it. So I think that's really the issue is the use has to be understood by all the parties and the compensation has to be appropriate to the use. >> All right, so Ralph Finos. As a guy who looks at market models and handles a lot of the fundamentals for how we do our forecasting, look at this from the standpoint of how people are going to make money because clearly what we're talking about sounds like is the idea that any derivative use is embedded in algorithms. Seeing how those contracts get set up and I got a comment on that in a second, but the promise, a number of years ago, is that people are going to start selling data willy-nilly as a basis for their economic, a way of capturing value out of their economic activities or their business activities, hasn't matured yet generally. Do we see like this brand new data economy, where everybody's selling data to each other, being the way that this all plays out? >> Yeah, I'm having a hard time imagining this as a marketplace. I think we pointed at the manufacturing industries, technology industries, where some of this makes some sense. But I think from a practitioner perspective, you're looking for variables that are meaningful that are in a form you can actually use to make prediction. That you understand what the the history and the validity of that of that data is. And in a lot of cases there's a lot of garbage out there that you can't use. And the notion of paying for something that ultimately you look at and say, oh crap, it's not, this isn't really helping me, is going to be... maybe not an insurmountable barrier but it's going to create some obstacles in the market for adoption of this kind of thought process. We have to think about the utility of the data that feeds your models. >> Yeah, I think there's going to be a lot, like there's going to be a lot of legal questions raised and I recommend that people go look at a recent SiliconANGLE article written by Mike Wheatley and edited by our Editor In Chief Robert Hof about Microsoft letting technology partners own right to joint innovations. This is a quite a difference. This is quite a change for Microsoft who used to send you, if you sent an email with an idea to them, you'd often get an email back saying oh, just to let you know any correspondence we have here is the property of Microsoft. So there clearly is tension in the model about how we're going to utilize data and enable derivative use and how we're going to share, how we're going to appropriate value and share in the returns of that. I think this is going to be an absolutely central feature of business models, certainly in the digital business world for quite some time. The last thing I'll note and then I'll get to the Action Items, the last thing I'll mention here is that one of the biggest challenges in whenever we start talking about how we set up businesses and institutionalize the work that's done, is to look at the nature of the assets and the scope of the assets and in circumstances where the asset is used by two parties and it's generating a high degree of value, as measured by the transactions against those assets, there's always going to be a tendency for one party to try to take ownership of it. One party that's able to generate greater returns than the other, almost always makes move to try to take more control out of that asset and that's the basis of governance. And so everybody talks about data governance as though it's like something that you worry about with your backup and restore. Well, that's important but this notion of data governance increasingly is going to become a feature of strategy and boardroom conversations about what it really means to create data assets, sustain those data assets, get value out of them, and how we determine whether or not the right balance is being struck between the value that we're getting out of our data and third parties are getting out of our data, including customers. So with that, let's do a quick Action Item. David Floyer, I'm looking at you. Why don't we start here. David Floyer, Action Item. >> So my Action Item is for businesses, you should focus. Focus on data about your products in use by your customers, to improve, help improve the quality of your products and fuse AI into those products as one of the most efficient ways of adding value to it. And do that before your competition has a chance to come in and get data that will stop you from doing that. >> George Gilbert, Action Item. >> I guess mine would be that... in most cases you you want to embrace some amount of reuse because of the economics involved from your joint development with a solution provider. But if others are going to get some benefit from sort of reusing some of the intellectual property that informs models that you build, make sure you negotiate with your vendor that any upgrades to those models, whether they're digital twins or in other forms, that there's a canonical version that can come back and be an upgraded path for you as well. >> Jim Kobielus, Action Item. >> My Action Item is for businesses to regard your data as a product that you monetize yourself. Or if you are unable to monetize it yourself, if there is a partner, like a supplier or a customer who can monetize that data, then negotiate the terms of that monetization in your your relationship and be vigilant on that so you get a piece of that stream. Even if the bulk of the work is done by your partner. >> Neil Raden, Action Item. >> It's all based on transparency. Your data is your data. No one else can take it without your consent. That doesn't mean that you can't get involved in relationships where there's an agreement to do that. But the problem is most agreements, especially when you look at a business consumer, are so onerous that nobody reads them and nobody understands them. So the person providing the data has to have an unequivocal right to sell it to you and the person buying it has to really understand what the limits are that they can do with it. >> Ralph Finos, Action Item. You're muted Ralph. But it was brilliant, whatever it was. >> Well it was and I really can't say much more than that. (Peter laughs) But I think from a practitioner perspective and I understand that from a manufacturing perspective how the value could be there. But as a practitioner if you're fishing for data out there that someone has that might look like something you can use, chances are it's not. And you need to be real careful about spending money to get data that you're not really clear is going to help you. >> Great. All right, thanks very much team. So here's our Action Item conclusion for today. The whole concept of digital business is predicated in the idea of using data assets in a differential way to better serve your markets and improve your operations. It's your data. Increasingly, that is going to be the base for differentiation. And any weak undertaking to allow that data to get out has the potential that someone else can, through their data science and their capabilities, re-engineer much of what you regard as your differentiation. We've had conversations with leading data scientists who say that if someone were to sell customer data into a open marketplace, that it would take about four days for a great data scientist to re-engineer almost everything about your customer base. So as a consequence, we have to tread lightly here as we think about what it means to release data into the wild. Ultimately, the challenge there for any business will be: how do I establish the appropriate governance and protections, not just looking at the technology but rather looking at the overall notion of the data assets. If you don't understand how to monetize your data and nonetheless enter into a partnership with somebody else, by definition that partner is going to generate greater value out of your data than you are. There's significant information asymmetries here. So it's something that, every company must undertake an understanding of how to generate value out of their data. We don't think that there's going to be a general-purpose marketplace for sharing data in a lot of ways. This is going to be a heavily contracted arrangement but it doesn't mean that we should not take great steps or important steps right now to start doing a better job of instrumenting our products and services so that we can start collecting data about our products and services because the path forward is going to demonstrate that we're going to be able to improve, dramatically improve the quality of the goods and services we sell by reducing the assets specificities for our customers by making them more intelligent and more programmable. Finally, is this going to be a feature of a differentiated business relationship through trust? We're open to that. Personally, I'll speak for myself, I think it will. I think that there is going to be an important element, ultimately, of being able to demonstrate to a customer base, to a marketplace, that you take privacy, data ownership, and intellectual property control of data assets seriously and that you are very, very specific, very transparent, in how you're going to use those in derivative business transactions. All right. So once again, David Floyer, thank you very much here in the studio. On the phone: Neil Raden, Ralph Finos, Jim Kobielus, and George Gilbert. This has been another Wikibon Action Item. (electronic music)
SUMMARY :
and the products that we are utilizing And on the remote lines we have Neil Raden, You've been in the BI world as a user, as a consultant, and independently I go into the warehouse and I say, So what's your thought on how this is likely to play out? that you have clear ownership over the data. and that the ownership of it, as Neil said, That is likely to give you the best return on your money but that doesn't give you rights to then also You need the right to use that data. and the whole concept of digital twin and some of that is going to go into It's not kosher to monetize data that you don't own and the compensation has to be appropriate to the use. and handles a lot of the fundamentals and the validity of that of that data is. and that's the basis of governance. and get data that will stop you from doing that. because of the economics involved from your Even if the bulk of the work is done by your partner. and the person buying it has to really understand But it was brilliant, whatever it was. how the value could be there. and that you are very, very specific,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jim | PERSON | 0.99+ |
David Floyer | PERSON | 0.99+ |
Jim Kobielus | PERSON | 0.99+ |
Neil | PERSON | 0.99+ |
George Gilbert | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
George | PERSON | 0.99+ |
Neil Raden | PERSON | 0.99+ |
BMW | ORGANIZATION | 0.99+ |
Mike Wheatley | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Ginni Rometty | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
IRI | ORGANIZATION | 0.99+ |
Nielsen | ORGANIZATION | 0.99+ |
April 06, 2018 | DATE | 0.99+ |
Peter | PERSON | 0.99+ |
David | PERSON | 0.99+ |
Ralph Finos | PERSON | 0.99+ |
one party | QUANTITY | 0.99+ |
two parties | QUANTITY | 0.99+ |
Mercedes-Benz | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Mercedes Benz | ORGANIZATION | 0.99+ |
One party | QUANTITY | 0.99+ |
Robert Hof | PERSON | 0.99+ |
Capital One | ORGANIZATION | 0.99+ |
first | QUANTITY | 0.99+ |
Ralph | PERSON | 0.99+ |
one | QUANTITY | 0.99+ |
today | DATE | 0.98+ |
One | QUANTITY | 0.98+ |
IMSN | ORGANIZATION | 0.98+ |
GDPR | TITLE | 0.98+ |
Teradata | ORGANIZATION | 0.98+ |
next month | DATE | 0.96+ |
Moby Dick | TITLE | 0.95+ |
about 20 different predictive models | QUANTITY | 0.95+ |
Sybase | ORGANIZATION | 0.95+ |
decades | QUANTITY | 0.93+ |
about ten years ago | DATE | 0.88+ |
about four days | QUANTITY | 0.86+ |
second | QUANTITY | 0.83+ |
once | QUANTITY | 0.82+ |
Wikibon | ORGANIZATION | 0.8+ |
of years ago | DATE | 0.77+ |
Action | ORGANIZATION | 0.68+ |
SiliconANGLE | TITLE | 0.66+ |
twins | QUANTITY | 0.64+ |
Editor In Chief | PERSON | 0.61+ |
Items | QUANTITY | 0.58+ |
twin | QUANTITY | 0.48+ |
Think | ORGANIZATION | 0.46+ |
Wikibon Action Item | March 23rd, 2018
>> Hi, I'm Peter Burris, and welcome to another Wikibon Action Item. (funky electronic music) This was a very interesting week in the tech industry, specifically because IBM's Think Conference aggregated in a large number of people. Now, The CUBE was there. Dave Vellante, John Furrier, and myself all participated in somewhere in the vicinity of 60 or 70 interviews with thought leaders in the industry, including a number of very senior IBM executives. The reason why this becomes so important is because IBM made a proposal to the industry about how some of the digital disruption that the market faces is likely to unfold. The normal approach or the normal mindset that people have used is that startups, digital native companies were going to change the way that everything was going to operate, and the dinosaurs were going to go by the wayside. IBM's interesting proposal is that the dinosaurs actually are going to learn to dance, utilizing or playing on a book title from a number of years ago. And the specific argument was laid out by Ginni Rometty in her keynote, when she said that there are number of factors that are especially important here. Factor number one is that increasingly, businesses are going to recognize that the role that their data plays in competition is on the ascending. It's getting more important. Now, this is something that Wikibon's been arguing for quite some time. In fact, we have said that the whole key to digital disruption and digital business is to acknowledge the difference between business and digital business, is the role that data and data assets play in your business. So we have strong agreement there. But on top of that, Ginni Rometty made the observation that 80% of the data that could be accessed and put the work in business has not yet been made available to the new activities, the new processes that are essential to changing the way customers are engaged, businesses operate, and overall change and disruption occurs. So her suggestion is that that 80%, that vast amount of data that could be applied that's not being tapped, is embedded deep within the incumbents. And so the core argument from IBM is that the incumbent companies, not the digital natives, not the startups, but the incumbent companies are poised to make a significant, to have a significant role in disrupting how markets operate, because of the value of their data that hasn't currently been put to work and made available to new types of work. That was the thesis that we heard this week, and that's what we're going to talk about today. Are the incumbent really going to strike back? So Dave Vellante, let me start with you. You were at Think, you heard the same type of argument. What did you walk away with? >> So when I first heard the term incumbent disruptors, I was very skeptical, and I still am. But I like the concept and I like it a lot. So let me explain why I like it and why I think there's some real challenges. If I'm a large incumbent global 2,000, I'm not going to just roll over because the world is changing and software is eating my world. Rather what I'm going to do is I'm going to use my considerable assets to compete, and so that includes my customers, my employees, my ecosystem, the partnerships that I have there, et cetera. The reason why I'm skeptical is because incumbents aren't organized around their data assets. Their data assets are stovepipe, they're all over the place. And the skills to leverage that data value, monetize that data, understand the contribution that data makes toward monetization, those skills are limited. They're bespoke and they're very narrow. They're within lines of business or divisions. So there's a huge AI gap between the true digital business and an incumbent business. Now, I don't think all is lost. I think a lot of strategies can work, from M&A to transformation projects, joint ventures, spin-offs. Yeah, IBM gave some examples. They put up Verizon and American Airlines. I don't see them yet as the incumbent disruptors. But then there was another example of IBM Maersk doing some very interesting and disrupting things, Royal Bank of Canada doing some pretty interesting things. >> But in a joint venture forum, Dave, to your point, they specifically set up a joint venture that would be organized around this data, didn't they? >> Yes, and that's really the point I'm trying to make. All is not lost. There are certain things that you can do, many things that you can do as an incumbent. And it's really game on for the next wave of innovation. >> So we agree as a general principle that data is really important, David Floyer. And that's been our thesis for quite some time. But Ginni put something out there, Ginni Rometty put something out there. My good friend, Ginni Rometty, put something out there that 80% of the data that could be applied to disruption, better customer engagement, better operations, new markets, is not being utilized. What do we think about that? Is that number real? >> If you look at the data inside any organization, there's a lot of structured data. And that has better ability to move through an organization. Equally, there's a huge amount of unstructured data that goes in emails. It goes in voicemails, it goes in shared documents. It goes in diagrams, PowerPoints, et cetera, that also is data which is very much locked up in the way that Dave Vellante was talking about, locked up in a particular process or in a particular area. So is there a large amount of data that could be used inside an organization? Is it private, is it theirs? Yes, there is. The question is, how do you tap that data? How do you organize around that data to release it? >> So this is kind of a chicken and egg kind of a problem. Neil Raden, I'm going to turn to you. When we think about this chicken and egg problem, the question is do we organize in anticipation of creating these assets? Do we establish new processes in anticipation of creating these data assets? Or do we create the data assets first and then re-institutionalize the work? And the reason why it's a chicken and egg kind of problem is because it takes an enormous amount of leadership will to affect the way a business works before the asset's in place. But it's unclear that we're going to get the asset that we want unless we affect the reorganization, institutionalization. Neil, is it going to be a chicken? Is it going to be the egg? Or is this one of the biggest problems that these guys are going to have? >> Well, I'm a little skeptical about this 80% number. I need some convincing before I comment on that. But I would rather see, when David mentioned the PowerPoint slides or email or that sort of thing, I would rather see that information curated by the application itself, rather than dragged out in broad data and reinterpreted in something. I think that's very dangerous. I think we saw that in data warehousing. (mumbling) But when you look at building data lakes, you throw all this stuff into a data lake. And then after the fact, somebody has to say, "Well, what does this data mean?" So I find it kind of a problem. >> So Jim Kobielus, a couple weeks ago Microsoft actually introduced a technology or a toolkit that could in fact be applied to move this kind of advance processing for dragging value out of a PowerPoint or a Word document or something else, close and proximate to the application. Is that, I mean, what Neil just suggested I think is a very, very good point. Are we going to see these kinds of new technologies directly embedded within applications to help users narrowly, but businesses more broadly, lift that information out of these applications so it can be freed up for other uses? >> I think yeah, on some level, Peter, this is a topic called dark data. It's been discussed in data management circles for a long time. The vast majority, I think 75 to 80% is the number that I see in the research, is locked up in terms of it's not searchable, it's not easily discoverable. It's not mashupable, I'm making up a word. But the term mashup hasn't been used in years, but I think it's a good one. What it's all about is if we want to make the most out of our incumbent's data, then we need to give the business, the business people, the tools to find the data where it is, to mash it up into new forms and analytics and so forth, in order to monetize it and sell it, make money off of it. So there are a wide range of data discovery and other tools that support a fairly self-service combination and composition of composite data object. I don't know that, however, that the culture of monetizing existing dataset and pulling dark data into productized forms, I don't think that's taken root in any organization anywhere. I think that's just something that consultants talk about as something that gee, should be done, but I don't think it's happening in the real world. >> And I think you're probably correct about that, but I still think Neil raised a great point. And I would expect, and I think we all believe, that increasingly this is not going to come as a result of massive changes in adoption of new data science, like practices everywhere, but an embedding of these technologies. Machine learning algorithms, approaches to finding patterns within application data, in the applications themselves, which is exactly what Neil was saying. So I think that what we're going to see, and I wanted some validation from you guys about this, is increasingly tools being used by application providers to reveal data that's in applications, and not open source, independent tool chains that then ex-post-facto get applied to all kinds of different data sources in an attempt for the organization to pull the stuff out. David Floyer, what do you think? >> I agree with you. I think there's a great opportunity for the IT industry in this area to put together solutions which can go and fit in. On the basis of existing applications, there's a huge amount of potential, for example, of ERP systems to link in with IOT systems, for example, and provide a data across an organization. Rather than designing your own IOT system, I think people are going to buy-in pre-made ones. They're going to put the devices in, the data's going to come in, and the AI work will be done as part of that, as part of implementing that. And right across the board, there is tremendous opportunity to improve the applications that currently exist, or put in new versions of applications to address this question of data sharing across an organization. >> Yeah, I think that's going to be a big piece of what happens. And it also says, Neil Raden, something about whether or not enormous machine learning deities in the sky, some of which might start with the letter W, are going to be the best and only way to unlock this data. Is this going to be something that, we're suggesting now that it's something that's going to be increasingly-distributed closer to applications, less invasive and disruptive to people, more invasive and disruptive to the applications and the systems that are in place. And what do you think, Neil? Is that a better way of thinking about this? >> Yeah, let me give you an example. Data science the way it's been practiced is a mess. You have one person who's trying to find the data, trying to understand the data, complete your selection, designing experiments, doing runs, and so forth, coming up with formulas and then putting them in the cluster with funny names so they can try to remember which one was which. And now what you have are a number of software companies who've come up with brilliant ways of managing that process, of really helping the data science to create a work process in curating the data and so forth. So if you want to know something about this particular model, you don't have to go to the person and say, "Why did you do that model? "What exactly were you thinking?" That information would be available right there in the workbench. And I think that's a good model for, frankly, everything. >> So let's-- >> Development pipeline toolkits. That's a hot theme. >> Yeah, it's a very hot theme. But Jim, I don't think you think but I'm going to test it. I don't think we're going to see AI pipeline toolkits be immediately or be accessed by your average end user who's putting together a contract, so that that toolkit or so that data is automatically munched and ingested or ingested and munched by some AI pipeline. This is going to happen in an application. So the person's going to continue to do their work, and then the tooling will or will not grab that information and then combine it with other things through the application itself into the pipeline. We got that right? >> Yeah, but I think this is all being, everything you described is being embedded in applications that are making calls to backend cloud services that have themselves been built by data scientists and exposed through rest APIs. Steve, Peter, everything you're describing is coming to applications fairly rapidly. >> I think that's a good point, but I want to test it. I want to test that. So Ralph Finos, you've been paying a lot of attention during reporting season to what some of the big guys are saying on some of their calls and in some of their public statements. One company in particular, Oracle, has been finessing a transformation, shall we say? What are they saying about how this is going as we think about their customer base, the transformation of their customer base, and the degree to which applications are or are not playing a role in those transformations? >> Yeah, I think in their last earnings call a couple days ago that the point that they were making around the decline and the-- >> Again, this is Oracle. So in Oracle's last earnings call, yeah. >> Yeah, I'm sorry, yeah. And the decline and the revenue growth rate in the public cloud, the SAS end of their business, was a function really of a slowdown of the original acquisitions they made to kind of show up as being a transformative cloud vendor, and that are basically beginning to run out of gas. And I think if you're looking at marketing applications and sales-related applications and content-type of applications, those are kind of hitting a natural high of growth. And I think what they were saying is that from a migration perspective on ERP, that that's going to take a while to get done. They were saying something like 10 or 15% of their customer base had just begun doing some sort of migration. And that's a data around ERP and those kinds of applications. So it's a long slog ahead of them, but I'd rather be in their shoes, I think, for the long run than trying to kind of jazz up in the near-term some kind of pseudo-SAS cloud growth based on acquisition and low-lying fruit. >> Yeah, because they have a public cloud, right? I mean, at least they're in the game. >> Yeah, and they have to show they're in the game. >> Yeah, and specifically they're talking about their applications as clouds themselves. So they're not just saying here's a set of resources that you can build, too. They're saying here's a set of SAS-based applications that you can build around. >> Dave: Right. Go ahead, Ralph, sorry. >> Yeah, yeah. And I think the notion there is the migration to their ERP and their systems of record applications that they're saying, this is going to take a long time for people to do that migration because of complexity in process. >> So the last point, or Dave Vellante, did you have a point you want to make before I jump into a new thought here? >> I just compare and contrast IBM and Oracle. They have public clouds, they have SAS. Many others don't. I think this is a major different point of differentiation. >> Alright, so we've talked about whether or not this notion of data as a source of value's important, and we agree it is. We still don't know whether or not 80% is the right number, but it is some large number that's currently not being utilized and applied to work differently than the data currently is. And that likely creates some significant opportunities for transformation. Do we ultimately think that the incumbents, again, I mention the chicken and the egg problem. Do we ultimately think that the incumbents are... Is this going to be a test of whether or not the incumbents are going to be around in 10 years? The degree to which they enact the types of transformation we thought about. Dave Vellante, you said you were skeptical. You heard the story. We've had the conversation. Will incumbents who do this in fact be in a better position? >> Well, incumbents that do take action absolutely will be in a better position. But I think that's the real question. I personally believe that every industry is going to get disrupted by digital, and I think a lot of companies are not prepared for this and are going to be in deep trouble. >> Alright, so one more thought, because we're talking about industries overall. There's so many elements we haven't gotten to, but there's one absolute thing I want to talk about. Specifically the difference between B2C and B2B companies. Clearly the B2C industries have been disrupted, many of them pretty significantly, over the last few years. Not too long ago, I have multiple not-necessarily-good memories of running the aisles of Toys R Us sometime after 10 o'clock at night, right around December 24th. I can't do that anymore, and it's not because my kids are grown. Or I won't be able to do that soon anymore. So B2C industries seem to have been moved faster, because the digital natives are able to take advantage of the fact that a lot of these B2C industries did not have direct and strong relationships with those customers. I would posit that a lot of the B2B industries are really where the action's going to take. And the kind of way I would think about it, and David Floyer, I'll turn to you first. The way I would think about it is that in the B2C world, it's new markets and new ways of doing things, which is where the disruption's going to take place. So more of a substitution as opposed to a churn. But in the B2B markets, it's disrupting greater efficiencies, greater automation, greater engagement with existing customers, as well as finding new businesses and opportunities. What do you think about that? >> I think the B2B market is much more stable. Relationships, business relationships, very, very important. They take a long time to change. >> Peter: But much of that isn't digital. >> A lot of that is not digital. I agree with that. However, I think that the underlying change that's happening is one of automation. B2B are struggling to put into place automation with robots, automation everywhere. What you see, for example, in Amazon is a dedication to automation, to making things more efficient. And I think that's, to me, the biggest challenges, owning up to the fact that they have to change their automation, get themselves far more efficient. And if they don't succeed in doing that, then their ability to survive or their likelihood of being taken over with a reverse takeover becomes higher and higher and higher. So how do you go about that level, huge increase in automation that is needed to survive, I think is the biggest question for B2B players. >> And when we think about automation, David Floyer, we're not talking about the manufacturing arms or only talking about the manufacturing arms. We're talking about a lot of new software automation. Dave Vellante, Jim Kobielus, RPA is kind of a new thing. Dave, we saw some interesting things at Think. Bring us up to speed quickly on what the community at Think was talking about with RPA. >> Well, I tell you. There were a lot of people in financial services, which is IBM's stronghold. And they're using software robots to automate a lot of the backend stuff that humans were doing. That's a major, major use case. I would say 25 to 30% of the financial services organizations that I talked to had active RPA projects ongoing at the moment. I don't know. Jim, what are your thoughts? >> Yeah, I think backend automation is where B2B disruption is happening. As the organizations are able to automate more of their backend, digitize more of their backend functions and accelerate them and improve the throughput of transactions, are those that will clean up. I think for the B2C space, it's the frontend automation of the digitalization of the engagement channels. But RPA is essentially a key that's unlocking backend automation for everybody, because it allows more of the frontend business analysts and those who are not traditionally BPM, or business process re-engineering professionals, to begin to take standard administrative processes and begin to automate them from, as it were, the outside-in in a greater way. So I think RPA is a secret key for that. I think we'll see some of the more disruptive organizations, businesses, take RPA and use it to essentially just reverse-engineer, as it were, existing processes, but in an automated fashion, and drive that improvement but in the backend by AI. >> I just love the term software robots. I just think that that's, I think that so strongly evokes what's going to happen here. >> If I could add, I think there's a huge need to simplify that space. The other thing I witnessed at IBM Think is it's still pretty complicated. It's still a heavy lift. There's a lot of big services component to this, which is probably why IBM loves it. But there's a massive market, I think, to simplify the adoption or RPA. >> I completely agree. We have to open the aperture as well. Again, the goal is not to train people new things, new data science, new automation stuff, but to provide tools and increasingly embed those tools into stuff that people are already using, so that the disruption and the changes happen more as a consequence of continuing to do what the people do. Alright, so let's hit the action item we're on, guys. It's been a great conversation. Again, we haven't talked about GDPR. We haven't talked about a wide array of different factors that are going to be an issue. I think this is something we're going to talk about. But on the narrow issue of can the disruptors strike back? Neil Raden, let's start with you. Neil Raden, action item. >> I've been saying since 1975 that I should be hanging around with a better class of people, but I do spend a lot of time in the insurance industry. And I have been getting a consensus that in the next five to 10 years, there will no longer be underwriters for claims adjustments. That business is ready for massive, massive change. >> And those are disruptors, largely. Jim Kobielus, action item. >> Action item. In terms of business disruption, is just not to imagine that because you were the incumbent in the past era in some solution category that's declining, that that automatically guarantees you, that makes your data fit for seizing opportunities in the future. As we've learned from Blockbuster Video, the fact that they had all this customer data didn't give them any defenses against Netflix coming along and cleaning their coffin, putting them out of business. So the next generation of disruptor will not have any legacy data to work from, and they'll be able to work miracles because they made a strategic bet on some frontend digital channel that made all the difference. >> Ralph Finos, action item. >> Yeah, I think there's a notion here of siege mentality. And I think the incumbents are in the castle walls, and the disruptors are outside the castle walls. And sometimes the disruptors, you know, scale the walls. Sometimes they don't. But I think being inside the walls is a long-run tougher thing to be at. >> Dave Vellante, action item. >> I want to pick up on something Neil said. I think it's alluring for some of these industries, like insurance and financial services and healthcare, even parts of government, that really haven't been disrupted in a huge way yet to say, "Well, I'll wait and I'll see what happens." I think that's a huge mistake. I think you have to start immediately thinking about strategies, particularly around your data, as we talked about earlier. Maybe it's M&A, maybe it's joint ventures, maybe it's spinning out new companies. But the time is past where you should be acting. >> David Floyer, action item. >> I think that it's easier to focus on something that you can actually do. So my action item is that the focus of most B2B companies should be looking at all of their processes and incrementally automating them, taking out the people cost, taking out the cost, other costs, automating those processes as much as possible. That, in my opinion, is the most likely path to being in the position that you can continue to be competitive. Without that focus, it's likely that you're going to be disrupted. >> Alright. So the one thing I'll say about that, David, is when I think you say people cost I think you mean the administrative cost associated with people. >> And people doing things, automating jobs. >> Alright, so we have been talking here in today's Wikibon Action Item about the question, will the incumbents be able to strike back? The argument we heard at IBM Think this past week, and this is the third week of March, was that data is an asset that can be applied to significantly disrupt industries, and that incumbents have a lot of data that hasn't been bought into play in the disruptive flow. And IBM's argument is that we're going to see a lot of incumbents start putting their data into play, more of their data assets into play. And that's going to have a significant impact ultimately on industry structure, customer engagement, the nature of the products and services that are available over the course of the next decade. We agree. We generally agree. We might nitpick about whether it's 80%, whether it's 60%. But in general, the observation is an enormous amount of data that exists within a large company, that's related to how they conduct business, is siloed and locked away and is used once and is not made available, is dark and is not made available for derivative uses. That could, in fact, lead to significant consequential improvements in how a business's transaction costs are ultimately distributed. Automation's going to be a big deal. David Floyer's mentioned this in the past. I'm also of the opinion that there's going to be a lot of new opportunities for revenue enhancement and products. I think that's going to be as big, but it's very clear that to start it makes an enormous amount of sense to take a look at where your existing transaction costs are, where existing information asymmetries exist, and see what you can do to unlock that data, make it available to other processes, and start to do a better job of automating local and specific to those activities. And we generally ask our clients to take a look at what is your value proposition? What are the outcomes that are necessary for that value proposition? What activities are most important to creating those outcomes? And then find those that, by doing a better job of unlocking new data, you can better automate those activities. In general, our belief is that there's a significant difference between B2C and B2B businesses. Why? Because a lot of B2C businesses never really had that direct connection, therefore never really had as much of the market and customer data about what was going on. A lot of point-of-sale perhaps, but not a lot of other types of data. And then the disruptors stepped in and created direct relationships, gathered that data and were able to rapidly innovate products and services that served consumers differently. Where a lot of that new opportunity exists is in the B2B world. And here's where the real incumbents are going to start flexing their muscles over the course of the next decade, as they find those opportunities to engage differently, to automate existing practices and activities, change their cost model, and introduce new approaches to operating that are cloud-based, blockchain-based, data-based, based on data, and find new ways to utilize their people. If there's one big caution we have about this, it's this. Ultimately, the tooling is not broadly mature. The people necessary to build a lot of these tools are increasingly moving into the traditional disruptors, the legacy disruptors if we will. AWS, Netflix, Microsoft, companies more along those lines. That talent is very dear still in the industry, and it's going to require an enormous effort to bring those new types of technologies that can in fact liberate some of this data. We looked at things like RPA, robot process automation. We look at the big application providers to increasingly imbue their products and services with some of these new technologies. And ultimately, paradoxically perhaps, we look for the incumbent disruptors to find ways to disrupt without disrupting their own employees and customers. So embedding more of these new technologies in an ethical way directly into the systems and applications that serve people, so that the people face minimal changes to learning new tricks, because the systems themselves have gotten much more automated and much more... Are able to learn and evolve and adjust much more rapidly in a way that still corresponds to the way people do work. So our action item. Any company in the B2B space that is waiting for data to emerge as an asset in their business, so that they can then do all the institutional, re-institutionalizing of work and reorganizing of work and new types of investment, is not going to be in business in 10 years. Or it's going to have a very tough time with it. The big challenge for the board and the CIO, and it's not successfully been done in the past, at least not too often, is to start the process today without necessarily having access to the data, of starting to think about how the work's going to change, think about the way their organization's going to have to be set up. This is not business process re-engineering. This is organizing around future value of data, the options that data can create, and employ that approach to start doing local automation, serve customers, and change the way partnerships work, and ultimately plan out for an extended period of time how their digital business is going to evolve. Once again, I want to thank David Floyer here in the studio with me. Neil Raden, Dave Vellante, Ralph Finos, Jim Kobielus remote. Thanks very much guys. For all of our clients, once again this has been a Wikibon Action Item. We'll talk to you again. Thanks for watching. (funky electronic music)
SUMMARY :
is that the dinosaurs actually are going to learn to dance, And the skills to leverage that data value, Yes, and that's really the point I'm trying to make. that 80% of the data that could be applied to disruption, And that has better ability to move through an organization. that these guys are going to have? And then after the fact, somebody has to say, close and proximate to the application. that the culture of monetizing existing dataset in an attempt for the organization to pull the stuff out. the data's going to come in, Yeah, I think that's going to be a big piece of what happens. of really helping the data science That's a hot theme. So the person's going to continue to do their work, that are making calls to backend cloud services and the degree to which applications are So in Oracle's last earnings call, yeah. and that are basically beginning to run out of gas. I mean, at least they're in the game. here's a set of resources that you can build, too. is the migration to their ERP I think this is a major different point of differentiation. and applied to work differently than the data currently is. and are going to be in deep trouble. So more of a substitution as opposed to a churn. They take a long time to change. And I think that's, to me, the biggest challenges, or only talking about the manufacturing arms. of the financial services organizations that I talked to and drive that improvement but in the backend by AI. I just love the term software robots. There's a lot of big services component to this, of different factors that are going to be an issue. that in the next five to 10 years, And those are disruptors, largely. that made all the difference. And sometimes the disruptors, you know, scale the walls. But the time is past where you should be acting. So my action item is that the focus of most B2B companies So the one thing I'll say about that, David, and employ that approach to start doing local automation,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
Jim | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
David Floyer | PERSON | 0.99+ |
Ginni Rometty | PERSON | 0.99+ |
Verizon | ORGANIZATION | 0.99+ |
Jim Kobielus | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
Neil Raden | PERSON | 0.99+ |
Neil | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Steve | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Netflix | ORGANIZATION | 0.99+ |
Ralph | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Peter | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
75 | QUANTITY | 0.99+ |
American Airlines | ORGANIZATION | 0.99+ |
Ralph Finos | PERSON | 0.99+ |
March 23rd, 2018 | DATE | 0.99+ |
25 | QUANTITY | 0.99+ |
John Furrier | PERSON | 0.99+ |
10 | QUANTITY | 0.99+ |
Toys R Us | ORGANIZATION | 0.99+ |
80% | QUANTITY | 0.99+ |
60% | QUANTITY | 0.99+ |
Think | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
15% | QUANTITY | 0.99+ |
Ginni | PERSON | 0.99+ |
60 | QUANTITY | 0.99+ |
PowerPoint | TITLE | 0.99+ |
10 years | QUANTITY | 0.99+ |
1975 | DATE | 0.99+ |
Word | TITLE | 0.99+ |
Royal Bank of Canada | ORGANIZATION | 0.99+ |
first | QUANTITY | 0.99+ |
today | DATE | 0.98+ |
this week | DATE | 0.98+ |
Wikibon Action Item | De-risking Digital Business | March 2018
>> Hi I'm Peter Burris. Welcome to another Wikibon Action Item. (upbeat music) We're once again broadcasting from theCube's beautiful Palo Alto, California studio. I'm joined here in the studio by George Gilbert and David Floyer. And then remotely, we have Jim Kobielus, David Vellante, Neil Raden and Ralph Finos. Hi guys. >> Hey. >> Hi >> How you all doing? >> This is a great, great group of people to talk about the topic we're going to talk about, guys. We're going to talk about the notion of de-risking digital business. Now, the reason why this becomes interesting is, the Wikibon perspective for quite some time has been that the difference between business and digital business is the role that data assets play in a digital business. Now, if you think about what that means. Every business institutionalizes its work around what it regards as its most important assets. A bottling company, for example, organizes around the bottling plant. A financial services company organizes around the regulatory impacts or limitations on how they share information and what is regarded as fair use of data and other resources, and assets. The same thing exists in a digital business. There's a difference between, say, Sears and Walmart. Walmart mades use of data differently than Sears. And that specific assets that are employed and had a significant impact on how the retail business was structured. Along comes Amazon, which is even deeper in the use of data as a basis for how it conducts its business and Amazon is institutionalizing work in quite different ways and has been incredibly successful. We could go on and on and on with a number of different examples of this, and we'll get into that. But what it means ultimately is that the tie between data and what is regarded as valuable in the business is becoming increasingly clear, even if it's not perfect. And so traditional approaches to de-risking data, through backup and restore, now needs to be re-thought so that it's not just de-risking the data, it's de-risking the data assets. And, since those data assets are so central to the business operations of many of these digital businesses, what it means to de-risk the whole business. So, David Vellante, give us a starting point. How should folks think about this different approach to envisioning business? And digital business, and the notion of risk? >> Okay thanks Peter, I mean I agree with a lot of what you just said and I want to pick up on that. I see the future of digital business as really built around data sort of agreeing with you, building on what you just said. Really where organizations are putting data at the core and increasingly I believe that organizations that have traditionally relied on human expertise as the primary differentiator, will be disrupted by companies where data is the fundamental value driver and I think there are some examples of that and I'm sure we'll talk about it. And in this new world humans have expertise that leverage the organization's data model and create value from that data with augmented machine intelligence. I'm not crazy about the term artificial intelligence. And you hear a lot about data-driven companies and I think such companies are going to have a technology foundation that is increasingly described as autonomous, aware, anticipatory, and importantly in the context of today's discussion, self-healing. So able to withstand failures and recover very quickly. So de-risking a digital business is going to require new ways of thinking about data protection and security and privacy. Specifically as it relates to data protection, I think it's going to be a fundamental component of the so-called data-driven company's technology fabric. This can be designed into applications, into data stores, into file systems, into middleware, and into infrastructure, as code. And many technology companies are going to try to attack this problem from a lot of different angles. Trying to infuse machine intelligence into the hardware, software and automated processes. And the premise is that meaty companies will architect their technology foundations, not as a set of remote cloud services that they're calling, but rather as a ubiquitous set of functional capabilities that largely mimic a range of human activities. Including storing, backing up, and virtually instantaneous recovery from failure. >> So let me build on that. So what you're kind of saying if I can summarize, and we'll get into whether or not it's human expertise or some other approach or notion of business. But you're saying that increasingly patterns in the data are going to have absolute consequential impacts on how a business ultimately behaves. We got that right? >> Yeah absolutely. And how you construct that data model, and provide access to the data model, is going to be a fundamental determinant of success. >> Neil Raden, does that mean that people are no longer important? >> Well no, no I wouldn't say that at all. I'm talking with the head of a medical school a couple of weeks ago, and he said something that really resonated. He said that there're as many doctors who graduated at the bottom of their class as the top of their class. And I think that's true of organizations too. You know what, 20 years ago I had the privilege of interviewing Peter Drucker for an hour and he foresaw this, 20 years ago, he said that people who run companies have traditionally had IT departments that provided operational data but they needed to start to figure out how to get value from that data and not only get value from that data but get value from data outside the company, not just internal data. So he kind of saw this big data thing happening 20 years ago. Unfortunately, he had a prejudice for senior executives. You know, he never really thought about any other people in an organization except the highest people. And I think what we're talking about here is really the whole organization. I think that, I have some concerns about the ability of organizations to really implement this without a lot of fumbles. I mean it's fine to talk about the five digital giants but there's a lot of companies out there that, you know the bar isn't really that high for them to stay in business. And they just seem to get along. And I think if we're going to de-risk we really need to help companies understand the whole process of transformation, not just the technology. >> Well, take us through it. What is this process of transformation? That includes the role of technology but is bigger than the role of technology. >> Well, it's like anything else, right. There has to be communication, there has to be some element of control, there has to be a lot of flexibility and most importantly I think there has to be acceptability by the people who are going to be affected by it, that is the right thing to do. And I would say you start with assumptions, I call it assumption analysis, in other words let's all get together and figure out what our assumptions are, and see if we can't line em up. Typically IT is not good at this. So I think it's going to require the help of a lot of practitioners who can guide them. >> So Dave Vellante, reconcile one point that you made I want to come back to this notion of how we're moving from businesses built on expertise and people to businesses built on expertise resident as patterns in the data, or data models. Why is it that the most valuable companies in the world seem to be the ones that have the most real hardcore data scientists. Isn't that expertise and people? >> Yeah it is, and I think it's worth pointing out. Look, the stock market is volatile, but right now the top-five companies: Apple, Amazon, Google, Facebook and Microsoft, in terms of market cap, account for about $3.5 trillion and there's a big distance between them, and they've clearly surpassed the big banks and the oil companies. Now again, that could change, but I believe that it's because they are data-driven. So called data-driven. Does that mean they don't need humans? No, but human expertise surrounds the data as opposed to most companies, human expertise is at the center and the data lives in silos and I think it's very hard to protect data, and leverage data, that lives in silos. >> Yes, so here's where I'll take exception to that, Dave. And I want to get everybody to build on top of this just very quickly. I think that human expertise has surrounded, in other businesses, the buildings. Or, the bottling plant. Or, the wealth management. Or, the platoon. So I think that the organization of assets has always been the determining factor of how a business behaves and we institutionalized work, in other words where we put people, based on the business' understanding of assets. Do you disagree with that? Is that, are we wrong in that regard? I think data scientists are an example of reinstitutionalizing work around a very core asset in this case, data. >> Yeah, you're saying that the most valuable asset is shifting from some of those physical assets, the bottling plant et cetera, to data. >> Yeah we are, we are. Absolutely. Alright, David Foyer. >> Neil: I'd like to come in. >> Panelist: I agree with that too. >> Okay, go ahead Neil. >> I'd like to give an example from the news. Cigna's acquisition of Express Scripts for $67 billion. Who the hell is Cigna, right? Connecticut General is just a sleepy life insurance company and INA was a second-tier property and casualty company. They merged a long time ago, they got into health insurance and suddenly, who's Express Scripts? I mean that's a company that nobody ever even heard of. They're a pharmacy benefit manager, what is that? They're an information management company, period. That's all they do. >> David Foyer, what does this mean from a technology standpoint? >> So I wanted to to emphasize one thing that evolution has always taught us. That you have to be able to come from where you are. You have to be able to evolve from where you are and take the assets that you have. And the assets that people have are their current systems of records, other things like that. They must be able to evolve into the future to better utilize what those systems are. And the other thing I would like to say-- >> Let me give you an example just to interrupt you, because this is a very important point. One of the primary reasons why the telecommunications companies, whom so many people believed, analysts believed, had this fundamental advantage, because so much information's flowing through them is when you're writing assets off for 30 years, that kind of locks you into an operational mode, doesn't it? >> Exactly. And the other thing I want to emphasize is that the most important thing is sources of data not the data itself. So for example, real-time data is very very important. So what is your source of your real-time data? If you've given that away to Google or your IOT vendor you have made a fundamental strategic mistake. So understanding the sources of data, making sure that you have access to that data, is going to enable you to be able to build the sort of processes and data digitalization. >> So let's turn that concept into kind of a Geoffrey Moore kind of strategy bromide. At the end of the day you look at your value proposition and then what activities are central to that value proposition and what data is thrown off by those activities and what data's required by those activities. >> Right, both internal-- >> We got that right? >> Yeah. Both internal and external data. What are those sources that you require? Yes, that's exactly right. And then you need to put together a plan which takes you from where you are, as the sources of data and then focuses on how you can use that data to either improve revenue or to reduce costs, or a combination of those two things, as a series of specific exercises. And in particular, using that data to automate in real-time as much as possible. That to me is the fundamental requirement to actually be able to do this and make money from it. If you look at every example, it's all real-time. It's real-time bidding at Google, it's real-time allocation of resources by Uber. That is where people need to focus on. So it's those steps, practical steps, that organizations need to take that I think we should be giving a lot of focus on. >> You mention Uber. David Vellante, we're just not talking about the, once again, talking about the Uberization of things, are we? Or is that what we mean here? So, what we'll do is we'll turn the conversation very quickly over to you George. And there are existing today a number of different domains where we're starting to see a new emphasis on how we start pricing some of this risk. Because when we think about de-risking as it relates to data give us an example of one. >> Well we were talking earlier, in financial services risk itself is priced just the way time is priced in terms of what premium you'll pay in terms of interest rates. But there's also something that's softer that's come into much more widely-held consciousness recently which is reputational risk. Which is different from operational risk. Reputational risk is about, are you a trusted steward for data? Some of that could be personal information and a use case that's very prominent now with the European GDPR regulation is, you know, if I ask you as a consumer or an individual to erase my data, can you say with extreme confidence that you have? That's just one example. >> Well I'll give you a specific number on that. We've mentioned it here on Action Item before. I had a conversation with a Chief Privacy Officer a few months ago who told me that they had priced out what the fines to Equifax would have been had the problem occurred after GDPR fines were enacted. It was $160 billion, was the estimate. There's not a lot of companies on the planet that could deal with $160 billion liability. Like that. >> Okay, so we have a price now that might have been kind of, sort of mushy before. And the notion of trust hasn't really changed over time what's changed is the technical implementations that support it. And in the old world with systems of record we basically collected from our operational applications as much data as we could put it in the data warehouse and it's data marked satellites. And we try to govern it within that perimeter. But now we know that data basically originates and goes just about anywhere. There's no well-defined perimeter. It's much more porous, far more distributed. You might think of it as a distributed data fabric and the only way you can be a trusted steward of that is if you now, across the silos, without trying to centralize all the data that's in silos or across them, you can enforce, who's allowed to access it, what they're allowed to do, audit who's done what to what type of data, when and where? And then there's a variety of approaches. Just to pick two, one is where it's discovery-oriented to figure out what's going on with the data estate. Using machine learning this is, Alation is an example. And then there's another example, which is where you try and get everyone to plug into what's essentially a new system catalog. That's not in a in a deviant mesh but that acts like the fabric for your data fabric, deviant mesh. >> That's an example of another, one of the properties of looking at coming at this. But when we think, Dave Vellante coming back to you for a second. When we think about the conversation there's been a lot of presumption or a lot of bromide. Analysts like to talk about, don't get Uberized. We're not just talking about getting Uberized. We're talking about something a little bit different aren't we? >> Well yeah, absolutely. I think Uber's going to get Uberized, personally. But I think there's a lot of evidence, I mentioned the big five, but if you look at Spotify, Waze, AirbnB, yes Uber, yes Twitter, Netflix, Bitcoin is an example, 23andme. These are all examples of companies that, I'll go back to what I said before, are putting data at the core and building humans expertise around that core to leverage that expertise. And I think it's easy to sit back, for some companies to sit back and say, "Well I'm going to wait and see what happens." But to me anyway, there's a big gap between kind of the haves and the have-nots. And I think that, that gap is around applying machine intelligence to data and applying cloud economics. Zero marginal economics and API economy. An always-on sort of mentality, et cetera et cetera. And that's what the economy, in my view anyway, is going to look like in the future. >> So let me put out a challenge, Jim I'm going to come to you in a second, very quickly on some of the things that start looking like data assets. But today, when we talk about data protection we're talking about simply a whole bunch of applications and a whole bunch of devices. Just spinning that data off, so we have it at a third site. And then we can, and it takes to someone in real-time, and then if there's a catastrophe or we have, you know, large or small, being able to restore it often in hours or days. So we're talking about an improvement on RPO and RTO but when we talk about data assets, and I'm going to come to you in a second with that David Floyer, but when we talk about data assets, we're talking about, not only the data, the bits. We're talking about the relationships and the organization, and the metadata, as being a key element of that. So David, I'm sorry Jim Kobielus, just really quickly, thirty seconds. Models, what do they look like? What are the new nature of some of these assets look like? >> Well the new nature of these assets are the machine learning models that are driving so many business processes right now. And so really the core assets there are the data obviously from which they are developed, and also from which they are trained. But also very much the knowledge of the data scientists and engineers who build and tune this stuff. And so really, what you need to do is, you need to protect that knowledge and grow that knowledge base of data science professionals in your organization, in a way that builds on it. And hopefully you keep the smartest people in house. And they can encode more of their knowledge in automated programs to manage the entire pipeline of development. >> We're not talking about files. We're not even talking about databases, are we David Floyer? We're talking about something different. Algorithms and models are today's technology's really really set up to do a good job of protecting the full organization of those data assets. >> I would say that they're not even being thought about yet. And going back on what Jim was saying, Those data scientists are the only people who understand that in the same way as in the year 2000, the COBOL programmers were the only people who understood what was going on inside those applications. And we as an industry have to allow organizations to be able to protect the assets inside their applications and use AI if you like to actually understand what is in those applications and how are they working? And I think that's an incredibly important de-risking is ensuring that you're not dependent on a few experts who could leave at any moment, in the same way as COBOL programmers could have left. >> But it's not just the data, and it's not just the metadata, it really is the data structure. >> It is the model. Just the whole way that this has been put together and the reason why. And the ability to continue to upgrade that and change that over time. So those assets are incredibly important but at the moment there is no way that you can, there isn't technology available for you to actually protect those assets. >> So if I combine what you just said with what Neil Raden was talking about, David Vallante's put forward a good vision of what's required. Neil Raden's made the observation that this is going to be much more than technology. There's a lot of change, not change management at a low level inside the IT, but business change and the technology companies also have to step up and be able to support this. We're seeing this, we're seeing a number of different vendor types start to enter into this space. Certainly storage guys, Dylon Sears talking about doing a better job of data protection we're seeing middleware companies, TIBCO and DISCO, talk about doing this differently. We're seeing file systems, Scality, WekaIO talk about doing this differently. Backup and restore companies, Veeam, Veritas. I mean, everybody's looking at this and they're all coming at it. Just really quickly David, where's the inside track at this point? >> For me it is so much whitespace as to be unbelievable. >> So nobody has an inside track yet. >> Nobody has an inside track. Just to start with a few things. It's clear that you should keep data where it is. The cost of moving data around an organization from inside to out, is crazy. >> So companies that keep data in place, or technologies to keep data in place, are going to have an advantage. >> Much, much, much greater advantage. Sure, there must be backups somewhere. But you need to keep the working copies of data where they are because it's the real-time access, usually that's important. So if it originates in the cloud, keep it in the cloud. If it originates in a data-provider, on another cloud, that's where you should keep it. If it originates on your premise, keep it where it originated. >> Unless you need to combine it. But that's a new origination point. >> Then you're taking subsets of that data and then combining that up for itself. So that would be my first point. So organizations are going to need to put together what George was talking about, this metadata of all the data, how it interconnects, how it's being used. The flow of data through the organization, it's amazing to me that when you go to an IT shop they cannot define for you how the data flows through that data center or that organization. That's the requirement that you have to have and AI is going to be part of that solution, of looking at all of the applications and the data and telling you where it's going and how it's working together. >> So the second thing would be companies that are able to build or conceive of networks as data. Will also have an advantage. And I think I'd add a third one. Companies that demonstrate perennial observations, a real understanding of the unbelievable change that's required you can't just say, oh Facebook wants this therefore everybody's going to want it. There's going to be a lot of push marketing that goes on at the technology side. Alright so let's get to some Action Items. David Vellante, I'll start with you. Action Item. >> Well the future's going to be one where systems see, they talk, they sense, they recognize, they control, they optimize. It may be tempting to say, you know what I'm going to wait, I'm going to sit back and wait to figure out how I'm going to close that machine intelligence gap. I think that's a mistake. I think you have to start now, and you have to start with your data model. >> George Gilbert, Action Item. >> I think you have to keep in mind the guardrails related to governance, and trust, when you're building applications on the new data fabric. And you can take the approach of a platform-oriented one where you're plugging into an API, like Apache Atlas, that Hortonworks is driving, or a discovery-oriented one as David was talking about which would be something like Alation, using machine learning. But if, let's say the use case starts out as an IOT, edge analytics and cloud inferencing, that data science pipeline itself has to now be part of this fabric. Including the output of the design time. Meaning the models themselves, so they can be managed. >> Excellent. Jim Kobielus, you've been pretty quiet but I know you've got a lot to offer. Action Item, Jim. >> I'll be very brief. What you need to do is protect your data science knowledge base. That's the way to de-risk this entire process. And that involves more than just a data catalog. You need a data science expertise registry within your distributed value chain. And you need to manage that as a very human asset that needs to grow. That is your number one asset going forward. >> Ralph Finos, you've also been pretty quiet. Action Item, Ralph. >> Yeah, I think you've got to be careful about what you're trying to get done. Whether it's, it depends on your industry, whether it's finance or whether it's the entertainment business, there are different requirements about data in those different environments. And you need to be cautious about that and you need leadership on the executive business side of things. The last thing in the world you want to do is depend on data scientists to figure this stuff out. >> And I'll give you the second to last answer or Action Item. Neil Raden, Action Item. >> I think there's been a lot of progress lately in creating tools for data scientists to be more efficient and they need to be, because the big digital giants are draining them from other companies. So that's very encouraging. But in general I think becoming a data-driven, a digital transformation company for most companies, is a big job and I think they need to it in piece parts because if they try to do it all at once they're going to be in trouble. >> Alright, so that's great conversation guys. Oh, David Floyer, Action Item. David's looking at me saying, ah what about me? David Floyer, Action Item. >> (laughing) So my Action Item comes from an Irish proverb. Which if you ask for directions they will always answer you, "I wouldn't start from here." So the Action Item that I have is, if somebody is coming in saying you have to re-do all of your applications and re-write them from scratch, and start in a completely different direction, that is going to be a 20-year job and you're not going to ever get it done. So you have to start from what you have. The digital assets that you have, and you have to focus on improving those with additional applications, additional data using that as the foundation for how you build that business with a clear long-term view. And if you look at some of the examples that were given early, particularly in the insurance industries, that's what they did. >> Thank you very much guys. So, let's do an overall Action Item. We've been talking today about the challenges of de-risking digital business which ties directly to the overall understanding of the role of data assets play in businesses and the technology's ability to move from just protecting data, restoring data, to actually restoring the relationships in the data, the structures of the data and very importantly the models that are resident in the data. This is going to be a significant journey. There's clear evidence that this is driving a new valuation within the business. Folks talk about data as the new oil. We don't necessarily see things that way because data, quite frankly, is a very very different kind of asset. The cost could be shared because it doesn't suffer the same limits on scarcity. So as a consequence, what has to happen is, you have to start with where you are. What is your current value proposition? And what data do you have in support of that value proposition? And then whiteboard it, clean slate it and say, what data would we like to have in support of the activities that we perform? Figure out what those gaps are. Find ways to get access to that data through piecemeal, piece-part investments. That provide a roadmap of priorities looking forward. Out of that will come a better understanding of the fundamental data assets that are being created. New models of how you engage customers. New models of how operations works in the shop floor. New models of how financial services are being employed and utilized. And use that as a basis for then starting to put forward plans for bringing technologies in, that are capable of not just supporting the data and protecting the data but protecting the overall organization of data in the form of these models, in the form of these relationships, so that the business can, as it creates these, as it throws off these new assets, treat them as the special resource that the business requires. Once that is in place, we'll start seeing businesses more successfully reorganize, reinstitutionalize the work around data, and it won't just be the big technology companies who have, who people call digital native, that are well down this path. I want to thank George Gilbert, David Floyer here in the studio with me. David Vellante, Ralph Finos, Neil Raden and Jim Kobelius on the phone. Thanks very much guys. Great conversation. And that's been another Wikibon Action Item. (upbeat music)
SUMMARY :
I'm joined here in the studio has been that the difference and importantly in the context are going to have absolute consequential impacts and provide access to the data model, the ability of organizations to really implement this but is bigger than the role of technology. that is the right thing to do. Why is it that the most valuable companies in the world human expertise is at the center and the data lives in silos in other businesses, the buildings. the bottling plant et cetera, to data. Yeah we are, we are. an example from the news. and take the assets that you have. One of the primary reasons why is going to enable you to be able to build At the end of the day you look at your value proposition And then you need to put together a plan once again, talking about the Uberization of things, to erase my data, can you say with extreme confidence There's not a lot of companies on the planet and the only way you can be a trusted steward of that That's an example of another, one of the properties I mentioned the big five, but if you look at Spotify, and I'm going to come to you in a second And so really, what you need to do is, of protecting the full organization of those data assets. and use AI if you like to actually understand and it's not just the metadata, And the ability to continue to upgrade that and the technology companies also have to step up It's clear that you should keep data where it is. are going to have an advantage. So if it originates in the cloud, keep it in the cloud. Unless you need to combine it. That's the requirement that you have to have that goes on at the technology side. Well the future's going to be one where systems see, I think you have to keep in mind the guardrails but I know you've got a lot to offer. that needs to grow. Ralph Finos, you've also been pretty quiet. And you need to be cautious about that And I'll give you the second to last answer and they need to be, because the big digital giants David's looking at me saying, ah what about me? that is going to be a 20-year job and the technology's ability to move from just
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jim Kobielus | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
David Vellante | PERSON | 0.99+ |
David | PERSON | 0.99+ |
Apple | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Microsoft | ORGANIZATION | 0.99+ |
Neil | PERSON | 0.99+ |
ORGANIZATION | 0.99+ | |
Walmart | ORGANIZATION | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
David Floyer | PERSON | 0.99+ |
George Gilbert | PERSON | 0.99+ |
Jim Kobelius | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
Jim | PERSON | 0.99+ |
Geoffrey Moore | PERSON | 0.99+ |
George | PERSON | 0.99+ |
Ralph Finos | PERSON | 0.99+ |
Neil Raden | PERSON | 0.99+ |
INA | ORGANIZATION | 0.99+ |
Equifax | ORGANIZATION | 0.99+ |
Sears | ORGANIZATION | 0.99+ |
Peter | PERSON | 0.99+ |
March 2018 | DATE | 0.99+ |
Uber | ORGANIZATION | 0.99+ |
TIBCO | ORGANIZATION | 0.99+ |
DISCO | ORGANIZATION | 0.99+ |
David Vallante | PERSON | 0.99+ |
$160 billion | QUANTITY | 0.99+ |
20-year | QUANTITY | 0.99+ |
30 years | QUANTITY | 0.99+ |
Ralph | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Netflix | ORGANIZATION | 0.99+ |
Peter Drucker | PERSON | 0.99+ |
Express Scripts | ORGANIZATION | 0.99+ |
Veritas | ORGANIZATION | 0.99+ |
David Foyer | PERSON | 0.99+ |
Veeam | ORGANIZATION | 0.99+ |
$67 billion | QUANTITY | 0.99+ |
Palo Alto, California | LOCATION | 0.99+ |
first point | QUANTITY | 0.99+ |
thirty seconds | QUANTITY | 0.99+ |
second | QUANTITY | 0.99+ |
Spotify | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Connecticut General | ORGANIZATION | 0.99+ |
two things | QUANTITY | 0.99+ |
both | QUANTITY | 0.99+ |
about $3.5 trillion | QUANTITY | 0.99+ |
Hortonworks | ORGANIZATION | 0.99+ |
Cigna | ORGANIZATION | 0.99+ |
Both | QUANTITY | 0.99+ |
2000 | DATE | 0.99+ |
today | DATE | 0.99+ |
one | QUANTITY | 0.99+ |
Dylon Sears | ORGANIZATION | 0.98+ |
2018-01-26 Wikibon Action Item with Peter Burris
>> Hi, I'm Peter Burris. Welcome to Wikibon's Action Item. (light instrumental music) No one can argue that big data and related technologies have had significant impact on how businesses run, especially digital businesses. The evidence is everywhere. Just watch Amazon as it works its way through any number of different markets. It's highly dependent upon what you can get out of big data technologies to do a better job of anticipating customer needs, predict best actions, make recommendations, et cetera. On the other hand, nobody can argue, however, that the overall concept of big data has had significant issues from a standpoint of everybody being able to get similar types of value. It just hasn't happened. There have been a lot of failures. So today, from our Palo Alto studios, I've asked David Floyer, who's with me here, Jim Kobielus and Ralph Finos and George Gilbert are on the line, and what we're going to talk about is effectively where are we with big data pipelines and from a maturity standpoint to better increase the likelihood that all businesses are capable of getting value out of this. Jim, why don't you take us through it. What's the core issue as we think about the maturing of machine analytics, big data pipelines? >> Yeah, the core issue is the maturation of the machine learning pipeline, how mature is it? And the way Wikibon looks at the maturation of the machine learning pipeline independent of the platforms that are used to implement that pipeline are three issues. To what extent has it been standardized? Is there a standard conception, various tasks, phases, functions, and their sequence. Number two, to what extent has this pipeline at various points or end to end been automated to enable through point consistency. And number three, to what extent has this pipeline been accelerated not through just automation but through a very (static) and collaboration and handling things like governance in a repeatable way? Those are core issues in terms of the ML pipeline. But in the broader sense, the ML pipeline is only one work stream in the broader application development pipeline that includes code, development, and testing the pipeline. So really dev ops is really the broader phenomenon here. ML pipeline is one segment of the dev ops pipeline. >> So we need to start thinking about how we can envision the ML pipeline creating assets that businesses can use in a lot of different ways. For those assets specifically or models, machine learning models that can be used in more high value analytic systems. This pressure has been in place for quite a while. But David Floyer, there's a reason why right now this has become important. Why don't you give us a quick overview of kind of like where does this go? Why now? >> Why now? Why now is because automation is in full swing, and you've just seen the Amazon having the ability now to automate warehouses, and they've just announced the ability to automate stores, brick and mortar stores. You go in. You pick something up. You walk out. And that's all you have to do. No lines at checkout. No people in the checkout, a completely automated store. So that business model of automation of business processes is, to me, what all this has to lead up to. We have to take the existing automation that we have, which is the systems of record and other automation that we've had for many other years, and then we have to take the new capabilities of AI and other areas of automation, and apply those to those existing automation and start on this journey. It's a 10 year journey or more to automating as many of those business processes as possible. Something like 80% or 90% are there and can be automated. It's an exciting future, but what we have to focus on is being able to do it now and start doing it now. >> So that requires that we really do take an asset-oriented approach to all of this. At the end of the day, it's impossible to imagine business taking on increasing complexity within the technology infrastructure if it hasn't taken care of business in very core ways, not the least of which is do we have, as a business, have a consistent approach to thinking about how we build these models? So Jim, you've noted that there's kind of three overarching considerations. Help us go into it a little bit. Where are the problems that businesses are facing? Where are they seeing the lack of standardization creating the greatest issues? >> Yeah, well, first of all, the whole notion of a machine learning pipeline has a long vintage. It actually descends from the notion of a data mining pipeline, but the data mining industry, years ago, consolidated or had a consensus around some model called Crisp. I won't bore you with the details there. Taking it forward to an analytical pipeline or a machine learning pipeline, the critical issues we see now is the type of asset that's being built and productionized is a machine learning model, which is a statistical model that is increasingly built on artificial neural networks, you know, to drive things like data learning. Some of the critical things up front, the preparation of all the data in terms of ingest and transformation and cleansing, that's an old set of problems well established, and there's a lot of tools on the market that do that really well. That's all critical for data preparation prior to the modeling process really truly beginning. >> So is that breaking down, Jim? Is that the part that's breaking down? Is that the upfront understanding of the processes, or is it somewhere else in the pipeline process that is-- >> Yeah, it's in the middle, Peter. The modeling itself for machine learning is where, you know, there's a number of things that have to happen for these models to be highly predictive. A, you have to do something called feature engineering, and that's really fundamentally looking for the predictors in large data sets that you can build into models. And you can use various forms. So feature engineering is a highly manual process that to some increasingly is being automated. But a lot of it is really leading edge technology is in the research institutes of the world. That's a huge issue of how to automate more of the upfront feature engineering. That feeds into the second core issue is that there's 10 zillion ways to skin the statistical model cat, the algorithms. You know, from the older models, the port vic-machine, to the newer artificial neural networks convolution. Blah, blah, blah. So a core issue, okay, you have a feature set through feature engineering, which of the 10 zillion algorithms should you use to actually build the model based on that feature set. So there are tools on the market that can accelerate some of these selection and testing of those alternate ways of building out those models. But once again, that highly manual process, traditionally manual process and selecting the items, building the models, still needs a lot of manual care and feeding to really be done right. It's human judgment. You really need high power data scientists. And then three, once you have the models built, training them. Training is critical with actual data to determine whether the models actually are predictive or do face recognition or whatever it is with a high degree of accuracy. Training itself is a very complicated pipeline in its own right. It takes a lot of time. It takes a lot of resources, a lot of storage. You got to, you know, your data link and so forth. The whole issue of standardizing on training of machine learning models is a black art on its own. And I'm just scratching the surface of these issues that are outstanding in terms of actually getting greater automation into a highly manual, highly expert-driven process. Go ahead, David. >> Jim, can I just break in? You've mentioned three things. They're very much in the AI portion of this discussion. The endpoint has to be something which allows automation of the business process, and fundamentally, it's real time automation. I think you would agree with that. So the outcome of that model then has to be a piece of code that is going to be as part of the overall automation system in the enterprise and has to fit in, and if it's going to be real time, it's got to be really fast as well. >> In other words, if the asset that's created by this pipeline is going to be used in some other set of activities? >> Correct, so it needs to be tested in that set of activities and part of a normal curve. So what is the automation? What is that process to get that code into a place where it can actually be useful to the enterprise and save money? >> Yeah, David, it's called dev ops, and really dev ops means a number of different things including especially a source code, code control repository. You know, in the broader scheme of things that repository for your code for dev ops for continuous release and cycles needs to be expanded, and it's scoped to include machine learning models, deep learning, whatever it is you're building based on the data. What I'm getting at is a deepening repository of what I call logic that is driving your applications. It's code. It's Java, C++, or Sharp or whatever. It's statistical and predictive model. It's orchestration models you're using for BPM and so forth. It's maybe graph models. It's a deep and thickening layer of logic that needs to be pushed into your downstream applications to drive these levels of automation. >> Peter: So Jim? >> It has to be governed and consolidated. >> So Jim? The bottom line is we need maturity in the pipeline associated with machine learning and big data so that we can increase maturity in how we apply those assets elsewhere in the organization? Have I got that right? >> Right. >> George, what is that going to look like? >> Well, I want to build on what Jim was talking about earlier and my way of looking at this, at the pipeline, is actually to break it out into four different ones. And actually, Jim, as he's pointed out, there's more than potentially four. But the first is the design time for the applications, these new modern, operational, analytic applications, and I'll tie that back to the systems of record and effect. The second is the run time pipeline for these new operational, analytic applications, and those applications really have a separate pipeline for design time and run time of the machine learning models. And the reason I keep them separate is they are on a separate development and deployment and administration scaffolding from the operational applications. And the way it works with the systems of record, which of course, we're not going to be tearing out for decades, they might call out to one of these new applications, feed in some predictors, or have some calculated, and then they get a prediction or a prescription back for the system of record. I think the parts-- >> So George, what has to happen is we have to be able to ensure that the development activities that actually build the applications the business finds valuable and the processes by which we report into the business some of the outcomes of these things and the pipelines associated with building these models, which are the artifacts and the assets created by the pipelines, all have to come together. Are we talking about a single machine or big data pipeline? George, you mentioned four. Are we going to see pipelines for machine learning and pipelines for deep learning and pipelines for other types of AI? Are we going to see a portfolio of pipelines? What do you guys think? >> I think so, but here's the thing. I think there's going to be a consolidated data lake from which all of these pipelines draw the data that are used for modeling and downstream deployment. But if you look at training of models, you know, deep learning models, which are like their name indicates, they're deep, hierarchical. They're used for things like image recognition and so forth. The data there is video and speech and so forth. And there's different kinds of algorithms that they use to build, and there's different types of training that needs to happen for deep learning versus like other machine learning models versus whatever else-- >> So Jim, let me stop you because-- >> There are different processes. >> Jim, let me stop you. So I want to get to the meat of this guys. Tell me what a user needs to do from a design standpoint to inform their choice of pipeline building, and then secondarily, what kind of tools they're going to need. Does it start with the idea that there's different algorithms? There's different assets that are being created at the model level? Is it really going to feed that? And that's going to lead to a choice of tools? Is it the application requirements? How mature, how standardized, can we really put in place conventions for doing this now so it becomes a strategic business capability? >> I think there has to be a recognition. There's different use cases downstream. 'Cause these are different types of applications entirely built from AI in the broadest sense. And they require different data, different algorithm. But you look at the use cases. So in other words, the use cases, like Chatbox. That's a use case now for AI. That's a very different use case from say self-driving vehicle. So those need entirely different pipelines in every capacity to be able to build out and deploy and manage those disparate applications. >> Let me make sure I got this, Jim. What you're saying is that the process of creating a machine learning asset, a model, is going to be different at the pipeline level. It's not going to be different at the data level. It's going to be different at the pipeline level. George, does that make sense? Is that right? Do you see it that way, too, as we talk to folks? >> I do see what Jim is saying in the sense that if you're using sort of operational tooling or guardrails to maintain the fidelity of your model that's being called by an existing system of record, that's a very different tooling from what's going to be managing your IOT models, which have to get distributed and which may have sort of a central canonical version and then an edge specific instance. In other words, I do think we're going to see different tooling because we're going to see different types of applications being fed and maintained by these models. >> Organizationally, we might have a common framework or approach, but the different use cases will drive different technology selections, and those pipelines themselves will be regarded as assets that generate machine learning and other types of assets that then get applied inside these automation applications. Have I got that right, guys? >> Yes. >> Yes. A quick example to illustrate exactly what we're referring to here. So IOT, George brought up IOT analytics with AI built in its edge applications. We're going to see a bifurcation between IOT analytic applications where the training of the models is done in a centralized way because you've got huge amounts of data that needs to be training these very complex models that are running in the cloud but driving all these edge nodes and gateways and so forth, but then you're going to have another pipeline for edge-based training of models for things like autonomous operation where more of the actual training will happen at the edges, at the perimeter. It'll be different types of training using different types of data with different types of time lags and so forth built in. But there will be distinct pipelines that need to be managed in a broader architecture. >> So issues like the ownership of the data, the intellectual property control, the data, the location of the data, the degree to which regulatory compliance is associated with it, how it gets tested, all those types of issues are going to have an impact on the nature of the pipelines that we build here. >> Yes. >> So look, one of the biggest challenges that every IT organization has, in fact every business has, is the challenge that if you have this much going on, the slowest part of it slows everything else down. So there's always an impedance mismatch organizationally. Are we going to see a forcing of data science, application development, routines, practices, and conventions start to come together because the app development world, which is being asked to go faster and faster and faster is at some point in time say, I can't wait for these guys to do their sandbox stuff? What do you think, guys? Are we going to see that? David, I'll look at you first, and Jim, I'll go to you. >> Sure, I think that the central point of control for this is going to have to be the business case for developing this automation, and therefore from that, what's required in that system of record. >> Peter: Where the money is. >> Where the money is. What is required to make that automation happen, and therefore from that, what are you going to pick as your ways of doing that? And I think that at the moment, it seems to me as an outsider, it's much more driven by the data scientists rather than the people, the business line, and eventually the application developers themselves. I think that shift has to happen. >> Well, yeah, well, one of our predictions has been that the tools are improving and that that's going to allow for a separation, increased specialization of the data science world, and we'll see the difference between people who are really doing data science and people who are doing support work. And I think what we're saying here is those people who do support work are going to end up moving closer to the application development world. Jim, I think that's basically some research that you've done as well. Have I got that right? Okay, so let me wrap up our Action Item here. David Floyer, do you have a quick observation, a quick Action Item for this segment? >> For this segment? The Action Item to me is putting together a business case for automation, the fundamental reduction of costs and improvement of business model, and that to me, is what starts this off. How are you going to save money? Where is it most important? Where in your business model is it most important? And what we've done is some very recent research is put out a starting point for this discussion, a business model of a 10 billion dollar company, and we're predicting that it saves 14 billion dollars. >> Let's come to that. The Action Item is basically, start getting serious about this stuff because based on business cases, yeah. All right, so let me summarize very quickly. For Jim Kobielus and George Gilbert and Ralph Finos, who seem to have disappeared off our screens and David Floyer, our Action Item is this. That the leaders in the industry, in the digital world, are starting to apply things like machine learning, deep learning, and other AI forms very aggressively to compete, and that's going to force everybody to get better at this. The challenge, of course, is if you're forcing, or if you're spending most of your time on the underlying technology, you're not spending most of your time figuring out how to actually deliver the business results. Our expectation is that over the course of the next year, one of the things that are going to happen significantly within organizations will be a drive to improve the degree to which machine learning pipelines become more standardized reflecting of good data science practices within the business which itself will change based on the nature of the business, regulatory businesses versus non-regulatory businesses, for example. Having those activities be reflected in the tooling choices, have those tooling choices then be reflected in the types of models you want to build, and those models, those machine learning models ultimately reflecting the needs of the business case. This is going to be a domain that requires a lot of thought in a lot of IT organizations, a lot of inventions yet to be done here. But it's going to, we believe, drive a degree of specialization within the data science world as the tools improve and a realignment of crucial value-creating activities within the business so that what is data science becomes data science. What's more support, what's more related to building these pipelines, and operating these pipelines becomes more associated with dev ops and application development overall. All right, so for the Wikibon team, Jim Kobielus, Ralph Finos, George Gilbert, and here in the studio with me, David Floyer, this has been Wikibon's Action Item. We look forward to seeing you again. (light instrumental music)
SUMMARY :
that the overall concept of big data has had of the platforms that are used to implement the ML pipeline creating assets the ability to automate stores, brick and mortar stores. At the end of the day, it's impossible to imagine Some of the critical things up front, the preparation and that's really fundamentally looking for the predictors So the outcome of that model then has to be What is that process to get that code into a place where it that needs to be pushed into your downstream applications at the pipeline, is actually to break it out created by the pipelines, all have to come together. that needs to happen for deep learning versus And that's going to lead to a choice of tools? I think there has to be a recognition. It's not going to be different at the data level. or guardrails to maintain the fidelity of your model or approach, but the different use cases will drive huge amounts of data that needs to be training the location of the data, the degree to which is the challenge that if you have this much going on, is going to have to be the business case for developing and eventually the application developers themselves. and that that's going to allow for a separation, and that to me, is what starts this off. Our expectation is that over the course
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jim | PERSON | 0.99+ |
David | PERSON | 0.99+ |
David Floyer | PERSON | 0.99+ |
Jim Kobielus | PERSON | 0.99+ |
George | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
George Gilbert | PERSON | 0.99+ |
Ralph Finos | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
80% | QUANTITY | 0.99+ |
10 year | QUANTITY | 0.99+ |
Peter | PERSON | 0.99+ |
2018-01-26 | DATE | 0.99+ |
14 billion dollars | QUANTITY | 0.99+ |
90% | QUANTITY | 0.99+ |
Wikibon | ORGANIZATION | 0.99+ |
10 billion dollar | QUANTITY | 0.99+ |
10 zillion algorithms | QUANTITY | 0.99+ |
10 zillion ways | QUANTITY | 0.99+ |
three | QUANTITY | 0.99+ |
three issues | QUANTITY | 0.99+ |
next year | DATE | 0.99+ |
first | QUANTITY | 0.99+ |
Java | TITLE | 0.99+ |
one | QUANTITY | 0.99+ |
today | DATE | 0.98+ |
second | QUANTITY | 0.98+ |
four | QUANTITY | 0.97+ |
one segment | QUANTITY | 0.96+ |
second core issue | QUANTITY | 0.92+ |
three things | QUANTITY | 0.89+ |
years | DATE | 0.87+ |
Palo Alto | LOCATION | 0.87+ |
single machine | QUANTITY | 0.85+ |
Chatbox | TITLE | 0.82+ |
decades | QUANTITY | 0.79+ |
one work stream | QUANTITY | 0.79+ |
C+ | TITLE | 0.77+ |
Number two | QUANTITY | 0.69+ |
number three | QUANTITY | 0.69+ |
three overarching considerations | QUANTITY | 0.65+ |
Sharp | TITLE | 0.53+ |
IOT | ORGANIZATION | 0.49+ |
Action Item with Peter Burris
>> Hi, I'm Peter Burris. Welcome to Wikibon's Action Item. On Action Item, every week I assemble the core of the Wikibon research time here in our theCUBE Palo Alto studios, as well as remotely, to discuss a seminal topic that's facing the technology industry, and business overall, as we navigate this complex transition of digital business. Here in the studio with me this week, I have David Floyer. David, welcome. >> Thank you. >> And then remotely, we have George Gilbert, Neil Raden, Jim Kobielus, and Ralph Finos. Guys, thank you very much for joining today. >> Hi, how are you doing? >> Great to be here. >> This week, we're going to discuss something that's a challenge to talk about in a small format, but we're going to do our best, and that is, given that the industry is maneuvering through this significant transformation from a product orientation to a services orientation, what's that going to mean for business models? Now this is not a small question, because there are some very, very big players that the technology industry has been extremely dependent upon to drive forward invention, and innovation, and new ideas, and customers, that are entirely dependent upon this ongoing stream of product revenue. On the other hand, we've got companies like AWS, and others that are much more dependent upon the notion of services revenue, where the delivery of the value is in a continuous service orientation. And we conclude most of the SaaS players in that as well, like sales force, etc. So how are those crucial companies, that have been so central to the development of the technology industry, and still are essential to the future of the technology industry, going to navigate this transition? Similarly, how are the services companies, for those circumstances in which the customer does want a private asset that they can utilize as a basis for performing their core business, how are they going to introduce a product orientation? What's that mix, what's that match going to be? And that's what we're going to talk about today. So David, I've kind of laid it out, but really, where are we in this notion of product to service in some of these business model changes? >> It's an early stage, but it's very, very profound changes going on. We can see it from the amount of business of the cloud business supplies are providing. You can see that Amazon, Google, IBM, and Microsoft Azure, all of those are putting very large resources into creating services to be provided to the business itself. But equally, we are aware that services themselves need to be on premise as well, so we're seeing the movement of true private cloud, for example, which is going to be provided as a service as well, so if we take some examples, like for example, Oracle, the customer, they're a cloud customer, they're providing exactly the same service on premise as they provide in the cloud. >> And by service, you mean in how the customer utilizes the technologies. >> Correct. >> The asset arrangement may be very different, but the proposition of what the customer gets out of the assets are essentially the same. >> Yes, the previous model was, we provide you with a product, you buy a number of those products, you put them together, you service it, you look after it. The new model, here coming in with TPC, with the single throat to choke, is that the vendor will look after the maintenance of everything, putting in new releases, bringing things up to date, and they will have a smaller set of things that they will support, and as a result, it's win-win. It's win for the customer, because he's costs are lower, and he can concentrate on differentiated services. >> And secure and privatize his assets. >> Right, and the vendor wins because they have economies of scale, they can provide it at a much lower cost as well. And even more important to both sides is that the time to value of new releases is much, much quicker, and time to security exposures, time to a whole number of other things, improve with this new model. >> So Jim, when we think about this notion of a services orientation, ultimately, it starts to change the relationships between the customer and the vendor. And the consequence of that is, not surprisingly, that a number of different considerations, whether they be metrics, or other elements, become more important. Specifically we start thinking about the experience that the customer has of using something. Walk us through this kind of transition to an experience-oriented approach to conceiving of whether or not the business model's being successful. >> Right, your customer will now perceive the experience in the context of an entire engagement that is multi-channel, multi-touch point, multi-device, multi-application, and so forth, where they're expecting the same experience, the same value, the same repeatable package of goodies, whatever it is they get from you, regardless of the channel through which you're touching them or they're touching you. That channel may be provided through a private, on-premises implementation of your stack, or through a public cloud implementation of your capability, or most likely through all of the above, combined into a hybrid true private cloud. Regardless of the packaging, and the delivery of that value in the context of the engagement the customer expects it to be, self-service increasingly, predictable, managed by the solution provider, guaranteed with a fast continuous release in update cycle. So, fundamentally it's an experience economy, because the customer has many other options to go to, of providers that can provide them with a good or better experience, in terms of the life cycle of things that you're doing for them. So bottom line, the whole notion of a TPC really gets to that notion that the experience is the most important thing, the cloud experience, that can be delivered on-prem, or can be delivered in the public environment. And that's really the new world. With a multi-cloud is that master sort of a matrix of the seamless cross-channel experience. >> We like to think of the notion of a business model as worrying about three fundamental questions. How are you going to create value? How are you going to deliver value? And how are you going to capture value? Where the creation is how shared it's going to be, it's going to be a network of providers, you're going to have to work with OEMs. The delivery, is it going to be online, is it going to be on-prem? Those types of questions, but this notion of value capture is a key feature, David, of how this is changing. And George, I want to ask you a question. The historical norm is that value capture took place in the form of, I give you a product, you give me cash. But when we start moving to a services-orientation, where the services is perhaps being operated and delivered by the supplier, it introduces softer types of exchange mechanisms, like, how are you going to use my data? Are you going to improve the fidelity of the system by pooling me with a lot of other customers? Am I losing my differentiation? My understanding of customers, is that being appropriated and munged with others to create models? Take us through this soft value capture challenge that a service provider has, and what specifically, I guess actually the real challenge that the customer has as they try to privatize their assets, George. >> So, it's a big question that you're asking, and let me use an example to help sort of make concrete the elaboration, or an explanation. So now we're not just selling software, but we might be selling sort of analytic data services. Let's say, a vendor like IBM works with Airbus to build data services where the aircraft that Airbus sells to its airline customers, that provides feedback data that both IBM has access to, to improve its models about how the aircraft work, as well as that data would also go back to Airbus. Now, Airbus then can use that data service to help its customers with prescriptions about how to operate better on certain routes, how to do maintenance better, not just predictive maintenance, but how to do it more just in time with less huge manuals. The key here is that since it's a data service that's being embedded with the product, multiple vendors can benefit from that data service. And the customer of the traditional software company, so in this case, Airbus being the customer of IBM, has to negotiate to make sure its IP is protected to some extent, but at the same time, they want IBM to continue working with that data feedback because it makes their models richer, the models that Airbus gets access to richer over time. >> But presumably that has to be factored into the contractual obligations of both parties enter into, to make sure that those soft dollars are properly commensurated in the agreements. That's not something that we're seeing a lot in the industry, but the model of how we work closely with our clients and our customers is an important one. And it's likely to change the way that IT thinks about itself as a provider of services. Neil, what kinds of behaviors are IT likely to start exhibiting as it finds itself, if not competing, at least trying to mimic the classes of behaviors that we're seeing from service providers inside their own businesses? >> Yeah, well, IT organizations grew over the last, I dunno, 50 years or so, organically, and it was actually amazing how similar their habits and processes, and ways of doing things were the same across industries, and locations, and so forth. But the problem was that everything they had to deal with, whether they were the computers, or the storage, or the networks, and so forth, were all really expensive. So they were always in a process of managing from scarcity. The business wanted more and more from them, and they had lower and lower budgets, because they had to maintain what they had, so it created a lot of tension between IT and organizations, and because of that, whenever a conversation happened between other groups within the business and IT, IT always seemed to have the last word, no, or okay. Whatever the decision was, it was really IT's. And what I see happening here is, when the IT business becomes less insular, I think a lot of this tension between IT and the rest of the organization will start to dissipate. And that's what I'm hoping will happen, because they started this concept of IT vs the business, but if you went out in an organization and asked 100 people what they did, not one of them would say, "I'm the business," right? They have a function, but IT created this us vs them thing, to protect themselves, and I think that once they're able to utilize external services for hardware, for software, for whatever else they have to do, they become more like a commercial operation, like supply-side, or procurement, or something, and managing those relationships, and getting the services that they're paying for, and I think ultimately that could really help organizations, by breaking down those walls in IT. >> So it used to be that an IT decision to make an investment would have uncertain returns, but certain costs, and there are multiple reasons why those returns would be uncertain, or those benefits would be uncertain. Usually it was because some other function would see the benefits under their umbrella, you know, marketing might see increased productivity, or finance would see increased productivity as a consequence of those investments, but the costs always ended up in IT. And that's one of the reasons why we yet find ourself in this nasty cycle of constantly trying to push costs down, because the benefits always showed up somewhere else, the costs always showed up inside IT. But it does raise this question ultimately of, does this notion of an ongoing services orientation, is it just another way of saying, we're letting a lock in back in the door in a big way? Because we're now moving from a relationship, a sourcing relationship that's procurement oriented, buy it, spend as little money as possible, get value out of it, as opposed to a services orientation, which is effectively, move responsibility for this part of the function off into some other service provider, perpetually. And that's going to have a significant implication, ultimately, on the question of whether or not we buy services, default to services. Ralph, what do you think, where are businesses going to end up on this, are we just going to see everything end up being a set of services, or is there going to be some model that we might use, and I'll ask the team this, some model that we might use to conceive when it should be a purchase, and when it should be a service? What do you think, Ralph? >> Yeah, I think the industry's gravitating towards a service model, and I think it's a function of differentiation. You know, if you're an enterprise, and you're running a hundred different workloads, and 15 of them are things that really don't differentiate you from your competition, or create value that's differentiable in some kind of way, it doesn't make any sense to own that kind of functionality. And I think, in the long run, more and more aspects, or a higher percentage of workload is going to be in that category. There will always be differentiation workloads, there will always be workloads requiring unique kinds of security, especially around transactions. But in the net, the slow march of service makes a lot of sense to me. >> What do you think, guys? Are we going to see, uh, do we agree with Ralph, number one? And number two, what about those exceptions? Is there a framework that we can start to utilize to start helping folks imagine what are the exceptions to that rule, what do you think David? >> Sure, I think that there are circumstances when... >> Well first, do we generally agree with the march? >> Absolutely, absolutely. >> I agree too. >> Yes, fully agree that more and more services are going to be purchased, and a smaller percentage of the IT budget from an enterprise will go into specific purchases of assets. But there are some circumstances where you will want to make sure that you have those assets on premise, that there is no other call on those assets, either from the court, or from difference of priority between what you need and what a service provider needs. So in both those circumstances, they may well choose to purchase it, or to have the asset on the premise so that it's clearly theirs, and clearly their priority of when to use it, and how to use it. So yes, clearly, an example might be, for example, if you are a bank, and you need to guarantee that all of that information is yours, because you need to know what assets are owned by who, and if you give it to a service provider, there are circumstances where there could be a legal claim on that service provider, which would mean that you'll essentially go out of business. So there are very clear examples of where that could happen, but in general, I agree. There's one other thing I'd like to add to this conversation. The interesting thing from an IT point of view, an enterprise IT, is that you'll have fewer people to do business with, you'll be buying a package of services. So that means many of the traditional people that you did business with, both software and hardware, will not be your customers anymore, and they will have to change their business models to deal with this. So for example, Permabit has become an OEM supplier of capabilities of data management inside. And Kaminario has just announced that it's becoming a software vendor. >> Nutanix. >> Nutanix is becoming a software vendor, and is either allowing other people to take the single throat to choke, or putting together particular packages where it will be the single throat to choke. >> Even NetAct, which is a pretty consequential business, has been been around for a long time, is moving in this direction. >> Yes, a small movement in that direction, but I think a key question for many of these vendors are, do I become an OEM supplier to the... >> Customer owner. >> The customer owner. Or what's my business model going to be? Should I become the OEM supplier, or should I try and market something directly in some sort of way to the vendors? >> Now this is a very important point, David, because one of the reasons, for a long time, why the OEM model ran into some challenges, is precisely over customer ownership. But when data from operations of the product, or of the service is capable of flowing, not only to the customer engagement originator, but also to the OEM supplier, the supplier has pretty significant, the OEM company has pretty significant visibility, ultimately, into what is going on with their product. And they can use that to continuously improve their product, while at the same time, reducing some of the costs associated with engagement. So the flowing of data, the whole notion of digital business allows a single data about operation to go to multiple parties, and as a consequence, all those parties now have viable business models, if they do it right. >> Yeah, absolutely. And Kaminario will be be a case in point. They need metadata about the whole system, as a whole, to help them know how to apply the best patches to their piece of software, and the same is true for other suppliers of software, the Permabit, or whoever those are, and it's the responsibility of that owner or the customer to make sure that all of those people can work in that OEM environment effectively, and improve their product as well. >> Yeah, so great conversation guys. This is a very, very rich and fertile domain, and I think it's one that we're going to come back to, if not directly, at least in talking about how different vendors are doing things, or how customers have to, or IT organizations have to adjust their behaviors to move from a procurement to a strategic sourcing set of relationships, etc. But what I'd like to do now, as we try to do every week, is getting to the Action Item round, and I'm going to ask each of you guys to give me, give our audience, give our users, the action item, what do they do differently on next Monday as a consequence of this conversation? And George Gilbert, I'm going to start with you. George, action item. >> Okay, so mine is really an extension of what we were talking about when I was raising my example, which is your OEM supplier, let's say IBM, or a company we just talked to recently, C3 IoT, is building essentially what are application data services that would accompany your products that you, who used to be a customer, are selling a supply chain master, say. So really trying to boil that down is, there is a model of your product or service could be the digital twin, and as your vendor keeps improving it, and you offer it to your customers, you need to make sure that as the vendor improves it, that there is a version that is backward compatible with what you are using. So there's the IP protection part, but then there's also the compatibility protection part. >> Alright, so George, your action item would be, don't focus narrowly on the dollars being spent, factor those soft dollars as well, both from a value perspective, as well an ongoing operational compatibility perspective. Alright, Jim Kobielus, action item. >> Action item's for IT professionals to take a quick inventory of what of your assets in computing you should be outsourcing to the cloud as services, it's almost everything. And also, to inventory, what of your assets must remain in the form of hard discreet tangible goods or products, and my contention is that, I would argue that the edge, the OT, the operational technology, the IOT, sensors and actuators that are embedded in your machine tools and everything else, that you're running the business on, are the last bastion of products in this new marketplace, where everything else becomes a service. Because the actual physical devices upon which you've built your OT are essentially going to remain hard tangible products forevermore, of necessity, and you'll probably want to own those, because those are the very physical fabric of your operation. >> So Jim, your action item is, start factoring the edge into your consideration of the arrangements of your assets, as you think about product vs services. >> Yes. >> Neil Raden, action item. >> Well, I want to draw a distinction between actually, sorry, between actually, ah damn, sorry. (laughs) >> Jim: I like your fan, Neil. >> Peter: Action item, get your monitor right. >> You know. I want to draw the distinction between actually moving to a service, as opposed to just doing something that's a funding operate. Suppose we have 500 Oracle applications in our company running on 35 or 40 Oracle instances, and we have this whole army of Oracle DBAs, and programmers, and instance tuners, and we say well, we're going to give all the servers to the Salvation Army, and we're going to move everything to the Oracle cloud. We haven't really changed anything in the way the IT organization works. So if we're really looking for change in culture and operation, and everything else, we have to make sure we're thinking about how we're changing, reading the way things get done and managed in the organization. And I think just moving to the cloud is very often just a budgetary thing. >> So your action item would be, as you go through this process, you're going to re-institutionalize the way you work, get ready to do it. Ralph Finos, action item. >> Yeah, I think if you're a vendor, if you're an IT industry vendor, you kind of want to begin to look a lot like, say, a Honda or Toyota in terms of selling the hardware to get the service in the long term relationship in the lock-in. I think that's really where the hardware vendors, as one group of providers, is going to want to go. And I think you want, as a user and an enterprise, I think you're going to want to drive your vendors in that direction. >> So your action item would be, for a user anyway, move from a procurement orientation that's focused on cost, to a vendor management orientation that's focused on co-development, co-evolution of the value that's being delivered by the service. David Floyer, action item. >> So my action item is for vendors, a whole number of smaller vendors. They have to decide whether they're going to invest in the single most expensive thing that they can do, which is an enterprise sales force, for direct selling of their products to enterprise IT, and-or whether they're going to take an OEM type model, and provide services to a subset, for example, to focus on the cloud service providers, which Kaminario are doing, or focus on selling indirectly to all of the, the vendors who are owning the relationship with the enterprise. So that, to me, is a key decision, very important decision as the number of vendors will decline over the next five years. >> Certainly, what we have, visibility to what we have right now, so your action item is, as a small vendor, choose whose sales force you're going to use, yours or somebody else's. >> Correct. >> Alright. So great conversation guys. Let me kind of summarize this a bit. This week, we talked about the evolving business models in the industry, and the basic notion, or the reason why this has become such an important consideration, is because we're moving from an era where the types of applications that we were building were entirely being used internally, and were therefore effectively entirely private, vs increasingly trying to extend even those high-volume transaction processing applications into other types of applications that deliver things out to customers. So the consequence of the move to greater integration, greater external delivery of things within the business, has catalyzed this movement to the cloud. And as a consequence, this significant reformation, from a product to a services orientation, is gripping the industry, and that's going to have significant implications on how both buyers and users of technology, and sellers and providers of technology are going to behave. We believe that the fundamental question is going to come down to, what process are you going to use to create value, with partnerships, go it alone? How are you going to deliver that value, through an OEM sales force, through a network of providers? And how are you going to capture value out of that process, through money, through capturing of data, and more of an advertising model? These are not just questions that feature in the consumer world, they're questions that feature significantly in the B2B world as well. Our expectations, over the next few years, we expect to see a number of changes start to manifest themselves. We expect to see, for example, a greater drive towards experience of the customer as a dominant consideration. And today, it's the cloud experience that's driving many of these changes. Can we get the cloud experience, both the public cloud, and on premise, for example? Secondly, our expectations that we're going to see a lot of emphasis on how soft exchanges of value take place, and how we privatize those exchanges. Hard dollars are always going to flow back and forth, even if they take on subscription, as opposed to a purchase orientation, but what about that data that comes out of the operations? Who owns that, and who gets to lay claim to future revenue streams as a consequence of having that data? Similarly, we expect to see that we will have a new model that IT can use to start focusing its efforts on more business orientation, and therefore not treating IT as the managers of hardware assets, but rather managers of business services that have to remain private to the business. And then finally, our expectation is that this march is going to continue. There will be significant and ongoing drive to increase the role that a service's business model plays in how value is delivered, and how value is captured. Partly because of the increasing dominant role that data's playing as an asset in digital business. But we do believe that there are some concrete formulas and frameworks that can be applied to best understand how to arrange those assets, how to institutionalize and work around those assets, and that's a key feature of how we're working with our customers today. Alright, once again, team, thank you very much for this week's Action Item. From theCUBE studios in beautiful Palo Alto, I want to thank David Floyer, George Gilbert, Jim Kobielus, Neil Raden, and Ralph Finos, this has been Action Item.
SUMMARY :
Here in the studio with me this week, I have David Floyer. And then remotely, we have George Gilbert, Neil Raden, that have been so central to the development of the cloud business supplies are providing. And by service, you mean in how the customer but the proposition of what the customer Yes, the previous model was, we provide you with the time to value of new releases is much, that the customer has of using something. because the customer has many other options to go to, Where the creation is how shared it's going to be, the models that Airbus gets access to richer over time. But presumably that has to be factored into because they had to maintain what they had, or is there going to be some model that we might use, But in the net, the slow march of service So that means many of the traditional people the single throat to choke, or is moving in this direction. do I become an OEM supplier to the... Should I become the OEM supplier, So the flowing of data, the whole notion of digital business and it's the responsibility of that owner or the customer and I'm going to ask each of you guys to give me, could be the digital twin, and as your vendor don't focus narrowly on the dollars being spent, And also, to inventory, what of your assets of the arrangements of your assets, Well, I want to draw a distinction between And I think just moving to the cloud is get ready to do it. in terms of selling the hardware to get the service co-development, co-evolution of the value and provide services to a subset, for example, what we have right now, so your action item is, So the consequence of the move to greater integration,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Airbus | ORGANIZATION | 0.99+ |
David | PERSON | 0.99+ |
David Floyer | PERSON | 0.99+ |
Jim Kobielus | PERSON | 0.99+ |
Honda | ORGANIZATION | 0.99+ |
George | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Peter Burris | PERSON | 0.99+ |
Ralph Finos | PERSON | 0.99+ |
Jim | PERSON | 0.99+ |
George Gilbert | PERSON | 0.99+ |
Neil Raden | PERSON | 0.99+ |
Toyota | ORGANIZATION | 0.99+ |
Neil | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
15 | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
Peter | PERSON | 0.99+ |
Ralph | PERSON | 0.99+ |
50 years | QUANTITY | 0.99+ |
Salvation Army | ORGANIZATION | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
Wikibon | ORGANIZATION | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Kaminario | ORGANIZATION | 0.99+ |
one | QUANTITY | 0.99+ |
both | QUANTITY | 0.99+ |
100 people | QUANTITY | 0.99+ |
NetAct | ORGANIZATION | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
single | QUANTITY | 0.99+ |
both parties | QUANTITY | 0.99+ |
next Monday | DATE | 0.99+ |
This week | DATE | 0.99+ |
500 | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
each | QUANTITY | 0.99+ |
both sides | QUANTITY | 0.99+ |
40 | QUANTITY | 0.98+ |
35 | QUANTITY | 0.98+ |
Nutanix | ORGANIZATION | 0.98+ |
today | DATE | 0.98+ |
this week | DATE | 0.98+ |
Permabit | ORGANIZATION | 0.96+ |
Secondly | QUANTITY | 0.95+ |
single data | QUANTITY | 0.92+ |
theCUBE | ORGANIZATION | 0.91+ |
Action Item | 2018 Predictions Addendum
>> Hi I'm Peter Burris. Welcome to Action Item. (upbeat electronic music) Every week I bring the Wikibon research team together to talk about some of the issues that are most important in the computing industry and this week is no different. This week I'm joined by four esteemed Wikibon analysts, David Floyer, Neil Radon, Jim Kobielus, Ralph Finos, and what we're going to do is we're going to talk a few minutes about some of the predictions that we did not get into, our recent predictions webinar. So, I'd like to start off with Jim Kobielus. Jim, one of the things that we didn't get a chance to talk about yesterday in the overall predictions webinar was some of the new AI frameworks that are on the horizon for developers. So, let's take a look at it. What's the prediction? >> Prediction for 2018, Peter, is that the AI community will converge on an open framework. An open framework for developing, training and deploying deep learning and machine learning applications. In fact, in 2017, we've seen the momentum in this direction, strong momentum. If you were at AWS re:Invent just a few weeks ago, you'll notice that on the main stage, they discuss what they're doing in terms of catalyzing an open API, per building AI, an open model interchange format, and an open model compilation framework, and they're not the only vendor who's behind this. Microsoft has been working with AWS, as well as independently and with other partners to catalyze various aspects of this open framework. We also see Intel and Google and IBM and others marching behind a variety of specifications such as Gluon (mumbles) NNVM and so forth, so we expect continued progress along these lines in 2018, and that we expect that other AI solution provider, as well as users and developers will increasingly converge on this, basically, the abstraction framework that will make it irrelevant whether you build your model in TensorFlow or MXNet or whatever, you'd be able to compile it and run it in anybody else's back end. >> So Jim, one question then we'll move on to Neil really quickly but one question that i have is the relationship between tool choice and role in the organization has always been pretty tight. Roles have changed as a consequence of the availability of tools. Now we talked about some of the other predictions. How the data scientist role is going to change. As we think about some of these open AI development frameworks, how are they going to accommodate the different people that are going to be responsible for building and creating business value out of AI and data? >> Pete, hit it on another level that i didn't raise in my recent predictions document, but i'll just quickly touch on it. We're also seeing the development of open devops environments within which teams of collaborators, data scientists, subject matter experts, data engineers and so forth will be able to build and model and train and deploy deep learning and so forth within a standard workflow where each one of them has task-oriented tools to enable their piece but they all share a common governance around the models, the data and so forth. In fact, we published a report several months ago, Wikibon, talking about devops for data science, and this is a huge research focus for us going forward, and really, for the industry as a whole. It's productionizing of AI in terms of building and deploying the most critical applications, the most innovative applications now in business. >> Great, Jim, thanks very much for that. So Neil, I want to turn to you now. One of the challenges that the big data and the computing industry faces overall is that how much longer are we going to be able to utilize the technologies that have taken us through the first 50 years at the hardware level, and there is some promise in some new approaches to thinking about computing. What's your prediction? >> Well in 2018, you're going to see a demonstration of an actual quantum computer chip that's built on top of existing silicone technology and fabrication. This is a real big deal because what this group in the University of New South Wales came up with was a way to layer traditional transistors and silicon on top of those wacky quantum bits to control them, and to deal with, I don't want to get too technical about that, but the point is that quantum computing has the promise of moving computing light years ahead of where we are now. We've managed to build lots of great software on things that go on or off, and quantum computing is much more than that. I think what you're going to see in 2018 is a demonstration of actual quantum computing chips built on this, and the big deal in that is that we can take these existing machines and factories and capital equipment designed for silicone, and start to produce quantum chips without basically developing a whole new industry. Now why is this important? It's only the first step because these things are not going to be based on the existing Intel i86 instruction set, so all new software will have to be developed, software engineers are going to have to learn a whole new way of doing things, but the possibilities are endless. If you can think about a drug discovery, or curing disease, or dealing with the climate, or new forms of energy to propel us into space, that's where quantum computing is likely to take this. >> Yeah, quantum computing, just to bring a, kind of a fine point on it, allows, at any given time, the machine to be in multiple different states, and it's that fact that allows, in many respects, a problem to be attacked from a large number of directions at the same time, and then test each of them out, so it has a natural affinity with some of the things that we think about in AI, so it's going to have an enormous impact over the course of the next few years and it's going to be interesting to see how this plays out. So David Floyer, I now want to turn to you. We're not likely to see quantum computing at the edge anytime soon, by virtue of some of the technologies we face. More likely it'll be specialized processors up in the cloud service provider in the near term. But what are you going to talk about when we think about the role that the edge is going to play in the industry, and the impacts it's going to have on, quite frankly, the evolution of de facto standards? >> Well, I'd like to focus on the economics of edge devices. And my prediction is that the economics of consumer-led volume will dominate the design of IoT devices at the edge. If you take an IoT device, it's made up of sensors and advanced analytics and AI, and specifically designed compute elements, and together with the physical setup of fitting it into wherever you're going to put it, that is the overall device that will be put into the edge, and that's where all of the data is going to be generated, and obviously, if you generate data somewhere, the most efficient way of processing that data is actually at the edge itself, so you don't have to transport huge amounts of data. So the prediction is that new vendors with deep knowledge of the technology itself, using all the tools that Jim was talking about, and deep knowledge of the end user environments and the specific solutions that they're going to offer, they will come out with much lower cost solutions than traditional vendors. So to put a little bit of color around it, let's take a couple of real-world examples where this is already in place in the consumer world, and will be the basis of solutions in the enterprise. If we take the Apple iPhone X, it has facial recognition built-in, and it has facial recognition built-in on their A11 chips, but they're bionic chips. They've got GPUs, they've got neural networks all in the chip itself, and the total cost of that solution is around a hundred dollars in terms of these parts, and that includes the software. So if we take that hundred dollars and put it into what it would actually be priced at, that's around $300. So that's a much, much lower cost than a traditional IT vendor could ever do, and a much, at least an order of magnitude, and probably two orders of magnitude cheaper than an IT department could produce for its own use. So that leaves (mumbles) inclusions, going to be a lot of new vendors. People like Sony, for example, Hitachi, Fujitsu, Honeywell. Possibly people like Apple and Microsoft. Nvidia, Samsung, and many companies that we'll predict are going to come out of India, China and Russia who have strong mathematical educational programs. So the action item is for CIOs, is to really look carefully at the projects that you are looking at, and determine, do I really have the volume to be unique in this area? If that volume, if it's a problem which is going to be industry-wide, the advice we would give is wait for that device to come out from a specialized vendor rather than develop it yourself. And focus investment on areas where you have both the volume of devices and the volume of data that will allow you to be successful. >> All right, David, thank you very much. So let me wrap this week's Action Item, which has been kind of a bridge, but we've looked specifically at some of the predictions that didn't make it into our recent predictions webinar, and if I want to try to summarize or try to bring all these things together, here's what I think what we'd say. Number one, we'd say that the development community has to prepare itself for some pretty significant changes as a consequence of having an application development environment that's more probabilistic, driven by data and driven by AI and related technologies, and we think that there will be new frameworks that are deployed in 2018, and that's just where it's going to start, and will mature over the next few years as we heard from Jim Kobielus. We've also heard that there is going to be a new computing architecture that's going to drive change, perhaps for the next 50 years, and the whole concept of quantum computing is very, very real, and it's going to have significant implications. Now it will take some time to roll out, but again, software developers have to think about the implications of some these new architectures on their work because not only are they going to have to deal with technology approaches that are driven by data, but they're also going to have to look at entirely new ways of framing problems because it used to be about something different than it is today. The next thing that we need to think about is that there still is going to be the economics of computing that are going to ultimately shape how all of this plays out. David Floyer talked about, specifically at the edge, where Wikibon believes it's going to have an enormous implication on the true cost of computing and how well some of these complex problems actually find their way into commercial and other domains. So with a background of those threee things, we think, ultimately, that's an addendum to the predictions that we have and once again, i'm Peter Burris. Thank you very much for joining us for Action Item, and we look forward to working with you more closely over the course of the next year, 2018, as we envision the new changes and the practice of how to make those changes a reality. From our Palo Alto theCUBE studios, this has been Action Item. (bright electronic music)
SUMMARY :
that are most important in the computing industry and that we expect that other AI solution provider, How the data scientist role is going to change. and really, for the industry as a whole. and the computing industry faces overall in the University of New South Wales came up with and the impacts it's going to have on, and that includes the software. is that there still is going to be the economics
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
Fujitsu | ORGANIZATION | 0.99+ |
Samsung | ORGANIZATION | 0.99+ |
Hitachi | ORGANIZATION | 0.99+ |
Jim Kobielus | PERSON | 0.99+ |
Jim | PERSON | 0.99+ |
Nvidia | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Sony | ORGANIZATION | 0.99+ |
Apple | ORGANIZATION | 0.99+ |
David Floyer | PERSON | 0.99+ |
Ralph Finos | PERSON | 0.99+ |
Neil Radon | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
2017 | DATE | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Honeywell | ORGANIZATION | 0.99+ |
2018 | DATE | 0.99+ |
Neil | PERSON | 0.99+ |
ORGANIZATION | 0.99+ | |
AWS | ORGANIZATION | 0.99+ |
hundred dollars | QUANTITY | 0.99+ |
University of New South Wales | ORGANIZATION | 0.99+ |
Wikibon | ORGANIZATION | 0.99+ |
one question | QUANTITY | 0.99+ |
yesterday | DATE | 0.99+ |
India | LOCATION | 0.99+ |
Russia | LOCATION | 0.99+ |
Peter | PERSON | 0.99+ |
Pete | PERSON | 0.99+ |
first step | QUANTITY | 0.99+ |
China | LOCATION | 0.99+ |
Intel | ORGANIZATION | 0.99+ |
This week | DATE | 0.99+ |
this week | DATE | 0.99+ |
first 50 years | QUANTITY | 0.99+ |
around $300 | QUANTITY | 0.99+ |
iPhone X | COMMERCIAL_ITEM | 0.98+ |
today | DATE | 0.97+ |
both | QUANTITY | 0.97+ |
each | QUANTITY | 0.97+ |
each one | QUANTITY | 0.96+ |
around a hundred dollars | QUANTITY | 0.96+ |
TensorFlow | TITLE | 0.95+ |
one | QUANTITY | 0.95+ |
One | QUANTITY | 0.95+ |
MXNet | TITLE | 0.93+ |
next year, 2018 | DATE | 0.93+ |
threee things | QUANTITY | 0.91+ |
few weeks ago | DATE | 0.91+ |
years | DATE | 0.9+ |
several months ago | DATE | 0.88+ |
two orders | QUANTITY | 0.88+ |
four | QUANTITY | 0.84+ |
theCUBE | ORGANIZATION | 0.82+ |
i86 | COMMERCIAL_ITEM | 0.81+ |
Palo Alto | LOCATION | 0.71+ |
Gluon | ORGANIZATION | 0.68+ |
next 50 years | DATE | 0.61+ |
next | DATE | 0.54+ |
Invent | EVENT | 0.51+ |
couple | QUANTITY | 0.5+ |
NNVM | TITLE | 0.46+ |
A11 | COMMERCIAL_ITEM | 0.45+ |
Wikibon Analyst Meeting | Dell EMC Analyst Summit
>> Welcome to another edition of Wikibon's Weekly Research Meeting on theCUBE. (techno music) I'm Peter Burris, and once again I'm joined by, in studio, George Gilbert, David Floyer. On the phone we have Dave Vellante, Stu Miniman, Ralph Finos, and Neil Raden. And this week we're going to be visiting Dell EMC's Analyst Summit. And we thought we'd take some time today to go deeper into the transition that Dell and EMC have been on in the past few years, touching upon some of the value that they've been creating for customers and addressing some of the things that we think they're going to have to do to continue on the path that they're on and continue to deliver value to the marketplace. Now, to look back over the course of the past year, it was about a year ago that the transaction actually closed. And in the ensuing year, there's been a fair amount of change. We've seen some interesting moves by Dell to bring the companies together, a fair amount of conversation about how bigger is better. And at the most recent VMworld, we saw a lot of great news of VMworld, VMware in particular working more closely with Amazon and others, or AWS and others. So we've seen some very positive things happen in the course of the past year. But there are still some crucial questions that are addressed. And to kick us off, Dave Vellante, where are we one year in and what are we expecting to hear this week? >> Dave: And foremost, Michael Dell was trying to transform his company. It wasn't happening fast enough. He had to go private. He wanted to be an enterprise player, and amazingly, he and Silver Lake came up with four billion dollars in cash. And they may very well pull off one of the greatest wealth creation trades in the history of the computer industry because for four billion dollars, they're getting an asset that's worth somewhere north of 50 billion, and they're paying down the debt that they used to lever that acquisition through cash flow. So like I say, for a pittance (laughs) of four billion dollars, they're going to turn that into a lot of dough, tens and tens of billions. If you look at EMC pre the M and A, I'm sorry, if you look at Dell pre M and A, pre-merger, their transformation was largely failing. The company was making a lot of acquisitions but it wasn't able to reshape itself fast enough. If you look at EMC pre-merger, it was a powerhouse, but it was suffering from this decade-long collapse of infrastructure hardware and software pricing, which was very much a drag on growth and cash flow. So the company was forced to find a white knight, which came in the form of Michael Dell. So you had this low gross margin company, Dell's public gross margin before it went private were in the teens. EMC was in the roughly 60%. Merge those together and you get a roughly 30% plus gross margin entity. I don't think they're there yet. I think they got a lot of work to do. So a lot of talk about integration. And there's some familiarity with these two companies because they had a fairly large OEM deal for the better part of a decade in the 90s. But culturally, it's quite different. Dell's a very metrics-driven culture with a lot of financial discipline. EMC's kind of a take the hill, do whatever it takes culture. And they're in the process of bringing those together, and a lot of cuts are taking place. So we want to understand what impacts those will have to customers. The other point I want to make is that without VMware, in my view anyway, the combination of these companies would not be nearly as interesting. In fact, it would be quite boring. So the core of these companies, you know, have faced a lot of challenges. But they do have VMware to leverage. And I think the challenge that customers really need to think about is how does this company continue to innovate now that they can't really do M and A? If you look at EMC, for years, they would spend money on R and D and make incremental improvements to its product lines and then fill the gaps with M and A. And there're many, many examples of that, Isilon, Data Domain, XtremIO, and dozens of others. That kept EMC competitive. So how does Dell continue that strength? It spends about four and a half billion a year on R and D, and according to Wikibon's figures, that's about 6% of revenue. If you compare that with other companies, Oracle, Amazon, they're into the 12%. Google's mid-teens. Microsoft, obviously to 12, 13%. Cisco's up there. EMC itself was spending 12% on R and D. So IBM's only about 6%, but remember IBM, about two thirds of the company is services. It's not R and D heavy. So Dell has got to cut costs. It's a must. And what implications does that have on the service levels that customers have grown to expect, and what's the implications on Dell's roadmap? I think we would posit that a lot of the cash cows are going to get funded in a way that allows them to have a managed decline in that business. And it's likely that customers are going to see reduced roadmap functions going forward. So a key challenge that I see for Dell EMC is growth. The strength is really VMware, and the leverage of the VMware and their own install base I think gives Dell EMC the ability to keep pace with its competitors because it's got kind of the inside baseball there. It's got a little bit of supply chain leverage, and of course its sales force and its channels are a definite advantage for this company. But it's got a lot of weaknesses and challenges. Complexity of the portfolio, it's got a big debt load that hamstrings its ability to do M and A. I think services is actually a big opportunity for this company. Servicing its large install base. And I think the key threat is cloud and China. I think China, with its low-cost structure, made a deal like this inevitable. So I come back to the point of Michael Dell's got to cut in order to stay competitive. >> Peter: Alright, so one of the, sorry- >> Dave: Next week, hear a lot about sort of innovation strategies, which are going to relate to the edge. Dell EMC has not announced an edge strategy. It needs to. It's behind HPE in that regard, one its major competitors. And it's got to get into the game. And it's going to be really interesting to see how they are leveraging data to participate in that IOT business. >> Great summary, Dave. So you mentioned that one of the key challenges that virtually every company faces is how do they reposition themselves in a world in which the infrastructure platform, foundation, is going to be more cloud-oriented. Stu Miniman, why don't you take us through, very quickly, where Dell EMC is relative to the cloud? >> Stu: Yeah, great question, Peter. And just to set that up, it's important to talk about one of the key initiatives from Dell and EMC coming together, one of the synergies that Michael Dell has highlighted is really around the move from converged infrastructure to hyper converged infrastructure. And this is also the foundational layer that Dell EMC uses today for a lot of their cloud solutions. So EMC has done a great job with the first wave of converged infrastructure through partnering with Cisco. They created the Vblock, which is now VxBlock, which is now a multi-billion dollar revenue stream. And Dell did a really good job of jumping on early with the hyper converged infrastructure trend. So I'd written research years ago that not only was it through partnerships but through OEM deals, if you look at most of the solutions that were being sold on the market, the underlying server for them was Dell. And that was even before the EMC acquisition. Once they acquired EMC, they really get kind of control, if you will, of the VMware VSAN business, which is a very significant player. They have an OEM relationship with Nutanix, who's doing quite well in the space, and they put together their own full-stack solution, which takes Dell's hardware, the VMware VSAN, and the go-to-market processes of what used to be VCE, and they put together VxRail, which is doing quite well from a revenue and a growth standpoint. And the reason I set this all up to talk about cloud is that if you look at Dell's positioning, a lot of their cloud starts at that foundational infrastructure level. They have all of these enterprise hybrid clouds and different solutions that they've been offering for a few years. And underneath those, really it is a simplified infrastructure hardware offering. So whether that is the traditional VCE converged infrastructure solutions or the newer hyper converged infrastructure solutions, that's the base level. And then there's software that wraps on top of it. So they've done a decent amount of revenue. The concern I have is, you know, Peter, you laid out, it's very much a software world. We've been talking a lot at Wikibon about the multi-cloud nature of what's going on. And while Dell and the Dell family have a very strong position in the on-premises market, that's really they're center strength, is around hardware and customer and the enterprises data center. And the threat is public cloud and multi-cloud. And if it centers around hardware and especially when you dig down and say, "okay, I want to sell more servers," which is one of the primary drivers that Michael wants to have with his whole family of solutions, how much can you really live across these in various environments? Of course, they have partnerships with Microsoft. There's the VMware partnerships with Amazon, which is interesting, how they even partner with the likes of Google and others, it can be looked at. But from that kind of center strength is on premises and therefore they're not really living heavily in the public and multi-cloud world, unless you look at Pivotal. So Pivotal's a software, and that's where they're going to say that the big push is, but it's these massive shifts of large install base of EMC, Dell, and VMware, compared to the public cloud that are doing the land grabs. So this is where it's really interesting to look at. And the announcement that we're interested to look at is how IOT and edge fits into all of this. So David Foyer and you, Peter, research about how- >> Peter: Yeah, well, we'll get to that. >> Stu: There's a lot of nuance there. >> We'll get to that in a second, Stu. But one of the things I wanted to mention to David Floyer is that certainly in the case of Dell, they have been a major player in the Intel ecosystem. And as we think about what's going to happen over the course of the next couple of years, what's going to happen with Intel? It's going to continue to dominate. And what's that going to mean for Dell? >> Sure, Dell's success, I mean, what Stu has been talking about is the importance of volume for Dell, being a volume player. And obviously when they're looking at Intel, the PC is a declining market, and ARM is doing incredibly well in the mobile and other marketplaces. And Dell's success is essentially tied to Intel. So the question to ask is if Intel starts to lose market share to ARM and maybe even IBM, what is the impact on that on Dell? And in particular, what is the impact on the edge? And so if you look at the edge, there are two primary parts. We put forward there are two parts of the edge. There's the primary data, which is coming from the sensors themselves, from the cameras and other things like that. So there's the primary edge, and there's the secondary edge, which is after that data has been processed. And if you think about the primary edge, AI and DL go to the primary edge because that's where the data is coming in, and you want the highest fidelity of data. So you want to do the processing as close as possible to that. So you're looking at these examples in autonomous cars. You're seeing it in security cameras, that all of that processing is going to much cheaper chips, very, very close to the data itself. What that means is that most of that IOT, or could mean, is that most of that IOT could go to other vendors, other than Intel, to go to the ARM vendors. And if you look at that market, it's going to be very specialized in the particular industry and the particular problem it's trying to solve. So it's likely that non-IT vendors are going to be in that business. And you're likely to be selling to OT and not the IT. So all of those are challenges to Dell in attacking the edge. They can win the secondary edge, which is the compressed data, initially compressing it 1,000 to one, probably going to a million to one compression of the data coming from the sensors to a much higher value data but much, much smaller amounts, both on the compute side and on the storage side. So if that bifurcation happens at the edge, the size of marketplace is going to be very considerably reduced for Intel. And Dell has in my view a strategic decision to make of whether they get into being part of that ARM ecosystem for the edge. There's a strong argument that's saying that they would need to do that. >> And they will be announcing something on Monday, I believe, or next week. We're going to hear a lot about that. But when we think, ultimately, about the software that Dell and EMC are going to have to think about, they're very strong in VMware, which is important, and there's no question that virtual machines will remain important, if not only from an install base standpoint but from, in the future, how the cloud is organized and arranged and managed. Pivotal also is an interesting play, especially as it does a better job of incorporating more of the open source elements that are becoming very attractive to developers. But George, let me ask you a question, ultimately, about where is Dell in some of these more advanced software worlds? When we think about machine learning, when we think about AI, these are not strong markets right now, are not huge markets right now, but they're leading indicators. They're going to provide cues about where the industry's going to go and who's going to get a chance to provide the tooling for them. So what's our take right now, where Dell is, Dell EMC is relative to some of these technologies? >> Okay, so that was a good lead in for my take on all the great research David Floyer's done, which is when we go through big advances in hardware, typically relative price performance changes between CPU, memory, storage, networking. When we see big relative changes between those, then there's an opportunity for the software to be re-architected significantly. So in this case, what we call unigrid, what David's called unigrid previously is the ability to build scale-out, extremely high-performance clusters to the point where we don't have to bottleneck on shared storage like a SAN anymore. In other words, we can treat the private memory for each node as if it were storage, direct-attached storage, but it is now so fast in getting between nodes and to the memory in a node that for all intents and purposes, it can perform as if you had a shared storage small cluster before. Only now this can scale out to hundreds, perhaps thousands, of nodes. The significance of that is we are in an era of big data and big analytics. And so the issue here is can Dell sort of work with the most advanced software vendors who are trying to push the envelope to build much larger-scale data management software than they've been able to. Now, Dell has an upward, sort of an uphill climb to master the cloud vendors. They build their own infrastructure hardware. But they've done pools of GPUs, for instance, to accelerate machine learning training. Dell could work with these data management vendors to get pools of this scale-out hardware in the clouds to take advantage of the NoSQL databases, the NewSQL databases. There's an opportunity to leapfrog. What we found out at Oracle, at their user conference this week was even though they're building similar hardware, their database is not yet ready to take advantage of it. So there is an opportunity for Dell to start making inroads in the cloud where their generic infrastructure wouldn't. Now, one more comment on the edge, I know David was saying on the sort of edge device, that's looking more and more like it doesn't have to be Intel-compatible. But if you go to the edge gateway, the thing that bridges OT and IT, that's probably going to be their best opportunity on the edge. The challenge, though, is it's not clear how easy it will be in a low-touch sort of go-to-market model that Dell is accustomed to because like they discovered in the late 90s, it cost $6,000 per year per PC to support. And no one believed that number until Intel did a study on itself and verified it. The protocols from all the sensors on the OT side are so horribly complex and legacy-oriented that even the big auto manufacturers keep track of the different ones on a spreadsheet. So mapping the IT gateway server to all the OT edge devices may turn out to be horribly complex for a few years. >> Oh, it's not a question of may. It is going to be horribly complex for the next few years. (laughing) I don't think there's any question about that. But look, here's what I want to do. I want to ask one more question. And I'm going to go do a round table and ask everybody to give me what the opportunity is and what the threat is. But before I do that, the one thing we haven't discussed, and Dave Vellante, I'm going to throw it over to you, is we've looked at the past of Dell talks a lot about the advantages of its size and the economies of scale that it gets. And Dell's not in the semiconductor business or at least not in a big way. And that's one place where you absolutely do get economies of scale. They got VMware in the system software business, which is an important point. So there may be some economies there. But in manufacturing and assembly, as you said earlier, Dave, that is all under consideration when we think about where the real cost efficiencies are going to be. One of the key places may be in the overall engagement model. The ability to bring a broad portfolio, package it up, and make it available to a customer with the appropriate set of services, and I think this is why you said services is still an opportunity. But what does it mean to get to the Dell EMC overall engagement model as Dell finds or looks to find ways to cut costs, to continue to pay down its debt and show a better income statement? >> Dave: So let me take the customer view. I mean, I think you're right. This whole end to end narrative that you hear from Dell, for years you heard it from HP, I don't think it really makes that much of a difference. There is some supply chain leverage, no question. So you can get somewhat cheaper components, you could probably get supplies, which are very tight right now. So there are definitely some tactical advantages for customers, but I think your point is right on. The real leverage is the engagement model. And the interesting thing from I think our standpoint is that you've got a very high-touch EMC direct sales force, and that's got to expand into the channel. Now, EMC's done a pretty good job with the channel over the last, you know, half a decade. Dell doesn't have as good a reputation there. Its channel partners are many more but perhaps not as sophisticated. So I think one of the things to watch is the channel transformation and then how Dell EMC brings its services and its packages to the market. I think that's very, very important for customers in terms of reducing a lot of the complexity in the Dell EMC portfolio, which just doubled in complexity. So I think that is something that is going to be a critical indicator. It's an opportunity, and at the same time, if they blow it, it's a big threat to this organization. I think it's one of the most important things, especially, as you pointed out, in the context of cost cutting. If they lose sight of the importance of the customer, they could hit some bumps in the road and open it up for competition to come in and swoop some of their business. I don't think they will. I think Michael Dell is very focused on the customer, and EMC's culture has always been that way. So I would bet on them succeeding there, but it's not a trivial task. >> Yeah, I would agree with you. In fact, one of the statements that we heard from Michael Dell and other executives at Dell EMC at VMworld, over and over and over again, on theCUBE and elsewhere, was this notion of open with an opinion. And in many respects, the opinion is not just something that they say. It's something that they do through their packaging and how they put their technologies into the marketplace. Okay, guys, rapid fire, really, really, really short answers. Let's start with the threats. And then we'll close with the positive note on the strengths. David Floyer, really quick, biggest threat that we're looking at next week? >> The biggest threat is the evolution of ARM processes, and if they keep to an Intel-only strategy, that to me is their biggest threat. Those could offer a competition in both mobile, increasing percentages of mobile, and also also in the IOT and other processor areas. >> Alright, George Gilbert, biggest threat? >> Okay, two, summarizing the comments I made before, one, they may not be able to get the cloud vendors to adopt pools of their scale-out infrastructure because the software companies may not be ready to take advantage of it yet. So that's cloud side. >> No, you just get one. Dave Vellante. >> Dave: Interest rates. (laughing) >> Peter: Excellent. Stu Miniman. >> Stu: Software. >> Peter: Okay, come on Stu. Give me an area. >> Stu: Dell's a hardware company! Everything George said, there's no way the cloud guys are going to adopt Dell EMC's infrastructure gear. This is a software play. Dell's been cutting their software assets, and I'm really worried that I'm going to see an edge box, you know, that doesn't have the intelligence that they need to put the intelligence that they say that they're going to put in. >> So, specifically, it's software that's capable of running the edge centers, so to speak. Ralph Finos. >> Ralph: Yeah, I think the hardware race to the bottom. That's a big part of their business, and I think that's a challenge when you're looking at going head on head, with HPE especially. >> Peter: Neil Raden, Neil Raden. >> Neil: Private managed cloud. >> Or what we call true private cloud, which goes back to what Stu said, related to the software and whether or not it ends up being manageable. Okay, threats. David Floyer. >> You mean? >> Or I mean opportunities, strengths. >> Opportunities, yes. The opportunity is being by far the biggest IT place out there, and the opportunity to suck up other customers inside that. So that's a big opportunity to me. They can continue to grow by acquisition. Even companies the size of IBM might be future opportunities. >> George Gilbert. >> On the opposite side of what I said earlier, they really could work with the data management vendors because we really do need scale-out infrastructure. And the cloud vendors so far have not spec'd any or built any. And at the same time, they could- >> Just one, George. (laughing) Stu Miniman. >> Dave: Muted. >> Peter: Dave Vellante. >> Dave: I would say one of the biggest opportunities is 500,000 VMware customers. They've got the server piece, the networking piece kind of, and storage. And combine that with their services prowess, I think it's a huge opportunity for them. >> Peter: Stu, you there? Ralph Finos. >> Stu: Sorry. >> Peter: Okay, there you go. >> Stu: Dave stole mine, but it's not the VMware install base, it's really the Dell EMC install base, and those customers that they can continue moving along that journey. >> Peter: Ralph Finos. >> Ralph: Yeah, highly successful software platform that's going to be great. >> Peter: Neil Raden. >> Neil: Too big to fail. >> Alright, I'm going to give you my bottom lines here, then. So this week we discussed Dell EMC and our expectations for the Analyst Summit and our observations on what Dell has to say. But very quickly, we observed that Dell EMC is a financial play that's likely to make a number of people a lot of money, which by the way has cultural implications because that has to be spread around Dell EMC to the employee base. Otherwise some of the challenges associated with cost cutting on the horizon may be something of an issue. So the whole cultural challenges faced by this merger are not insignificant, even as the financial engineering that's going on seems to be going quite well. Our observation is that the cloud world ultimately is being driven by software and the ability to do software, with the other observation that the traditional hardware plays tied back to Intel will by themselves not be enough to guarantee success in the multitude of different cloud options that will become available, or opportunities that will become available to a wide array of companies. We do believe the true private cloud will remain crucially important, and we expect that Dell EMC will be a major player there. But we are concerned about how Dell is going to evolve as a, or Dell EMC is going to evolve as a player at the edge and the degree to which they will be able to enhance their strategy by extending relationships to other sources of hardware and components and technology, including, crucially, the technologies associated with analytics. We went through a range of different threats. If we identify two that are especially interesting, one, interest rates. If the interest rates go up, making Dell's debt more expensive, that's going to lead to some strategic changes. The second one, software. This is a software play. Dell has to demonstrate that it can, through its 6% of R and D, generate a platform that's capable of fully automating or increasing the degree to which Dell EMC technologies can be automated. In many conversations we've had with CIOs, they've been very clear. One of the key criteria for the future choices of suppliers will be the degree to which that supplier fits into their automation strategy. Dell's got a lot of work to do there. On the big opportunities side, the number one from most of us has been VMware and the VMware install base. Huge opportunity that presents a pathway for a lot of customers to get to the cloud that cannot be discounted. The second opportunity that we think is very important that I'll put out there is that Dell EMC still has a lot of customers with a lot of questions about how digital transformation's going to work. And if Dell EMC can establish itself as a thought leader in the relationship between business, digital business, and technology and bring the right technology set, including software but also packaging of other technologies, to those customers in a true private cloud format, then Dell has the potential to bias the marketplace to their platform even as the marketplace chooses in an increasingly rich set of mainly SaaS but public cloud options. Thanks very much, and we look forward to speaking with you next week on the Wikibon Weekly Research Meeting here on theCUBE. (techno music)
SUMMARY :
And in the ensuing year, there's been And it's likely that customers are going to see And it's got to get into the game. platform, foundation, is going to be more cloud-oriented. and the go-to-market processes of what used to be VCE, certainly in the case of Dell, So the question to ask is Dell EMC is relative to some of these technologies? in the clouds to take advantage and ask everybody to give me what the opportunity is and that's got to expand into the channel. And in many respects, the opinion is not just and if they keep to an Intel-only strategy, one, they may not be able to get No, you just get one. Dave: Interest rates. Peter: Excellent. Peter: Okay, come on Stu. the cloud guys are going to adopt that's capable of running the edge centers, so to speak. Ralph: Yeah, I think the hardware race to the bottom. related to the software and whether or not So that's a big opportunity to me. And the cloud vendors so far have not spec'd any Stu Miniman. And combine that with their services prowess, Peter: Stu, you there? install base, it's really the Dell EMC install base, that's going to be great. and the ability to do software,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
David Floyer | PERSON | 0.99+ |
George Gilbert | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
George | PERSON | 0.99+ |
Neil Raden | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
EMC | ORGANIZATION | 0.99+ |
Dave | PERSON | 0.99+ |
Michael | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
Ralph Finos | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Stu Miniman | PERSON | 0.99+ |
Cisco | ORGANIZATION | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Ralph | PERSON | 0.99+ |
Peter | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Neil | PERSON | 0.99+ |
Dell | ORGANIZATION | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Michael Dell | PERSON | 0.99+ |
David Foyer | PERSON | 0.99+ |
Nutanix | ORGANIZATION | 0.99+ |
Monday | DATE | 0.99+ |
12% | QUANTITY | 0.99+ |
HP | ORGANIZATION | 0.99+ |
hundreds | QUANTITY | 0.99+ |
next week | DATE | 0.99+ |