Image Title

Search Results for George O:

Wikibon Predictions Webinar with Slides


 

(upbeat music) >> Hi, welcome to this year's Annual Wikibon Predictions. This is our 2018 version. Last year, we had a very successful webinar describing what we thought was going to happen in 2017 and beyond and we've assembled a team to do the same thing again this year. I'm very excited to be joined by the folks listed here on the screen. My name is Peter Burris. But with me is David Floyer, Jim Kobielus is remote. George Gilbert's here in our Pal Alto studio with me. Neil Raden is remote. David Vellante is here in the studio with me. And Stuart Miniman is back in our Marlboro office. So thank you analysts for attending and we look forward to a great teleconference today. Now what we're going to do over the course of the next 45 minutes or so is we're going to hit about 13 of the 22 predictions that we have for the coming year. So if you have additional questions, I want to reinforce this, if you have additional questions or things that don't get answered, if you're a client, give us a call. Reach out to us. We'll leave you with the contact information at the end of the session. But to start things off we just want to make sure that everybody understands where we're coming from. And let you know who is Wikibon. So Wikibon is a company that starts with the idea of what's important as to research communities. Communities are where the action is. Community is where the change is happening. And community is where the trends are being established. And so we use digital technologies like theCUbE, CrowdChat and others to really ensure that we are surfacing the best ideas that are in a community and making them available to our clients so that they can succeed successfully, they can be more successful in their endeavors. When we do that, our focus has always been on a very simple premise. And that is that we're moving to an era of digital business. For many people, digital business can mean virtually anything. For us it means something very specific. To us, the difference between business and digital business is data. A digital business uses data to differentially create and keep a customer. So borrowing from what Peter Drucker said if the goal of business is to create customers and keep and sustain customers, the goal of digital business is to use data to do that. And that's going to inform an enormous number of conversations and an enormous number of decisions and strategies over the next few years. We specifically believe that all businesses are going to have establish what we regard as the five core digital business capabilities. First, they're going to have to put in place concrete approaches to turning more data into work. It's not enough to just accrete data, to capture data or to move data around. You have to be very purposeful and planful in how you establish the means by which you turn that data into work so that you can create and keep more customers. Secondly, it's absolutely essential that we build kind of the three core technology issues here, technology capabilities of effectively doing a better job of capturing data and IoT and people, or internet of things and people, mobile computing for example, is going to be a crucial feature of that. You have to then once you capture that data, turn it into value. And we think this is the essence of what big data and in many respects AI is going to be all about. And then once you have the possibility, kind of the potential energy of that data in place, then you have to turn it into kinetic energy and generate work in your business through what we call systems of agency. Now, all of this is made possible by this significant transformation that happens to be conterminous with this transition to digital business. And that is the emergence of the cloud. The technology industry has always been defined by the problems it was able to solve, catalyzed by the characteristics of the technology that made it possible to solve them. And cloud is crucial to almost all of the new types of problems that we're going to solve. So these are the five digital business capabilities that we're going to talk about, where we're going to have our predictions. Let's start first and foremost with this notion of turn more data into work. So our first prediction relates to how data governance is likely to change in a global basis. If we believe that we need to turn more data into work well, businesses haven't generally adopted many of the principles associated with those practices. They haven't optimized to do that better. They haven't elevated those concepts within the business as broadly and successfully as they have or as they should. We think that's going to change in part by the emergence of GDPR or the General Data Protection Regulation. It's going to go in full effect in May 2018. A lot has been written about it. A lot has been talked about. But our core issues ultimately are is that the dictates associated with GDPR are going to elevate the conversation on a global basis. And it mandates something that's now called the data protection officer. We're going to talk about that in a second David Vellante. But if is going to have real teeth. So we were talking with one chief privacy officer not too long ago who suggested that had the Equifax breach occurred under the rules of GDPR that the actual finds that would have been levied would have been in excess of 160 billion dollars which is a little bit more than the zero dollars that has been fined thus far. Now we've seen new bills introduced in Congress but ultimately our observation and our conversations with a lot of data chief privacy officers or data protection officers is that in the B2B world, GDPR is going to strongly influence not just our businesses behavior regarding data in Europe but on a global basis. Now that has an enormous implication David Vellante because it certainly suggest this notion of a data protection officer is something now we've got another potential chief here. How do we think that's going to organize itself over the course of the next few years? >> Well thank you Peter. There are a lot of chiefs (laughs) in the house and sometimes it gets confusing as the CIO, there's the CDO and that's either chief digital officer or chief data officer. There's the CSO, could be strategy, sometimes that could be security. There's the CPO, is that privacy or product. As he says, it gets confusing sometimes. On theCUbE we talked to all of these roles so we wanted to try to add some clarity to that. First thing we want to say is that the CIO, the chief information officer, that role is not going away. A lot of people predict that, we think that's nonsense. They will continue to have a critical role. Digital transformations are the priority in organizations. And so the chief digital officer is evolving from more than just a strategy role to much more of an operation role. Generally speaking, these chiefs tend to report in our observation to the chief operating officer, president COO. And we see the chief digital officer as increasing operational responsibility aligning with the COO and getting incremental responsibility that's more operational in nature. So the prediction really is that the chief digital officer is going to emerge as a charismatic leader amongst these chiefs. And by 2022, nearly 50% of organizations will position the chief digital officer in a more prominent role than the CIO, the CISO, the CDO and the CPO. Those will still be critical roles. The CIO will be an enabler. The chief information security officer has a huge role obviously to play especially in terms of making security a teams sport and not just falling on IT's shoulders or the security team's shoulders. The chief data officer who really emerged from a records and data management role in many cases, particularly within regulated industries will still be responsible for that data architecture and data access working very closely with the emerging chief privacy officer and maybe even the chief data protection officer. Those roles will be pretty closely aligned. So again, these roles remain critical but the chief digital officer we see as increasing in prominence. >> Great, thank you very much David. So when we think about these two activities, what we're really describing is over the course of the next few years, we strongly believe that data will be regarded more as an asset within business and we'll see resources devoted to it and we'll see certainly management devoted to it. Now, that leads to the next set of questions as data becomes an asset, the pressure to acquire data becomes that much more acute. We believe strongly that IoT has an enormous implication longer term as a basis for thinking about how data gets acquired. Now, operational technology has been in place for a long time. We're not limiting ourselves just operational technology when we talk about this. We're really talking about the full range of devices that are going to provide and extend information and digital services out to consumers, out to the Edge, out to a number of other places. So let's start here. Over the course of the next few years, the Edge analytics are going to be an increasingly important feature overall of how technology decisions get made, how technology or digital business gets conceived and even ultimately how business gets defined. Now David Floyer's done a significant amount of work in this domain and we've provided that key finding on the right hand side. And what it shows is that if you take a look at an Edge based application, a stylized Edge based application and you presume that all the data moves back to an centralized cloud, you're going to increase your costs dramatically over a three year period. Now that moderates the idea or moderates the need ultimately for providing an approach to bringing greater autonomy, greater intelligence down to the Edge itself and we think that ultimately IoT and Edge analytics become increasingly synonymous. The challenge though is that as we evolve, while this has a pressure to keep more of the data at the Edge, that ultimately a lot of the data exhaust can someday become regarded as valuable data. And so as a consequence of that, there's still a countervailing impression to try to still move all data not at the moment of automation but for modeling and integration purposes, back to some other location. The thing that's going to determine that is going to be rate at which the cost of moving the data around go down. And our expectation is over the next few years when we think about the implications of some of the big cloud suppliers, Amazon, Google, others, that are building out significant networks to facilitate their business services may in fact have a greater impact on the common carriers or as great an impact on the common carriers as they have on any server or other infrastructure company. So our prediction over the next few years is watch what Amazon, watch what Google do as they try to drive costs down inside their networks because that will have an impact how much data moves from the Edge back to the cloud. It won't have an impact necessarily on the need for automation at the Edge because latency doesn't change but it will have a cost impact. Now that leads to a second consideration and the second consideration is ultimately that when we talk about greater autonomy at the Edge we need to think about how that's going to play out. Jim Kobielus. >> Jim: Hey thanks a lot Peter. Yeah, so what we're seeing at Wikibon is that more and more of the AI applications, more of the AI application development involves AI and more and more of the AI involves deployment of those models, deep learning machine learning and so forth to the Edges of the internet of things and people. And much of that AI will be operating autonomously with little or no round-tripping back to the cloud. What that's causing, in fact, we're seeing really about a quarter of the AI development projects (static interference with web-conference) as Edge deployment. What that involves is that more and more of that AI will be, those applications will be bespoke. They'll be one of a kind, or unique or an unprecedented application and what that means is that, you know, there's a lot of different deployment scenarios within which organizations will need to use new forms of learning to be able to ready that data, those AI applications to do their jobs effectively albeit to predictions of real time, guiding of an autonomous vehicle and so forth. Reinforcement learning is the core of what many of these kinds of projects, especially those that involve robotics. So really software is hitting the world and you know the biggest parts are being taken out of the Edge, much of that is AI, much of that autonomous, where there is no need or less need for real time latency in need of adaptive components, AI infused components where as they can learn by doing. From environmental variables, they can adapt their own algorithms to take the right actions. So, they'll have far reaching impacts on application development in 2018. For the developer, the new developer really is a data scientist at heart. They're going to have to tap into a new range of sources of data especially Edge sourced data from the senors on those devices. They're going to need to do commitment training and testing especially reinforcement learning which doesn't involve trained data so much as it involves being able to build an algorithm that can learn to maximum what's called accumulative reward function and if you do the training there adaptly in real time at the Edge and so forth and so on. So really, much of this will be bespoke in the sense that every Edge device increasingly will have its own set of parameters and its own set of objective functions which will need to be optimized. So that's one of the leading edge forces, trends, in development that we see in the coming year. Back to you Peter. >> Excellent Jim, thank you very much. The next question here how are you going to create value from data? So once you've, we've gone through a couple trends and we have multiple others about what's going to happen at the Edge. But as we think about how we're going to create value from data, Neil Raden. >> Neil: You know, the problem is that data science emerged rapidly out of sort of a perfect storm of big data and cloud computing and so forth. And people who had been involved in quantitative methods you know rapidly glommed onto the title because it was, lets face it, it was very glamorous and paid very well. But there weren't really good best practices. So what we have in data science is a pretty wide field of things that are called data science. My opinion is that the true data scientists are people who are scientists and are involved in developing new or improving algorithms as opposed to prepping data and applying models. So the whole field really kind of generated very quickly, in really, just in a few years. To me I called it generation zero which is more like data prep and model management all done manually. And it wasn't really sustainable in most organizations because for obvious reasons. So generation one, then some vendors stepped up with tool kits or benchmarks or whatever for data scientists and made it a little better. And generation two is what we're going to see in 2018, is the need for data scientists to no longer prep data or at least not spend very much time with it. And not to do model management because the software will not only manage the progression of the models but even recommend them and generate them and select the data and so forth. So it's in for a very big change and I think what you're going to see is that the ranks of data scientists are going to sort of bifurcate to old style, let me sit down and write some spaghetti code in R or Java or something and those that use these advanced tool kits to really get the work done. >> That's great Neil and of course, when we start talking about getting the work done, we are becoming increasingly dependent upon tools, aren't we George? But the tool marketplace for data science, for big data, has been somewhat fragmented and fractured. And hasn't necessarily focused on solving the problems of the data scientists. But in many respects focusing the problems that the tools themselves have. What's going to happen in the coming year when we start thinking about Neil's prescription that as the tools improve what's going to happen to the tools. >> Okay so, the big thing that we see supporting what Neil's talking about, what Neil was talking about is partly a symptom of a product issue and a go to market issue where the produce issue was we had a lot of best of breed products that were all designed to fit together. That in the broader big data space, that's the same issue that we faced with more narrowly with ArpiM Hadoop where you know, where we were trying to fit together a bunch of open source packages that had an admin and developer burden. More broadly, what Neil is talking about is sort of a richer end to end tools that handle both everything from the ingest all to the way to the operationalization and feedback of the models. But part of what has to go on here is that with open source, these open source tools the price point and the functional footprints that many of the vendors are supporting right now can't feed an enterprise sales force. Everyone talks with their open source business models about land and expand and inside sales. But the problem is once you want to go to wide deployment in an enterprise, you still need someone negotiating commercial terms at a senior level. You still need the technical people fitting the tools into a broader architecture. And most of the vendors that we have who are open source vendors today, don't have either the product breadth or the deal size to support traditional enterprise software. An account team would typically a million and a half to two million quota every year so we see consolidation and the consolidation again driven by the need for simplicity for the admins and the developers and for business model reasons to support enterprise sales force. >> All right, so what we're going to see happen in the course of the coming year is a lot of specialization and recognition of what is data science, what are the practices, how is it going to work, supported by an increasing quality of tools and a lot of tool vendors are going to be left behind. Now the third kind of notion here for those core technology capabilities is we still have to enact based on data. The good new is that big data is starting to show some returns in part because of some of the things that AI and other technologies are capable of doing. But we have to move beyond just creating the potential for, we have to turn that into work and that's what we mean ultimately by this notion of systems of agency. The idea that data driven applications will increasingly be act on behalf of a brand, on behalf of a company and building those systems out is going to be crucial. It's going to have a whole new set of disciplines and expertise required. So when we think about what's going to be required, it always starts with this notion of AI. A lot of folks are presuming however, that AI is going to be relatively easy to build or relatively easy to put together. We have a different opinion George. What do we think is going to happen as these next few years unfold related to AI adoption in large enterprises? >> Okay so, let's go back to the lessons we learned from sort of the big data, the raw, you know, let's put a data link in place which was sort of the top of everyone's agenda for several years. The expectation was it was going to cure cancer, taste like chocolate and cost a dollar. And uh. (laughing) It didn't quite work out that way. Partly because we had a burden on the administrator again of so many tools that weren't all designed to fit together, even though they were distributed together. And then the data scientists, the guys who had to take all this data that wasn't carefully curated yet. And turn that into advanced analytics and machine learning models. We have many of the same problems now with tool sets that are becoming more integrated but at lower levels. This is partly what Neil Raden was just talking about. What we have to recognize is something that we see all along, I mean since the beginning of (laughs) corporate computing. We have different levels of extraction and you know at the very bottom, when you're dealing with things like Tensorflow or MXNet, that's not for mainstream enterprises. That's for you know, the big sophisticated tech companies who are building new algorithms on those frameworks. There's a level above that where you're using like a spark cluster in the machine learning built into that. That's slightly more accessible but when we talk about mainstream enterprises taking advantage of AI, the low hanging fruit is for them to use the pre-trained models that the public cloud vendors have created with all the consumer data on speech, image recognition, natural language processing. And then some of those capabilities can be further combined into applications like managing a contact center and we'll see more from like Amazon, like recommendation engines, fulfillment optimization, pricing optimization. >> So our expectation ultimately George is that we're going to see a lot of this, a lot of AI adoption happen through existing applications because the vendors that are capable of acquiring a talent, taking or experimenting, creating value, software vendors are going to be where a lot of the talent ends up. So Neil, we have an example of that. Give us an example of what we think is going to happen in 2018 when we start thinking about exploiting AI and applications. >> Neil: I think that it's fairly clear to be the application of what's called advanced analytics and data science and even machine learning. But really, it's rapidly becoming a commonplace in organizations not just at the bottom of the triangle here. But I like the example of SalesForce.com. What they've done with Einstein, is they've made machine learning and I guess you can say, AI applications available to their customer base and why is that a good thing? Because their customer base already has a giant database of clean data that they can use. So you're going to see a huge number of applications being built with Einstein against Salesforce.com data. But there's another thing to consider and that is a long time ago Salesforce.com built connectors to a zillion times of external data. So, if you're a SalesForce.com customer using Einstein, you're going to be able to use those advanced tools without knowing anything about how to train a machine learning model and start to build those things. And I think that they're going to lead the industry in that sense. That's going to push their revenue next year to, I don't know, 11 billion dollars or 12 billion dollars. >> Great, thanks Neil. All right so when we think about further evidence of this and further impacts, we ultimately have to consider some of the challenges associated with how we're going to create application value continually from these tools. And that leads to the idea that one of the cobblers children, it's going to gain or benefit from AI will in fact be the developer organization. Jim, what's our prediction for how auto-programming impacts development? >> Jim: Thank you very much Peter. Yeah, automation, wow. Auto-programming like I said is the epitome of enterprise application development for us going forward. People know it as co-generation but that really understates the control of auto-programming as it's evolving. Within 2018, what we're going to see is that machine learning driven co-generation approach of becoming the forefront of innovation. We're seeing a lot of activity in the industry in which applications use ML to drive the productivity of developers for all kinds of applications. We're also seeing a fair amount of what's called RPA, robotic process automation. And really, how they differ is that ML will deliver or will drive co-generation, from what I call the inside out meaning, creating reams of code that are geared to optimize a particular application scenario. This is RPA which really takes over the outside in approach which is essentially, it's the evolution of screen scraping that it's able to infer the underlined code needed for applications of various sorts from the external artifacts, the screens and from sort of the flow of interactions and clips and so forth for a given application. We're going to see that ML and RPA will compliment each other in the next generation of auto-programming capabilities. And so, you know, really application development tedium is really the enemy of, one of the enemies of productivity (static interference with web-conference). This is a lot of work, very detailed painstaking work. And what they need is to be better, more nuanced and more adaptive auto-programming tools to be able to build the code at the pace that's absolutely necessary for this new environment of cloud computing. So really AI-related technologies can be applied and are being applied to application development productivity challenges of all sorts. AI is fundamental to RPA as well. We're seeing a fair number of the vendors in that stage incorporate ML driven OCR and natural language processing and screen scraping and so forth into their core tools to be able to quickly build up the logic albeit to drive sort of the verbiage outside in automation of fairly complex orchestration scenario. In 2018, we'll see more of these technologies come together. But you know, they're not a silver bullet. 'Cause fundamentally and for organizations that are considering going deeply down into auto-programming they're going to have to factor AI into their overall plans. They need to get knowledgeable about AI. They're going to need to bring more AI specialists into their core development teams to be able to select from the growing range of tools that are out there, RPA and ML driven auto-programming. Overall, really what we're seeing is that the AI, the data scientists, who's been the fundamental developer of AI, they're coming into the core of development tools and skills in organizations. And they're going to be fundamental to this whole trend in 2018 and beyond. If AI gets proven out in auto-programming, these developers will then be able to evangelize the core utility of the this technology, AI. In a variety of other backend but critically important investments that organizations will be making in 2018 and beyond. Especially in IT operations and in management, AI is big in that area as well. Back to you there, Peter. >> Yeah, we'll come to that a little bit later in the presentation Jim, that's a crucial point but the other thing we want to note here regarding ultimately how folks will create value out of these technologies is to consider the simple question of okay, how much will developers need to know about infrastructure? And one of the big things we see happening is this notion of serverless. And here we've called it serverless, developer more. Jim, why don't you take us through why we think serverless is going to have a significant impact on the industry, at least certainly from a developer perspective and developer productivity perspective. >> Jim: Yeah, thanks. Serverless is really having an impact already and has for the last several years now. Now, everybody, many are familiar in the developer world, AWS Lambda which is really the ground breaking public cloud service that incorporates the serverless capabilities which essentially is an extraction layer that enables developers to build stateless code that executes in a cloud environment without having to worry about and to build microservices, we don't have to worry about underlined management of containers and virtual machines and so forth. So in many ways, you know, serverless is a simplification strategy for developers. They don't have to worry about the underlying plumbing. They can worry, they need to worry about the code, of course. What are called Lambda functions or functional methods and so forth. Now functional programming has been around for quite a while but now it's coming to the form in this new era of serverless environment. What we'll see in 2018 is that we're predicting is that more than 50% of lean microservices employees, in the public cloud will be deployed in serverless environments. There's AWS and Microsoft has the Azure function. IMB has their own. Google has their own. There's a variety of private, there's a variety of multiple service cloud code bases for private deployment of serverless environments that we're seeing evolving and beginning to deploy in 2018. They all involve functional programming which really, along, you know, when coupled with serverless clouds, enables greater scale and speed in terms of development. And it's very agile friendly in the sense that you can quickly Github a functionally programmed serverless microservice in a hurry without having to manage state and so forth. It's very DevOps friendly. In the very real sense it's a lot faster than having to build and manage and tune. You know, containers and DM's and so forth. So it can enable a more real time and rapid and iterative development pipeline going forward in cloud computing. And really fundamentally what serverless is doing is it's pushing more of these Lamba functions to the Edge, to the Edges. If you're at an AWS Green event last week or the week before, but you notice AWS is putting a big push on putting Lambda functions at the Edge and devices for the IoT as we're going to see in 2018. Pretty much the entire cloud arena. Everybody will push more of the serverless, functional programming to the Edge devices. It's just a simplification strategy. And that actually is a powerful tool for speeding up some of the development metabolism. >> All right, so Jim let me jump in here and say that we've now introduced the, some of these benefits and really highlighted the role that the cloud is going to play. So, let's turn our attention to this question of cloud optimization. And Stu, I'm going to ask you to start us off by talking about what we mean by true private cloud and ultimately our prediction for private cloud. Do we have, why don't you take us through what we think is going to happen in this world of true private cloud? >> Stuart: Sure Peter, thanks a lot. So when Wikibon, when we launched the true private cloud terminology which was about two weeks ago next week, two years ago next week, it was in some ways coming together of a lot of trends similar to things that you know, George, Neil and James have been talking about. So, it is nothing new to say that we needed to simplify the IT stack. We all know, you know the tried and true discussion of you know, way too much of the budget is spent kind of keeping lights on. What we'd like to say is kind of running the business. If you squint through this beautiful chart that we have on here, a big piece of this is operational staffing is where we need to be able to make a significant change. And what we've been really excited and what led us to this initial market segment and what we're continuing to see good growth on is the move from traditional, really siloed infrastructure to you want to have, you know, infrastructure where it is software based. You want IT to really be able to focus on the application services that they're running. And what our focus for the this for the 2018 is of course it's the central point, it's the data that matters here. The whole reason we've infrastructured this to be able to run applications and one of the things that is a key determiner as to where and what I use is the data and how can I not only store that data but actually gain value from that data. Something we've talked about time and again and that is a major determining factor as to am I building this in a public cloud or am I doing it in you know my core. Is it something that is going to live on the Edge. So that's what we were saying here with the true private cloud is not only are we going to simplify our environment and therefore it's really the operational model that we talked about. So we often say the line, cloud is not a destination. But it's an operational model. So a true private cloud giving me some of the you know, feel and management type of capability that I had had in the public cloud. It's, as I said, not just virtualization. It's much more than that. But how can I start getting services and one of the extensions is true private cloud does not live in isolation. When we have kind of a core public cloud and Edge deployments, I need to think about the operational models. Where data lives, what processing happens need to be as environments, and what data we'll need to move between them and of course there's fundamental laws of physics that we need to consider in that. So, the prediction of course is that we know how much gear and focus has been on the traditional data center. And true private cloud helps that transformation to modernization and the big focus is many of these applications we've been talking about and uses of data sets are starting to come into these true private cloud environments. So, you know, we've had discussions. There's Spark, there's modern databases. Many of these, there's going to be many reasons why they might live in the private cloud environment. And therefore that's something that we're going to see tremendous growth and a lot of focus. And we're seeing a new wave of companies that are focusing on this to deliver solutions that will do more than just a step function for infrastructure or get us outside of our silos. But really helps us deliver on those cloud native applications where we pull in things like what Jim was talking about with serverless and the like. >> All right, so Stu, what that suggests ultimately is that data is going to dictate that everything's not going to end up in the private or in the public cloud or centralized public clouds because of latency costs, data governance and IP protection reasons. And there will be some others. At bare minimum, that means that we're going to have it in most large enterprises as least a couple of clouds. Talk to us about what this impact of multi cloud is going to look like over the course of the next few years. >> Stuart: Yeah, critical point there Peter. Because, right, unfortunately, we don't have one solution. There's nobody that we run into that say, oh, you know, I just do a single you know, one environment. You know it would be great if we only had one application to worry about. But as you've done this lovely diagram here, we all use lots of SaaS and increasingly, you know, Oracle, Microsoft, SalesForce, you know, all pushing everybody to multiple SaaS environments that has major impacts on my security and where my data lives. Public clouds, no doubt is growing at leaps and bounds. And many customers are choosing applications to live in different places. So just as in data centers, I would kind of look at it from an application standpoint and build up what I need. Often, there's you know, Amazon doing phenomenal. But you know, maybe there's things that I'm doing with Azure. Maybe there's things that's I'm doing with Google or others as well as my service providers for locality, for you know, specialized services, that there's reasons why people are doing it. And what customers would love is an operational model that can actually span between those. So we are very early in trying to attack this multi cloud environment. There's everything from licensing to security to you know, just operationally how do I manage those. And a piece of them that we were touching on in this prediction year, is Kubernetes actually can be a key enabler for that cloud native environment. As Jim talked about the serverless, what we'd really like is our developer to be able to focus on building their application and not think as much about the underlined infrastructure whether that be you know, racket servers that I built myself or public cloud infrastructures. So we really want to think more it's at the data and application level. It's SaaS and pass is the model and Kubernetes holds the promise to solve a piece of this puzzle. Now Kubernetes is not by no means a silver bullet for everything that we need. But it absolutely, it is doing very well. Our team was at the Linux, the CNCF show at KubeCon last week and there is you know, broad adoption from over 40 of the leading providers including Amazon is now a piece. Even SalesForce signed up to the CNCF. So Kubernetes is allowing me to be able to manage multi cloud workflows and therefore the prediction we have here Peter is that 50% of developing teams will be building, sustaining multi cloud with Kubernetes as a foundational component of that. >> That's excellent Stu. But when we think about it, the hardware of technology especially because of the opportunities associated with true private cloud, the hardware technologies are also going to evolve. There will be enough money here to sustain that investment. David Floyer, we do see another architecture on the horizon where for certain classes of workloads, we will be able to collapse and replicate many of these things in an economical, practical way on premise. We call that UniGrid, NVME is, over fabric is a crucial feature of UniGrid. >> Absolutely. So, NVMe takes, sorry NVMe over fabric or NVMe-oF takes NVMe which is out there as storage and turns it into a system framework. It's a major change in system architecture. We call this UniGrid. And it's going to be a focus of our research in 2018. Vendors are already out there. This is the fastest movement from early standards into products themselves. You can see on the chart that IMB have come out with NVMe over fabrics with the 900 storage connected to the power. Nine systems. NetApp have the EF750. A lot of other companies are there. Meta-Lox is out there looking for networks, for high speed networks. Acceler has a major part of the storage software. So and it's going to be used in particular with things like AI. So what are the drivers and benefits of this architecture? The key is that data is the bottleneck for application. We've talked about data. The amount of data is key to making applications more effective and higher value. So NVMe and NVMe over fabrics allows data to be accessed in microseconds as opposed to milliseconds. And it allows gigabytes of data per second as opposed to megabytes of data per second. And it also allows thousands of processes to access all of the data in very very low latencies. And that gives us amazing parallelism. So what's is about is disaggregation of storage and network and processes. There are some huge benefits from that. Not least of which is you save about 50% of the processor you get back because you don't have to do storage and networking on it. And you save from stranded storage. You save from stranded processor and networking capabilities. So it's overall, it's going to be cheaper. But more importantly, it makes it a basis for delivering systems of intelligence. And systems of intelligence are bringing together systems of record, the traditional systems, not rewriting them but attaching them to real time analytics, real time AI and being able to blend those two systems together because you've got all of that additional data you can bring to bare on a particular problem. So systems themselves have reached pretty well the limit of human management. So, one of the great benefits of UniGrid is to have a single metadata lab from all of that data, all of those processes. >> Peter: All those infrastructure elements. >> All those infrastructure elements. >> Peter: And application. >> And applications themselves. So what that leads to is a huge potential to improve automation of the data center and the application of AI to operations, operational AI. >> So George, it sounds like it's going to be one of the key potential areas where we'll see AI be practically adopted within business. What do we think is going to happen here as we think about the role that AI is going to play in IT operations management? >> Well if we go back to the analogy with big data that we thought was going to you know, cure cancer, taste like chocolate, cost a dollar, and it turned out that the application, the most wide spread application of big data was to offload ETL from expensive data warehouses. And what we expect is the first widespread application of AI embedded in applications for horizontal use where Neil mentioned SalesForce and the ability to use Einstein as SalesForce data and connected data. Now because the applications we're building are so complex that as Stu mentioned you know, we have this operational model with a true private cloud. It's actually not just the legacy stuff that's sucking up all the admin overhead. It's the complexity of the new applications and the stringency of the SLA's, means that we would have to turn millions of people into admins, the old you know, when the telephone networks started, everyone's going to have to be an operator. The only way we can get past this is if we sort of apply machine learning to IT Ops and application performance management. The key here is that the models can learn how the infrastructure is laid out and how it operates. And it can also learn about how all the application services and middleware works, behaving independently and with each other and how they tie with the infrastructure. The reason that's important is because all of a sudden you can get very high fidelity root cause analysis. In the old management technology, if you had an underlined problem, you'd have a whole sort of storm of alerts, because there was no reliable way to really triangulate on the or triage the root cause. Now, what's critical is if you have high fidelity root cause analysis, you can have really precise recommendations for remediation or automated remediation which is something that people will get comfortable with over time, that's not going to happen right away. But this is critical. And this is also the first large scale application of not just machine learning but machine data and so this topology of collecting widely desperate machine data and then applying models and then reconfiguring the software, it's training wheels for IoT apps where you're going to have it far more distributed and actuating devices instead of software. >> That's great, George. So let me sum up and then we'll take some questions. So very quickly, the action items that we have out of this overall session and again, we have another 15 or so predictions that we didn't get to today. But one is, as we said, digital business is the use of data assets to compete. And so ultimately, this notion is starting to diffuse rapidly. We're seeing it on theCUbE. We're seeing it on the the CrowdChats. We're seeing it in the increase of our customers. Ultimately, we believe that the users need to start preparing for even more business scrutiny over their technology management. For example, something very simple and David Floyer, you and I have talked about this extensively in our weekly action item research meeting, the idea of backing up and restoring a system is no longer in a digital business world. It's not just backing up and restoring a system or an application, we're talking about restoring the entire business. That's going to require greater business scrutiny over technology management. It's going to lead to new organizational structures. New challenges of adopting systems, et cetera. But, ultimately, our observations is that data is going to indicate technology directions across the board whether we talk about how businesses evolve or the roles that technology takes in business or we talk about the key business capability, digital business capabilities, of capturing data, turning it into value and then turning into work. Or whether we talk about how we think about cloud architecture and which organizations of cloud resources we're going to utilize. It all comes back to the role that data's going to play in helping us drive decisions. The last action item we want to put here before we get to the questions is clients, if we don't get to your question right now, contact us. Send us an inquiry. Support@silicongangle.freshdesk.com. And we'll respond to you as fast as we can over the course of the next day, two days, to try to answer your question. All right, David Vellante, you've been collecting some questions here. Why don't we see if we can take a couple of them before we close out. >> Yeah, we got about five or six minutes in the chat room, Jim Kobielus has been awesome helping out and so there's a lot of detailed answer there. The first, there's some questions and comments. The first one was, are there too many chiefs? And I guess, yeah. There's some title inflation. I guess my comment there would be titles are cheap, results aren't. So if you're creating chief X officers just for the, to check a box, you're probably wasting money. So you've got to give them clear roles. But I think each of these chiefs has clear roles to the extent that they are you know empowered. Another comment came up which is we don't want you know, Hadoop spaghetti soup all over again. Well true that. Are we at risk of having Hadoop spaghetti soup as the centricity of big data moves from Hadoop to AI and ML and deep learning? >> Well, my answer is we are at risk of that but that there's customer pressure and vendor economic pressure to start consolidating. And we'll also see, what we didn't see in the ArpiM big data era, with cloud vendors, they're just going to start making it easier to use some of the key services together. That's just natural. >> And I'll speak for Neil on this one too, very quickly, that the idea ultimately is as the discipline starts to mature, we won't have people that probably aren't really capable of doing some of this data science stuff, running around and buying a tool to try to supplement their knowledge and their experience. So, that's going to be another factor that I think ultimately leads to clarity in how we utilize these tools as we move into an AI oriented world. >> Okay, Jim is on mute so if you wouldn't mind unmuting him. There was a question, is ML a more informative way of describing AI? Jim, when you and I were in our Boston studio, I sort of asked a similar question. AI is sort of the uber category. Machine learning is math. Deep learning is a more sophisticated math. You have a detailed answer in the chat. But maybe you can give a brief summary. >> Jim: Sure, sure. I don't want too pedantic here but deep learning is essentially, it's a lot more hierarchical deeper stacks of neural network of layers to be able to infer high level extractions from data, you know face recognitions, sentiment analysis and so forth. Machine learning is the broader phenomenon. That's simply along a different and part various approaches for distilling patterns, correlations and algorithms from the data itself. What we've seen in the last week, five, six tenure, let's say, is that all of the neural network approaches for AI have come to the forefront. And in fact, the core often market place and the state of the art. AI is an ancient paradigm that's older than probably you or me that began and for the longest time was rules based system, expert systems. Those haven't gone away. The new era of AI we see as a combination of both statical approaches as well as rules based approaches, and possibly even orchestration based approaches like graph models or building broader context or AI for a variety of applications especially distributed Edge application. >> Okay, thank you and then another question slash comment, AI like graphics in 1985, we move from a separate category to a core part of all apps. AI infused apps, again, Jim, you have a very detailed answer in the chat room but maybe you can give the summary version. >> Jim: Well quickly now, the most disruptive applications we see across the world, enterprise, consumer and so forth, the advantage involves AI. You know at the heart of its machine learning, that's neural networking. I wouldn't say that every single application is doing AI. But the ones that are really blazing the trail in terms of changing the fabric of our lives very much, most of them have AI at their heart. That will continue as the state of the art of AI continues to advance. So really, one of the things we've been saying in our research at Wikibon `is that the data scientists or those skills and tools are the nucleus of the next generation application developer, really in every sphere of our lives. >> Great, quick comment is we will be sending out these slides to all participants. We'll be posting these slides. So thank you Kip for that question. >> And very importantly Dave, over the course of the next few days, most of our predictions docs will be posted up on Wikibon and we'll do a summary of everything that we've talked about here. >> So now the questions are coming through fast and furious. But let me just try to rapid fire here 'cause we only got about a minute left. True private cloud definition. Just say this, we have a detailed definition that we can share but essentially it's substantially mimicking the public cloud experience on PRIM. The way we like to say it is, bringing the cloud operating model to your data versus trying to force fit your business into the cloud. So we've got detailed definitions there that frankly are evolving. about PaaS, there's a question about PaaS. I think we have a prediction in one of our, you know, appendices predictions but maybe a quick word on PaaS. >> Yeah, very quick word on PaaS is that there's been an enormous amount of effort put on the idea of the PaaS marketplace. Cloud Foundry, others suggested that there would be a PaaS market that would evolve because you want to be able to effectively have mobility and migration and portability for this large cloud application. We're not seeing that happen necessarily but what we are seeing is that developers are increasingly becoming a force in dictating and driving cloud decision making and developers will start biasing their choices to the platforms that demonstrate that they have the best developer experience. So whether we call it PaaS, whether we call it something else. Providing the best developer experience is going to be really important to the future of the cloud market place. >> Okay great and then George, George O, George Gilbert, you'll follow up with George O with that other question we need some clarification on. There's a question, really David, I think it's for you. Will persistent dims emerge first on public clouds? >> Almost certainly. But public clouds are where everything is going first. And when we talked about UniGrid, that's where it's going first. And then, the NVMe over fabrics, that architecture is going to be in public clouds. And it has the same sort of benefits there. And NV dims will again develop pretty rapidly as a part of the NVMe over fabrics. >> Okay, we're out of time. We'll look through the chat and follow up with any other questions. Peter, back to you. >> Great, thanks very much Dave. So once again, we want to thank you everybody here that has participated in the webinar today. I apologize for, I feel like Hans Solo and saying it wasn't my fault. But having said that, none the less, I apologize Neil Raden and everybody who had to deal with us finding and unmuting people but we hope you got a lot out of today's conversation. Look for those additional pieces of research on Wikibon, that pertain to the specific predictions on each of these different things that we're talking about. And by all means, Support@silicongangle.freshdesk.com, if you have an additional question but we will follow up with as many as we can from those significant list that's starting to queue up. So thank you very much. This closes out our webinar. We appreciate your time. We look forward to working with you more in 2018. (upbeat music)

Published Date : Dec 16 2017

SUMMARY :

And that is the emergence of the cloud. but the chief digital officer we see how much data moves from the Edge back to the cloud. and more and more of the AI involves deployment and we have multiple others that the ranks of data scientists are going to sort Neil's prescription that as the tools improve And most of the vendors that we have that AI is going to be relatively easy to build the low hanging fruit is for them to use of the talent ends up. of the triangle here. And that leads to the idea the logic albeit to drive sort of the verbiage And one of the big things we see happening is in the sense that you can quickly the role that the cloud is going to play. Is it something that is going to live on the Edge. is that data is going to dictate that and Kubernetes holds the promise to solve the hardware technologies are also going to evolve. of the processor you get back and the application of AI to So George, it sounds like it's going to be one of the key and the stringency of the SLA's, over the course of the next day, two days, to the extent that they are you know empowered. in the ArpiM big data era, with cloud vendors, as the discipline starts to mature, AI is sort of the uber category. and the state of the art. in the chat room but maybe you can give the summary version. at Wikibon `is that the data scientists these slides to all participants. over the course of the next few days, bringing the cloud operating model to your data Providing the best developer experience is going to be with that other question we need some clarification on. that architecture is going to be in public clouds. Peter, back to you. on Wikibon, that pertain to the specific predictions

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
David FloyerPERSON

0.99+

David VellantePERSON

0.99+

JimPERSON

0.99+

NeilPERSON

0.99+

DavidPERSON

0.99+

StuartPERSON

0.99+

Jim KobielusPERSON

0.99+

Neil RadenPERSON

0.99+

EuropeLOCATION

0.99+

AmazonORGANIZATION

0.99+

2018DATE

0.99+

AWSORGANIZATION

0.99+

Peter BurrisPERSON

0.99+

GeorgePERSON

0.99+

WikibonORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

2017DATE

0.99+

Stuart MinimanPERSON

0.99+

George GilbertPERSON

0.99+

Peter DruckerPERSON

0.99+

May 2018DATE

0.99+

PeterPERSON

0.99+

MicrosoftORGANIZATION

0.99+

General Data Protection RegulationTITLE

0.99+

DavePERSON

0.99+

1985DATE

0.99+

50%QUANTITY

0.99+

Last yearDATE

0.99+

George OPERSON

0.99+

OracleORGANIZATION

0.99+

Hans SoloPERSON

0.99+

Support@silicongangle.freshdesk.comOTHER

0.99+

12 billion dollarsQUANTITY

0.99+

second considerationQUANTITY

0.99+

11 billion dollarsQUANTITY

0.99+

Nine systemsQUANTITY

0.99+