Image Title

Search Results for David Flor:

#HybridStorage


 

from our studios in the heart of Silicon Valley Palo Alto California this is a cute conversation hi I'm Peter Burris analyst at wiki bond welcome to another wiki bond the cube digital community event this one sponsored by HP and focusing on hybrid storage like all of our digital community events this one will feature about 25 minutes of video followed by a crowd chat which will be your opportunity to ask your questions share your experiences and push forward the community's thinking on the important issues facing business today so what are we talking about today again hybrid storage let's get going so what is hybrid storage in a lot of shops most people have associated the cloud with public cloud but as we gain experience with the challenges associated with transforming to digital business in which we use data as a singular value producing asset increasingly IT professionals are starting to realize this important relationship between data storage and cloud services and in many respects that's really what we're trying to master today is a better understanding of how the business is going to use data to affect significant changes in how it behaves in the marketplace and it's that question of behavior that question of action that question of location that is pushing business to think differently about how its cloud architectures are going to work we're going to keep data proximate to where it's created to where it's going to be used to where it's going to be able to generate value which demands that we have storage resources in place close to that data proximate to that activity near that value producing activity and that the cloud services will have to follow in many respects that's what we're talking about when we talk about hybrid cloud today we're talking about the increasing recognition that we're going to move cloud services to the data default and not move the data into the cloud public cloud specifically so it's this ongoing understanding as we gain experience with this powerful set of technologies that data architecture is going to be increasingly distributed that storage therefore will be increasingly distributed and that cloud services will flow to where the data is required utilizing storage technologies that can best serve that set of workload so it's a more complex world that demands new levels of simplicity ease of use and optimization so that's where we're going to start our conversation so these crucial questions of how data storage and cloud are going to come together to create hybrid architectures was the basis for a great cubed conversation between silicon angle wiki bonds david Volante and HPE sun dip aurora let's hear what they had to say talk about let's talk about the break down those three things cost efficiency ease of use and resource optimization let's start with cost efficiency so obviously there's TCO there's also the way in which I consume the people I presume are looking for a different pricing model is that are you hearing that yeah absolutely so as part of the cost of of running their business and being able to operate like a cloud everybody is looking at a variety of different procurement and utilization models one of the ways HPE provides utilization model that can map to their cloud journey a public cloud journey is through Greenlake the ability to use and consume data on-demand consume compute on demand across the entire portfolio of products HPE has essentially is what a Greenlake journey looks like and let's go into ease-of-use so what do you mean by that I mean people look they think cloud they think swipe the credit card and start you know deploying machines what do you mean by easy for us ease of use translates back to how do you map to a simpler operating and support model for us the support model is the is the key for customers to be able to to realize the benefits of going to that cloud to get to a simpler support model we use AI ops and for us a offs means using a product called info site info site is a product that is uses deep learning and machine learning algorithms to look at a wide net of call home data from physical resources out there and then be able to take that data and make it actionable and the action behind that is predictiveness the prescriptive nosov creating automated support tickets enclosing automated support tickets without anybody ever having to pick up a phone and call IT support that info site model now is being expanded across the board to all HP products it started with nimble now info site is available on three part it's available on synergy and a recent announcement said it's also available on pro alliance and we expect that info set becomes the glue the automation a I do that goes across the entire portfolio of HP products so this is a great example of applying AI to data so it's like call home taking to a whole new level isn't it yeah it absolutely is and in fact what it does is it uses the call home data that we've had for a long time with products like 3par which essentially was amazing data but not being auctioned on in an automated fashion it takes that data and creates an automation tasks around it and many times that automation task leads to much simpler support experience all right third item you mentioned was resource optimization let's let's drill down into that I infer from that there's there are performance implications is maybe governance compliance you know physical placement can you elaborate that's in color yes I think it's all of the above that he just talked about it's definitely about applying the right performance level to the right set of applications we call this application of air storage the ability to be able to understand which application is creating the data allows us to understand how that data needs to be accessed which in turn means we know where it needs to reside one of the things that HP is doing in the storage domain is creating a common storage fabric with the cloud we call that the fabric for the cloud the idea there is that we have a single layer between the on-premises and off premises resources that allows us to move data as needed depending on the application needs and depending on the user needs so this crucial new factors that have to be incorporated through everyone's thinking of cost efficiency ease of use and resource optimization it's going to place new types of stress on the storage hierarchy it's gonna require new technologies to better support digital transformation David Flor an analyst here in wiki bon has been a leading thinker of the relationship between the storage hierarchy and workloads and digital thinking for quite some time I had a great conversation with David not too long ago let's hear what he had to say about this new storage hierarchy and the new technologies they're gonna make possible these changes have you've been looking at this notion of modern storage architectures for 10 years now and you've been relatively prescient in understanding what's going to happen you were one of the first guys to predict well in advance of everybody else that the crossover between flash and HDD was gonna happen sooner rather than later so I'm not gonna spend a lot of time quizzing you what do you see as a modern storage architecture let's just let it rip ok well let's start with one simple observation the days of stand-alone systems for data have gone we're in a software-defined world and you want to be able to run those data architectures anywhere where you the data is and that means in your data center where you've is created or in the cloud or in a public cloud or at the edge you want to be able to be flexible enough to be able to do all of the data services where the best place is and that means everything has to be software German Software Defined is the first proposition of a modern day in a storage so so the second thing is that there are different types of technology you have the very fastest storage which is in the in in the DRAM itself you have env dim which is the next one down from that expensive but a lot cheaper than the dim and then you have different sorts of flash you have the high-performance flash and you have the 3d flash you know as many layers as you can which is much cheaper flash and then at the bottom you have HD DS and an even tape as storage devices so how the key question is how do you manage that sort of environment well let me start because it still sounds like we still have a storage hierarchy absolutely and it still sounds like that hierarchy is defined largely in terms of access speeds yep and price point size points yes those are the two mason and and bandwidth and latency as well with it which are tied into the richer tied into those yes so what you if you're gonna have this everywhere and you need services everywhere what you have to have is an architecture which takes away all of that complexity so that you all you see from an application point of view is data and how it gets there and how it's put away and how it's stored and how it's protected that's under the covers so the first thing is you need a virtualization of that data layer the physical layer the virtualization of that physical yes and secondly you need that physical layer to extend to all the places that may be using this data you you don't want to be constrained to this data set lives here you want to be able to say ok I want to move this piece of programming to the data as quickly as I can that's much much faster than moving the data to the to the processing so I want to be able to know where all the data is for this particular dataset or file or whatever it is where they all are how they connect together what the latency is between everything I want to understand that architecture and I want a virtualized view of that across that whole the nodes that make up my hybrid cloud so let me be clear here so so we are going to use a software-defined infrastructure that allows us to place the physical devices that have the right cost performance characteristics where they need to be based on the physical realities of latency of you know power availability hardening etc on the network and the network but we want to mask that complexity from the application the application developer an application administrator yes and Software Defined helps do that but doesn't completely do it No well you you want services which say exactly so their service is on top of all that apps that are that are recognizable by the developer by the you know the business person by the administrator as they think about how they use data towards those outcomes not use a storage or use a device but use the data to reach application outcomes that's absolutely right then that's what I call the data plane which is a series of services which enable that to happen and and driven by the application required so we've looked at this and some of the services include you know and and compression deduplication the backup restore security data protection so that's kind of that's kind of the services that now the enterprise by or needs to think about so that those services can be applied with you know by policy yes wherever they're required based on the utilization of the data correct where it's kind of where the event takes place and then you still have at the bottom of that you have the different types of devices you still have you still want of hamsters Mickey you still want hard disk they're not disappearing but if you're gonna use hard disks then you want to use it in the right way if you're using a hard disk you know you want to give it large box you to have it going sequentially in and out all the time so the storage administration and the day the physical schema and everything else is still important in all this but it's less important less the centerpiece of the buying decision correct increasingly it's how well does this stuff prove support the services that the business is using to achieve their outcomes and you want to use course the lowest cost that you can and there will be many different options over more more options open but but the automation of that is absolutely key and that automation from a vendor point of view one of the key things they have to do is to be able to learn from the usage by their customers across as broad a number of customers as they can learn what works what doesn't work learn so that they can put automation into their own software their own software services well sounds like we're talking four things we got we got software-defined still have a storage hierarchy defined by cost and performance but with mainly semiconductor stuff we've got great data services that are relevant to the business and automation that masks the complexity from the artificial AI there is also also made many things fantastic so David's thinking on the new storage hierarchy and how it's going to relate to new classes of workload is a baseline for a lot of the changes happening in the industry today but we still have to turn technology into services that deliver higher levels of value once again let's go back to Dave volantes conversation with Sun dip Arora and here what Sun dip has to say about some of the new digital services some of the new data services they're gonna be essential to supporting these new hybrid storage capabilities we have and what it does it it gives us the opportunity now not just you look at column data from storage but then also look at call home data from the compute side and then what we can do is correlate the data coming back to have better predictability and outcomes on your data center operations as opposed to doing it at the layer of infrastructure you also set out a vision of this this orchestration yeah lair can you talk more about that are we talking about across all clouds whether it's on pram or at the edge or in the public cloud yeah we are we're talking about making it as simple as possible where the customers are not necessarily picking and choosing it allows them to have a strategy that allows them to go across the data center whether it's a public cloud building their own private infrastructure or running on a traditional on-premises sand structure so this vision for us cloud fabric vision for us allows for customers to do that and what about software-defined storage yeah where does that fit into this whole equation yeah I'm glad you mentioned that because that was a third tenant of what HP truly brings to our customers software-defined is is something that allows us to maximize the utilization of the existing resources that our customers have so what we've done is we've partnered with a great deal of really strong software-defined vendors such as comm world cohesive accumulo de terre I know we work very closely with the likes of veeam Zotoh and and the goal there is to do to provide our customers with a whole range of options to drive building a software-defined infrastructure build off the Apollo series of products Apollo servers or storage products for us are extremely dense storage products that allow for both cost and resource optimization so Sunday I made some fantastic points about how new storage technologies are going to be turned into usable services that digital businesses will require as they conceived of their overall hybrid storage approach here's an opportunity hear a little bit more about what HPE thinks about some of these crucial areas let's hear what they have to say in this Chuck talk short take I'm gonna introduce you to HPE primary storage if you want the agility of the public cloud but need the resiliency and speed of high-end storage for mission-critical applications this force is a trade-off of agility for resiliency high-end storage is fast and reliable but falls short on agility and simplicity what if you could have it all what if you could have both agility and resiliency for your mission-critical apps introducing the world's most intelligent storage for mission-critical apps HP primary it delivers an on-demand experience so storage is instantly available Apple wear resiliency backed with a hundred percent availability guarantee predictive acceleration so apps aren't fast some of the time but fast all the time with embedded AI let me tell you more about HPE primarily was engineered to drive unique value in high-end storage there are four areas we focus on global intelligence powered with the most advanced AI for infrastructure info site an all active architecture with multiple nodes for higher resiliency and limitless parallelization a service centric OS that eliminates the risk and simplifies management and timeless storage with a new ownership experience that keeps getting better to learn more go to hp.com slash storage slash prime era so that's been a great series of conversations about hybrid storage and I want to thank Sun dip Arora of HPE David floor of wiki bonds to look at angle jim kanby lists of wiki bonds to look and angle and my colleague David Volante for helping out on the interview side I'm Peter Burris and this has been another wiki bond the cube digital community event sponsored by HPE now stay tuned for our Crouch at which will be your opportunity to ask your questions share your experiences and push for the community's thinking on hybrid storage once again thank you very much for watching let's crouch at

Published Date : Aug 21 2019

SUMMARY :

and and the goal there is to do to

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
David VolantePERSON

0.99+

Peter BurrisPERSON

0.99+

DavidPERSON

0.99+

David FlorPERSON

0.99+

10 yearsQUANTITY

0.99+

HPEORGANIZATION

0.99+

Silicon ValleyLOCATION

0.99+

HPORGANIZATION

0.99+

SundayDATE

0.99+

second thingQUANTITY

0.99+

hundred percentQUANTITY

0.99+

about 25 minutesQUANTITY

0.98+

first propositionQUANTITY

0.98+

firstQUANTITY

0.98+

AppleORGANIZATION

0.97+

todayDATE

0.97+

wiki bondsTITLE

0.97+

Dave volantesPERSON

0.96+

bothQUANTITY

0.95+

Sun dipORGANIZATION

0.95+

twoQUANTITY

0.95+

oneQUANTITY

0.94+

david VolantePERSON

0.92+

third itemQUANTITY

0.92+

single layerQUANTITY

0.89+

ChuckPERSON

0.89+

thirdQUANTITY

0.86+

Palo Alto CaliforniaLOCATION

0.83+

jim kanbyPERSON

0.82+

GermanOTHER

0.82+

one simpleQUANTITY

0.8+

ApolloORGANIZATION

0.8+

Sun dip AroraORGANIZATION

0.8+

first guysQUANTITY

0.79+

3parORGANIZATION

0.77+

nimbleORGANIZATION

0.73+

wikiORGANIZATION

0.72+

threeQUANTITY

0.7+

GreenlakeORGANIZATION

0.7+

Sun dip AroraPERSON

0.7+

the cube digital communityEVENT

0.67+

wiki bondORGANIZATION

0.65+

wiki bondTITLE

0.64+

ApolloTITLE

0.64+

secondlyQUANTITY

0.63+

bondEVENT

0.6+

thingsQUANTITY

0.58+

areasQUANTITY

0.57+

veeam ZotohORGANIZATION

0.57+

auroraPERSON

0.47+

hp.comOTHER

0.46+

HPE Data Platform


 

from our studios in the heart of Silicon Valley Palo Alto California this is a cute conversation hi I'm Peter Burris analyst wiki Bond welcome to another wiki Bond the cube digital community event this one's sponsored by HPE like all of our digital community events this one will feature about 25 minutes of video followed by a crowd chat which will be your opportunity to ask your questions share your experiences and push forward the community's thinking on important issues facing business today so what are we talking about today over the course of the last say six months or so we've had a lot of conversations with our customers about the core issues that multi-cloud is going to engender with in business one of them clearly is how do we bring greater intelligence to how we move manage and administer data within the enterprise some of the more interesting conversations we've had turns out to have been with HPE and that's what we're going to talk about today we're going to be spending a few minutes with a number of HPE professionals as well as wiki bond professionals and thought leaders talking about the challenges that enterprises face as a consider intelligent data platforms so let's get started the first conversation that we're going to talk about is with Sandeep Singh who is the vice president at HPE Sandeep let's have that conversation about the challenges facing business today as it pertains to data so Sandeep I started off by making the observation that we've got this mountain of data coming in a lot of enterprises at the same time there seems to be a the the notion of how data is going to create new classes of business value seems to be pretty deeply ingrained and acculturated to a lot of decision-makers so they want more value out of their data but they're increasingly concerned about the volume of data that's going to hit them how in your conversations with customers are you hearing them talk about this fundamental challenge so that that's a great question you know across the board data is at the heart of applications pretty much everything that organizations do and when they look at it in conversations with customers it really boils down to a couple of areas one is how is my data just effortlessly available all the time it's always fast because fundamentally that's driving the speed of my business and that's incredibly important and how can my various audiences including developers just consume it like the public cloud in a self-service fashion and then the second part of that conversation is really about this massive data storm or mountain of data that's coming and it's gonna be available how do how do I Drive a competitive advantage how do i unlock these hidden insights in that data to uncover new revenue streams new customer experiences those are the areas that we hear about and fundamentally underlying it the challenge for customers is boy I have a lot of complexity and how do I ensure that I have the necessary insights in a the infrastructure management so I am not beholden am or my IT staff isn't beholden to fighting the IT fires that can cause disruptions and delays to projects so fundamentally we want to be able to push time and attention in the infrastructure in the administration of those devices that handle the data and move that time and attention up into how we deliver the data services and ideally up into the applications that are going to actually generate a new class of work within a digital business so I got that right absolutely it's about infrastructure that just runs seamlessly it's always on it's always fast people don't have to worry about what is it gonna go down is my data available or is it gonna slow down people don't want sometimes faster one always fast right I and that's governing the application performance that ultimately I can deliver and you talked about while geez if it if the data infrastructure just work seamlessly then can I eventually get to the applications and building the right pipelines ultimately for mining that data drive doing the AI and the machine learning analytics driven insides from there great discussion about the importance of data in the enterprise and how it's changing the way we think about business we're going to come back to Sandeep shortly but first let's spend some time talking with David floor who's the wiki bond analyst about the new mindset that is required to take advantage of some of these technologies and solve some of these problems specifically we need to think increasingly about data services let's hear what David has to say explain what that new mindset is yes I completely agree that that new mindset is required and it starts with you want to be able to deal with data wherever it's gonna be you in we are in a hybrid world hybrid cloud world your own clouds other public clouds partner clouds all of these need to be integrated and data is at the core of it so that the requirement then is to have rather than think about each individual piece is to think about services which are going to be applied to that data and can be applied not only to the data in one place but across all of that data and there isn't such a thing is just one set of services there going to be multiple sets of these services available but hope we will see some degree of conversion so they'll be the same lexicon and conceptual etcetera there'll be the same levels of things that are needed within each of these architectures but there'll be different emphasis on different areas we need to look at the way we administer data as a set of services that create outcomes for the business and as opposed to that are then translated into individual devices let me so let's jump into this notion of of what those services look like it seems as though we can list off a couple of them sure yeah so we must have of data reduction techniques so you must have deduplication compression type of techniques and you want to apply that our crosses bigger an amount of data as you can the more data you apply those the higher the levels of compression and deduplication you can get so that's clearly you've got those sort of sets of services across there you must backup and restore data in another place and be able to restore it quickly and easily there's that again is a service how quickly how integrated that recovery again that's going to be a variable that's a differentiation in the service exactly you're going to need data data protection in general end to end protection of once or another for example you need end-to-end encryption across there it's no longer good enough to say this bits been encrypted and then this bits the encrypted has got to be an end-to-end from one location to another location seamlessly provided that sort of thing well let me let me let me press on it cuz I think it's a really important point and and and it's you know the notion that the weakest link determines the strength of the chain right the what you just described says if you have encryption here and you don't have encryption there but because of the nature of digital you can start you start bringing that data together guess what the weakest link determines the protection of the overall data absolutely yes and then you need services like snapshots like like other services which provide much better usage of that data one of the great things about flash and that's brought about this about is that you can take a copy of that in real time and use that first totally different purpose and have that being changed in a different way so there are some really significantly great improvements you can have with services like snapshots and then you need some other services which are becoming even more important in my opinion the advent of [Music] bad actors in the in the world has really bought about the requirement for things like air gaps to have your data with the metadata all in one place and completely separated from everything else there are such things as called logical air gaps I think they as long as they're real in the real sense that the two paths can't interfere with each other those are going to be services which become very very important that's generally as an example of a general class of security data services they require so ultimately what we're describing is we're describing a new mindset that says that a storage administrator has to think about the services that the applications in the business requires and then seek out technologies that can provide those services at the price point with the degree of power consumption in the space or the environmental or with the type of maintenance and services related support that required based on the physical location the degree to which is under their control etc so that kind of what how we're thinking about this I think absolutely and the again if there's going to be multiple of these around in the marketplace one size is not going to fit all yeah you if you're wanting super fast response time at an edge and and if you don't get that response in time it's going to be no use whatsoever you're going to take you're going to have a different architecture a different way of doing it then if you need to be a hundred percent certain that every bit is captured and you know in a financial sort of environment but from a service standpoint you want to be able to look at that specific solution in a common way current policies current bilities correct great observations by David Flor it's very clear that for enterprises to get more control over their data their data assets and how they create value out of data they have to take a services mentality but the challenge that we all face is just taking a service mentality is not going to be enough we have to think about how we're going to organize those services into a platform that is pertinent and relevant to how business operates in a digital sense so let's go back to Sandeep saying and talk to him a little bit about this HPE notion of the intelligent data platform you've been one of the leaders in the complex systems arena for a long time and that includes storage where are you guys taking some of these technologies yeah so our strategy is to deliver an intelligent data platform and that intelligent data platform begins with workload optimized composable systems that can span the mission critical workloads general purpose secondary Big Data ai workloads we also deliver cloud data services that enable you to embrace hybrid cloud all of these systems including all the way to cloud data services are plumbed with data mobility and so for example use cases of even modernizing protection and going all the way to protecting cost effectively in the public cloud are enabled but really all of these systems then are imbued with a level of intelligence with a global intelligence engine that begins with predicting and proactively resolving issues before they occur but it goes way beyond that in delivering these prescriptive insights that are built on top of global learning across hundreds of thousands of systems with over a billion data points coming in on a daily basis to be able to deliver at the information at the fingertips of even the virtual machine admins to say this virtual machine is sapping the performance of this node and if you were to move it to this other node the performance or the SLA for all of the virtual machine farm will be even better we build on top of that to deliver pre-built automation so that it's hooked in with a REST API for strategy so that developers can consume it in a containerized application that's orchestrated with kubernetes or they can leverage it as an infrastructure as code whether it's with ansible puppet or chef we accelerate all of the application workloads and bring up where data protection and so it's available for the traditional business applications whether they're built on sa P or Oracle or sequel or the virtual machine farms or the new stack containerized applications and then customers can build their AI and big data pipelines on top of the infrastructure with a plethora of tools whether they're using basically Kafka lastic map our h2o that complete flexibility exists and within HPE were then able to turn around and deliver all of this with an as a service experience with HPE Greenlake to customers so that's where I want to take you next so how invasive is this going to be to a large shop well it is completely seamless in that way so with Greenlake we're able to deliver a fully managed service experience where the a cloud like page you go consumption model and combining it with HPE financial services we're also able to transform their organization in terms of this journey and make it a fully self-funding journey as well so today the typical administrator the typical shop has got a bunch of administrators that are administrating devices that's starting to change they've introduced automation that typically is associated with those devices but if we think three to five years out folks going to be thinking more in terms of data services and how those services get consumed and that's going to be what the storage part of I t's going to be thinking about they can almost become day to administrators if I got that right yes intelligence is fundamentally changing everything not only on the consumer side but on the business side of it a lot of what we've been talking about is intelligence is the game changer we actually see the dawn of the intelligence era and through this AI driven experience what it means for customers as a it enables a support experience that they just absolutely love secondly it means that the infrastructure is always on it's always fast it's always optimized in that sense and thirdly in terms of making these data services that are available and data insights that are being unlocked it's all about how can you enable your innovators and the data scientists and the data analysts to shrink that time to deriving insights from months literally down to minutes today there's this chasm that exists where there's a great concept of how can i leverage the AI technology and between that concept to making it real to thinking about a where can I actually fit and then how do i implement an end-to-end solution and a technology stack so then I just have a pipeline that's available to me that chasm literally is a matter of months and what we're able to deliver for example with HPE blue data is literally a catalog self-service experience where you can select and seamlessly build a pipeline literally in a matter of minutes and it's just all completely hosted seamlessly so making AI and machine learning essentially available for the mainstream through so the ontology data platform makes it possible to see these new classes of applications become routine without forcing the underlying storage administrators themselves to become data scientists absolutely all right the intelligent data platform is a very great concept but it's got to be made real and it's being made real today by HP Calvin Zito's a thought leader at HPE and he's done a series of chalk talks as it pertains to improving storage improving data management one of the more interesting ones was specifically on the intelligent data platform let's watch Calvin Zito's chalk talk hey guys I love it's time for another around the storage black chalk talk in this chalk top we're gonna look at the intelligent Data Platform let me set up the discussion at HP we see the dawn of the intelligence error the flatshare brought a speed with flash flash is now table stakes the cloud era brought new levels of agility and everyone expects as a service experience going forward the intelligence era with an AI driven experience for infrastructure operations in AI enabled unlocking of insights is poised to catapult businesses forward so the intelligent era will see the rise of the intelligent enterprise the enterprise will be always on always fast always agile to respond to different challenges but most of all the intelligent enterprise will be built for innovation innovation that can ilish new services revenue streams and business models every enterprise will need to have an intelligent data strategy where your data is always on and always fast automated an on-demand hybrid by design and applies global intelligence for visibility and lifecycle management our strategy is to deliver an intelligent data platform that turns your data challenges into business opportunities it begins with workload optimized composable systems for multiple workloads and we deliver cloud services for a hybrid cloud environment so that you can seamlessly move data throughout its lifecycle I'll have more on this in a moment the global intelligence engine infuses the entire infrastructure with intelligence it starts with predicting and proactively resolving issues before they occur it creates a unique workload fingerprint and these workload fingerprints combined with global learning enable us to drive recommendations to keep your app workloads and supporting infrastructure always optimized and delivering predictable speed we have a REST API first strategy and offer pre build automation connectors we bring Apple wear protection for both traditional and modern new stack application workloads and you can use the intelligent data platform to build and deliver flexible big data and AI pipelines for driving real-time analytics let's take a quick look at the portfolio of workload optimized composable systems these are systems across mission-critical general-purpose workloads as well secondary data and solutions for the emerging big data and AI applications because our portfolio is built for the cloud we offer comprehensive cloud data services for both production workloads and backup and archive in the cloud HPE info site provides the global intelligence across the portfolio and we give you flexibility of consuming these solutions as a service with HPE Greenlake I want to close with one more thing the HPE intelligent data platform has three main attributes first it's AI driven it removes the burden of managing infrastructure so that IT can focus on innovating and not administrating second it's built for cloud and it enables easy data and workload mobility across hybrid cloud environments finally the intelligent data platform delivers and as a service experience so you can be your own cloud provider to learn more go to hp.com intelligent data always love to hear from you on Twitter where you can find me as calvin zito you can find my blog at hp.com slash blog until next time thanks for joining me on this around the storage black chalk talk I think Calvin makes a compelling case that the opportunity to use these technologies is available today not something that we're just going to wait for in the future and that's good because one of the most important things that business has to think about is how are they going to utilize some of these new AI and related technologies to alter the way that they engage their customers run their businesses and handle their operations and ultimately improve their overall efficiency and effectiveness in the marketplaces it's very clear that this intelligent data platform is required to do many of the advanced AI things that business wants to do but it also requires AI in the platform itself so let's go back to Sandeep Singh and talk to Sandeep about how HPE foresees AI being embedded in them into the intelligent data platform so it can make possible greater utilization of AI and the rest of the application portfolio so we've got the significant problem we now have to figure out how to architect because we want predictability and certainty and and cost clarity and to how we're going to do this part of the challenge or part of the pushers new use cases for AI so we're trying to push data up so that we can build these new use cases but it seems that we have to also have to take some of those very same technologies and drive them down into the infrastructure so we get greater intelligence greater self meter and greater self management self administration within the infrastructure itself I got that right yes absolutely what becomes important for customers is when you think about data and ultimately storage that underlies the data is you can build and deploy fast and reliable storage but that's only solving half the problem greater than 50% of the issues actually end up arising from the higher layers for example you could change the firmware on the host bus adapter inside a server that can trickle down and cause a data unavailability or a performance slowdown issue you need to be able to predict that all the way at that higher level and then prevent that from occurring or your virtual machines might be in a state of over memory commitment at the server level or you CPU over commitment how do you discover those issues and prevent them from happening the other area that's becoming important is when we talk about this whole notion of cloud and hybrid cloud right that complexity tends to multiply exponentially so when the smarts you guys are going after building that hybrid cloud infrastructure fundamental challenges even as I've got a new workload and I want to place that you even on premises because you've had lots of silos how do you even figure out where should I place a workload a and how it'll react with workloads B and C on a given system and now you multiply that across hundreds of systems multiple clouds and the challenge you can see that it's multiplying exponentially oh yeah well I would say that having you know where do I put workload a the right answer today maybe here but the right answer tomorrow maybe some where else and you want to make sure that the service is right required to perform workload a our resident and available without a lot of administrative work necessary to ensure that there's commonality that's kind of what we mean by this hybrid multi cloud world isn't it absolutely and you when you start to think about it basically you end up in requiring and fundamentally needing the data mobility aspect of it because without the data you can't really move your workloads and you need consistency of data services so that your app if it's architected for reliability and a set of data services those just go along with the application and then you need building on top of that the portability for your actual application workload consistently managed with a hybrid management interface there so we want to use an intelligent data platform that's capable of assuring performance assuring availability and assuring security and going beyond that to then deliver a simplified automated experience right so that everything is just available through a self-service interface and then it brings along a level of intelligence that's just built into it globally so that in instead of trying to manually predict and landing in a world of reactive after IT fires have occurred is that there are sea of sensors and it's automatic the infrastructures automatically for predicting and preventing issues before they ever occur and then going beyond that how can you actually fingerprint the individual application workloads to then deliver prescriptive insights right to keep the infrastructure always optimized in that sense so discerning the patterns of data utilization so that the administrative costs of making sure the data is available where it needs to be number one number two assuring that data as assets is made available to developers as they create new applications new new things that create new work but also working very closely with the administrators so that they are not bound [Music] as you know an explosion in the number of tasks adapt to perform to keep this all working across the board yes I want to thank Sandeep Singh and calvin zito both of HPE as well as wiki bonds David Floyd for sharing their ideas on this crucially important topic of how we're going to take more of a platform approach to do a better job of managing crucial data assets in today's and tomorrow's digital businesses I'm Peter Burris and this has been another wiki bomb the cube digital community event sponsored by HPE now stay tuned for our crowd chat which will be your opportunity to ask your questions share your experiences and push for the community's thinking on important issues facing business today thank you very much for watching and now let's crouch [Music]

Published Date : Jul 26 2019

SUMMARY :

of it so that the requirement then is to

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

Sandeep SinghPERSON

0.99+

David FloydPERSON

0.99+

Peter BurrisPERSON

0.99+

David FlorPERSON

0.99+

threeQUANTITY

0.99+

HPEORGANIZATION

0.99+

David floorPERSON

0.99+

Silicon ValleyLOCATION

0.99+

tomorrowDATE

0.99+

calvin zitoPERSON

0.99+

HPORGANIZATION

0.99+

Calvin ZitoPERSON

0.99+

todayDATE

0.99+

greater than 50%QUANTITY

0.99+

second partQUANTITY

0.99+

AppleORGANIZATION

0.99+

Calvin ZitoPERSON

0.98+

two pathsQUANTITY

0.98+

five yearsQUANTITY

0.98+

over a billion data pointsQUANTITY

0.98+

SandeepPERSON

0.98+

hundreds of thousands of systemsQUANTITY

0.97+

each individual pieceQUANTITY

0.97+

bothQUANTITY

0.97+

first conversationQUANTITY

0.97+

hundreds of systemsQUANTITY

0.97+

eachQUANTITY

0.96+

oneQUANTITY

0.96+

firstQUANTITY

0.96+

three main attributesQUANTITY

0.95+

one setQUANTITY

0.95+

one placeQUANTITY

0.94+

about 25 minutesQUANTITY

0.94+

SandeepORGANIZATION

0.94+

one sizeQUANTITY

0.94+

wiki BondORGANIZATION

0.93+

hundred percentQUANTITY

0.92+

HPETITLE

0.91+

GreenlakeORGANIZATION

0.91+

secondQUANTITY

0.91+

half the problemQUANTITY

0.91+

one locationQUANTITY

0.87+

Palo Alto CaliforniaLOCATION

0.86+

first strategyQUANTITY

0.83+

kloadORGANIZATION

0.83+

a lot of enterprisesQUANTITY

0.81+

hp.comORGANIZATION

0.81+

a lot of decision-makersQUANTITY

0.81+

wiki bondORGANIZATION

0.81+

h2oTITLE

0.81+

Kafka lasticTITLE

0.79+

TwitterORGANIZATION

0.79+

of sensorsQUANTITY

0.71+

six monthsQUANTITY

0.69+

OracleORGANIZATION

0.67+