Image Title

Search Results for Data Ops:

DO NOT PUBLISH FOR REVIEW DATA OPS Itumeleng Monale


 

from the cube studios in Palo Alto in Boston connecting with thought leaders all around the world this is a cute conversation everybody welcome back to the cube this is Dave Volante and you're watching a special presentation data ops enacted made possible by IBM you know what's what's happening is the innovation engine in the IT economy is really shifted used to be Moore's Law today it's applying machine intelligence and AI to data really scaling that and operationalizing that new knowledge the challenges that is it's not so easy to operationalize AI and infuse it into the data pipeline but what we're doing in this program is bringing in practitioners who have actually had a great deal of success in doing just that and I'm really excited to have it Kumal a the tumor lang Manali is here she's the executive head of data management or personal and business banking at Standard Bank of South Africa the tumor length thanks so much for coming in the cube thank you for having me Dave you're very welcome and first of all how you holding up with this this bovid situation how are things in Johannesburg um things in Johannesburg are fine and we've been on lockdown now I think it's day 33 if I'm not mistaken lost count and but we're really grateful for the swift action of government we only I mean we have less than 4,000 places in the country and infection rate is is really slow so we've really I think been able to flatten the curve and we're grateful for being able to be protected in this way so we're all working from home or learning the new normal and we're all in this together that's great to hear why don't you tell us a little bit about your your role you're a data person we're really going to get in with here with us you know how you spend your time okay well I hit up a date operations function in a data management function which really is the foundational part of the data value chain that then allows other parts of the organization to monetize data and leverage it as as the use cases apply we monetize it ourselves as well but really we're an enterprise wide organization that ensures that data quality is managed data is governed that we have the effective practices applied to the entire lineage of the data ownership and curation is in place and everything else from a regulatory as well as opportunity perspective then is able to be leveraged upon so historically you know data has been viewed as sort of this expense it's it's big it's growing it needs to be managed deleted after a certain amount of time and then you know ten years ago the Big Data move data became an asset you had a lot of shadow ID people going off and doing things that maybe didn't comply to the corporate ethics probably drove here here you're a part of the organization crazy but talk about that how what has changed but they in the last you know five years or so just in terms of how people approach data oh I mean you know the story I tell my colleague who are all bankers obviously is the fact that um the banker in 1989 had to mainly just know debits credit and be able to look someone in the eye and know whether or not they'd be a credit risk or not you know if we lend you money and you pay it back um the the banker of the late 90s had to then contend with the emergence of technologies that made their lives easier and allowed for automation and processes to run much more smoothly um in the early two-thousands I would say that digitization was a big focus and in fact my previous role was head of digital banking and at the time we thought digital was the panacea it is the be-all and end-all is the thing that's gonna make organizations edit lo and behold we realized that once you've gotten all your digital platforms ready they are just the plate or the pipe and nothing is flowing through it and there's no food on the plate if data is not the main so really um it's always been an acid I think organizations just never consciously knew that data was there okay so so it sounds like once you've made that sort of initial digital transformation you really had to work it and what we're hearing from a lot of practitioners like toughest challenges related to that involve different parts of the organization different skill sets of challenges and sort of getting everybody to work together on the same page it's better but maybe you could take us back to sort of when you started on this initiative around data ops what was that like what were some of the challenges that you faced and how'd you get through them first and foremost Dave organizations used to believe that data was I t's problem and that's probably why you you then saw the emergence of things like shadow IP but when you really acknowledge that data is and si just like money is an asset then you you have to then take accountability for it just the same way as you would any other asset in the organization and you will not add the a its management to a separate function that's not code to the business and oftentimes IT are seen as a support for an enabling but not quite the main show in most organizations right so what we we then did is first emphasize that data is a business capability a business function it presides in business next to product management next to marketing makes to everything else that the business needs for data management also has to be for to every role in every function to different degrees and varying bearing events and when you take accountability as an owner of a business unit you also take accountability for the data in the systems that support the business unit for us that was the first picture um and convincing my colleagues that data was their problem and not something that we had to worry about and they just kind of leave us to - it was was also a journey but that was kind of the first step in - in terms of getting the data operations journey going um you had to first acknowledge please carry on no you just had to first acknowledge that it's something you must take accountability of as a banker not just need to a different part of the organization that's a real cultural mindset you know in the game of rock-paper-scissors you know culture kinda beats everything doesn't it it's almost like a yep a trump card and so so the businesses embrace that but but what did you do to support that is there has to be trust in the data that it has to be a timeliness and so maybe you could pick us through how you achieve those objectives and maybe some other objectives that business the man so the one thing I didn't mention Davis that obviously they didn't embrace it in the beginning it wasn't a it wasn't there oh yeah that make sense they do that type of conversation um what what he had was a few very strategic people with the right mindset that I could partner with that understood the case for data management and while we had that as as an in we developed a framework for a fully matured data operations capability in the organization and what that would look like in a target date scenario and then what you do is you wait for a good crisis so we had a little bit of a challenge in that our local regulator found us a little bit wanting in terms of our data quality and from that perspective it then brought the case for data quality management to the whole so now there's a burning platform you have an appetite for people to partner with you and say okay we need this to comply to help us out and when they start seeing their opt-in action do they stick then buy into into the concepts so sometimes you need to just wait for a good price and leverage it and only do that which the organization will appreciate at that time you don't have to go Big Bang data quality management was the use case at the time five years ago so we focused all our energy on that and after that it gave us leeway and license really bring to maturity or the other capabilities of the business might not well understand as well so when that crisis hit of thinking about people process in technology you probably had to turn some knobs in each of those areas can you talk about that so from a technology perspective that that when we partnered with with IBM to implement information analyzer for us in terms of making sure that then we could profile the data effectively what was important for us is to to make strides in terms of showing the organization progress but also being able to give them access to self-service tools that will give them insight into their data from a technology perspective that was kind of I think that the genesis of of us implementing and the IBM suite in earnest from a data management perspective people wise we really then um also began a data stewardship journey in which we implemented business unit stewards of data I don't like using the word steward because in my organization it's taken lightly it's almost like a part-time occupation so we converted them we call them data managers and and the analogy I would give is every department with a pl any department worth its salt has a FD or financial director and if money is important to you you have somebody helping you take accountability and execute on your responsibilities and managing that that money so if data is equally important as an asset you will have a leader a manager helping you execute on your data ownership accountability and that was the people journey so firstly I had kind of soldiers planted in each department which were data managers that would then continue building the culture maturing the data practices as as applicable to each business unit use cases so what was important is that every manager in every business unit to the Data Manager focus their energy on making that business unit happy by ensuring that their data was of the right compliance level and the right quality the right best practices from a process and management perspective and was governed through and then in terms of process really it's about spreading through the entire ecosystem data management as a practice and can be quite lonely in the sense that unless the core business of an organization is managing data they worried about doing what they do to make money and most people in most business units will be the only unicorn relative to everybody else who does what they do and so for us it was important to have a community of practice a process where all the data managers across business as well as the technology parts and the specialists who were data management professionals coming together and making sure that we we work together on on specific use so I wonder if I can ask you so the the industry sort of likes to market this notion of of DevOps applied to data and data op have you applied that type of mindset approach agile of continuous improvement is I'm trying to understand how much is marketing and how much actually applicable in the real world can you share well you know when I was reflecting on this before this interview I realized that our very first use case of data officers probably when we implemented information analyzer in our business unit simply because it was the first time that IT and business as well as data professionals came together to spec the use case and then we would literally in an agile fashion with a multidisciplinary team come together to make sure that we got the outcomes that we required I mean for you to to firstly get a data quality management paradigm where we moved from 6% quality at some point from our client data now we're sitting at 99 percent and that 1% literally is just the timing issue to get from from 6 to 99 you have to make sure that the entire value chain is engaged so our business partners were the fundamental determinant of the business rules apply in terms of what does quality mean what are the criteria of quality and then what we do is translate that into what we put in the catalog and ensure that the profiling rules that we run are against those business rules that were defined at first so you'd have upfront determination of the outcome with business and then the team would go into an agile cycle of maybe two-week sprints where we develop certain things have stand-ups come together and then the output would be - boarded in a prototype in a fashion where business then gets to go double check that out so that was the first iterate and I would say we've become much more mature at it and we've got many more use cases now and there's actually one that it's quite exciting that we we recently achieved over the end of 2019 into the beginning of this year so what we did was they've am worried about the sunlight coming through the window you look crazy to me like the sunset in South Africa we've been on the we've been on CubeSat sometimes it's so bright we have to put on sunglasses but so the most recent one which was in in late 2019 coming in too early this year we we had long kind of achieved the the compliance and the regulatory burning platform issues and now we are in a place of I think opportunity and luxury where we can now find use cases that are pertinent to business execution and business productivity the one that comes to mind is where a hundred and fifty eight years old as an organization right so so this Bank was born before technology it was also born in the days of light no no no integration because every branch was a standalone entity you'd have these big ledges that transactions were were documented in and I think once every six months or so these Ledger's would be taken by horse-drawn carriage to a central place to give go reconcile between branches and paper but the point is if that is your legacy the initial kind of ERP implementations would have been focused on process efficiency based on old ways of accounting for transactions and allocating information so it was not optimized for the 21st century our architecture had has had huge legacy burden on it and so going into a place where you can be agile with data is something that we're constantly working toward so we get to a place where we have hundreds of branches across the country and all of them obviously telling to client servicing clients as usual and and not being able for any person needing sales teams or executional teams they were not able in a short space of time to see the impact of the tactic from a data perspective um we were in a place where in some cases based on how our Ledger's roll up in the reconciliation between various systems and accounts work it would take you six weeks to verify whether your technique were effective or not because to actually see the revenue hitting our our general ledger and our balance sheet might take that long that is an ineffective way to operate in a such a competitive environment so what you had our frontline sales agents literally manually documenting the sales that they had made but not being able to verify whether that or not is bringing revenue until six weeks later so what we did then is we sat down and defined all the requirements from a reporting perspective and the objective was moved from six weeks latency to 24 hours um and even 24 hours is not perfect our ideal would be that bite rows of day you're able to see what you've done for that day but that's the next the next epoch that will go through however um we literally had the frontline teams defining what they'd want to see in a dashboard the business teams defining what the business rules behind the quality and the definitions would be and then we had an entire I'm analytics team and the data management team working around sourcing the data optimising and curating it and making sure that the latency had done that's I think only our latest use case for data art um and now we're in a place where people can look at a dashboard it's a cubed self-service they can Logan at any time I see the sales they've made which is very important right now and the time of overt nineteen from a from a productivity and executional competitiveness listing those are two great use cases of cooling so the first one you know going from data quality 6% the 99% I mean 6% is all you do is spend time arguing about the data stills probity and then 99% you're there and you said it's just basically a timing issue use latency in the timing and then the second one is is instead of paving the cow path with an outdated you know ledger Barratt data process week you've now compressed that down to 24 hours you want to get the end of day so you've built in the agility into your data pipeline I'm gonna ask you then so when GDP are hit were you able to very quickly leverage this capability and and imply and then maybe other of compliance edik as well Oh actually you know what we just now was post gdpr us um and and we got GDP all right about three years ago but literally all we got right was reporting for risk and compliance purposes the use cases that we have now are really around business opportunity lists so the risk so we prioritize compliance report a long time ago were able to do real-time reporting of a single transaction perspective I'm suspicious transactions etc I'm two hours in Bank and our governor so from that perspective that was what was prioritize in the beginning which was the initial crisis so what you found is an entire engine geared towards making sure that data quality was correct for reporting and regulatory purposes but really that is not the be-all and end-all of it and if that's all we did I believe we really would not have succeeded or could have stayed dead we succeeded because data monetization is actually the penisy the leveraging of data for business opportunity is is actually then what tells you whether you've got the right culture or not you're just doing it to comply then it means the hearts and minds of the rest of the business still aren't in the data game I love this story because it's me it's nirvana for so many years we've been pouring money to mitigate risk and you have no choice do it you know the general council signs off on it the the CFO but grudgingly signs off on it but it's got to be done but for years decades we've been waiting to use these these risk initiatives to actually drive business value you know kind of happened with enterprise data warehouse but it was too slow it was complicated it certainly didn't happen with with email archiving that was just sort of a tech balk it sounds like you know we're at that point today and I want to ask you to me like you know you we talking earlier about you know the crisis gonna perpetuated this this cultural shift and you took advantage of that so we're on the mother nature dealt up a crisis like we've never seen before how do you see your data infrastructure your data pipeline your data ops what kind of opportunities do you see in front of you today as a result of mobit nineteen well I mean because of of the quality of mind data that we had now we were able to very quickly respond to to pivot nineteen in in our context where the government and put us on lockdown relatively early in in the curve in disciple of infection and what it meant is it brought a little bit of a shock to the economy because small businesses all of a sudden didn't have a source of revenue for potentially three to six weeks and based on the data quality work that we did before it was actually relatively easy to be agile enough to do the things that we did so within the first weekend of of lockdown in South Africa we were the first bank to proactively and automatically offer small businesses and student um students with loans on our books a instant preman payment holiday assuming they were in good standing and we did that upfront though it was actually an up out process rather than you had to fall in and arrange for that to happen and I don't believe we would have been able to do that if our data quality was not with um we have since made many more initiatives to try and keep the economy going to try and keep our clients in in a state of of liquidity and so you know data quality at that point and that Dharma is critical to knowing who you're talking to who needs what and in which solutions would best be fitted towards various segments I think the second component is um you know working from home now brings an entirely different normal right so so if we have not been able to provide productivity dashboard and and sales and dashboards to to management and all all the users that require it we would not be able to then validate or say what our productivity levels are and other people are working from home I mean we still have essential services workers that physically go into work but a lot of our relationship bankers are operating from home and that face the baseline and the foundation that we said productivity packing for various metric being able to be reported on in a short space of time has been really beneficial the next opportunity for us is we've been really good at doing this for the normal operational and front line and type of workers but knowledge workers have also know not necessarily been big productivity reporters historically they kind of get an output then the output might be six weeks down the line um but in a place where teams now are not locate co-located and work needs to flow in an edge of passion we need to start using the same foundation and and and data pipeline that we've laid down as a foundation for the reporting of knowledge work and agile team type of metric so in terms of developing new functionality and solutions there's a flow in a multidisciplinary team and how do those solutions get architected in a way where data assists in the flow of information so solutions can be optimally developed well it sounds like you're able to map a metric the business lines care about you know into these dashboards you using the sort of data mapping approach if you will which makes it much more relevant for the business as you said before they own the data that's got to be a huge business benefit just in terms of again we talked about cultural we talked about speed but but the business impact of being able to do that it has to be pretty substantial it really really is um and and the use cases really are endless because every department finds their own opportunity to utilize in terms of their also I think the accountability factor has has significantly increased because as the owner of a specific domain of data you know that you're not only accountable to yourself and your own operation but people downstream to you as a product and and an outcome depend on you to ensure that the quality of the data you produces is of a high nature so so curation of data is a very important thing and business is really starting to understand that so you know the cards Department knows that they are the owners of card data right and you know the vehicle asset Department knows that they are the owners of vehicle they are linked to a client profile and all of that creates an ecosystem around the plan I mean when you come to a bank you you don't want to be known as a number and you don't want to be known just for one product you want to be known across everything that you do with that with that organization but most banks are not structured that way they still are product houses and product systems on which your data reside and if those don't act in concert then we come across extremely schizophrenic as if we don't know our clients and so that's very very important to me like I could go on for an hour talking about this topic but unfortunately we're out of time thank you so much for sharing your deep knowledge and your story it's really an inspiring one and congratulations on all your success and I guess I'll leave it with you know what's next you gave us you know a glimpse of some of the things you wanted to do pressing some of the the elapsed times and the time cycle but but where do you see this going in the next you know kind of mid term and longer term currently I mean obviously AI is is a big is a big opportunity for all organizations and and you don't get automation of anything right if the foundations are not in place so you believe that this is a great foundation for anything AI to to be applied in terms of the use cases that we can find the second one is really um providing an API economy where certain data product can be shared with third parties I think that probably where we want to take things as well we are ready utilizing external third-party data sources I'm in our data quality management suite to ensure validity of client identity and and and residents and things of that nature but going forward because been picked and banks and other organizations are probably going to partner to to be more competitive going forward we need to be able to provide data product that can then be leveraged by external parties and vice-versa the trooper like thanks again great having you thank you very much Dave appreciate the opportunity and thank you for watching everybody that we go we are digging in the data offs we've got practitioners we've got influencers we've got experts we're going in the crowd chat it's the crowd chat dot net flash data ops but keep it right there way back but more coverage this is Dave Volante for the cube [Music]

Published Date : Apr 28 2020

**Summary and Sentiment Analysis are not been shown because of improper transcript**

ENTITIES

EntityCategoryConfidence
six weeksQUANTITY

0.99+

IBMORGANIZATION

0.99+

JohannesburgLOCATION

0.99+

1989DATE

0.99+

Dave VolantePERSON

0.99+

24 hoursQUANTITY

0.99+

DavePERSON

0.99+

threeQUANTITY

0.99+

two hoursQUANTITY

0.99+

Standard BankORGANIZATION

0.99+

6%QUANTITY

0.99+

Palo AltoLOCATION

0.99+

two-weekQUANTITY

0.99+

South AfricaLOCATION

0.99+

99 percentQUANTITY

0.99+

late 2019DATE

0.99+

South AfricaLOCATION

0.99+

6QUANTITY

0.99+

less than 4,000 placesQUANTITY

0.99+

99%QUANTITY

0.99+

second componentQUANTITY

0.99+

1%QUANTITY

0.99+

six weeksQUANTITY

0.99+

21st centuryDATE

0.99+

BostonLOCATION

0.98+

first timeQUANTITY

0.98+

99QUANTITY

0.98+

five yearsQUANTITY

0.98+

first bankQUANTITY

0.98+

first stepQUANTITY

0.98+

five years agoDATE

0.98+

late 90sDATE

0.98+

each departmentQUANTITY

0.98+

ten years agoDATE

0.98+

an hourQUANTITY

0.97+

six weeks laterDATE

0.97+

KumalPERSON

0.97+

firstQUANTITY

0.97+

LedgerORGANIZATION

0.97+

todayDATE

0.97+

DavisPERSON

0.95+

first pictureQUANTITY

0.95+

second oneQUANTITY

0.95+

firstlyQUANTITY

0.94+

first weekendQUANTITY

0.94+

first oneQUANTITY

0.94+

Big BangEVENT

0.94+

a hundred and fifty eight years oldQUANTITY

0.94+

hundreds of branchesQUANTITY

0.93+

once every six monthsQUANTITY

0.93+

one productQUANTITY

0.92+

single transactionQUANTITY

0.91+

two great use casesQUANTITY

0.9+

end of 2019DATE

0.89+

eachQUANTITY

0.89+

LoganPERSON

0.88+

early this yearDATE

0.87+

each business unitQUANTITY

0.85+

ManaliPERSON

0.84+

DevOpsTITLE

0.84+

Itumeleng MonalePERSON

0.83+

a lot of practitionersQUANTITY

0.81+

about three years agoDATE

0.8+

yearsQUANTITY

0.77+

first use caseQUANTITY

0.77+

every branchQUANTITY

0.76+

oneQUANTITY

0.73+

every departmentQUANTITY

0.72+

beginning of this yearDATE

0.69+

nineteenQUANTITY

0.67+

every businessQUANTITY

0.67+

biteQUANTITY

0.64+

every managerQUANTITY

0.59+

MooreTITLE

0.59+

thousandsQUANTITY

0.56+

long timeDATE

0.53+

Omer Asad & Sandeep Singh, HPE | HPE Discover 2021


 

>>Welcome back to HPD discovered 2021. The virtual edition. My name is Dave a lot and you're watching the cube. We're here with Omar assad is the vice president, GM of H P S H C I and primary storage and data management business. And Sandeep Singh was the vice president of marketing for HP storage division. Welcome gents. Great to see you. >>Great to be here. Dave, >>it's a pleasure to be here today. >>Hey, so uh, last month you guys, you made a big announcement and and now you're, you know, shining the spotlight on that here at discover Cindy. Maybe you can give us a quick recap, what do we need to know? >>Yeah, Dave. We announced that we're expanding HB Green Lake by transforming HB storage to a cloud native software defined data services business. We unveiled a new vision for data that accelerates data dream of transformation for our customers. Uh and it introduced a and we introduced the data services platform that consists of two game changing innovations are first announcement was data services cloud console. It's a SAS based console that delivers the cut operational agility and it's designed to unify data operations through a suite of cloud data services. Our second announcement is H P E electra. It's cloud native data infrastructure to power your data edge to cloud. And it's managed natively with data services cloud console to bring that cloud operational model to our customers wherever their data lives. Together with the data services >>platform. >>Hp Green Green Lake brings that cloud experience to our customers data across edge and on premises environment and lays the foundation for our customers to shift from managing storage to managing data. >>Well, I think it lays the foundation for the next decade. You know, when we entered this past decade, we we we we keep we use terms like software led that that sort of morphed into. So the software defined data center containers with kubernetes, let's zoom out for a minute. If we can homer, maybe you could describe the problems that you're trying to address with this announcement. >>Thanks dave. It's always a pleasure talking to you on these topics. So in my role as general manager for primary storage, I speak with the hundreds of customers across the board and I consistently hear that data is at the heart of what our customers are doing and they're looking for a data driven transformative approach to their business. But as they engage on these things, there are two challenges that they consistently faced. The first one is that managing storage at scale Is rife with complexity. So while storage has gotten faster in the last 20 years, managing a single array or maybe two or three arrays has gotten simpler over time. But managing storage at scale when you deploy fleet, so storage as customers continue to gather, store and life cycle of that data. This process is extremely frustrating for customers. Still I. T. Administrators are firefighting, they're unable to innovate for their business because now data spans all the way from edge to corridor cloud. And then with the advent of public cloud there's another dimension of multi cloud that has been added to their data sprawl. And then secondly what what we what we consistently hear is that idea administrators need to shift from managing storage to managing data. What this basically means is that I. T. Has a desire to mobilize, protect and provision data seamlessly across its lifecycle and across the locations that it is stored at. This ensures that I. D. Leaders uh and also people within the organization understand the context of the data that they store and they operate upon. Yet data management is an extremely big challenge and it is a web of fragmented data silos across processes across infrastructure all the way from test and dev to administration uh to production uh to back up to lifecycle data advantage. Uh And so up till now data management was tied up with storage management and this needs to change for our customers especially with the diversity of the application workloads as they're growing and as customers are expanding their footprint across a multi cloud environment, >>just had to almost um response there. We recently conducted a survey that was actually done by E. S. She. Um and that was a survey of IT. decision makers. And it's interesting what it showcased, 93% of the respondents indicated that storage and data management complexity is impeding their digital transformation. 95% of the respondents indicated that solving storage and data management complexity is a top 10 business initiative for them And 94% want to bring the cloud experience on premises. >>You know, I'll chime in. I think as you guys move to the sort of software world and container world affinity to developers homer. You talked about, you know, things like data protection and we talk about security being bolted on all the time. Now. It's designed in it's it's done at sort of the point of creation, not as an afterthought and that's a big change that we see coming. Uh Let's talk about, you know what also needs to change as customers make the move from this idea of managing storage to to managing data or maybe you can take that one. >>That's a that's a very interesting problem. Right. What are the things that have to be true in order for us to move into this new data management model? So, dave one of the things that the public cloud got right is the cloud operational model which sets the standard for agility and a fast pace for our customers in a classic I. T. On prime model. If you ever wanted to stand up an application or if you were thinking about standing up a particular workload, uh you're going to file a series of I. T. Tickets uh And then you are at the mercy of whatever complex processes exist within organization and and depending on what the level of approvals are within a particular organization, standing up a workload can take days, weeks or even months in certain cases. So what cloud did was a rock that level of simplicity for someone that wanted to instead she ate an app. This means that the provision of underlying infrastructure that makes that workload possible needs to be reduced to minutes from days and weeks. But so what we are intending to do over here is to bring the best of both worlds together so that the cloud experience can be experienced everywhere with ease and simplicity and the customers don't need to change their operating model. So it's blending the two together. And that's what we are trying to usher in into this new era where we start to differentiate between data management and storage management as two independent. Yes, >>Great. Thank you for that. Omer. So deep. I wonder if you could share with the audience, you know, the vision that you guys unveiled, What does it look like? How are you making it actually substantive and and real? >>Yeah. David, That's also great question. Um across the board it's time to reimagine data management. Everything that homer shared. Those challenges are leading to customers needing to break down the silos and complexity that plagues these distributed data environments. And our vision is to deliver a new data experience that helps customers unleash the power of data. We call this vision unified data obs Unified Data Ops integrates data centric policies to streamline data management cloud native control to bring the cloud operational model to where customers data labs and a I driven insights to make the infrastructure invisible. It delivers a new data experience to simplify and bring that agility of cloud to data infrastructure. Streamline data management and help customers innovate faster than ever before. We're making the promise of unified Data Ops Real by transforming H P E storage to a cloud native software defined data services business and introducing a data services platform that expands Hve Green Lake. >>I mean, you know, you talk about the complexity, I see, I look at it as you kind of almost embracing the complexity saying, look, it's gonna keep getting more complex as the cloud expands to the edge on prem Cross cloud, it gets more complex underneath. What you're doing is you're almost embracing that complexity, putting a layer over it and hiding that complexity from from the end customer that and so they can spend their time doing other things over. I wonder if you can maybe talk a little bit more about the data services console, is it sort of another, you know, software layer to manage infrastructure? What exactly is it? >>It's a lot more than that dave and you're you're 100% right. It's basically we're attempting in this release to attack that complexity. Head on. So simply put data services. Cloud console is a SAS based console that delivers cloud operational model and cloud operational agility uh to our customers, it unifies data operations through a series of cloud data services that are delivered on top of this console to our customers in a continuous innovation stream. Uh And what we have done is going back to the point that I made earlier separating storage and data management and putting the strong suites of each of those together into the SAS delivered console for our customers. So what we have done is we have separated data and infrastructure management away from physical hardware to provide a comprehensive and a unified approach to managing data and infrastructure wherever it lives from a customer's perspective, it could be at the edge, it could be in a coal. Oh, it could be in their data center or it could be a bunch of data services that are deployed within the public cloud. So now our customers with data services, cloud console can manage the entire life cycle of their data from all the way from deployment, upgrading and optimizing it uh from a single console from anywhere in the world. Uh This console is designed to streamline data management with cloud data services that enable access to data, It allows for policy-based data protection, it allows for an organizational wide search on top of your storage assets. And we deliver basically a 360° visibility to all your data from a single console that the customer can experience from anywhere. So, so if you look at the journey, the way we're deciding to deliver this. So the first in its first incarnation, uh data services, cloud console gives you infrastructure and cloud data services to start to do data management along with that. But this is that foundation that we are placing in front of our customers, the SAS console through which we get touch our customers on a daily basis. And now as our customers get access to the SAAS platform on the back end, we will continue to roll in additional services throughout the years on a true SAS based innovation base for our customers. And and these services can will be will be ranging all the way from data protection to multiple out data management, all the way to visibility all the way to understanding the context of your data as it's stored across your enterprise. And in addition to that, we're offering a consistent, revised, unified API which allows for our customers to build automation against their storage infrastructure without ever worrying about that. As infrastructure changes. Uh the A P I proof points are going to break for them. That is never going to happen because they are going to be programming to a single SAS based aPI interface from now on. >>Right. And that brings in this idea of infrastructures coding because you talk about as a service to talk about Green Lake and and my question is always okay. Tell me what's behind that. And if and if and if and if you're talking about boxes and and widgets, that's a it's a problem. And you're not you're talking about services and A P. I. S and microservices and that's really the future model. And infrastructure is code and ultimately data as code is really part of that. So, All right. So you guys, I know some of your branding folks, you guys give deep thought uh, to this. So the second part of the announcement is the new product brands and deep maybe you can talk about that a little bit. >>Sure. Ultimately delivering the cloud operational model requires cognitive data infrastructure and that has been engineered to be natively managed from the cloud. And that's why we have also introduced H. P. E. Electra. Omar. Can you perhaps described HB electro even more? >>Absolutely. Thank you. Sandy. Uh, so with with HB Electoral we're launching a new brand of cloud native hardware infrastructure to power our customers data all the way from edge to the core to the cloud. The releases are smaller models for the edge then at the same time having models for the data center and then expanding those services into the public cloud as well. Right. All these hardware devices, Electoral hardware devices are cloud native. Empowered by our Data services. Cloud Council. We're announcing two models with this launch H. P. E. Electra 9000. Uh, this is for our mission critical workloads. It has its history and bases in H P E primera. It comes with 100% availability guarantee. Uh It's the first of its type in the industry. It comes with standard support contract, No special verb is required. And then we're also launching HB electoral 6000. Uh These are based in our history of uh nimble storage systems. Uh These these are for business critical applications, especially for that mid range of the storage market, optimizing price, performance and efficiency. Both of these systems are full envy, any storage powered by our timeless capabilities with data in place upgrades. And then they both deliver a unified infrastructure and data management experience through the data services, cloud console. Uh and and and at the back end, unified ai Ops experience with H P E info site is seamlessly blended in along with the offering for our customers. >>So this is what I was talking about before. It's sort of not your grandfather's storage business anymore. Is this is this is this is something that is part of that, that unified vision, that layer that I talked about. The AP is the program ability. So you're you're reaching into new territory here. Maybe you can give us an example of how the customers experience what that looks like. >>Excellent, loved her Dave. So essentially what we're doing is we're changing the storage experience to a true cloud operational model for our customers. These recent announcements that we just went through along with, indeed they expand the cloud experience that our customers get with storage as a service with HPD Green Lake. So a couple of examples to make this real. So the first of all is simplified deployment. Uh, so I t no longer has to go through complex startup and deployment processes. Now, all you need to do is these systems shipped and delivered to the customer's data center. Operational staff just need to rack and stack and then leave, connect the power cable, connect the network cable. And the job is done from that point onwards, data services console takes over where you can onboard these systems, you can provision these systems if you have a pre existing organization wide security as well as standard profile setup in data services console, we can automatically apply those on your behalf and bring these systems online. From a customer's perspective, they can be anywhere in the world to onboard these systems, they could be driving in a car, they could be sitting on a beach uh And and you know, these systems are automatically on boarded through this cloud operational model which is delivered through the SAAS application for our customers. Another big example. All that I'd like to shed light on is intent based provisioning. Uh So Dave typically provisioning a workload within a data center is an extremely spreadsheet driven trial and error kind of a task. Which system do I land it on? Uh Is my existing sl is going to be affected which systems that loaded, which systems are loaded enough that I put this additional workload on it and the performance doesn't take. All of these decisions are trial and error on a constant basis with cloud data services console along with the electron new systems that are constantly in a loop back information feeding uh Typical analytics to the console. All you need to do is to describe the type of the workload and the intent of the workload in terms of block size S. L. A. That you would like to experience at that point. Data services console consults with intra site at the back end. We run through thousands of data points that are constantly being given to us by your fleet and we come back with a few recommendations. You can accept the recommendation and at that time we go ahead and fully deploy this workload on your behalf or you can specify a particular system and then we will try to enforce the S. L. A. On that system. So it completely eliminates the guesswork and the planning that you have to do in this regard. Uh And last but not the least. Uh you know, one of the most important things is, you know, upgrades has been a huge problem for our customers. Uh And typically oftentimes when you're not in this constant, you know, loop back communication with your customers. It often is a big challenge to identify which release or which bug fix or which update goes on to which particular machine. All of that has been completely taken away from our customers and fully automated. Uh we run thousands of signatures across are installed base. We identify which upgrades need to be curated for which machines in a fleet for a particular customer. And then if it applies to that customer we presented, and if the customer accepts it, we automatically go ahead and upgrade the system and and and last, but not the least from a global management perspective. Now, a customer has an independent data view of their data estate, independent from a storage estate. And data services. Council can blend the two to give a consistent view or you can just look at the fleet view or the data view. >>It's kind of the Holy Grail. I mean I've been in this business a long time and I think I t. People have dreamt about you know this kind of capability for for a long long time. I wonder if we could sort of stay on the customers for a moment here and and talk about what's enabled. Now everybody's talking digital transformation that I joke about the joke. Not funny. The force marched to digital with Covid uh and we really wasn't planned for but the customers really want to drive now that digital transfer some of them are on the back burner and now they're moving to the front burner. What are the outcomes that are that are enabled here? Omar. >>Excellent. So so on on a typical basis for a traditional I. T. Customer, this cloud operational model means that you know information technology staff can move a lot faster and they can be a lot more productive on the things that are directly relevant to their business. They can get up to 99% of the savings back to spend more time on strategic projects or best of all spend time with their families rather than managing and upgrading infrastructure and fleets of infrastructure. Right. For line of business owners, the new experience means that their data infrastructure can be presented can be provision where the self service on demand type of capability. Uh They necessarily don't have to be in the data center to be able to make those decisions. Capacity management, performance management, all of that is died in and presented to them wherever they are easy to consume SAS based models and especially for data innovators, whether it's D B A s, uh whether it's data analysts, they can start to consume infrastructure and ultimately data as a code to speed up their app development because again, the context that we're bringing forward is the context of data decoupling it from. Actually, storage management, storage management and data management are now two separate domains that can be presented through a single console to tie the end to end picture for a customer. But at the end of the day, what we have felt is that customers really really want to rely and move forward with the data management and leave infrastructure management to machine oriented task, which we have completely automated on their behalf. >>So I'm sure you've heard you got the memo about, you know, H H P going all in on as a service. Uh it's clear that the companies all in. How does this announcement fit in to that overall mission, Sandeep >>Dave. We believe the future is edge to cloud and our mission is to be the edge to cloud platform as a service company and as as HB transforms HP Green Lake is our unified cloud platform. Hp Green Link is how we deliver cloud services and agile cloud experiences to customers, applications and data across the edge to cloud. With the storage announcement that we made recently, we announced that we're expanding HB Green Lake with as a service transformation of the HPV storage business to a cloud native software defined data services business. And this expands storage as a service delivering full cloud experience to our customers data across edge and on prem environment across the board were committed to being a strategic partner for every one of our customers and helping them accelerate their digital transformation. >>Yeah, that's where the puck is going guys. Hey as always great conversation with with our friends from HP storage. Thanks so much for the collaboration and congratulations on the announcements and I know you're not done yet. >>Thanks. Dave. Thanks. Dave. All right. Dave. It's a pleasure to be here. >>You're very welcome. And thank you for being with us for hp. You discovered 2021. You're watching the cube, the leader digital check coverage. Keep it right there, but right back. >>Mhm. Mhm.

Published Date : Jun 23 2021

SUMMARY :

Great to see you. Great to be here. Hey, so uh, last month you guys, you made a big announcement and and now that delivers the cut operational agility and it's designed to unify data operations Hp Green Green Lake brings that cloud experience to our customers So the software defined data center containers with kubernetes, let's zoom and this needs to change for our customers especially with the diversity of the application 95% of the respondents indicated that solving storage to managing data or maybe you can take that one. What are the things that have to be true the vision that you guys unveiled, What does it look like? Um across the board it's time to reimagine saying, look, it's gonna keep getting more complex as the cloud expands to the edge on prem Cross cloud, Uh the A P I proof points are going to break for So the second part of the announcement is the new product brands and deep maybe you can talk about that data infrastructure and that has been engineered to be natively managed from Uh and and and at the back end, unified ai Ops experience with H of how the customers experience what that looks like. Council can blend the two to give a consistent view or you can just look at the fleet view on the back burner and now they're moving to the front burner. Uh They necessarily don't have to be in the data center to be able to make those decisions. Uh it's clear that the companies all in. customers, applications and data across the edge to cloud. on the announcements and I know you're not done yet. It's a pleasure to be here. the leader digital check coverage.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

DavePERSON

0.99+

twoQUANTITY

0.99+

Sandeep SinghPERSON

0.99+

HPORGANIZATION

0.99+

95%QUANTITY

0.99+

100%QUANTITY

0.99+

hundredsQUANTITY

0.99+

Omar assadPERSON

0.99+

two challengesQUANTITY

0.99+

SandeepPERSON

0.99+

SandyPERSON

0.99+

94%QUANTITY

0.99+

second announcementQUANTITY

0.99+

2021DATE

0.99+

93%QUANTITY

0.99+

thousandsQUANTITY

0.99+

three arraysQUANTITY

0.99+

H P S H C IORGANIZATION

0.99+

HPDORGANIZATION

0.99+

OmarPERSON

0.99+

Hp Green LinkORGANIZATION

0.99+

HB ElectoralORGANIZATION

0.99+

first announcementQUANTITY

0.99+

H H PORGANIZATION

0.99+

two modelsQUANTITY

0.99+

HB Green LakeORGANIZATION

0.99+

Omer AsadPERSON

0.99+

davePERSON

0.99+

firstQUANTITY

0.99+

BothQUANTITY

0.99+

HPD Green LakeORGANIZATION

0.98+

hpORGANIZATION

0.98+

last monthDATE

0.98+

Hp Green Green LakeORGANIZATION

0.98+

first incarnationQUANTITY

0.98+

both worldsQUANTITY

0.98+

Data OpsORGANIZATION

0.98+

todayDATE

0.98+

HPEORGANIZATION

0.98+

next decadeDATE

0.97+

bothQUANTITY

0.97+

Hve Green LakeORGANIZATION

0.97+

first oneQUANTITY

0.96+

two gameQUANTITY

0.96+

single arrayQUANTITY

0.96+

eachQUANTITY

0.95+

HP Green LakeORGANIZATION

0.95+

oneQUANTITY

0.95+

HPVORGANIZATION

0.95+

single consoleQUANTITY

0.95+

CovidPERSON

0.95+

H. P. E. Electra 9000COMMERCIAL_ITEM

0.94+

up to 99%QUANTITY

0.94+

HBORGANIZATION

0.94+

H P E electraORGANIZATION

0.93+

two separate domainsQUANTITY

0.93+

secondlyQUANTITY

0.93+

second partQUANTITY

0.91+

customersQUANTITY

0.9+

thousands of data pointsQUANTITY

0.87+

HB electroORGANIZATION

0.86+

SAASTITLE

0.86+

HB electoral 6000COMMERCIAL_ITEM

0.85+

past decadeDATE

0.85+

HB GreenORGANIZATION

0.84+

Cloud CouncilORGANIZATION

0.84+

H P EORGANIZATION

0.82+

360°QUANTITY

0.81+

SASORGANIZATION

0.81+

10 businessQUANTITY

0.71+

singleQUANTITY

0.7+

Omer Asad & Sandeep Singh | HPE Discover 2021


 

>>Welcome back to HPD discovered 2021. The virtual edition. My name is Dave Volonte and you're watching the cube. We're here with Omar assad is the vice president GM of H P S H C I and primary storage and data management business. And Sandeep Singh was the vice president of marketing for HP storage division. Welcome gents. Great to see you. >>Great to be here. Dave, >>It's a pleasure to be here today. >>Hey, so uh, last month you guys, you made a big announcement and and now you're, you know, shining the spotlight on that here at discover Cindy. Maybe you can give us a quick recap, what do we need to know? >>Yeah, Dave. We announced that we're expanding HB Green Lake by transforming HB storage to a cloud native software defined data services business. We unveiled a new vision for data that accelerates data, dream of transformation for our customers. Uh and it introduced a and we introduced the data services platform that consists of two game changing innovations are first announcement was Data services cloud console. It's a SAS based console that delivers the cut operational agility and it's designed to unify data operations through a suite of cloud data services. Our 2nd announcement is HPE. Electra. It's cloud native data infrastructure to power your data edge to cloud. And it's managed natively with data services cloud console to bring that cloud operational model to our customers wherever their data lives together with the data services platform. Hp Green Green Lake brings that cloud experience to our customers data across edge and on premises environment and lays the foundation for our customers to shift from managing storage to managing data. >>Well, I think it lays the foundation for the next decade. You know, when we entered this past decade, we we were Ricky bobby's terms like software led that that sort of morphed into. So the software defined data center containers with kubernetes, Let's zoom out for a minute. If we can homer maybe you could describe the problems that you're trying to address with this announcement. >>Thanks dave. It's always a pleasure talking to you on these topics. So in my role as general manager for primary storage, I speak with the hundreds of customers across the board and I consistently hear that data is at the heart of what our customers are doing and they're looking for a data driven transformative approach to their business. But as they engage on these things, there are two challenges that they consistently faced. The first one is that managing storage at scale Is rife with complexity. So while storage has gotten faster in the last 20 years, managing a single array or maybe two or three arrays has gotten simpler over time. But managing storage at scale when you deploy fleet. So storage as customers continue to gather, store and lifecycle that data. This process is extremely frustrating for customers. Still I. T. Administrators are firefighting, they're unable to innovate for their business because now data spans all the way from edge to corridor cloud. And then with the advent of public cloud there's another dimension of multi cloud that has been added to their data sprawl. And then secondly what what we what we consistently hear is that idea administrators need to shift from managing storage to managing data. What this basically means is that I. D. Has a desire to mobilize, protect and provision data seamlessly across its lifecycle and across the locations that it is stored at. Uh This ensures that I. D. Leaders uh and also people within the organization understand the context of the data that they store and they operate upon. Yet data management is an extremely big challenge and it is a web of fragmented data silos across processes across infrastructure all the way from test and dev to administration uh to production uh to back up to lifecycle data management. Uh And so up till now data management was tied up with storage management and this needs to change for our customers especially with the diversity of the application workloads as they're growing and as customers are expanding their footprint across a multi cloud environment >>just to add to almost uh response there. We recently conducted a survey that was actually done by E. S. She. Um and that was a survey of IT. decision makers. And it's interesting what it showcased, 93% of the respondents indicated that storage and data management complexity is impeding their digital transformation. 95% of the respondents indicated that solving storage and data management complexity is a top 10 business initiative for them and 94% want to bring the cloud experience on premises, >>you know, al china. And I think as you guys move to the sort of software world and container world affinity to developers homer, you talked about, you know, things like data protection and we talk about security being bolted on all the time. Now. It's designed in it's it's done at sort of the point of creation, not as an afterthought. And that's a big change that we see coming. Uh But let's talk about, you know, what also needs to change as customers make the move from this idea of managing storage to to managing data or maybe you can take that one. >>That's a that's a that's a very interesting problem. Right. What are the things that have to be true in order for us to move into this new data management model? So, dave one of the things that the public cloud got right is the cloud operational model uh which sets the standard for agility and a fast pace for our customers in a classic I. T. On prime model, if you ever wanted to stand up an application or if you were thinking about standing up a particular workload, uh you're going to file a series of I. T. Tickets and then you're at the mercy of whatever complex processes exist within organization and and depending on what the level of approvals are within a particular organization, standing up a workload can take days, weeks or even months in certain cases. So what cloud did was they brought that level of simplicity for someone that wanted to instead she ate an app. This means that the provisioning of underlying infrastructure that makes that workload possible needs to be reduced to minutes from days and weeks. But so what we are intending to do over here is to bring the best of both worlds together so that the cloud experience can be experienced everywhere with ease and simplicity and the customers don't need to change their operating model. So it's blending the two together. And that's what we are trying to usher in into this new era where we start to differentiate between data management and storage management as two independent things. >>Great, thank you for that. Omer sometimes I wonder if you could share with the audience, you know, the vision that you guys unveiled, What does it look like? How are you making it actually substantive and and real? >>Yeah. Dave. That's also great question. Um across the board it's time to reimagine data management. Everything that homer shared. Those challenges are leading to customers needing to break down the silos and complexity that plagues these distributed data environments. And our vision is to deliver a new data experience that helps customers unleash the power of data. We call this vision unified data jobs, Unified Data Ops integrates data centric policies to streamline data management, cloud native control to bring the cloud operational model to where customers data labs and a I driven insights to make the infrastructure invisible. It delivers a new data experience to simplify and bring that agility of cloud to data infrastructure. Streamline data management and help customers innovate faster than ever before. We're making the promise of Unified Data Ops Real by transforming Hve storage to a cloud native software defined data services business and introducing a data services platform that expands Hve Green Lake. >>I mean, you know, you talk about the complexity, I see, I look at it as you kind of almost embracing the complexity saying, look, it's gonna keep getting more complex as the cloud expands to the edge on prem Cross cloud, it gets more complex underneath. What you're doing is you're almost embracing that complexity and putting a layer over it and hiding that complexity from from the end customer that and so they can spend their time doing other things over. I wonder if you can maybe talk a little bit more about the data services console, Is it sort of another software layer to manage infrastructure? What exactly is it? >>It's a lot more than that, Dave and you're you're 100% right. It's basically we're attempting in this release to attack that complexity head on. So simply put data services. Cloud console is a SAS based console that delivers cloud operational model and cloud operational agility uh to our customers. It unifies data operations through a series of cloud data services that are delivered on top of this console to our customers in a continuous innovation stream. Uh And what we have done is going back to the point that I made earlier separating storage and data management and putting the strong suites of each of those together into the SAS delivered console for our customers. So what we have done is we have separated data and infrastructure management away from physical hardware to provide a comprehensive and a unified approach to managing data and infrastructure wherever it lives. From a customer's perspective, it could be at the edge, it could be in a coal. Oh, it could be in their data center or it could be a bunch of data services that are deployed within the public cloud. So now our customers with data services. Cloud console can manage the entire life cycle of their data from all the way from deployment, upgrading and optimizing it uh from a single console from anywhere in the world. Uh This console is designed to streamline data management with cloud data services that enable access to data. It allows for policy-based data protection, it allows for an organizational wide search on top of your storage assets. And we deliver basically a 360° visibility to all your data from a single console that the customer can experience from anywhere. So, so if you look at the journey the way we're deciding to deliver this. So the first, in its first incarnation, uh Data services, Cloud console gives you infrastructure and cloud data services to start to do data management along with that. But this is that foundation that we are placing in front of our customers, the SAS console, through which we get touch our customers on a daily basis. And now as our customers get access to the SAAS platform on the back end, we will continue to roll in additional services throughout the years on a true SAS based innovation base for our customers. And and these services can will be will be ranging all the way from data protection to multiple out data management, all the way to visibility all the way to understanding the context of your data as it's stored across your enterprise. And in addition to that, we're offering a consistent revised unified Api which allows for our customers to build automation against their storage infrastructure. Without ever worrying about that. As infrastructure changes, uh, the A. P I proof points are going to break for them. That is never going to happen because they are going to be programming to a single SAS based aPI interface from now on. >>Right. And that brings in this idea of infrastructure as code because you talk about as a service to talk about Green Lake and and my question is always okay. Tell me what's behind that. And if and if and if and if you're talking about boxes and and widgets, that's a it's a problem. And you're not, you're talking about services and A P. I. S and microservices and that's really the future model and infrastructure is code and ultimately data as code is really part of that. So, All right. So you guys, I know some of your branding folks, you guys give deep thought to this. So the second part of the announcement is the new product brands and deep maybe you can talk about that a little bit. >>Sure. Ultimately delivering the cloud operational model requires cognitive data infrastructure and that has been engineered to be natively managed from the cloud. And that's why we have also introduced H. P. E. Electra. Omar, Can you perhaps described HB electro even more. >>Absolutely. Thank you. Sandy. Uh, so with with HB Electoral we're launching a new brand of cloud native hardware infrastructure to power our customers data all the way from edge to the core to the cloud. The releases are smaller models for the edge then at the same time having models for the data center and then expanding those services into the public cloud as well. Right. All these hardware devices, Electoral hardware devices are cloud native and powered by our data services. Cloud Council, we're announcing two models with this launch H. P. E Electoral 9000. Uh, this is for our mission critical workloads. It has its history and bases in H P E. Primera. It comes with 100% availability guarantee. Uh It's the first of its type in the industry. It comes with standard support contract, no special verb is required. And then we're also launching HB Electoral 6000. Uh These are based in our history of uh nimble storage systems. Uh These these are for business critical applications, especially for that mid range of the storage market, optimizing price, performance and efficiency. Both of these systems are full envy any storage powered by our timeless capabilities with data in place upgrades. And then they both deliver a unified infrastructure and data management experience through the data services, cloud console. Uh And and and at the back end unified Ai Ops experience with H P. E. Info site is seamlessly blended in along with the offering for our >>customers. So this is what I was talking about before. It's sort of not your grandfather's storage business anymore. This is this is this is something that is part of that, that unified vision, that layer that I talked about, the A. P. I. Is the program ability. So you're you're reaching into new territory here. Maybe you can give us an example of how the customers experience what that looks like. >>Excellent. Love to Dave. So essentially what we're doing is we're changing the storage experience to a true cloud operational model for our customers. These recent announcements that we just went through along with, indeed they expand the cloud experience that our customers get with storage as a service with HP Green Lake. So a couple of examples to make this real. So the first of all is simplified deployment. Uh So I t no longer has to go through complex startup and deployment processes. Now all you need to do is these systems shipped and delivered to the customer's data center. Operational staff just need to rack and stack and then leave connect the power cable, connect the network cable. And the job is done. From that point onwards, data services console takes over where you can onboard these systems, you can provision these systems if you have a pre existing organization wide security as well as standard profile setup in data services console, we can automatically apply those on your behalf and bring these systems online. From a customer's perspective, they can be anywhere in the world to onboard these systems, they could be driving in a car, they could be sitting on a beach. Uh And and you know, these systems are automatically on boarded through this cloud operational model which is delivered through the SAAS application for our customers. Another big example. All that I'd like to shed light on is intent based provisioning. Uh So Dave typically provisioning a workload within a data center is an extremely spreadsheet driven trial and error kind of a task. Which system do I land it on? Uh Is my existing sl is going to be affected which systems that loaded which systems are loaded enough that I put this additional workload on it and the performance doesn't take. All of these decisions are trial and error on a constant basis with cloud Data services console along with the electron new systems that are constantly in a loop back information feeding uh Typical analytics to the console. All you need to do is to describe the type of the workload and the intent of the workload in terms of block size S. L. A. That you would like to experience at that point. Data services console consults with intra site at the back end. We run through thousands of data points that are constantly being given to us by your fleet and we come back with a few recommendations. You can accept the recommendation and at that time we go ahead and fully deploy this workload on your behalf or you can specify a particular system and then people try to enforce the S. L. A. On that system. So it completely eliminates the guesswork and the planning that you have to do in this regard. Uh And last but not the least. Uh You know, one of the most important things is, you know, upgrades has been a huge problem for our customers. Uh And typically oftentimes when you're not in this constant, you know, loop back communication with your customers. It often is a big challenge to identify which release or which bug fix or which update goes on to which particular machine, all of that has been completely taken away from our customers and fully automated. Uh We run thousands of signatures across are installed base. We identify which upgrades need to be curated for which machines in a fleet for a particular customer. And then if it applies to that customer we presented, and if the customer accepts it, we automatically go ahead and upgrade the system and and and last, but not the least from a global management perspective. Now, a customer has an independent data view of their data estate, independent from a storage estate and data services. Council can blend the two to give a consistent view or you can just look at the fleet view or the data view. >>It's kind of the holy Grail. I mean I've been in this business a long time and I think I. T. People have dreamt about you know this kind of capability for for a long long time. I wonder if we could sort of stay on the customers for a moment here and and talk about what's enabled. Now. Everybody's talking digital transformation. I joke about the joke. Not funny. The force marched to digital with Covid. Uh and we really wasn't planned for but the customers really want to drive now that digital transfer some of them are on the back burner and now they're moving to the front burner. What are the outcomes that are that are enabled here? Omar. >>Excellent. So so on on a typical basis for a traditional I. T. Customer this cloud operational model means that you know information technology staff can move a lot faster and they can be a lot more productive on the things that are directly relevant to their business. They can get up to 99% of the savings back to spend more time on strategic projects or best of all spend time with their families rather than managing and upgrading infrastructure and fleets of infrastructure. Right for line of business owners, the new experience means that their data infrastructure can be presented can be provision where the self service on demand type of capability. Uh They necessarily don't have to be in the data center to be able to make those decisions. Capacity management, performance management, all of that is died in and presented to them wherever they are easy to consume. SaS based models and especially for data innovators, whether it's D B A s, whether it's data analysts, they can start to consume infrastructure and ultimately data as a code to speed up their app development because again, the context that we're bringing forward is the context of data decoupling it from. Actually, storage management, storage management and data management are now two separate domains that can be presented through a single console to tie the end to end picture for a customer. But at the end of the day, what we have felt is that customers really, really want to rely and move forward with the data management and leave infrastructure management to machine oriented task, which we have completely automated on their behalf. >>So I'm sure you've heard you got the memo about, you know, H H p going all in on as a service. Uh it is clear that the companies all in. How does this announcement fit in to that overall mission? Cindy >>dave We believe the future is edge to cloud and our mission is to be the edge to cloud platform as a service company and as as HB transforms HP Green Lake is our unified cloud platform. Hp Green Link is how we deliver cloud services and agile cloud experiences to customers applications and data across the edge to cloud. With the storage announcement that we made recently, we announced that we're expanding HB Green Lake with as a service transformation of the HPV storage business to a cloud native software defined data services business. And this expands storage as a service, delivering full cloud experience to our customers data across edge and on prem environment across the board were committed to being a strategic partner for every one of our customers and helping them accelerate their digital transformation. >>Yeah, that's where the puck is going guys. Hey as always great conversation with with our friends from HP storage. Thanks so much for the collaboration and congratulations on the announcements and and I know you're not done yet. >>Thanks. Dave. Thanks. Dave. >>Thanks. Dave. It's a pleasure to be here. >>You're very welcome. And thank you for being with us for hp. You discovered 2021 you're watching the cube, the leader digital check coverage. Keep it right there, but right back. >>Yeah. Yeah.

Published Date : Jun 4 2021

SUMMARY :

Great to see you. Great to be here. Hey, so uh, last month you guys, you made a big announcement and and now you're, that delivers the cut operational agility and it's designed to unify data operations So the software defined data center containers with kubernetes, Let's zoom and this needs to change for our customers especially with the diversity of the application 95% of the respondents indicated that solving storage to managing data or maybe you can take that one. What are the things that have to be true you know, the vision that you guys unveiled, What does it look like? Um across the board it's time to reimagine saying, look, it's gonna keep getting more complex as the cloud expands to the edge on prem Cross cloud, Uh This console is designed to streamline data management with cloud So the second part of the announcement is the new product brands and deep maybe you can talk about that a little bit. data infrastructure and that has been engineered to be natively managed from Uh And and and at the back end unified Ai Ops experience with H that layer that I talked about, the A. P. I. Is the program ability. Uh You know, one of the most important things is, you know, upgrades has been a huge problem The force marched to digital with Covid. Uh They necessarily don't have to be in the data center to be able to make those decisions. Uh it is clear that the companies all in. dave We believe the future is edge to cloud and our mission is to be on the announcements and and I know you're not done yet. Dave. the leader digital check coverage.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavePERSON

0.99+

Dave VolontePERSON

0.99+

Sandeep SinghPERSON

0.99+

twoQUANTITY

0.99+

100%QUANTITY

0.99+

HPORGANIZATION

0.99+

94%QUANTITY

0.99+

95%QUANTITY

0.99+

two challengesQUANTITY

0.99+

Omar assadPERSON

0.99+

CindyPERSON

0.99+

93%QUANTITY

0.99+

2nd announcementQUANTITY

0.99+

chinaLOCATION

0.99+

OmarPERSON

0.99+

SandyPERSON

0.99+

2021DATE

0.99+

BothQUANTITY

0.99+

thousandsQUANTITY

0.99+

Hp Green LinkORGANIZATION

0.99+

Hp Green Green LakeORGANIZATION

0.99+

firstQUANTITY

0.99+

HB ElectoralORGANIZATION

0.99+

H P S H C IORGANIZATION

0.99+

HPDORGANIZATION

0.99+

HB Green LakeORGANIZATION

0.99+

two modelsQUANTITY

0.99+

Omer AsadPERSON

0.99+

second partQUANTITY

0.99+

first announcementQUANTITY

0.99+

single arrayQUANTITY

0.98+

three arraysQUANTITY

0.98+

davePERSON

0.98+

last monthDATE

0.98+

both worldsQUANTITY

0.98+

hpORGANIZATION

0.98+

next decadeDATE

0.98+

bothQUANTITY

0.98+

todayDATE

0.98+

HPVORGANIZATION

0.97+

Ricky bobbyPERSON

0.97+

first incarnationQUANTITY

0.97+

H H pORGANIZATION

0.97+

single consoleQUANTITY

0.97+

HB electroORGANIZATION

0.97+

first oneQUANTITY

0.97+

HPEORGANIZATION

0.96+

Cloud CouncilORGANIZATION

0.96+

HB Electoral 6000COMMERCIAL_ITEM

0.96+

Unified Data OpsORGANIZATION

0.96+

two independent thingsQUANTITY

0.96+

HP Green LakeORGANIZATION

0.96+

HveORGANIZATION

0.96+

Hve Green LakeORGANIZATION

0.95+

up to 99%QUANTITY

0.95+

two gameQUANTITY

0.95+

secondlyQUANTITY

0.95+

SaSTITLE

0.94+

oneQUANTITY

0.94+

Green LakeLOCATION

0.94+

SAASTITLE

0.93+

CovidPERSON

0.92+

eachQUANTITY

0.92+

360°QUANTITY

0.9+

past decadeDATE

0.9+

H. P. E Electoral 9000COMMERCIAL_ITEM

0.88+

ApiTITLE

0.88+

SASORGANIZATION

0.87+

hundreds of customersQUANTITY

0.85+

vice presidentPERSON

0.84+

singleQUANTITY

0.81+

SASTITLE

0.8+

H. P. E. ElectraORGANIZATION

0.8+

thousands of data pointsQUANTITY

0.78+

Aliye 1 1 w dave crowdchat v2


 

>> Hi everybody, this is Dave Velante with the CUBE. And when we talk to practitioners about data and AI they have troubles infusing AI into their data pipeline and automating that data pipeline. So we're bringing together the community, brought to you by IBM to really understand how successful organizations are operationalizing the data pipeline and with me to talk about that is Aliye Ozcan. Aliye, hello, introduce yourself. Tell us about who you are. >> Hi Dave, how are you doing? Yes, my name is Aliye Ozcan I'm the Data Operations Data ops Global Marketing Leader at IBM. >> So I'm very excited about this project. Go to crowdchat.net/dataops, add it to your calendar and check it out. So we have practitioners, Aliye from Harley Davidson, Standard Bank, Associated Bank. What are we going to learn from them? >> What we are going to learn from them is the data experiences. What are the data challenges that they are going through? What are the data bottlenecks that they had? And especially in these challenging times right now. The industry is going through this challenging time. We are all going through this. How the foundation that they invested. Is now helping them to pivot quickly to market demands, the new market demands fast. That is fascinating to see, and I'm very excited having individual conversations with those experts and bringing those stories to the audience here. >> Awesome, and we also have Inderpal Bhandari from the CDO office at IBM, so go to crowdchat.net/dataops, add it to your calendar, we'll see you in the crowd chat.

Published Date : May 6 2020

SUMMARY :

are operationalizing the data pipeline I'm the Data Operations Data ops What are we going to learn from them? What are the data challenges add it to your calendar, we'll

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavePERSON

0.99+

Dave VelantePERSON

0.99+

Standard BankORGANIZATION

0.99+

IBMORGANIZATION

0.99+

Associated BankORGANIZATION

0.99+

Inderpal BhandariPERSON

0.99+

Harley DavidsonORGANIZATION

0.99+

AliyePERSON

0.99+

crowdchat.net/dataopsOTHER

0.99+

Aliye OzcanPERSON

0.99+

Aliye 1PERSON

0.86+

CUBEORGANIZATION

0.85+

crowdchatTITLE

0.67+

Data opsORGANIZATION

0.61+

CDOORGANIZATION

0.53+

Jay ibm promo part one v1


 

>>Hi. I'm Jalen Burn, director of offering management, IBM Data Ops. As an organization, we've been focusing on simplifying the data, and they are lifecycle allowing you to discover and prepare data and then use that data the build, deploy, govern and manage your models with a range of capabilities that take advantage of machine and human intelligence. Data Ops is a critical and complementary Dissident II. The methodology enables agile data collaboration, driving speed and scale of operations and throughout the data lifecycle learn more on May 27 when IBM and client leaders come together during the data Ops Crowdchat event online. I hope to see you then.

Published Date : May 4 2020

SUMMARY :

As an organization, we've been focusing on simplifying the data, and they are lifecycle allowing you to discover

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jalen BurnPERSON

0.99+

IBMORGANIZATION

0.99+

May 27DATE

0.99+

part oneOTHER

0.97+

IBM Data OpsORGANIZATION

0.97+

data Ops CrowdchatEVENT

0.9+

Jay ibmPERSON

0.79+

Data OpsORGANIZATION

0.69+

Buno Pati, Infoworks io | CUBEConversation January 2020


 

>> From the SiliconANGLE media office in Boston, Massachusetts, it's theCUBE. Now, here's your host, Dave Vellante. >> Hello everyone, and welcome to this CUBE Conversation. You know, theCUBE has been following the trends in the so-called big data space since 2010. And one of the things that we reported on for a number of years is the complexity involved in wrangling and making sense out of data. The allure of this idea of no schema on write and very low cost platforms like Hadoop became a data magnet. And for years, organizations would shove data into a data lake. And of course the joke was it was became a data swamp. And organizations really struggled to realize the promised return on their big data investments. Now, while the cloud certainly simplified infrastructure deployment, it really introduced a much more complex data environment and data pipeline, with dozens of APIs and a mind-boggling array of services that required highly skilled data engineers to properly ingest, shape, and prepare that data, so that it could be turned into insights. This became a real time suck for data pros, who spent 70 to 80% of their time wrestling data. A number of people saw the opportunity to solve this problem and automate the heavy lift of data, and simplify the process to adjust, synchronize, transform, and really prepare data for analysis. And one of the companies that is attacking this challenge is InfoWorks. And with me to talk about the evolving data landscape is Buno Pati, CEO of InfoWorks. Buno, great to see you, thanks for coming in. >> Well thank you Dave, thanks for having me here. >> You're welcome. I love that you're in Palo Alto, you come to MetroWest in Boston to see us (Buno laughs), that's great. Well welcome. So, you heard my narrative. We're 10 years plus into this big data theme and meme. What did we learn, what are some of the failures and successes that we can now build on, from your point of view? >> All right, so Dave, I'm going to start from the top, with why big data, all right? I think this big data movement really started with the realization by companies that they need to transform their customer experience and their operations, in order to compete effectively in this increasingly digital world, right? And in that context, they also realized very quickly that data was the key asset on which this transformation would be built. So given that, you look at this and say, "What is digital transformation really about?" It is about competing with digital disruption, or fending off digital disruption. And this has become, over time, an existential imperative. You cannot survive and be relevant in this world without leveraging data to compete with others who would otherwise disrupt your business. >> You know, let's stay on that for a minute, because when we started the whole big data, covering that big data space, you didn't really hear about digital transformation. That's sort of a more recent trend. So I got to ask you, what's the difference between a business and a digital business, in your view? >> That is the foundational question behind big data. So if you look at a digital native, there are many of them that you can name. These companies start by building a foundational platform on which they build their analytics and data programs. It gives them a tremendous amount of agility and the right framework within which to build a data-first strategy. A data-first strategy where business information is persistently collected and used at every level of the organization. Furthermore, they take this and they automate this process. Because if you want to collect all your data and leverage it at every part of the business, it needs to be a highly automated system, and it needs to be able to seamlessly traverse on-premise, cloud, hybrid, and multi-cloud environments. Now, let's look at a traditional business. In a traditional enterprise, there is no foundational platform. There are things like point tools for ETL, and data integration, and you can name a whole slew of other things, that need to be stitched together and somehow made to work to deliver data to the applications that consume. The strategy is not a data-first strategy. It is use case by use case. When there is a use case, people go and find the data, they gather the data, they transform that data, and eventually feed an application. A process that can take months to years, depending on the complexity of the project that they're trying. And they don't automate this. This is heavily dependent, as you pointed out, on engineering talent, highly skilled engineering talent that is scarce. And they have not seamlessly traversed the various clouds and on-premise environments, but rather fragmented those environments, where individual teams are focused on a single environment, building different applications, using different tools, and different infrastructure. >> So you're saying the digital native company puts data at the core. They organize around that data, as opposed to maybe around a bottling plant, or around people. And then they leverage that data for competitive advantage through a platform that's kind of table stakes. And then obviously there's cultural aspects and other skills that they need to develop, right? >> Yeah, they have an ability which traditional enterprises don't. Because of this choice of a data-first strategy with a foundational platform, they have the ability to rapidly launch analytics use cases and iterate all them. That is not possible in a traditional or legacy environment. >> So their speed to market and time to value is going to be much better than their competition. This gets into the risk of disruption. Sometimes we talk about cloud native and cloud naive. You could talk about digital native and digital naive. So it's hard for incumbents to fend off the disrupters, and then ultimately become disrupters themselves. But what are you seeing in terms of some of the trends where organizations are having success there? >> One of the key trends that we're seeing, or key attributes of companies that are seeing a lot of success, is when they have organized themselves around their data. Now, what do I mean by that? This is usually a high-level mandate coming down from the top of the company, where they're forming centralized groups to manage the data and make it available for the rest of the organization to use. There are a variety of names that are being used for this. People are calling it their data fabric. They're calling it data as a service, which is pretty descriptive of what it ends up being. And those are terms that are all sort of representing the same concept of a centralized environment and, ideally, a highly automated environment that serves the rest of the business with data. And the goal, ultimately, is to get any data at any time for any application. >> So, let's talk a little bit about the cloud. I mentioned up front that the cloud really simplified infrastructure deployment, but it really didn't solve this problem of, we talked about in terms of data wrangling. So, why didn't it solve that problem? And you got companies like Amazon and Google and Microsoft, who are very adept at data. They're some of these data-first companies. Why is it that the cloud sort of in and of itself has not been able to solve this problem? >> Okay, so when you say solve this problem, it sort of begs the question, what's the goal, right? And if I were to very simply state the goal, I would call it analytics agility. It is gaining agility with analytics. Companies are going from a traditional world, where they had to generate a handful of BI and other reporting type of dashboards in a year, to where they literally need to generate thousands of these things in a year, to run the business and compete with digital disruption. So agility is the goal. >> But wait, the cloud is all about agility, is it not? >> It is, when you talk about agility of compute and storage infrastructure. So, there are three layers to this problem. The first is, what is the compute and storage infrastructure? The cloud is wonderful in that sense. It gives you the ability to rapidly add new infrastructure and spin it down when it's not in use. That is a huge blessing, when you compare it to the six to nine months, or perhaps even longer, that it takes companies to order, install, and test hardware on premise, and then find that it's only partially used. The next layer on that is what is the operating system on which my data and analytics are going to be run? This is where Hadoop comes in. Now, Hadoop is inherently complex, but operating systems are complex things. And Spark falls in that category. Databricks has taken some of the complexity out of running Spark because of their sort of manage service type of offering. But there's still a missing layer, which leverages that infrastructure and that operating system to deliver this agility where users can access data that they need anywhere in the organization, without intensely deep knowledge of what that infrastructure is and what that operating system is doing underneath. >> So, in my up front narrative, I talked about the data pipeline a little bit. But I'm inferring from your comments on platform that it's more than just this sort of narrow data pipeline. There's a macro here. I wonder if you could talk about that a little bit. >> Yeah. So, the data pipeline is one piece of the puzzle. What needs to happen? Data needs to be ingested. It needs to be brought into these environments. It has to be kept fresh, because the source data is persistently changing. It needs to be organized and cataloged, so that people know what's there. And from there, pipelines can be created that ultimately generate data in a form that's consumable by the application. But even surrounding that, you need to be able to orchestrate all of this. Typical enterprise is a multi-cloud enterprise. 80% of all enterprises have more than one cloud that they're working on, and on-premise. So if you can't orchestrate all of this activity in the pipelines, and the data across these various environments, that's not a complete solution either. There's certainly no agility in that. Then there's governance, security, lineage. All of this has to be managed. It's not simply creation of the pipeline, but all these surrounding things that need to happen in order for analytics to run at-scale within enterprises. >> So the cloud sort of solved that layer one problem. And you certainly saw this in the, not early days, but sort of mid-days of Hadoop, where the cloud really became the place where people wanted to do a lot of their Hadoop workloads. And it was kind of ironic that guys like Hortonworks, and Cloudera and MapR really didn't have a strong cloud play. But now, it's sort of flipping back where, as you point out, everybody's multi-cloud. So you have to include a lot of these on-prem systems, whether it's your Oracle database or your ETL systems or your existing data warehouse, those are data feeds into the cloud, or the digital incumbent who wants to be a digital native. They can't just throw all that stuff away, right? So you're seeing an equilibrium there. >> An equilibrium between ... ? >> Yeah, between sort of what's in the cloud and what's on-prem. Let me ask it this way: If the cloud is not a panacea, is there an approach that does really solve the problem of different datasets, the need to ingest them from different clouds, on-prem, and bring them into a platform that can be analyzed and drive insights for an organization? >> Yeah, so I'm going to stay away from the word panacea, because I don't think there ever is really a panacea to any problem. >> That's good, that means we got a good roadmap for our business then. (both laugh) >> However, there is a solution. And the solution has to be guided by three principles. Number one, automation. If you do not automate, the dependence on skill talent is never going to go away. And that talent, as we all know, is very very scarce and hard to come by. The second thing is integration. So, what's different now? All of these capabilities that we just talked about, whether it's things like ETL, or cataloging, or ingesting, or keeping data fresh, or creating pipelines, all of this needs to be integrated together as a single solution. And that's been missing. Most of what we've seen is point tools. And the third is absolutely critical. For things to work in multi-cloud and hybrid environments, you need to introduce a layer of abstraction between the complexity of the underlying systems and the user of those systems. And the way to think about this, Dave, is to think about it much like a compiler. What does a compiler do, right? You don't have to worry about what Intel processor is underneath, what version of your operating system you're running on, what memory is in the system. Ultimately, you might-- >> As much as we love assembly code. >> As much as we love assembly code. Now, so take the analogy a little bit further, there was a time when we wrote assembly code because there was no compiler. So somebody had to sit back and say, "Hey, wouldn't it be nice if we abstracted away from this?" (both laugh) >> Okay, so this sort of sets up my next question, which is, is this why you guys started InfoWorks? Maybe you could talk a little bit about your why, and kind of where you fit. >> So, let me give you the history of InfoWorks. Because the vision of InfoWorks, believe it or not, came out of a rear view mirror. Looking backwards, not forwards. And then predicting the future in a different manner. So, Amar Arsikere is the founder of InfoWorks. And when I met him, he had just left Zynga, where he was the general manager of their gaming platform. What he told me was very very simple. He said he had been at Google at a time when Google was moving off of the legacy systems of, I believe it was Netezza, and Oracle, and a variety of things. And they had just created Bigtable, and they wanted to move and create a data warehouse on Bigtable. So he was given that job. And he led that team. And that, as you might imagine, was this massive project that required a high degree of automation to make it all come together. And he built that, and then he built a very similar system at Zynga, when he was there. These foundational platforms, going back to what I was talking about before digital days. When I met him, he said, "Look, looking back, "Google may have been the only company "that needed such a platform. "But looking forward, "I believe that everyone's going to need one." And that has, you know, absolute truth in it, and that's what we're seeing today. Where, after going through this exercise of trying to write machine code, or assembly code, or whatever we'd like to call it, down at the detailed, complex level of an operating system or infrastructure, people have realized, "Hey, I need something much more holistic. "I need to look at this from a enterprise-wide perspective. "And I need to eliminate all of this dependence on," kind of like the cloud plays a role because it eliminates some of the dependence, or the bottlenecks around hardware and infrastructure. "And ultimately gain a lot more agility "than I'm able to do with legacy methodology." So you were asking early on, what are the lessons learned from that first 10 years? And lot of technology goes through these types of cycles of hype and disillusionment, and we all know the curve. I think there are two key lessons. One is, just having a place to land your data doesn't solve your problem. That's the beginning of your problems. And the second is that legacy methodologies do not transfer into the future. You have to think differently. And looking to the digital natives as guides for how to think, when you're trying to compete with them is a wonderful perspective to take. >> But those legacy technologies, if you're an incumbent, you can't just rip 'em and throw 'em out and convert. You going to use them as feeders to your digital platform. So, presumably, you guys have products. You call this space Enterprise Data Ops and Orchestration, EDO2. Presumably you have products and a portfolio to support those higher layer challenges that we talked about, right? >> Yeah, so that's a really important question. No, you don't rip and replace stuff. These enterprises have been built over years of acquisitions and business systems. These are layers, one on top of another. So think about the introduction of ERP. By the way, ERP is a good analogy of to what happened, because those were point tools that were eventually combined into a single system called ERP. Well, these are point capabilities that are being combined into a single system for EDO2, or Enterprise Data Operations and Orchestration. The old systems do not go away. And we are seeing some companies wanting to move some of their workloads from old systems to new systems. But that's not the major trend. The major trend is that new things that get done, the things that give you holistic views of the company, and then analytics based on that holistic view, are all being done on the new platforms. So it's a layer on top. It's not a rip and replace of the layers underneath. What's in place stays in place. But for the layer on top, you need to think differently. You cannot use all the legacy methodologies and just say that's going to apply to the new platform or new system. >> Okay, so how do you engage with customers? Take a customer who's got, you know, on-prem, they've got legacy infrastructure, they don't want to get disrupted. They want to be a digital native. How do you help them? You know, what do I buy from you? >> Yeah, so our product is called DataFoundry. It is a EDO2 system. It is built on the three principles, founding principles, that I mentioned earlier. It is highly automated. It is integrated in all the capabilities that surround pipelines, perhaps. And ultimately, it's also abstracting. So we're able to very easily traverse one cloud to another, or on-premise to the cloud, or even back. There are some customers that are moving some workloads back from the cloud. Now, what's the benefit here? Well first of all, we lay down the foundation for digital transformation. And we enable these companies to consolidate and organize their data in these complex hybrid, cloud, multi-cloud environments. And then generate analytics use cases 10x faster with about tenth of the resource. And I'm happy to give you some examples on how that works. >> Please do. I mean, maybe you could share some customer examples? >> Yeah, absolutely. So, let me talk about Macy's. >> Okay. >> Macy's is a customer of ours. They've been a customer for about, I think about 14 months at this point in time. And they had built a number of systems to run their analytics, but then recognized what we're seeing other companies recognize. And that is, there's a lot of complexity there. And building it isn't the end game. Maintaining it is the real challenge, right? So even if you have a lot of talent available to you, maintaining what you built is a real challenge. So they came to us. And within a period of 12 months, I'll just give you some numbers that are just mind-blowing. They are currently running 165,000 jobs a month. Now, what's a job? A job is a ingestion job, or a synchronization job, or a transformation. They have launched 431 use cases over a period of 12 months. And you know what? They're just ramping. They will get to thousands. >> Scale. >> Yeah, scale. And they have ingested a lot of data, brought in a lot of DataSources. So to do that in a period of 12 months is unheard of. It does not happen. Why is it important for them? So what problem are they trying to solve? They're a retailer. They are being digitally disruptive like (chuckles) no one else. >> They have an Amazon war room-- >> Right. >> No doubt. >> And they have had to build themselves out as a omni-channel retailer now. They are online, they are also with brick and mortar stores. So you take a look at this. And the key to competing with digital disrupters is the customer experience. What is that experience? You're online, how does that meld with your in-store experience? What happens if I buy online and return something in a store? How does all this come together into a single unified experience for the consumer? And that's what they're chasing. So that was the first application that they came to us with. They said, "Look, let us go into a customer 360. "Let us understand the entirety "of that customer's interaction "and touchpoints with our business. "And having done so, we are in a position "to deliver a better experience." >> Now that's a data problem. I mean, different DataSources, and trying to understand 360, I mean, you got data all over the place. >> All over the place. (speaking simultaneously) And there's historical data, there's stuff coming in from, you know, what's online, what's in the store. And then they progress from there. I mean, they're not restricting it to customer experience and selling. They're looking at merchandising, and inventory, and fulfillment, and store operations. Simple problem. You order something online, where do I pull this from? A store or a warehouse? >> So this is, you know, big data 2.0, just to use a sort of silly term. But it's really taking advantage of all the investment. I've often said, you know, Hadoop, for all the criticism it gets, it did lower our cost of getting data into, you know, at least one virtual place. And it got us thinking about how to get insights out of data. And so, what you're describing is the ability to operationalize your data initiatives at scale. >> Yeah, you can absolutely get your insights off of Hadoop. And I know people have different opinions of Hadoop, given their experience. But what they don't have, what these customers have not achieved yet, most of them, is that agility, right? So, how easily can you get your insights off of Hadoop? Do I need to hire a boatload of consultants who are going to write code for me, and shovel data in, and create these pipelines, and so forth? Or can I do this with a click of a button, right? And that's the difference. That is truly the difference. The level of automation that you need, and the level of abstraction that you need, away from this complexity, has not been delivered. >> We did, in, it must have been 2011, I think, the very first big data market study from anybody in the world, and put it out on, you know, Wikibon, free research. And one of the findings was (chuckles) this is a huge services business. I mean, the professional service is where all the money was going to flow because it was so complicated. And that's kind of exactly what happened. But now we're entering, really it seems like a phase where you can scale, and operationalize, and really simplify, and really focus your attention on driving business value, versus making stuff work. >> You are absolutely correct. So I'll give you the numbers. 55% of this industry is services. About 30% is software, and the rest is hardware. Break it down that way. 55%. So what's going on? People will buy a big data system. Call it Hadoop, it could be something in the cloud, it could be Databricks. And then, this is welcome to the world of SIs. Because at this point, you need these SIs to write code and perform these services in order to get any kind of value out of that. And look, we have some dismal numbers that we're staring at. According to Gardner, only 17% of those who have invested in Hadoop have anything in production. This is after how many years? And you look at surveys from, well, pick your favorite. They all look the same. People have not been able to get the value out of this, because it is too hard. It is too complex and you need too many consultants (laughs) delivering services for you to make this happen. >> Well, what I like about your story, Buno, is you're not, I mean, a lot of the data companies have pivoted to AI. Sort of like, we have a joke, ya know, same wine, new bottle. But you're not talking about, I mean sure, machine intelligence, I'm sure, fits in here, but you're talking about really taking advantage of the investments that you've made in the last decade and helping incumbents become digital natives. That sounds like it's at least a part of your mission here. >> Not become digital natives, but rather compete with them. >> Yeah, right, right. >> Effectively, right? >> Yep, okay. >> So, yeah, that is absolutely what needs to get done. So let me talk for a moment about AI, all right? Way back when, there was another wave of AI in the late 80s. I was part of that, I was doing my PhD at the time. And that obviously went nowhere, because we didn't have any data, we didn't have enough compute power or connectivity. Pretty inert. So here it is again. Very little has changed. Except for we do have the data, we have the connectivity, and we have the compute power. But do we really? So what's AI without the data? Just A, right? There's nothing there. So what's missing, even for AI and ML to be, and I believe these are going to be powerful game changers. But for them to be effective, you need to provide data to it, and you need to be able to do so in a very agile way, so that you can iterate on ideas. No one knows exactly what AI solution is going to solve your problem or enhance your business. This is a process of experimentation. This is what a company like Google can do extraordinarily well, because of this foundational platform. They have this agility to keep iterating, and experimenting, and trying ideas. Because without trying them, you will not discover what works best. >> Yeah, I mean, for 50 years, this industry has marched to the cadence of Moore's Law, and that really was the engine of innovation. And today, it's about data, applying machine intelligence to that data. And the cloud brings, as you point out, agility and scale. That's kind of the new cocktail for innovation, isn't it? >> The cloud brings agility and scale to the infrastructure. >> In low risk, as you said, right? >> Yeah. >> Experimentation, fail fast, et cetera. >> But without an EDO2 type of system, that gives you a great degree of automation, you could spend six months to run one experiment with AI. >> Yeah, because-- >> In gathering data and feeding it to it. >> 'Cause if the answer is people and throwing people at the problem, then you're not going to scale. >> You're not going to scale, and you're never going to really leverage AI and ML capabilities. You need to be able to do that not in six months, in six days, right, or less. >> So let's talk about your company a little bit. Can you give us the status, you know, where you're at? As their newly minted CEO, what your sort of goals are, milestones that we should be watching in 2020 and beyond? >> Yeah, so newly minted CEO, I came in July of last year. This has been an extraordinary company. I started my journey with this company as an investor. And it was funded by actually two funds that I was associated with, first being Nexus Venture Partners, and then Centerview Capital, where I'm still a partner. And myself and my other two partners looked at the opportunity and what the company had been able to do. And in July of last year, I joined as CEO. My partner, David Dorman, who used to be CEO of AT&T, he joined as chairman. And my third partner, Ned Hooper, joined as President and Chief Operating Officer. Ned used to be the Chief Strategy Officer of Cisco. So we pushed pause on the funding, and that's about as all-in as a fund can get. >> Yeah, so you guys were operational experts that became investors, and said, "Okay, we're going to dive back in "and actually run the business." >> And here's why. So we obviously see a lot of companies as investors, as they go out and look for funding. There are three things that come together very rarely. One is a massive market opportunity combined with the second, which is the right product to serve that opportunity. But the third is pure luck, timing. (Dave chuckles) It's timing. And timing, you know, it's a very very challenging thing to try to predict. You can get lucky and get it right, but then again, it's luck. This had all three. It was the absolute perfect time. And it's largely because of what you described, the 10 years of time that had elapsed, where people had sort of run the experiment and were not going to get fooled again by how easy this supposed to be by just getting one piece or the other. They recognized that they need to take this holistic approach and deploy something as an enterprise-wide platform. >> Yeah, I mean, you talk about a large market, I don't even know how you do a TAM, what's the TAM? It's data. (laughs) You know, it's the data universe, which is just, you know, massive. So, I have to ask you a question as an investor. I think you've raised, what 50 million, is that right? >> We've raised 50 million. The last round was led by NEA. >> Right, okay. You got great investors, hefty amount. Although, you know, in this day and age, you know, you're seeing just outrageous amounts being raised. Software obviously is a capital efficient business, but today you need to raise a lot of money for promotion, right, to get your name out there. What's your thoughts on, as a Silicon Valley investor, as this wave, I mean, get it while you can, I guess. You know, we're in the 10th year of this boom market. But your thoughts? >> You're asking me to put on my other hat. (Dave laughs) I think companies have, in general, raised too much money at too high a value too fast. And there's a penalty for that. And the down round IPO, which has become fashionable these days, is one of those penalties. It's a clear indication. Markets are very rational, public markets are very rational. And the pricing in a public market, when it's significantly below the pricing of in a private market, is telling you something. So, we are a little old-fashioned in that sense. We believe that a company has to lay down the right foundation before it adds fuel to the mix and grows. You have to have evidence that the machinery that you build, whether it's for sales, or marketing, or other go-to-market activities, or even product development, is working. And if you do not see all of those signs, you're building a very fragile company. And adding fuel in that setting is like flooding the carburetor. You don't necessarily go faster. (laughs) You just-- >> Consume more. >> You consume more. So there's a little bit of, perhaps, old-fashioned discipline that we bring to the table. And you can argue against it. You can say, "Well, why don't you just raise a lot of money, "hire a lot of sales guys, and hope for the best?" >> See what sticks? (laughs) >> Yeah. We are fully expecting to build a large institution here. And I use that word carefully. And for that to happen, you need the right foundation down first. >> Well, that resonates with us east coast people. So, Buno, thanks very much for comin' on theCUBE and sharing with us your perspectives on the marketplace. And best of luck with InfoWorks. >> Thank you, Dave. This has been a pleasure. Thank you for having me here. >> All right, we'll be watching, thank you. And thank you for watching, everybody. This is Dave Vellante for theCUBE. We'll see ya next time. (upbeat music fades out)

Published Date : Jan 14 2020

SUMMARY :

From the SiliconANGLE media office and simplify the process to adjust, synchronize, transform, and successes that we can now build on, that they need to transform their customer experience So I got to ask you, what's the difference and it needs to be able to seamlessly traverse on-premise, and other skills that they need to develop, right? they have the ability to rapidly launch analytics use cases is going to be much better than their competition. for the rest of the organization to use. Why is it that the cloud sort of in and of itself So agility is the goal. and that operating system to deliver this agility I talked about the data pipeline a little bit. All of this has to be managed. And you certainly saw this in the, not early days, the need to ingest them from different clouds, on-prem, Yeah, so I'm going to stay away from the word panacea, That's good, that means we got a good roadmap And the solution has to be guided by three principles. So somebody had to sit back and say, and kind of where you fit. And that has, you know, absolute truth in it, You going to use them as feeders to your digital platform. But for the layer on top, you need to think differently. Take a customer who's got, you know, on-prem, And I'm happy to give you some examples on how that works. I mean, maybe you could share some customer examples? So, let me talk about Macy's. And building it isn't the end game. So to do that in a period of 12 months is unheard of. And the key to competing with digital disrupters you got data all over the place. And then they progress from there. So this is, you know, big data 2.0, and the level of abstraction that you need, And one of the findings was (chuckles) And you look at surveys from, well, pick your favorite. I mean, a lot of the data companies have pivoted to AI. and I believe these are going to be powerful game changers. And the cloud brings, as you point out, that gives you a great degree of automation, and feeding it to it. 'Cause if the answer You need to be able to do that not in six months, Can you give us the status, you know, where you're at? And in July of last year, I joined as CEO. Yeah, so you guys were operational experts And it's largely because of what you described, So, I have to ask you a question as an investor. The last round was led by NEA. right, to get your name out there. You have to have evidence that the machinery that you build, And you can argue against it. And for that to happen, And best of luck with InfoWorks. Thank you for having me here. And thank you for watching, everybody.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
MicrosoftORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

DavePERSON

0.99+

David DormanPERSON

0.99+

GoogleORGANIZATION

0.99+

Dave VellantePERSON

0.99+

ZyngaORGANIZATION

0.99+

CiscoORGANIZATION

0.99+

January 2020DATE

0.99+

Ned HooperPERSON

0.99+

Amar ArsikerePERSON

0.99+

six monthsQUANTITY

0.99+

Palo AltoLOCATION

0.99+

2020DATE

0.99+

sixQUANTITY

0.99+

AT&TORGANIZATION

0.99+

BunoPERSON

0.99+

Centerview CapitalORGANIZATION

0.99+

NedPERSON

0.99+

Nexus Venture PartnersORGANIZATION

0.99+

third partnerQUANTITY

0.99+

2011DATE

0.99+

80%QUANTITY

0.99+

10 yearsQUANTITY

0.99+

12 monthsQUANTITY

0.99+

two partnersQUANTITY

0.99+

55%QUANTITY

0.99+

70QUANTITY

0.99+

OracleORGANIZATION

0.99+

50 yearsQUANTITY

0.99+

six daysQUANTITY

0.99+

thousandsQUANTITY

0.99+

first applicationQUANTITY

0.99+

one pieceQUANTITY

0.99+

10th yearQUANTITY

0.99+

HortonworksORGANIZATION

0.99+

InfoWorksORGANIZATION

0.99+

Silicon ValleyLOCATION

0.99+

nine monthsQUANTITY

0.99+

50 millionQUANTITY

0.99+

two fundsQUANTITY

0.99+

Buno PatiPERSON

0.99+

thirdQUANTITY

0.99+

three thingsQUANTITY

0.99+

firstQUANTITY

0.99+

431 use casesQUANTITY

0.99+

BostonLOCATION

0.99+

NetezzaORGANIZATION

0.99+

secondQUANTITY

0.99+

two key lessonsQUANTITY

0.99+

OneQUANTITY

0.99+

singleQUANTITY

0.98+

three layersQUANTITY

0.98+

late 80sDATE

0.98+

MapRORGANIZATION

0.98+

Boston, MassachusettsLOCATION

0.98+

dozensQUANTITY

0.98+

three principlesQUANTITY

0.98+

10xQUANTITY

0.98+

oneQUANTITY

0.98+

second thingQUANTITY

0.98+

17%QUANTITY

0.98+

2010DATE

0.97+

first 10 yearsQUANTITY

0.97+

ClouderaORGANIZATION

0.97+

todayDATE

0.97+

GardnerPERSON

0.96+

about 14 monthsQUANTITY

0.96+

Kellyn Pot'Vin Gorman, Delphix - Data Platforms 2017 - #DataPlatforms2017


 

>> Announcer: Live from the Wigwam in Phoenix, Arizona. It's theCUBE covering Data Platforms 2017. Brought to you by Qubole. >> Hey welcome back everybody. Jeff Frick here with theCUBE. We're at the historic Wigwam Resort. 99 years young just outside of Phoenix. At Data Platforms 2017. I'm Jeff Frick here with George Gilbert from Wikibon who's co-hosting with me all day. Getting to the end of the day. And we're excited to have our next guest. She is Kellyn Gorman. The technical intelligence manager and also the office of the CTO at Delphix, welcome. >> Yes, thank you, thank you so much. >> Absolutely, so what is Delphix for people that aren't familiar with Delphix? >> Most of realize that the database and data in general is the bottleneck and Delphix completely revolutionizes that. We remove it from being the bottleneck by virtualizing data. >> So you must love this show. >> Oh I do, I do. I'm hearing all about all kinds of new terms that we can take advantage of. >> Right, Cloud-Native and SEPRATE, you know and I think just the whole concept of atomic computing. Breaking down, removing storage, from serve. Breaking it down into smaller parts. Sounds like it fits right into kind of your guys will house. >> Yeah, I kind of want to containerize it all and be able to move it everywhere. But I love it. Yeah. >> So what do you think of this whole concept of Data Ops? We've been talking about Dev Ops for, I don't know how long... How long have we been talking about Dev Ops George? Five years? Six years? A while? >> Yeah a while (small chuckle) >> But now... >> Actually maybe eight years. >> Jeff: you're dating yourself George. (all laugh) Now we're talking about Data Ops, right? And there's a lot of talk of Data Ops. So this is the first time I've really heard it coined in such a way where it really becomes the primary driver in the way that you basically deliver value inside your organization. >> Oh absolutely. You know I come from the database realm. I was a DBA for over two decades and Dev Ops was a hard sell to a lot of DBAs. They didn't want to hear about it. I tried to introduce it over and over. The idea of automating and taking us kind of out this manual intervention. That introduced many times human error. So Dev Ops was a huge step forward getting that out of there. But the database was still in data in general was still this bottleneck. So Data Ops is the idea that you automate all of this and if you virtualize that data we found with Delphix that removed that last hurdle. And that was my, I guess my session was on virtualizing big data. The idea that I could take any kind of structured or unstructured file and virtualize that as well and instead of deploying it to multiple environments, I was able to deploy it once and actually do IO on demand. >> So let's peel the onion on that a little bit. What does it mean to virtualize data? And how does that break databases' bottleneck on the application? >> Well right now, when you talk about a relational data or any kind of legacy data store, people are duplicating that through our kick processes. So if we talk about Oracle they're using things like Datapump. They're using transportable table spaces. These are very cumbersome they take a very long time. Especially with the introduction of the cloud, there's many room for failure. It's not made for that, especially as the network is our last bottleneck. Is what we're also feeling too for many of these folks. When we introduce big data, many of these environments many of these, I guess you'd say projects came out of open source. They were done as a need, as a necessity to fulfill. And they've got a lot of moving pieces. And to be able to containerize that and then deploy it once and the virtualize it so instead of let's say you have 16 gigs that you need to duplicate here and over and over again. Especially if you're going on-prem or to the cloud. That I'm able to do it once and then do that IO on demand and go back to a gold copy a central location. And it makes it look like it's there. I was able to deploy a 16 gig file to multiple environments in less than a minute. And then each of those developers each have their own environment. Each tester has their own and they actually have a read write full robust copy. That's amazing to folks. All of a sudden, they're not held back by it. >> So our infrastructure analysts and our Wikibon research CTO David Floyer, if I'm understanding this correctly, talks about this where it's almost like a snapshot. >> Absolutely >> And it's a read write snapshot although you're probably not going to merge it back into the original. And this way Dev tests and whoever else wants to operate on live data can do that. >> Absolutely, it's full read write what we call it data version control. We've always had version control at the cold level. You may of had it at the actual server level. But you've rarely ever had it at the data level for the database or with flat files. What I used was the cms.gov data. It's available to everyone, it's public data. And we realized that these files were quite large and cumbersome. And I was able to reproduce it and enhance what they were doing at TIME magazine. And create a used case that made sense to a lot of people. Things that they're seeing in their real world environments. >> So, tell us more, elaborate how dev ops expands on this, I'm sorry, not dev ops data ops. How, take that as an example and generalize it some more so that we see how if DBAs were a bottleneck. How they now can become an enabler? >> One it's getting them to raise new skills. Many DBAs think that their value relies on those archaic processes. "It's going to take me three weeks to do this." So I have three weeks of value. Instead of saying "I am going to be able to do this in one day" and those other resources are now also valuable because they're doing their jobs. We're also seeing that data was seen as the centralized point. People were trying to come up with these pain points of solution to them. We're able to take that out completely. And people are able to embrace agility. They have agile environments now. Dev Ops means that they're able to automate that very easily instead of having that stopping point of constantly hitting a data and saying "I've got to take time to refresh this." "How am I going to refresh it?" "Can I do just certain..." We hear about this all the time with testing. When I go to testing summits, they are trying to create synchronized virtualized data. They're creating test data sets that they have to manage. It may not be the same as production where I can actually create a container of the entire developmental production environment. And refresh that back. And people are working on their full product. There's no room for error that you're seeing. Where you would have that if you were just taking a piece of it. Or if you were able to just grab just one tier of that environment because the data was too large before. >> So would the automation part be a generation of snapshot one or more snapshots. And then the sort of orchestration distribution to get it to the intended audiences? >> Yes, and we would use >> Okay. things like Jenkins through Chev normal dev ops tools work along with this. Along with command line utilities that are part of our product. To allow people to just create what they would create normally. But many times it's been siloed and like I said, work around that data. We've included the data as part of that. That they can deploy it just as fast. >> So a lot of the conversation here this morning was really about put the data all in this through your or pick your favorite public cloud to enable access to all the applications to the UPIs, through all different types of things. How does that impact kind of what you guys do in terms of conceptually? >> If you're able to containerize that it makes you capable of deploying to multiple clouds. Which is what we're finding. About 60% of our customers are in more than one cloud, two to five exactly. As we're dealing with that and recognizing that it's kind of like looking at your cloud environments. Like your phone providers. People see something shiny and new a better price point, lesser dollar. We're able to provide that one by saving all that storage space. It's virtualized, it's not taking a lot of disc space. Second of all, we're seeing them say "You know, I'm going to go over to Google." Oh guess what? This project says they need the data and they need to actually take the data source over to Amazon now. We're able to do that very easily. And we do it from multi tier. Flat files, the data, legacy data sources as well as our application tier. >> Now, when you're doing these snapshots, my understanding if I'm getting it right, is it's like a, it's not a full Xerox. It's more like the Delta. Like if someone's doing test dev they have some portion of the source of the source of truth, and as they make changes to it, it grows to include the edits until they're done, in which case then the whole thing is blown away. >> It depends on the technology you're looking at. Ours is able to trap that. So when we're talking about a virtual database, we're using the native recovery mechanisms. To kind of think of it as a perpetual recovery state inside our Delphix engine. So those changes are going on and then you have your VDBs that are a snapshot in time that they're working on. >> Oh so like you take a snapshot and then it's like a journal >> the transactional data is from the logs is continually applied. Of course it's different depending on each technology. So we do it differently for Cybase versus Oracle versus Sequal server and so on and so forth. Virtual files when we talk about flat files are different as well. Your parent, you take an exact snapshot of it. But it's really just projecting that NFS mount to another place. So that mount, if you replace those files, or update them of course, then you would be able to refresh and create a new shot of those files. So somebody said "We refresh these files every single night." You would be able to then refresh and project them out to the new place. >> Oh so you're, it's almost like you're sub-classing them... >> Yes. >> Okay, interesting... When you go into a company that's got a big data initiative, where do you fit in the discussion, in the sequence how do you position the value add relative to the data platform that it's sort of the center of the priority of getting it a platform in place? >> Well, that's what's so interesting about this is that we haven't really talked to a lot of big data companies. We've been very relational over a period of time. But our product is very much a Swiss Army knife. It will work on flat files. We've been doing it for multi tier environments forever. It's that our customers are now going "I have 96 petabytes in Oracle. I'm about to move over to big data." so I was able to go out and say we how would I do this in a big data environment? And I found this used case being used by TIME magazine and then created my environment. And did it off of Amazon. But it was just a used case. I was just a proof of concept that I built to show and demonstrate that. Yeah, my guy's back at the office are going "Kellyn when you're done with it, you can just deliver it back to us." (laughing) >> Jeff: Alright Kellyn. Well thank you for taking a few minutes to stop by and pretty interesting story. Everything's getting virtualized machines, databases... >> Soon us! >> And our data. >> Soon George! >> Right, not me George... (George laughs) Alright, thanks again Kellyn >> Thank you so much. >> for stopping by. Alright I'm with George Gilbert. I'm Jeff Frick you're watching theCUBE from Data Platforms 2017 in Phoenix, Arizona. Thanks for watching. (upbeat electronic music)

Published Date : May 26 2017

SUMMARY :

Brought to you by Qubole. and also the office of the CTO at Delphix, welcome. Most of realize that the database that we can take advantage of. Right, Cloud-Native and SEPRATE, you know and be able to move it everywhere. So what do you think of this whole concept in the way that you basically deliver and instead of deploying it to multiple environments, What does it mean to virtualize data? And to be able to containerize that and our Wikibon research CTO David Floyer, into the original. You may of had it at the actual server level. so that we see how if DBAs were a bottleneck. They're creating test data sets that they have to manage. distribution to get it to the intended audiences? To allow people to just create what So a lot of the conversation here the data source over to Amazon now. of the source of truth, and as they make and then you have your VDBs that NFS mount to another place. Oh so you're, it's almost like you're to the data platform that it's sort of I'm about to move over to big data." to stop by and pretty interesting story. Right, not me George... Alright I'm with George Gilbert.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
George GilbertPERSON

0.99+

JeffPERSON

0.99+

Kellyn GormanPERSON

0.99+

Jeff FrickPERSON

0.99+

KellynPERSON

0.99+

AmazonORGANIZATION

0.99+

GeorgePERSON

0.99+

twoQUANTITY

0.99+

three weeksQUANTITY

0.99+

16 gigQUANTITY

0.99+

OracleORGANIZATION

0.99+

PhoenixLOCATION

0.99+

Five yearsQUANTITY

0.99+

eight yearsQUANTITY

0.99+

Six yearsQUANTITY

0.99+

16 gigsQUANTITY

0.99+

GoogleORGANIZATION

0.99+

less than a minuteQUANTITY

0.99+

eachQUANTITY

0.99+

99 yearsQUANTITY

0.99+

XeroxORGANIZATION

0.99+

Phoenix, ArizonaLOCATION

0.99+

DelphixORGANIZATION

0.99+

Swiss ArmyORGANIZATION

0.99+

96 petabytesQUANTITY

0.98+

David FloyerPERSON

0.98+

About 60%QUANTITY

0.98+

Each testerQUANTITY

0.98+

WikibonORGANIZATION

0.98+

more than one cloudQUANTITY

0.98+

SecondQUANTITY

0.98+

one dayQUANTITY

0.98+

first timeQUANTITY

0.97+

TIMETITLE

0.97+

fiveQUANTITY

0.97+

OpsTITLE

0.96+

each technologyQUANTITY

0.96+

QubolePERSON

0.96+

CTOPERSON

0.95+

one tierQUANTITY

0.94+

theCUBEORGANIZATION

0.94+

ChevTITLE

0.93+

#DataPlatforms2017EVENT

0.92+

Dev OpsTITLE

0.91+

this morningDATE

0.89+

Kellyn Pot'Vin GormanPERSON

0.88+

over two decadesQUANTITY

0.87+

oneQUANTITY

0.82+

DelphixTITLE

0.81+

OneQUANTITY

0.77+

DatapumpORGANIZATION

0.75+

Wigwam ResortLOCATION

0.75+

OpsORGANIZATION

0.73+

single nightQUANTITY

0.72+

JenkinsTITLE

0.71+

WigwamLOCATION

0.71+

SequalORGANIZATION

0.7+

DataTITLE

0.66+

PlatformsEVENT

0.65+

Data Platforms 2017EVENT

0.64+

SEPRATEPERSON

0.63+

cms.govOTHER

0.56+

CybaseORGANIZATION

0.56+

Cloud-ORGANIZATION

0.55+

DeltaORGANIZATION

0.54+

Data OpsORGANIZATION

0.52+

2017DATE

0.44+