Image Title

Search Results for Encore:

Rich Gaston, Micro Focus | Virtual Vertica BDC 2020


 

(upbeat music) >> Announcer: It's theCUBE covering the virtual Vertica Big Data Conference 2020 brought to you by Vertica. >> Welcome back to the Vertica Virtual Big Data Conference, BDC 2020. You know, it was supposed to be a physical event in Boston at the Encore. Vertica pivoted to a digital event, and we're pleased that The Cube could participate because we've participated in every BDC since the inception. Rich Gaston this year is the global solutions architect for security risk and governance at Micro Focus. Rich, thanks for coming on, good to see you. >> Hey, thank you very much for having me. >> So you got a chewy title, man. You got a lot of stuff, a lot of hairy things in there. But maybe you can talk about your role as an architect in those spaces. >> Sure, absolutely. We handle a lot of different requests from the global 2000 type of organization that will try to move various business processes, various application systems, databases, into new realms. Whether they're looking at opening up new business opportunities, whether they're looking at sharing data with partners securely, they might be migrating it to cloud applications, and doing migration into a Hybrid IT architecture. So we will take those large organizations and their existing installed base of technical platforms and data, users, and try to chart a course to the future, using Micro Focus technologies, but also partnering with other third parties out there in the ecosystem. So we have large, solid relationships with the big cloud vendors, with also a lot of the big database spenders. Vertica's our in-house solution for big data and analytics, and we are one of the first integrated data security solutions with Vertica. We've had great success out in the customer base with Vertica as organizations have tried to add another layer of security around their data. So what we will try to emphasize is an enterprise wide data security approach, where you're taking a look at data as it flows throughout the enterprise from its inception, where it's created, where it's ingested, all the way through the utilization of that data. And then to the other uses where we might be doing shared analytics with third parties. How do we do that in a secure way that maintains regulatory compliance, and that also keeps our company safe against data breach. >> A lot has changed since the early days of big data, certainly since the inception of Vertica. You know, it used to be big data, everyone was rushing to figure it out. You had a lot of skunkworks going on, and it was just like, figure out data. And then as organizations began to figure it out, they realized, wow, who's governing this stuff? A lot of shadow IT was going on, and then the CIO was called to sort of reign that back in. As well, you know, with all kinds of whatever, fake news, the hacking of elections, and so forth, the sense of heightened security has gone up dramatically. So I wonder if you can talk about the changes that have occurred in the last several years, and how you guys are responding. >> You know, it's a great question, and it's been an amazing journey because I was walking down the street here in my hometown of San Francisco at Christmastime years ago and I got a call from my bank, and they said, we want to inform you your card has been breached by Target, a hack at Target Corporation and they got your card, and they also got your pin. And so you're going to need to get a new card, we're going to cancel this. Do you need some cash? I said, yeah, it's Christmastime so I need to do some shopping. And so they worked with me to make sure that I could get that cash, and then get the new card and the new pin. And being a professional in the inside of the industry, I really questioned, how did they get the pin? Tell me more about this. And they said, well, we don't know the details, but you know, I'm sure you'll find out. And in fact, we did find out a lot about that breach and what it did to Target. The impact that $250 million immediate impact, CIO gone, CEO gone. This was a big one in the industry, and it really woke a lot of people up to the different types of threats on the data that we're facing with our largest organizations. Not just financial data; medical data, personal data of all kinds. Flash forward to the Cambridge Analytica scandal that occurred where Facebook is handing off data, they're making a partnership agreement --think they can trust, and then that is misused. And who's going to end up paying the cost of that? Well, it's going to be Facebook at a tune of about five billion on that, plus some other finds that'll come along, and other costs that they're facing. So what we've seen over the course of the past several years has been an evolution from data breach making the headlines, and how do my customers come to us and say, help us neutralize the threat of this breach. Help us mitigate this risk, and manage this risk. What do we need to be doing, what are the best practices in the industry? Clearly what we're doing on the perimeter security, the application security and the platform security is not enough. We continue to have breaches, and we are the experts at that answer. The follow on fascinating piece has been the regulators jumping in now. First in Europe, but now we see California enacting a law just this year. They came into a place that is very stringent, and has a lot of deep protections that are really far-reaching around personal data of consumers. Look at jurisdictions like Australia, where fiduciary responsibility now goes to the Board of Directors. That's getting attention. For a regulated entity in Australia, if you're on the Board of Directors, you better have a plan for data security. And if there is a breach, you need to follow protocols, or you personally will be liable. And that is a sea change that we're seeing out in the industry. So we're getting a lot of attention on both, how do we neutralize the risk of breach, but also how can we use software tools to maintain and support our regulatory compliance efforts as we work with, say, the largest money center bank out of New York. I've watched their audit year after year, and it's gotten more and more stringent, more and more specific, tell me more about this aspect of data security, tell me more about encryption, tell me more about money management. The auditors are getting better. And we're supporting our customers in that journey to provide better security for the data, to provide a better operational environment for them to be able to roll new services out with confidence that they're not going to get breached. With that confidence, they're not going to have a regulatory compliance fine or a nightmare in the press. And these are the major drivers that help us with Vertica sell together into large organizations to say, let's add some defense in depth to your data. And that's really a key concept in the security field, this concept of defense in depth. We apply that to the data itself by changing the actual data element of Rich Gaston, I will change that name into Ciphertext, and that then yields a whole bunch of benefits throughout the organization as we deal with the lifecycle of that data. >> Okay, so a couple things I want to mention there. So first of all, totally board level topic, every board of directors should really have cyber and security as part of its agenda, and it does for the reasons that you mentioned. The other is, GDPR got it all started. I guess it was May 2018 that the penalties went into effect, and that just created a whole Domino effect. You mentioned California enacting its own laws, which, you know, in some cases are even more stringent. And you're seeing this all over the world. So I think one of the questions I have is, how do you approach all this variability? It seems to me, you can't just take a narrow approach. You have to have an end to end perspective on governance and risk and security, and the like. So are you able to do that? And if so, how so? >> Absolutely, I think one of the key areas in big data in particular, has been the concern that we have a schema, we have database tables, we have CALMS, and we have data, but we're not exactly sure what's in there. We have application developers that have been given sandbox space in our clusters, and what are they putting in there? So can we discover that data? We have those tools within Micro Focus to discover sensitive data within in your data stores, but we can also protect that data, and then we'll track it. And what we really find is that when you protect, let's say, five billion rows of a customer database, we can now know what is being done with that data on a very fine grain and granular basis, to say that this business process has a justified need to see the data in the clear, we're going to give them that authorization, they can decrypt the data. Secure data, my product, knows about that and tracks that, and can report on that and say at this date and time, Rich Gaston did the following thing to be able to pull data in the clear. And that could be then used to support the regulatory compliance responses and then audit to say, who really has access to this, and what really is that data? Then in GDPR, we're getting down into much more fine grained decisions around who can get access to the data, and who cannot. And organizations are scrambling. One of the funny conversations that I had a couple years ago as GDPR came into place was, it seemed a couple of customers were taking these sort of brute force approach of, we're going to move our analytics and all of our data to Europe, to European data centers because we believe that if we do this in the U.S., we're going to violate their law. But if we do it all in Europe, we'll be okay. And that simply was a short-term way of thinking about it. You really can't be moving your data around the globe to try to satisfy a particular jurisdiction. You have to apply the controls and the policies and put the software layers in place to make sure that anywhere that someone wants to get that data, that we have the ability to look at that transaction and say it is or is not authorized, and that we have a rock solid way of approaching that for audit and for compliance and risk management. And once you do that, then you really open up the organization to go back and use those tools the way they were meant to be used. We can use Vertica for AI, we can use Vertica for machine learning, and for all kinds of really cool use cases that are being done with IOT, with other kinds of cases that we're seeing that require data being managed at scale, but with security. And that's the challenge, I think, in the current era, is how do we do this in an elegant way? How do we do it in a way that's future proof when CCPA comes in? How can I lay this on as another layer of audit responsibility and control around my data so that I can satisfy those regulators as well as the folks over in Europe and Singapore and China and Turkey and Australia. It goes on and on. Each jurisdiction out there is now requiring audit. And like I mentioned, the audits are getting tougher. And if you read the news, the GDPR example I think is classic. They told us in 2016, it's coming. They told us in 2018, it's here. They're telling us in 2020, we're serious about this, and here's the finds, and you better be aware that we're coming to audit you. And when we audit you, we're going to be asking some tough questions. If you can't answer those in a timely manner, then you're going to be facing some serious consequences, and I think that's what's getting attention. >> Yeah, so the whole big data thing started with Hadoop, and Hadoop is open, it's distributed, and it just created a real governance challenge. I want to talk about your solutions in this space. Can you tell us more about Micro Focus voltage? I want to understand what it is, and then get into sort of how it works, and then I really want to understand how it's applied to Vertica. >> Yeah, absolutely, that's a great question. First of all, we were the originators of format preserving encryption, we developed some of the core basic research out of Stanford University that then became the company of Voltage; that build-a-brand name that we apply even though we're part of Micro Focus. So the lineage still goes back to Dr. Benet down at Stanford, one of my buddies there, and he's still at it doing amazing work in cryptography and keeping moving the industry forward, and the science forward of cryptography. It's a very deep science, and we all want to have it peer-reviewed, we all want to be attacked, we all want it to be proved secure, that we're not selling something to a major money center bank that is potentially risky because it's obscure and we're private. So we have an open standard. For six years, we worked with the Department of Commerce to get our standard approved by NIST; The National Institute of Science and Technology. They initially said, well, AES256 is going to be fine. And we said, well, it's fine for certain use cases, but for your database, you don't want to change your schema, you don't want to have this increase in storage costs. What we want is format preserving encryption. And what that does is turns my name, Rich, into a four-letter ciphertext. It can be reversed. The mathematics of that are fascinating, and really deep and amazing. But we really make that very simple for the end customer because we produce APIs. So these application programming interfaces can be accessed by applications in C or Java, C sharp, other languages. But they can also be accessed in Microservice Manor via rest and web service APIs. And that's the core of our technical platform. We have an appliance-based approach, so we take a secure data appliance, we'll put it on Prim, we'll make 50 of them if you're a big company like Verizon and you need to have these co-located around the globe, no problem; we can scale to the largest enterprise needs. But our typical customer will install several appliances and get going with a couple of environments like QA and Prod to be able to start getting encryption going inside their organization. Once the appliances are set up and installed, it takes just a couple of days of work for a typical technical staff to get done. Then you're up and running to be able to plug in the clients. Now what are the clients? Vertica's a huge one. Vertica's one of our most powerful client endpoints because you're able to now take that API, put it inside Vertica, it's all open on the internet. We can go and look at Vertica.com/secure data. You get all of our documentation on it. You understand how to use it very quickly. The APIs are super simple; they require three parameter inputs. It's a really basic approach to being able to protect and access data. And then it gets very deep from there because you have data like credit card numbers. Very different from a street address and we want to take a different approach to that. We have data like birthdate, and we want to be able to do analytics on dates. We have deep approaches on managing analytics on protected data like Date without having to put it in the clear. So we've maintained a lead in the industry in terms of being an innovator of the FF1 standard, what we call FF1 is format preserving encryption. We license that to others in the industry, per our NIST agreement. So we're the owner, we're the operator of it, and others use our technology. And we're the original founders of that, and so we continue to sort of lead the industry by adding additional capabilities on top of FF1 that really differentiate us from our competitors. Then you look at our API presence. We can definitely run as a dup, but we also run in open systems. We run on main frame, we run on mobile. So anywhere in the enterprise or one in the cloud, anywhere you want to be able to put secure data, and be able to access the protect data, we're going to be there and be able to support you there. >> Okay so, let's say I've talked to a lot of customers this week, and let's say I'm running in Eon mode. And I got some workload running in AWS, I've got some on Prim. I'm going to take an appliance or multiple appliances, I'm going to put it on Prim, but that will also secure my cloud workloads as part of a sort of shared responsibility model, for example? Or how does that work? >> No, that's absolutely correct. We're really flexible that we can run on Prim or in the cloud as far as our crypto engine, the key management is really hard stuff. Cryptography is really hard stuff, and we take care of all that, so we've all baked that in, and we can run that for you as a service either in the cloud or on Prim on your small Vms. So really the lightweight footprint for me running my infrastructure. When I look at the organization like you just described, it's a classic example of where we fit because we will be able to protect that data. Let's say you're ingesting it from a third party, or from an operational system, you have a website that collects customer data. Someone has now registered as a new customer, and they're going to do E-commerce with you. We'll take that data, and we'll protect it right at the point of capture. And we can now flow that through the organization and decrypt it at will on any platform that you have that you need us to be able to operate on. So let's say you wanted to pick that customer data from the operational transaction system, let's throw it into Eon, let's throw it into the cloud, let's do analytics there on that data, and we may need some decryption. We can place secure data wherever you want to be able to service that use case. In most cases, what you're doing is a simple, tiny little atomic efetch across a protected tunnel, your typical TLS pipe tunnel. And once that key is then cashed within our client, we maintain all that technology for you. You don't have to know about key management or dashing. We're good at that; that's our job. And then you'll be able to make those API calls to access or protect the data, and apply the authorization authentication controls that you need to be able to service your security requirements. So you might have third parties having access to your Vertica clusters. That is a special need, and we can have that ability to say employees can get X, and the third party can get Y, and that's a really interesting use case we're seeing for shared analytics in the internet now. >> Yeah for sure, so you can set the policy how we want. You know, I have to ask you, in a perfect world, I would encrypt everything. But part of the reason why people don't is because of performance concerns. Can you talk about, and you touched upon it I think recently with your sort of atomic access, but can you talk about, and I know it's Vertica, it's Ferrari, etc, but anything that slows it down, I'm going to be a concern. Are customers concerned about that? What are the performance implications of running encryption on Vertica? >> Great question there as well, and what we see is that we want to be able to apply scale where it's needed. And so if you look at ingest platforms that we find, Vertica is commonly connected up to something like Kafka. Maybe streamsets, maybe NiFi, there are a variety of different technologies that can route that data, pipe that data into Vertica at scale. Secured data is architected to go along with that architecture at the node or at the executor or at the lowest level operator level. And what I mean by that is that we don't have a bottleneck that everything has to go through one process or one box or one channel to be able to operate. We don't put an interceptor in between your data and coming and going. That's not our approach because those approaches are fragile and they're slow. So we typically want to focus on integrating our APIs natively within those pipeline processes that come into Vertica within the Vertica ingestion process itself, you can simply apply our protection when you do the copy command in Vertica. So really basic simple use case that everybody is typically familiar with in Vertica land; be able to copy the data and put it into Vertica, and you simply say protect as part of the data. So my first name is coming in as part of this ingestion. I'll simply put the protect keyword in the Syntax right in SQL; it's nothing other than just an extension SQL. Very very simple, the developer, easy to read, easy to write. And then you're going to provide the parameters that you need to say, oh the name is protected with this kind of a format. To differentiate it between a credit card number and an alphanumeric stream, for example. So once you do that, you then have the ability to decrypt. Now, on decrypt, let's look at a couple different use cases. First within Vertica, we might be doing select statements within Vertica, we might be doing all kinds of jobs within Vertica that just operate at the SQL layer. Again, just insert the word "access" into the Vertica select string and provide us with the data that you want to access, that's our word for decryption, that's our lingo. And we will then, at the Vertica level, harness the power of its CPU, its RAM, its horsepower at the node to be able to operate on that operator, the decryption request, if you will. So that gives us the speed and the ability to scale out. So if you start with two nodes of Vertica, we're going to operate at X number of hundreds of thousands of transactions a second, depending on what you're doing. Long strings are a little bit more intensive in terms of performance, but short strings like social security number are our sweet spot. So we operate very very high speed on that, and you won't notice the overhead with Vertica, perse, at the node level. When you scale Vertica up and you have 50 nodes, and you have large clusters of Vertica resources, then we scale with you. And we're not a bottleneck and at any particular point. Everybody's operating independently, but they're all copies of each other, all doing the same operation. Fetch a key, do the work, go to sleep. >> Yeah, you know, I think this is, a lot of the customers have said to us this week that one of the reasons why they like Vertica is it's very mature, it's been around, it's got a lot of functionality, and of course, you know, look, security, I understand is it's kind of table sticks, but it's also can be a differentiator. You know, big enterprises that you sell to, they're asking for security assessments, SOC 2 reports, penetration testing, and I think I'm hearing, with the partnership here, you're sort of passing those with flying colors. Are you able to make security a differentiator, or is it just sort of everybody's kind of got to have good security? What are your thoughts on that? >> Well, there's good security, and then there's great security. And what I found with one of my money center bank customers here in San Francisco was based here, was the concern around the insider access, when they had a large data store. And the concern that a DBA, a database administrator who has privilege to everything, could potentially exfil data out of the organization, and in one fell swoop, create havoc for them because of the amount of data that was present in that data store, and the sensitivity of that data in the data store. So when you put voltage encryption on top of Vertica, what you're doing now is that you're putting a layer in place that would prevent that kind of a breach. So you're looking at insider threats, you're looking at external threats, you're looking at also being able to pass your audit with flying colors. The audits are getting tougher. And when they say, tell me about your encryption, tell me about your authentication scheme, show me the access control list that says that this person can or cannot get access to something. They're asking tougher questions. That's where secure data can come in and give you that quick answer of it's encrypted at rest. It's encrypted and protected while it's in use, and we can show you exactly who's had access to that data because it's tracked via a different layer, a different appliance. And I would even draw the analogy, many of our customers use a device called a hardware security module, an HSM. Now, these are fairly expensive devices that are invented for military applications and adopted by banks. And now they're really spreading out, and people say, do I need an HSM? Well, with secure data, we certainly protect your crypto very very well. We have very very solid engineering. I'll stand on that any day of the week, but your auditor is going to want to ask a checkbox question. Do you have HSM? Yes or no. Because the auditor understands, it's another layer of protection. And it provides me another tamper evident layer of protection around your key management and your crypto. And we, as professionals in the industry, nod and say, that is worth it. That's an expensive option that you're going to add on, but your auditor's going to want it. If you're in financial services, you're dealing with PCI data, you're going to enjoy the checkbox that says, yes, I have HSMs and not get into some arcane conversation around, well no, but it's good enough. That's kind of the argument then conversation we get into when folks want to say, Vertica has great security, Vertica's fantastic on security. Why would I want secure data as well? It's another layer of protection, and it's defense in depth for you data. When you believe in that, when you take security really seriously, and you're really paranoid, like a person like myself, then you're going to invest in those kinds of solutions that get you best in-class results. >> So I'm hearing a data-centric approach to security. Security experts will tell you, you got to layer it. I often say, we live in a new world. The green used to just build a moat around the queen, but the queen, she's leaving her castle in this world of distributed data. Rich, incredibly knowlegable guest, and really appreciate you being on the front lines and sharing with us your knowledge about this important topic. So thanks for coming on theCUBE. >> Hey, thank you very much. >> You're welcome, and thanks for watching everybody. This is Dave Vellante for theCUBE, we're covering wall-to-wall coverage of the Virtual Vertica BDC, Big Data Conference. Remotely, digitally, thanks for watching. Keep it right there. We'll be right back right after this short break. (intense music)

Published Date : Mar 31 2020

SUMMARY :

Vertica Big Data Conference 2020 brought to you by Vertica. and we're pleased that The Cube could participate But maybe you can talk about your role And then to the other uses where we might be doing and how you guys are responding. and they said, we want to inform you your card and it does for the reasons that you mentioned. and put the software layers in place to make sure Yeah, so the whole big data thing started with Hadoop, So the lineage still goes back to Dr. Benet but that will also secure my cloud workloads as part of a and we can run that for you as a service but can you talk about, at the node to be able to operate on that operator, a lot of the customers have said to us this week and we can show you exactly who's had access to that data and really appreciate you being on the front lines of the Virtual Vertica BDC, Big Data Conference.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
AustraliaLOCATION

0.99+

EuropeLOCATION

0.99+

TargetORGANIZATION

0.99+

VerizonORGANIZATION

0.99+

VerticaORGANIZATION

0.99+

FacebookORGANIZATION

0.99+

Dave VellantePERSON

0.99+

May 2018DATE

0.99+

NISTORGANIZATION

0.99+

2016DATE

0.99+

BostonLOCATION

0.99+

2018DATE

0.99+

San FranciscoLOCATION

0.99+

New YorkLOCATION

0.99+

Target CorporationORGANIZATION

0.99+

$250 millionQUANTITY

0.99+

50QUANTITY

0.99+

Rich GastonPERSON

0.99+

SingaporeLOCATION

0.99+

TurkeyLOCATION

0.99+

FerrariORGANIZATION

0.99+

six yearsQUANTITY

0.99+

2020DATE

0.99+

one boxQUANTITY

0.99+

ChinaLOCATION

0.99+

CTITLE

0.99+

Stanford UniversityORGANIZATION

0.99+

JavaTITLE

0.99+

FirstQUANTITY

0.99+

oneQUANTITY

0.99+

AWSORGANIZATION

0.99+

U.S.LOCATION

0.99+

this weekDATE

0.99+

National Institute of Science and TechnologyORGANIZATION

0.99+

Each jurisdictionQUANTITY

0.99+

bothQUANTITY

0.99+

VerticaTITLE

0.99+

RichPERSON

0.99+

this yearDATE

0.98+

Vertica Virtual Big Data ConferenceEVENT

0.98+

one channelQUANTITY

0.98+

one processQUANTITY

0.98+

GDPRTITLE

0.98+

SQLTITLE

0.98+

five billion rowsQUANTITY

0.98+

about five billionQUANTITY

0.97+

OneQUANTITY

0.97+

C sharpTITLE

0.97+

BenetPERSON

0.97+

firstQUANTITY

0.96+

four-letterQUANTITY

0.96+

Vertica Big Data Conference 2020EVENT

0.95+

HadoopTITLE

0.94+

KafkaTITLE

0.94+

Micro FocusORGANIZATION

0.94+

Larry Lancaster, Zebrium | Virtual Vertica BDC 2020


 

>> Announcer: It's theCUBE! Covering the Virtual Vertica Big Data Conference 2020 brought to you by Vertica. >> Hi, everybody. Welcome back. You're watching theCUBE's coverage of the Vertica Virtual Big Data Conference. It was, of course, going to be in Boston at the Encore Hotel. Win big with big data with the new casino but obviously Coronavirus has changed all that. Our hearts go out and we are empathy to those people who are struggling. We are going to continue our wall-to-wall coverage of this conference and we're here with Larry Lancaster who's the founder and CTO of Zebrium. Larry, welcome to theCUBE. Thanks for coming on. >> Hi, thanks for having me. >> You're welcome. So first question, why did you start Zebrium? >> You know, I've been dealing with machine data a long time. So for those of you who don't know what that is, if you can imagine servers or whatever goes on in a data center or in a SAS shop. There's data coming out of those servers, out of those applications and basically, you can build a lot of cool stuff on that. So there's a lot of metrics that come out and there's a lot of log files that come. And so, I've built this... Basically spent my career building that sort of thing. So tools on top of that or products on top of that. The problem is that since at least log files are completely unstructured, it's always doing the same thing over and over again, which is going in and understanding the data and extracting the data and all that stuff. It's very time consuming. If you've done it like five times you don't want to do it again. So really, my idea was at this point with machine learning where it's at there's got to be a better way. So Zebrium was founded on the notion that we can just do all that automatically. We can take a pile of machine data, we can turn it into a database, and we can build stuff on top of that. And so the company is really all about bringing that value to the market. >> That's cool. I want to get in to that, just better understand who you're disrupting and understand that opportunity better. But before I do, tell us a little bit about your background. You got kind of an interesting background. Lot of tech jobs. Give us some color there. >> Yeah, so I started in the Valley I guess 20 years ago and when my son was born I left grad school. I was in grad school over at Berkeley, Biophysics. And I realized I needed to go get a job so I ended up starting in software and I've been there ever since. I mean, I spent a lot of time at, I guess I cut my teeth at Nedap, which was a storage company. And then I co-founded a business called Glassbeam, which was kind of an ETL database company. And then after that I ended up at Nimble Storage. Another company, EMC, ended up buying the Glassbeam so I went over there and then after Nimble though, which where I build the InfoSight platform. That's where I kind of, after that I was able to step back and take a year and a half and just go into my basement, actually, this is my kind of workspace here, and come up with the technology and actually build it so that I could go raise money and get a team together to build Zebrium. So that's really my career in a nutshell. >> And you've got Hello Kitty over your right shoulder, which is kind of cool >> That's right. >> And then up to the left you got your monitor, right? >> Well, I had it. It's over here, yeah. >> But it was great! Pull it out, pull it out, let me see it. So, okay, so you got that. So what do you do? You just sit there and code all night or what? >> Yeah, that's right. So Hello Kitty's over here. I have a daughter and she setup my workspace here on this side with Hello Kitty and so on. And over on this side, I've got my recliner where I basically lay it all the way back and then I pivot this thing down over my face and put my keyboard on my lap and I can just sit there for like 20 hours. It's great. Completely comfortable. >> That's cool. All right, better put that monitor back or our guys will yell at me. But so, obviously, we're talking to somebody with serious coding chops and I'll also add that the Nimble InfoSight, I think it was one of the best pick ups that HP, HPE, has had in a while. And the thing that interested me about that, Larry, is the ability that the company was able to take that InfoSight and poured it very quickly across its product lines. So that says to me it was a modern, architecture, I'm sure API, microservices, and all those cool buzz words, but the proof is in their ability to bring that IP to other parts of the portfolio. So, well done. >> Yeah, well thanks. Appreciate that. I mean, they've got a fantastic team there. And the other thing that helps is when you have the notion that you don't just build on top of the data, you extract the data, you structure it, you put that in a database, we used Vertica there for that, and then you build on top of that. Taking the time to build that layer is what lets you build a scalable platform. >> Yeah, so, why Vertica? I mean, Vertica's been around for awhile. You remember you had the you had the old RDBMS, Oracles, Db2s, SQL Server, and then the database was kind of a boring market. And then, all of a sudden, you had all of these MPP companies came out, a spade of them. They all got acquired, including Vertica. And they've all sort of disappeared and morphed into different brands and Micro Focus has preserved the Vertica brand. But it seems like Vertica has been able to survive the transitions. Why Vertica? What was it about that platform that was unique and interested you? >> Well, I mean, so they're the first fund to build, what I would call a real column store that's kind of market capable, right? So there was the C-Store project at Berkeley, which Stonebreaker was involved in. And then that became sort of the seed from which Vertica was spawned. So you had this idea of, let's lay things out in a columnar way. And when I say columnar, I don't just mean that the data for every column is in a different set of files. What I mean by that is it takes full advantage of things like run length and coding, and L file and coding, and block--impression, and so you end up with these massive orders of magnitude savings in terms of the data that's being pulled off of storage as well as as it's moving through the pipeline internally in Vertica's query processing. So why am I saying all this? Because it's fundamentally, it was a fundamentally disruptive technology. I think column stores are ubiquitous now in analytics. And I think you could name maybe a couple of projects which are mostly open source who do something like Vertica does but name me another one that's actually capable of serving an enterprise as a relational database. I still think Vertica is unique in being that one. >> Well, it's interesting because you're a startup. And so a lot of startups would say, okay, we're going with a born-in-the-cloud database. Now Vertica touts that, well look, we've embraced cloud. You know, we have, we run in the cloud, we run on PRAM, all different optionality. And you hear a lot of vendors say that, but a lot of times they're just taking their stack and stuffing it into the cloud. But, so why didn't you go with a cloud-native database and is Vertica able to, I mean, obviously, that's why you chose it, but I'm interested from a technologist standpoint as to why you, again, made that choice given all these other choices around there. >> Right, I mean, again, I'm not, so... As I explained a column store, which I think is the appropriate definition, I'm not aware of another cloud-native-- >> Hm, okay. >> I'm aware of other cloud-native transactional databases, I'm not aware of one that has the analytics form it and I've tried some of them. So it was not like I didn't look. What I was actually impressed with and I think what let me move forward using Vertica in our stack is the fact that Eon really is built from the ground up to be cloud-native. And so we've been using Eon almost ever since we started the work that we're doing. So I've been really happy with the performance and with reliability of Eon. >> It's interesting. I've been saying for years that Vertica's a diamond in the rough and it's previous owner didn't know what to do with it because it got distracted and now Micro Focus seems to really see the value and is obviously putting some investments in there. >> Yeah >> Tell me more about your business. Who are you disrupting? Are you kind of disrupting the do-it-yourself? Or is there sort of a big whale out there that you're going to go after? Add some color to that. >> Yeah, so our broader market is monitoring software, that's kind of the high-level category. So you have a lot of people in that market right now. Some of them are entrenched in large players, like Datadog would be a great example. Some of them are smaller upstarts. It's a pretty, it's a pretty saturated market. But what's happened over the last, I'd say two years, is that there's been sort of a push towards what's called observability in terms of at least how some of the products are architected, like Honeycomb, and how some of them are messaged. Most of them are messaged these days. And what that really means is there's been sort of an understanding that's developed that that MTTR is really what people need to focus on to keep their customers happy. If you're a SAS company, MTTR is going to be your bread and butter. And it's still measured in hours and days. And the biggest reason for that is because of what's called unknown unknowns. Because of complexity. Now a days, things are, applications are ten times as complex as they used to be. And what you end up with is a situation where if something is new, if it's a known issue with a known symptom and a known root cause, then you can setup a automation for it. But the ones that really cost a lot of time in terms of service disruption are unknown unknowns. And now you got to go dig into this massive mass of data. So observability is about making tools to help you do that, but it's still going to take you hours. And so our contention is, you need to automate the eyeball. The bottleneck is now the eyeball. And so you have to get away from this notion of a person's going to be able to do it infinitely more efficient and recognize that you need automated help. When you get an alert agent, it shouldn't be that, "Hey, something weird's happening. Now go dig in." It should be, "Here's a root cause and a symptom." And that should be proposed to you by a system that actually does the observing. That actually does the watching. And that's what Zebrium does. >> Yeah, that's awesome. I mean, you're right. The last thing you want is just another alert and it say, "Go figure something out because there's a problem." So how does it work, Larry? In terms of what you built there. Can you take us inside the covers? >> Yeah, sure. So there's really, right now there's two kinds of data that we're ingesting. There's metrics and there's log files. Metrics, there's actually sort of a framework that's really popular in DevOp circles especially but it's becoming popular everywhere, which is called Prometheus. And it's a way of exporting metrics so that scrapers can collect them. And so if you go look at a typical stack, you'll find that most of the open source components and many of the closed source components are going to have exporters that export all their stacks to Prometheus. So by supporting that stack we can bring in all of those metrics. And then there's also the log files. And so you've got host log files in a containerized environment, you've got container logs, and you've got application-specific logs, perhaps living on a host mount. And you want to pull all those back and you want to be able to associate this log that I've collected here is associated with the same container on the same host that this metric is associated with. But now what? So once you've got that, you've got a pile of unstructured logs. So what we do is we take a look at those logs and we say, let's structure those into tables, right? So where I used to have a log message, if I look in my log file and I see it says something like, X happened five times, right? Well, that event types going to occur again and it'll say, X happened six times or X happened three times. So if I see that as a human being, I can say, "Oh clearly, that's the same thing." And what's interesting here is the times that X, that X happened, and that this number read... I may want to know when the numbers happened as a time series, the values of that column. And so you can imagine it as a table. So now I have table for that event type and every time it happens, I get a row. And then I have a column with that number in it. And so now I can do any kind of analytics I want almost instantly across my... If I have all my event types structured that way, every thing changes. You can do real anomaly detection and incident detection on top of that data. So that's really how we go about doing it. How we go about being able to do autonomous monitoring in a way that's effective. >> How do you handle doing that for, like the Spoke app? Do you have to, does somebody have to build a connector to those apps? How do you handle that? >> Yeah, that's a really good question. So you're right. So if I go and install a typical log manager, there'll be connectors for different apps and usually what that means is pulling in the stuff on the left, if you were to be looking at that log line, and it will be things like a time stamp, or a severity, or a function name, or various other things. And so the connector will know how to pull those apart and then the stuff to the right will be considered the message and that'll get indexed for search. And so our approach is we actually go in with machine learning and we structure that whole thing. So there's a table. And it's going to have a column called severity, and timestamp, and function name. And then it's going to have columns that correspond to the parameters that are in that event. And it'll have a name associated with the constant parts of that event. And so you end up with a situation where you've structured all of it automatically so we don't need collectors. It'll work just as well on your home-grown app that has no collectors or no parsers to find or anything. It'll work immediately just as well as it would work on anything else. And that's important, because you can't be asking people for connectors to their own applications. It just, it becomes now they've go to stop what they're doing and go write code for you, for your platform and they have to maintain it. It's just untenable. So you can be up and running with our service in three minutes. It'll just be monitoring those for you. >> That's awesome! I mean, that is really a breakthrough innovation. So, nice. Love to see that hittin' the market. Who do you sell to? Both types of companies and what role within the company? >> Well, definitely there's two main sort of pushes that we've seen, or I should say pulls. One is from DevOps folks, SRE folks. So these are people who are tasked with monitoring an environment, basically. And then you've got people who are in engineering and they have a staging environment. And what they actually find valuable is... Because when we find an incident in a staging environment, yeah, half the time it's because they're tearing everything up and it's not release ready, whatever's in stage. That's fine, they know that. But the other half the time it's new bugs, it's issues and they're finding issues. So it's kind of diverged. You have engineering users and they don't have titles like QA, they're Dev engineers or Dev managers that are really interested. And then you've got DevOps and SRE people there (mumbles). >> And how do I consume your product? Is the SAS... I sign up and you say within three minutes I'm up and running. I'm paying by the drink. >> Well, (laughs) right. So there's a couple ways. So, right. So the easiest way is if you use Kubernetes. So Kubernetes is what's called a container orchestrator. So these days, you know Docker and containers and all that, so now there's container orchestrators have become, I wouldn't say ubiquitous but they're very popular now. So it's kind of on that inflection curve. I'm not exactly sure the penetration but I'm going to say 30-40% probably of shops that were interested are using container orchestrators. So if you're using Kubernetes, basically you can install our Kubernetes chart, which basically means copying and pasting a URL and so on into your little admin panel there. And then it'll just start collecting all the logs and metrics and then you just login on the website. And the way you do that is just go to our website and it'll show you how to sign up for the service and you'll get your little API key and link to the chart and you're off and running. You don't have to do anything else. You can add rules, you can add stuff, but you don't have to. You shouldn't have to, right? You should never have to do any more work. >> That's great. So it's a SAS capability and I just pay for... How do you price it? >> Oh, right. So it's priced on volume, data volume. I don't want to go too much into it because I'm not the pricing guy. But what I'll say is that it's, as far as I know it's as cheap or cheaper than any other log manager or metrics product. It's in that same neighborhood as the very low priced ones. Because right now, we're not trying to optimize for take. We're trying to make a healthy margin and get the value of autonomous monitoring out there. Right now, that's our priority. >> And it's running in the cloud, is that right? AWB West-- >> Yeah, that right. Oh, I should've also pointed out that you can have a free account if it's less than some number of gigabytes a day we're not going to charge. Yeah, so we run in AWS. We have a multi-tenant instance in AWS. And we have a Vertica Eon cluster behind that. And it's been working out really well. >> And on your freemium, you have used the Vertica Community Edition? Because they don't charge you for that, right? So is that how you do it or... >> No, no. We're, no, no. So, I don't want to go into that because I'm not the bizdev guy. But what I'll say is that if you're doing something that winds up being OEM-ish, you can work out the particulars with Vertica. It's not like you're going to just go pay retail and they won't let you distinguish between tests, and prod, and paid, and all that. They'll work with you. Just call 'em up. >> Yeah, and that's why I brought it up because Vertica, they have a community edition, which is not neutered. It runs Eon, it's just there's limits on clusters and storage >> There's limits. >> But it's still fully functional though. >> So to your point, we want it multi-tenant. So it's big just because it's multi-tenant. We have hundred of users on that (audio cuts out). >> And then, what's your partnership with Vertica like? Can we close on that and just describe that a little bit? >> What's it like. I mean, it's pleasant. >> Yeah, I mean (mumbles). >> You know what, so the important thing... Here's what's important. What's important is that I don't have to worry about that layer of our stack. When it comes to being able to get the performance I need, being able to get the economy of scale that I need, being able to get the absolute scale that I need, I've not been disappointed ever with Vertica. And frankly, being able to have acid guarantees and everything else, like a normal mature database that can join lots of tables and still be fast, that's also necessary at scale. And so I feel like it was definitely the right choice to start with. >> Yeah, it's interesting. I remember in the early days of big data a lot of people said, "Who's going to need these acid properties and all this complexity of databases." And of course, acid properties and SQL became the killer features and functions of these databases. >> Who didn't see that one coming, right? >> Yeah, right. And then, so you guys have done a big seed round. You've raised a little over $6 million dollars and you got the product market fit down. You're ready to rock, right? >> Yeah, that's right. So we're doing a launch probably, well, when this airs it'll probably be the day before this airs. Basically, yeah. We've got people... Like literally in the last, I'd say, six to eight weeks, It's just been this sort of pique of interest. All of a sudden, everyone kind of gets what we're doing, realizes they need it, and we've got a solution that seems to meet expectations. So it's like... It's been an amazing... Let me just say this, it's been an amazing start to the year. I mean, at the same time, it's been really difficult for us but more difficult for some other people that haven't been able to go to work over the last couple of weeks and so on. But it's been a good start to the year, at least for our business. So... >> Well, Larry, congratulations on getting the company off the ground and thank you so much for coming on theCUBE and being part of the Virtual Vertica Big Data Conference. >> Thank you very much. >> All right, and thank you everybody for watching. This is Dave Vellante for theCUBE. Keep it right there. We're covering wall-to-wall Virtual Vertica BDC. You're watching theCUBE. (upbeat music)

Published Date : Mar 31 2020

SUMMARY :

brought to you by Vertica. and we're here with Larry Lancaster why did you start Zebrium? and basically, you can build a lot of cool stuff on that. and understand that opportunity better. and actually build it so that I could go raise money It's over here, yeah. So what do you do? and then I pivot this thing down over my face and I'll also add that the Nimble InfoSight, And the other thing that helps is when you have the notion and Micro Focus has preserved the Vertica brand. and so you end up with these massive orders And you hear a lot of vendors say that, I'm not aware of another cloud-native-- I'm not aware of one that has the analytics form it and now Micro Focus seems to really see the value Are you kind of disrupting the do-it-yourself? And that should be proposed to you In terms of what you built there. And so you can imagine it as a table. And so you end up with a situation I mean, that is really a breakthrough innovation. and it's not release ready, I sign up and you say within three minutes And the way you do that So it's a SAS capability and I just pay for... and get the value of autonomous monitoring out there. that you can have a free account So is that how you do it or... and they won't let you distinguish between Yeah, and that's why I brought it up because Vertica, But it's still So to your point, I mean, it's pleasant. What's important is that I don't have to worry I remember in the early days of big data and you got the product market fit down. that haven't been able to go to work and thank you so much for coming on theCUBE All right, and thank you everybody for watching.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Larry LancasterPERSON

0.99+

Dave VellantePERSON

0.99+

LarryPERSON

0.99+

BostonLOCATION

0.99+

five timesQUANTITY

0.99+

three timesQUANTITY

0.99+

six timesQUANTITY

0.99+

EMCORGANIZATION

0.99+

sixQUANTITY

0.99+

ZebriumORGANIZATION

0.99+

20 hoursQUANTITY

0.99+

GlassbeamORGANIZATION

0.99+

NedapORGANIZATION

0.99+

VerticaORGANIZATION

0.99+

NimbleORGANIZATION

0.99+

Nimble StorageORGANIZATION

0.99+

HPORGANIZATION

0.99+

HPEORGANIZATION

0.99+

AWSORGANIZATION

0.99+

a year and a halfQUANTITY

0.99+

Micro FocusORGANIZATION

0.99+

ten timesQUANTITY

0.99+

two kindsQUANTITY

0.99+

two yearsQUANTITY

0.99+

three minutesQUANTITY

0.99+

first questionQUANTITY

0.99+

eight weeksQUANTITY

0.98+

StonebreakerORGANIZATION

0.98+

PrometheusTITLE

0.98+

30-40%QUANTITY

0.98+

EonORGANIZATION

0.98+

hundred of usersQUANTITY

0.98+

OneQUANTITY

0.98+

Vertica Virtual Big Data ConferenceEVENT

0.98+

KubernetesTITLE

0.97+

first fundQUANTITY

0.97+

Virtual Vertica Big Data Conference 2020EVENT

0.97+

AWB WestORGANIZATION

0.97+

Virtual Vertica Big Data ConferenceEVENT

0.97+

HoneycombORGANIZATION

0.96+

SASORGANIZATION

0.96+

20 years agoDATE

0.96+

Both typesQUANTITY

0.95+

theCUBEORGANIZATION

0.95+

DatadogORGANIZATION

0.95+

two mainQUANTITY

0.94+

over $6 million dollarsQUANTITY

0.93+

Hello KittyORGANIZATION

0.93+

SQLTITLE

0.93+

ZebriumPERSON

0.91+

SpokeTITLE

0.89+

Encore HotelLOCATION

0.88+

InfoSightORGANIZATION

0.88+

CoronavirusOTHER

0.88+

oneQUANTITY

0.86+

lessQUANTITY

0.85+

OraclesORGANIZATION

0.85+

2020DATE

0.85+

CTOPERSON

0.84+

VerticaTITLE

0.82+

Nimble InfoSightORGANIZATION

0.81+

Ron Cormier, The Trade Desk | Virtual Vertica BDC 2020


 

>> David: It's the cube covering the virtual Vertica Big Data conference 2020 brought to you by Vertica. Hello, buddy, welcome to this special digital presentation of the cube. We're tracking the Vertica virtual Big Data conferences, the cubes. I think fifth year doing the BDC. We've been to every big data conference that they've held and really excited to be helping with the digital component here in these interesting times. Ron Cormier is here, Principal database engineer at the Trade Desk. Ron, great to see you. Thanks for coming on. >> Hi, David, my pleasure, good to see you as well. >> So we're talking a little bit about your background you got, you're basically a Vertica and database guru, but tell us about your role at Trade Desk and then I want to get into a little bit about what Trade Desk does. >> Sure, so I'm a principal database engineer at the Trade Desk. The Trade Desk was one of my customers when I was working with Hp, at HP, as a member of the Vertica team, and I joined the Trade Desk in early 2016. And since then, I've been working on building out their Vertica capabilities and expanding the data warehouse footprint and as ever growing database technology, data volume environment. >> And the Trade Desk is an ad tech firm and you are specializing in real time ad serving and pricing. And I guess real time you know, people talk about real time a lot we define real time as before you lose the customer. Maybe you can talk a little bit about you know, the Trade Desk in the business and maybe how you define real time. >> Totally, so to give everybody kind of a frame of reference. Anytime you pull up your phone or your laptop and you go to a website or you use some app and you see an ad what's happening behind the scenes is an auction is taking place. And people are bidding on the privilege to show you an ad. And across the open Internet, this happens seven to 13 million times per second. And so the ads, the whole auction dynamic and the display of the ad needs to happen really fast. So that's about as real time as it gets outside of high frequency trading, as far as I'm aware. So we put the Trade Desk participates in those auctions, we bid on behalf of our customers, which are ad agencies, and the agencies represent brands so the agencies are the madman companies of the world and they have brands that under their guidance, and so they give us budget to spend, to place the ads and to display them and once the ads get displayed, so we bid on the hundreds of thousands of auctions per second. Once we make those bids, anytime we do make a bid some data flows into our data platform, which is powered by Vertica. And, so we're getting hundreds of thousands of events per second. We have other events that flow into Vertica as well. And we clean them up, we aggregate them, and then we run reports on the data. And we run about 40,000 reports per day on behalf of our customers. The reports aren't as real time as I was talking about earlier, they're more batch oriented. Our customers like to see big chunks of time, like a whole day or a whole week or a whole month on a single report. So we wait for that time period to complete and then we run the reports on the results. >> So you you have one of the largest commercial infrastructures, in the Big Data sphere. Paint a picture for us. I understand you got a couple of like 320 node clusters we're talking about petabytes of data. But describe what your environment looks like. >> Sure, so like I said, we've been very good customers for a while. And we started out with with a bunch of enterprise clusters. So the Enterprise Mode is the traditional Vertica deployment where the compute and the storage is tightly coupled all raid arrays on the servers. And we had four of those and we're doing okay, but our volumes are ever increasing, we wanted to store more data. And we wanted to run more reports in a shorter period of time, was to keep pushing. And so we had these four clusters and then we started talking with Vertica about Eon mode, and that's Vertica separation of compute and storage where you get the compute and the storage can be scaled independently, we can add storage without adding compute or vice versa or we can add both, like. So that was something that we were very interested in for a couple reasons. One, our enterprise clusters, we're running out of disk, like when adding disk is expensive. In Enterprise Mode, it's kind of a pain, you got to add, compute at the same time, so you kind of end up in an unbalanced place. So beyond mode that problem gets a lot better. We can add disk, infinite disk because it's backed by S3. And we can add compute really easy to scale, the number of things that we run in parallel concurrency, just add a sub cluster. So they are two US East and US west of Amazon, so reasonably diverse. And and the real benefit is that they can, we can stop nodes when we don't need them. Our workload is fairly lumpy, I call it. Like we, after the day completes, we do the ingest, we do the aggregation for ingesting and aggregating all day, but the final hour, so it needs to be completed. And then once that's done, then the number of reports that we need to run spikes up, it goes really high. And we run those reports, we spin up a bunch of extra compute on the fly, run those reports and then spin them down. And we don't have to pay for that, for the rest of the day. So Eon has been a nice Boone for us for both those reasons. >> I'd love to explore you on little bit more. I mean, it's relatively new, I think 2018 Vertica announced Eon mode, so it's only been out there a couple years. So I'm curious for the folks that haven't moved the Eon mode, can you which presumably they want to for the same reasons that you mentioned why by the stories and chunks when you're on Storage if you don't have to, what were some of the challenges that you had to, that you faced in going to Eon mode? What kind of things did you have to prepare for? Were there any out of scope expectations? Can you share that experience with us? >> Sure, so we were an early adopter. We participated in the beta program. I mean, we, I think it's fair to say we actually drove the requirements and a lot of ways because we approached Vertica early on. So the challenges were what you'd expect any early adopter to be going through. The sort of getting things working as expected. I mean, there's a number of cases, which I could touch upon, like, we found an efficiency in the way that it accesses the data on S3 and it was accessing the data too frequently, which ended up was just expensive. So our S3 bill went up pretty significantly for a couple of months. So that was a challenge, but we worked through that another was that we recently made huge strides in with Vertica was the ability to stop and start nodes and not have to start them very quickly. And when they start to not interfere with any running queries, so when we create, when we want to spin up a bunch to compute, there was a point in time when it would break certain queries that were already running. So that that was a challenge. But again, the very good team has been quite responsive to solving these issues and now that's behind us. In terms of those who need to get started, there's or looking to get started. there's a number of things to think about. Off the top of my head there's sort of new configuration items that you'll want to think about, like how instance type. So certainly the Amazon has a variety of instances and its important to consider one of Vertica's architectural advantages in these areas Vertica has this caching layer on the instances themselves. And what that does is if we can keep the data in cache, what we've found is that the performance is basically the same performance of Enterprise Mode. So having a good size cast when needed, can be a little worrying. So we went with the I three instance types, which have a lot of local NVME storage that we can, so we can cache data and get good performance. That's one thing to think about. The number of nodes, the instance type, certainly the number of shards is a sort of technical item that needs to be considered. It's how the data gets, its distributed. It's sort of a layer on top of the segmentation that some Vertica engineers will be familiar with. And probably I mean, the, one of the big things that one needs to consider is how to get data in the database. So if you have an existing database, there's no sort of nice tool yet to suck all the data into an Eon database. And so I think they're working on that. But we're at the point we got there. We had to, we exported all our data out of enterprise cluster as cache dumped it out to S3 and then we had the Eon cluster to suck that data. >> So awesome advice. Thank you for sharing that with the community. So but at the end of the day, so it sounds like you had some learning to do some tweaking to do and obviously how to get the data in. At the end of the day, was it worth it? What was the business impact? >> Yeah, it definitely was worth it for us. I mean, so right now, we have four times the data in our Eon cluster that we have in our enterprise clusters. We still run some enterprise clusters. We started with four at the peak. Now we're down to two. So we have the two young clusters. So it's been, I think our business would say it's been a huge win, like we're doing things that we really never could have done before, like for accessing the data on enterprise would have been really difficult. It would have required non trivial engineering to do things like daisy chaining clusters together, and then how to aggregate data across clusters, which would, again, non trivial. So we have all the data we want, we can continue to grow data, where running reports on seasonality. So our customers can compare their campaigns last year versus this year, which is something we just haven't been able to do in the past. We've expanded that. So we grew the data vertically, we've expanded the data horizontally as well. So we were adding columns to our aggregates. We are, in reaching the data much more than we have in the past. So while we still have enterprise kicking around, I'd say our clusters are doing the majority of the heavy lifting. >> And the cloud was part of the enablement, here, particularly with scale, is that right? And are you running certain... >> Definitely. >> And you are running on prem as well, or are you in a hybrid mode? Or is it all AWS? >> Great question, so yeah. When I've been speaking about enterprise, I've been referring to on prem. So we have a physical machines in data centers. So yeah, we are running a hybrid now and I mean, and so it's really hard to get like an apples to apples direct comparison of enterprise on prem versus Eon in the cloud. One thing that I touched upon in my presentation is it would require, if I try to get apples to apples, And I think about how I would run the entire workload on enterprise or on Eon, I had to run the entire thing, we want both, I tried to think about how many cores, we would need CPU cores to do that. And basically, it would be about the same number of cores, I think, for enterprise on prime versus Eon in the cloud. However, Eon nodes only need to be running half the course only need to be running about six hours out of the day. So the other the other 18 hours I can shut them down and not be paying for them, mostly. >> Interesting, okay, and so, I got to ask you, I mean, notwithstanding the fact that you've got a lot invested in Vertica, and get a lot of experience there. A lot of you know, emerging cloud databases. Did you look, I mean, you know, a lot about database, not just Vertica, your database guru in many areas, you know, traditional RDBMS, as well as MPP new cloud databases. What is it about Vertica that works for you in this specific sweet spot that you've chosen? What's really the difference there? >> Yeah, so I think the key differences is the maturity. There are a number, I am familiar with another, a number of other database platforms in the cloud and otherwise, column stores specifically, that don't have the maturity that we're used to and we need at our scale. So being able to specify alternate projections, so different sort orders on my data is huge. And, there's other platforms where we don't have that capability. And so the, Vertica is, of course, the original column store and they've had time to build up a lead in terms of their maturity and features and I think that other other column stores cloud, otherwise are playing a little bit of catch up in that regard. Of course, Vertica is playing catch up on the cloud side. But if I had to pick whether I wanted to write a column store, first graph from scratch, or use a defined file system, like a cloud file system from scratch, I'd probably think it would be easier to write the cloud file system. The column store is where the real smarts are. >> Interesting, let's talk a little bit about some of the challenges you have in reporting. You have a very dynamic nature of reporting, like I said, your clients want to they want to a time series, they just don't want to snap snapshot of a slice. But at the same time, your reporting is probably pretty lumpy, a very dynamic, you know, demand curve. So first of all, is that accurate? Can you describe that sort of dynamic, dynamism and how are you handling that? >> Yep, that's exactly right. It is lumpy. And that's the exact word that I use. So like, at the end of the UTC day, when UTC midnight rolls around, that's we do the final ingest the final aggregate and then the queue for the number of reports that need to run spikes. So the majority of those 40,000 reports that we run per day are run in the four to six hours after that spikes up. And so that's when we need to have all the compute come online. And that's what helps us answer all those queries as fast as possible. And that's a big reason why Eon is advantage for us because the rest of the day we kind of don't necessarily need all that compute and we can shut it down and not pay for it. >> So Ron, I wonder if you could share with us just sort of the wrap here, where you want to take this you're obviously very close to Vertica. Are you driving them in a heart and Eon mode, you mentioned before you'd like, you'd have the ability to load data into Eon mode would have been nice for you, I guess that you're kind of over that hump. But what are the kinds of things, If Column Mahoney is here in the room, what are you telling him that you want the team, the engineering team at Vertica to work on that would make your life better? >> I think the things that need the most attention sort of near term is just the smoothing out some of the edges in terms of making it a little bit more seamless in terms of the cloud aspects to it. So our goal is to be able to start instances and have them join the cluster in less than five minutes. We're not quite there yet. If you look at some of the other cloud database platforms, they're beating that handle it so I know the team is working on that. Some of the other things are the control. Like I mentioned, while we like control in the column store, we also want control on the cloud side of things in terms of being able to dedicate cluster, some clusters specific. We can pin workloads against a specific sub cluster and take advantage of the cast that's over there. We can say, okay, this resource pool. I mean, the sub cluster is a new concept, relatively new concept for Vertica. So being able to have control of many things at sub cluster level, resource pools, configuration parameters, and so on. >> Yeah, so I mean, I personally have always been impressed with Vertica. And their ability to sort of ride the wave adopt new trends. I mean, they do have a robust stack. It's been, you know, been 10 plus years around. They certainly embraced to do, the embracing machine learning, we've been talking about the cloud. So I actually have a lot of confidence to them, especially when you compare it to other sort of mid last decade MPP column stores that came out, you know, Vertica is one of the few remaining certainly as an independent brand. So I think that speaks the team there and the engineering culture. But give your final word. Just final thoughts on your role the company Vertica wherever you want to take it. >> Yeah, no, I mean, we're really appreciative and we value the partners that we have and so I think it's been a win win, like our volumes are, like I know that we have some data that got pulled into their test suite. So I think it's been a win win for both sides and it'll be a win for other Vertica customers and prospects, knowing that they're working with some of the highest volume, velocity variety data that (mumbles) >> Well, Ron, thanks for coming on. I wish we could have met face to face at the the Encore in Boston. I think next year we'll be able to do that. But I appreciate that technology allows us to have these remote conversations. Stay safe, all the best to you and your family. And thanks again. >> My pleasure, David, good speaking with you. >> And thank you for watching everybody, we're covering this is the Cubes coverage of the Vertica virtual Big Data conference. I'm Dave volante. We'll be right back right after this short break. (soft music)

Published Date : Mar 31 2020

SUMMARY :

brought to you by Vertica. So we're talking a little bit about your background and I joined the Trade Desk in early 2016. And the Trade Desk is an ad tech firm And people are bidding on the privilege to show you an ad. So you you have one of the largest And and the real benefit is that they can, for the same reasons that you mentioned why by dumped it out to S3 and then we had the Eon cluster So but at the end of the day, So we have all the data we want, And the cloud was part of the enablement, here, half the course only need to be running I mean, notwithstanding the fact that you've got that don't have the maturity about some of the challenges you have in reporting. because the rest of the day we kind of So Ron, I wonder if you could share with us in terms of the cloud aspects to it. the company Vertica wherever you want to take it. and we value the partners that we have Stay safe, all the best to you and your family. of the Vertica virtual Big Data conference.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
RonPERSON

0.99+

DavidPERSON

0.99+

VerticaORGANIZATION

0.99+

Ron CormierPERSON

0.99+

HPORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

last yearDATE

0.99+

AWSORGANIZATION

0.99+

40,000 reportsQUANTITY

0.99+

BostonLOCATION

0.99+

18 hoursQUANTITY

0.99+

fifth yearQUANTITY

0.99+

USLOCATION

0.99+

Dave volantePERSON

0.99+

next yearDATE

0.99+

sevenQUANTITY

0.99+

bothQUANTITY

0.99+

OneQUANTITY

0.99+

2018DATE

0.99+

less than five minutesQUANTITY

0.99+

this yearDATE

0.99+

10 plus yearsQUANTITY

0.99+

oneQUANTITY

0.99+

fourQUANTITY

0.99+

early 2016DATE

0.98+

applesORGANIZATION

0.98+

two young clustersQUANTITY

0.98+

twoQUANTITY

0.98+

both sidesQUANTITY

0.98+

about six hoursQUANTITY

0.98+

CubesORGANIZATION

0.98+

six hoursQUANTITY

0.98+

US EastLOCATION

0.98+

HpORGANIZATION

0.98+

EonORGANIZATION

0.96+

S3TITLE

0.95+

13 million times per secondQUANTITY

0.94+

halfQUANTITY

0.94+

primeCOMMERCIAL_ITEM

0.94+

four timesQUANTITY

0.92+

hundreds of thousands of auctionsQUANTITY

0.92+

mid last decadeDATE

0.89+

one thingQUANTITY

0.88+

One thingQUANTITY

0.87+

single reportQUANTITY

0.85+

couple reasonsQUANTITY

0.84+

four clustersQUANTITY

0.83+

first graphQUANTITY

0.81+

VerticaTITLE

0.81+

hundreds of thousands of events per secondQUANTITY

0.8+

about 40,000 reports per dayQUANTITY

0.78+

Vertica Big Data conference 2020EVENT

0.77+

320 nodeQUANTITY

0.74+

a whole weekQUANTITY

0.72+

Vertica virtual Big DataEVENT

0.7+

Keynote Analysis | Virtual Vertica BDC 2020


 

(upbeat music) >> Narrator: It's theCUBE, covering the Virtual Vertica Big Data Conference 2020. Brought to you by Vertica. >> Dave Vellante: Hello everyone, and welcome to theCUBE's exclusive coverage of the Vertica Virtual Big Data Conference. You're watching theCUBE, the leader in digital event tech coverage. And we're broadcasting remotely from our studios in Palo Alto and Boston. And, we're pleased to be covering wall-to-wall this digital event. Now, as you know, originally BDC was scheduled this week at the new Encore Hotel and Casino in Boston. Their theme was "Win big with big data". Oh sorry, "Win big with data". That's right, got it. And, I know the community was really looking forward to that, you know, meet up. But look, we're making the best of it, given these uncertain times. We wish you and your families good health and safety. And this is the way that we're going to broadcast for the next several months. Now, we want to unpack Colin Mahony's keynote, but, before we do that, I want to give a little context on the market. First, theCUBE has covered every BDC since its inception, since the BDC's inception that is. It's a very intimate event, with a heavy emphasis on user content. Now, historically, the data engineers and DBAs in the Vertica community, they comprised the majority of the content at this event. And, that's going to be the same for this virtual, or digital, production. Now, theCUBE is going to be broadcasting for two days. What we're doing, is we're going to be concurrent with the Virtual BDC. We got practitioners that are coming on the show, DBAs, data engineers, database gurus, we got a security experts coming on, and really a great line up. And, of course, we'll also be hearing from Vertica Execs, Colin Mahony himself right of the keynote, folks from product marketing, partners, and a number of experts, including some from Micro Focus, which is the, of course, owner of Vertica. But I want to take a moment to share a little bit about the history of Vertica. The company, as you know, was founded by Michael Stonebraker. And, Verica started, really they started out as a SQL platform for analytics. It was the first, or at least one of the first, to really nail the MPP column store trend. Not only did Vertica have an early mover advantage in MPP, but the efficiency and scale of its software, relative to traditional DBMS, and also other MPP players, is underscored by the fact that Vertica, and the Vertica brand, really thrives to this day. But, I have to tell you, it wasn't without some pain. And, I'll talk a little bit about that, and really talk about how we got here today. So first, you know, you think about traditional transaction databases, like Oracle or IMBDB tour, or even enterprise data warehouse platforms like Teradata. They were simply not purpose-built for big data. Vertica was. Along with a whole bunch of other players, like Netezza, which was bought by IBM, Aster Data, which is now Teradata, Actian, ParAccel, which was the basis for Redshift, Amazon's Redshift, Greenplum was bought, in the early days, by EMC. And, these companies were really designed to run as massively parallel systems that smoked traditional RDBMS and EDW for particular analytic applications. You know, back in the big data days, I often joked that, like an NFL draft, there was run on MPP players, like when you see a run on polling guards. You know, once one goes, they all start to fall. And that's what you saw with the MPP columnar stores, IBM, EMC, and then HP getting into the game. So, it was like 2011, and Leo Apotheker, he was the new CEO of HP. Frankly, he has no clue, in my opinion, with what to do with Vertica, and totally missed one the biggest trends of the last decade, the data trend, the big data trend. HP picked up Vertica for a song, it wasn't disclosed, but my guess is that it was around 200 million. So, rather than build a bunch of smart tokens around Vertica, which I always call the diamond in the rough, Apotheker basically permanently altered HP for years. He kind of ruined HP, in my view, with a 12 billion dollar purchase of Autonomy, which turned out to be one of the biggest disasters in recent M&A history. HP was forced to spin merge, and ended up selling most of its software to Microsoft, Micro Focus. (laughs) Luckily, during its time at HP, CEO Meg Whitman, largely was distracted with what to do with the mess that she inherited form Apotheker. So, Vertica was left alone. Now, the upshot is Colin Mahony, who was then the GM of Vertica, and still is. By the way, he's really the CEO, and he just doesn't have the title, I actually think they should give that to him. But anyway, he's been at the helm the whole time. And Colin, as you'll see in our interview, is a rockstar, he's got technical and business jobs, people love him in the community. Vertica's culture is really engineering driven and they're all about data. Despite the fact that Vertica is a 15-year-old company, they've really kept pace, and not been polluted by legacy baggage. Vertica, early on, embraced Hadoop and the whole open-source movement. And that helped give it tailwinds. It leaned heavily into cloud, as we're going to talk about further this week. And they got a good story around machine intelligence and AI. So, whereas many traditional database players are really getting hurt, and some are getting killed, by cloud database providers, Vertica's actually doing a pretty good job of servicing its install base, and is in a reasonable position to compete for new workloads. On its last earnings call, the Micro Focus CFO, Stephen Murdoch, he said they're investing 70 to 80 million dollars in two key growth areas, security and Vertica. Now, Micro Focus is running its Suse play on these two parts of its business. What I mean by that, is they're investing and allowing them to be semi-autonomous, spending on R&D and go to market. And, they have no hardware agenda, unlike when Vertica was part of HP, or HPE, I guess HP, before the spin out. Now, let me come back to the big trend in the market today. And there's something going on around analytic databases in the cloud. You've got companies like Snowflake and AWS with Redshift, as we've reported numerous times, and they're doing quite well, they're gaining share, especially of new workloads that are merging, particularly in the cloud native space. They combine scalable compute, storage, and machine learning, and, importantly, they're allowing customers to scale, compute, and storage independent of each other. Why is that important? Because you don't have to buy storage every time you buy compute, or vice versa, in chunks. So, if you can scale them independently, you've got granularity. Vertica is keeping pace. In talking to customers, Vertica is leaning heavily into the cloud, supporting all the major cloud platforms, as we heard from Colin earlier today, adding Google. And, why my research shows that Vertica has some work to do in cloud and cloud native, to simplify the experience, it's more robust in motor stack, which supports many different environments, you know deep SQL, acid properties, and DNA that allows Vertica to compete with these cloud-native database suppliers. Now, Vertica might lose out in some of those native workloads. But, I have to say, my experience in talking with customers, if you're looking for a great MMP column store that scales and runs in the cloud, or on-prem, Vertica is in a very strong position. Vertica claims to be the only MPP columnar store to allow customers to scale, compute, and storage independently, both in the cloud and in hybrid environments on-prem, et cetera, cross clouds, as well. So, while Vertica may be at a disadvantage in a pure cloud native bake-off, it's more robust in motor stack, combined with its multi-cloud strategy, gives Vertica a compelling set of advantages. So, we heard a lot of this from Colin Mahony, who announced Vertica 10.0 in his keynote. He really emphasized Vertica's multi-cloud affinity, it's Eon Mode, which really allows that separation, or scaling of compute, independent of storage, both in the cloud and on-prem. Vertica 10, according to Mahony, is making big bets on in-database machine learning, he talked about that, AI, and along with some advanced regression techniques. He talked about PMML models, Python integration, which was actually something that they talked about doing with Uber and some other customers. Now, Mahony also stressed the trend toward object stores. And, Vertica now supports, let's see S3, with Eon, S3 Eon in Google Cloud, in addition to AWS, and then Pure and HDFS, as well, they all support Eon Mode. Mahony also stressed, as I mentioned earlier, a big commitment to on-prem and the whole cloud optionality thing. So 10.0, according to Colin Mahony, is all about really doubling down on these industry waves. As they say, enabling native PMML models, running them in Vertica, and really doing all the work that's required around ML and AI, they also announced support for TensorFlow. So, object store optionality is important, is what he talked about in Eon Mode, with the news of support for Google Cloud and, as well as HTFS. And finally, a big focus on deployment flexibility. Migration tools, which are a critical focus really on improving ease of use, and you hear this from a lot of customers. So, these are the critical aspects of Vertica 10.0, and an announcement that we're going to be unpacking all week, with some of the experts that I talked about. So, I'm going to close with this. My long-time co-host, John Furrier, and I have talked some time about this new cocktail of innovation. No longer is Moore's law the, really, mainspring of innovation. It's now about taking all these data troves, bringing machine learning and AI into that data to extract insights, and then operationalizing those insights at scale, leveraging cloud. And, one of the things I always look for from cloud is, if you've got a cloud play, you can attract innovation in the form of startups. It's part of the success equation, certainly for AWS, and I think it's one of the challenges for a lot of the legacy on-prem players. Vertica, I think, has done a pretty good job in this regard. And, you know, we're going to look this week for evidence of that innovation. One of the interviews that I'm personally excited about this week, is a new-ish company, I would consider them a startup, called Zebrium. What they're doing, is they're applying AI to do autonomous log monitoring for IT ops. And, I'm interviewing Larry Lancaster, who's their CEO, this week, and I'm going to press him on why he chose to run on Vertica and not a cloud database. This guy is a hardcore tech guru and I want to hear his opinion. Okay, so keep it right there, stay with us. We're all over the Vertica Virtual Big Data Conference, covering in-depth interviews and following all the news. So, theCUBE is going to be interviewing these folks, two days, wall-to-wall coverage, so keep it right there. We're going to be right back with our next guest, right after this short break. This is Dave Vellante and you're watching theCUBE. (upbeat music)

Published Date : Mar 31 2020

SUMMARY :

Brought to you by Vertica. and the Vertica brand, really thrives to this day.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

Larry LancasterPERSON

0.99+

ColinPERSON

0.99+

IBMORGANIZATION

0.99+

HPORGANIZATION

0.99+

70QUANTITY

0.99+

MicrosoftORGANIZATION

0.99+

Michael StonebrakerPERSON

0.99+

Colin MahonyPERSON

0.99+

Stephen MurdochPERSON

0.99+

VerticaORGANIZATION

0.99+

EMCORGANIZATION

0.99+

Palo AltoLOCATION

0.99+

ZebriumORGANIZATION

0.99+

two daysQUANTITY

0.99+

AWSORGANIZATION

0.99+

BostonLOCATION

0.99+

VericaORGANIZATION

0.99+

Micro FocusORGANIZATION

0.99+

2011DATE

0.99+

HPEORGANIZATION

0.99+

UberORGANIZATION

0.99+

firstQUANTITY

0.99+

MahonyPERSON

0.99+

Meg WhitmanPERSON

0.99+

AmazonORGANIZATION

0.99+

Aster DataORGANIZATION

0.99+

SnowflakeORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

FirstQUANTITY

0.99+

12 billion dollarQUANTITY

0.99+

OneQUANTITY

0.99+

this weekDATE

0.99+

John FurrierPERSON

0.99+

15-year-oldQUANTITY

0.98+

PythonTITLE

0.98+

OracleORGANIZATION

0.98+

olin MahonyPERSON

0.98+

around 200 millionQUANTITY

0.98+

Virtual Vertica Big Data Conference 2020EVENT

0.98+

theCUBEORGANIZATION

0.98+

80 million dollarsQUANTITY

0.97+

todayDATE

0.97+

two partsQUANTITY

0.97+

Vertica Virtual Big Data ConferenceEVENT

0.97+

TeradataORGANIZATION

0.97+

oneQUANTITY

0.97+

ActianORGANIZATION

0.97+

Dan Woicke, Cerner Corporation | Virtual Vertica BDC 2020


 

(gentle electronic music) >> Hello, everybody, welcome back to the Virtual Vertica Big Data Conference. My name is Dave Vellante and you're watching theCUBE, the leader in digital coverage. This is the Virtual BDC, as I said, theCUBE has covered every Big Data Conference from the inception, and we're pleased to be a part of this, even though it's challenging times. I'm here with Dan Woicke, the senior director of CernerWorks Engineering. Dan, good to see ya, how are things where you are in the middle of the country? >> Good morning, challenging times, as usual. We're trying to adapt to having the kids at home, out of school, trying to figure out how they're supposed to get on their laptop and do virtual learning. We all have to adapt to it and figure out how to get by. >> Well, it sure would've been my pleasure to meet you face to face in Boston at the Encore Casino, hopefully next year we'll be able to make that happen. But let's talk about Cerner and CernerWorks Engineering, what is that all about? >> So, CernerWorks Engineering, we used to be part of what's called IP, or Intellectual Property, which is basically the organization at Cerner that does all of our software development. But what we did was we made a decision about five years ago to organize my team with CernerWorks which is the hosting side of Cerner. So, about 80% of our clients choose to have their domains hosted within one of the two Kansas City data centers. We have one in Lee's Summit, in south Kansas City, and then we have one on our main campus that's a brand new one in downtown, north Kansas City. About 80, so we have about 27,000 environments that we manage in the Kansas City data centers. So, what my team does is we develop software in order to make it easier for us to monitor, manage, and keep those clients healthy within our data centers. >> Got it. I mean, I think of Cerner as a real advanced health tech company. It's the combination of healthcare and technology, the collision of those two. But maybe describe a little bit more about Cerner's business. >> So we have, like I said, 27,000 facilities across the world. Growing each day, thank goodness. And, our goal is to ensure that we reduce errors and we digitize the entire medical records for all of our clients. And we do that by having a consulting practice, we do that by having engineering, and then we do that with my team, which manages those particular clients. And that's how we got introduced to the Vertica side as well, when we introduced them about seven years ago. We were actually able to take a tremendous leap forward in how we manage our clients. And I'd be more than happy to talk deeper about how we do that. >> Yeah, and as we get into it, I want to understand, healthcare is all about outcomes, about patient outcomes and you work back from there. IT, for years, has obviously been a contributor but removed, and somewhat indirect from those outcomes. But, in this day and age, especially in an organization like yours, it really starts with the outcomes. I wonder if you could ratify that and talk about what that means for Cerner. >> Sorry, are you talking about medical outcomes? >> Yeah, outcomes of your business. >> So, there's two different sides to Cerner, right? There's the medical side, the clinical side, which is obviously our main practice, and then there's the side that I manage, which is more of the operational side. Both are very important, but they go hand in hand together. On the operational side, the goal is to ensure that our clinicians are on the system, and they don't know they're on the system, right? Things are progressing, doctors don't want to be on the system, trust me. My job is to ensure they're having the most seamless experience possible while they're on the EMR and have it just be one of their side jobs as opposed to taking their attention away from the patients. That make sense? >> Yeah it does, I mean, EMR and meaningful use, around the Affordable Care Act, really dramatically changed the unit. I mean, people had to demonstrate in order to get paid, and so that became sort of an unfunded mandate for folks and you really had to respond to that, didn't you? >> We did, we did that about three to four years ago. And we had to help our clients get through what's called meaningful use, there was different stages of meaningful use. And what we did, is we have the website called the Lights On Network which is free to all of our clients. Once you get onto the website the Lights On Network, you can actually show how you're measured and whether or not you're actually completing the different necessary tasks in order to get those payments for meaningful use. And it also allows you to see what your performance is on your domain, how the clinicians are doing on the system, how many hours they're spending on the system, how many orders they're executing. All of that is completely free and visible to our clients on the Lights On Network. And that's actually backed by some of the Vertica software that we've invested in. >> Yeah, so before we get into that, it sounds like your mission, really, is just great user experiences for the people that are on the network. Full stop. >> We do. So, one of the things that we invented about 10 years ago is called RTMS Timers. They're called Response Time Measurement System. And it started off as a way of us proving that clients are actually using the system, and now it's turned into more of a user outcomes. What we do is we collect 2.5 billion timers per day across all of our clients across the world. And every single one of those records goes to the Vertica platform. And then we've also developed a system on that which allows us in real time to go and see whether or not they're deviating from their normal. So we do baselines every hour of the week and then if they're deviating from those baselines, we can immediately call a service center and have them engage the client before they call in. >> So, Dan, I wonder if you could paint a picture. By the way, that's awesome. I wonder if you could paint a picture of your analytics environment. What does it look like? Maybe give us a sense of the scale. >> Okay. So, I've been describing how we operate, our remote hosted clients in the two Kansas City data centers, but all the software that we write, we also help our client hosted agents as well. Not only do we take care of what's going on at the Kansas City data center, but we do write software to ensure that all of clients are treated the same and we provide the same level of care and performance management across all those clients. So what we do is we have 90,000 agents that we have split across all these clients across the world. And every single hour, we're committing a billion rows to Vertica of operational data. So I talked a little bit about the RTMS timers, but we do things just like everyone else does for CPU, memory, Java Heap Stack. We can tell you how many concurrent users are on the system, I can tell you if there's an application that goes down unexpected, like a crash. I can tell you the response time from the network as most of us use Citrix at Cerner. And so what we do is we measure the amount of time it takes from the client side to PCs, it's sitting in the virtual data centers, sorry, in the hospitals, and then round trip to the Citrix servers that are sitting in the Kansas City data center. That's called the RTT, our round trip transactions. And what we've done is, over the last couple of years, what we've done is we've switched from just summarizing CPU and memory and all that high-level stuff, in order to go down to a user level. So, what are you doing, Dr. Smith, today? How many hours are you using the EMR? Have you experienced any slowness? Have you experienced any hourglass holding within your application? Have you experienced, unfortunately, maybe a crash? Have you experienced any slowness compared to your normal use case? And that's the step we've taken over the last few years, to go from summarization of high-level CPU memory, over to outcome metrics, which are what is really happening with a particular user. >> So, really granular views of how the system is being used and deep analytics on that. I wonder, go ahead, please. >> And, we weren't able to do that by summarizing things in traditional databases. You have to actually have the individual rows and you can't summarize information, you have to have individual metrics that point to exactly what's going on with a particular clinician. >> So, okay, the MPP architecture, the columnar store, the scalability of Vertica, that's what's key. That was my next question, let me take us back to the days of traditional RDBMS and then you brought in Vertica. Maybe you could give us a sense as to why, what that did for you, the before and after. >> Right. So, I'd been painting a picture going forward here about how traditionally, eight years ago, all we could do was summarize information. If CPU was going to go and jump up 8%, I could alarm the data center and say, hey, listen, CPU looks like it's higher, maybe an application's hanging more than it has been in the past. Things are a little slower, but I wouldn't be able to tell you who's affected. And that's where the whole thing has changed, when we brought Vertica in six years ago is that, we're able to take those 90,000 agents and commit a billion rows per hour operational data, and I can tell you exactly what's going on with each of our clinicians. Because you know, it's important for an entire domain to be healthy. But what about the 10 doctors that are experiencing frustration right now? If you're going to summarize that information and roll it up, you'll never know what those 10 doctors are experiencing and then guess what happens? They call the data center and complain, right? The squeaky wheels? We don't want that, we want to be able to show exactly who's experiencing a bad performance right now and be able to reach out to them before they call the help desk. >> So you're able to be proactive there, so you've gone from, Houston, we have a problem, we really can't tell you what it is, go figure it out, to, we see that there's an issue with these docs, or these users, and go figure that out and focus narrowly on where the problem is as opposed to trying to whack-a-mole. >> Exactly. And the other big thing that we've been able to do is corelation. So, we operate two gigantic data centers. And there's things that are shared, switches, network, shared storage, those things are shared. So if there is an issue that goes on with one of those pieces of equipment, it could affect multiple clients. Now that we have every row in Vertica, we have a new program in place called performance abnormality flags. And what we're able to do is provide a website in real time that goes through the entire stack from Citrix to network to database to back-end tier, all the way to the end-user desktop. And so if something was going to be related because we have a network switch going out of the data center or something's backing up slow, you can actually see which clients are on that switch, and, what we did five years ago before this, is we would deploy out five different teams to troubleshoot, right? Because five clients would call in, and they would all have the same problem. So, here you are having to spare teams trying to investigate why the same problem is happening. And now that we have all of the data within Vertica, we're able to show that in a real time fashion, through a very transparent dashboard. >> And so operational metrics throughout the stack, right? A game changer. >> It's very compact, right? I just label five different things, the stack from your end-user device all the way through the back-end to your database and all the way back. All that has to work properly, right? Including the network. >> How big is this, what are we talking about? However you measure it, terabytes, clusters. What can you share there? >> Sorry, you mean, the amount of data that we process within our data centers? >> Give us a fun fact. >> Absolute petabytes, yeah, for sure. And in Vertica right now we have two petabytes of data, and I purge it out every year, one year's worth of data within two different clusters. So we have to two different data centers I've been describing, what we've done is we've set Vertica up to be in both data centers, to be highly redundant, and then one of those is configured to do real-time analysis and corelation research, and then the other one is to provide service towards what I described earlier as our Lights On Network, so it's a very dedicated hardened cluster in one of our data centers to allow the Lights On Network to provide the transparency directly to our clients. So we want that one to be pristine, fast, and nobody touch it. As opposed to the other one, where, people are doing real-time, ad hoc queries, which sometimes aren't the best thing in the world. No matter what kind of database or how fast it is, people do bad things in databases and we just don't want that to affect what we show our clients in a transparent fashion. >> Yeah, I mean, for our audience, Vertica has always been aimed at these big, hairy, analytic problems, it's not for a tiny little data mart in a department, it's really the big scale problems. I wonder if I could ask you, so you guys, obviously, healthcare, with HIPAA and privacy, are you doing anything in the cloud, or is it all on-prem today? >> So, in the operational space that I manage, it's all on-premises, and that is changing. As I was describing earlier, we have an initiative to go to AWS and provide levels of service to countries like Sweden which does not want any operational data to leave that country's walls, whether it be operational data or whether it be PHI. And so, we have to be able to adapt into Vertia Eon Mode in order to provide the same services within Sweden. So obviously, Cerner's not going to go up and build a data center in every single country that requires us, so we're going to leverage our partnership with AWS to make this happen. >> Okay, so, I was going to ask you, so you're not running Eon Mode today, it's something that you're obviously interested in. AWS will allow you to keep the data locally in that region. In talking to a lot of practitioners, they're intrigued by this notion of being able to scale independently, storage from compute. They've said they wished that's a much more efficient way, I don't have to buy in chunks, if I'm out of storage, I don't have to buy compute, and vice-versa. So, maybe you could share with us what you're thinking, I know it's early days, but what's the logic behind the business case there? >> I think you're 100% correct in your assessment of taking compute away from storage. And, we do exactly what you say, we buy a server. And it has so much compute on it, and so much storage. And obviously, it's not scaled properly, right? Either storage runs out first or compute runs out first, but you're still paying big bucks for the entire server itself. So that's exactly why we're doing the POC right now for Eon Mode. And I sit on Vertica's TAB, the advisory board, and they've been doing a really good job of taking our requirements and listening to us, as to what we need. And that was probably number one or two on everybody's lists, was to separate storage from compute. And that's exactly what we're trying to do right now. >> Yeah, it's interesting, I've talked to some other customers that are on the customer advisory board. And Vertica is one of these companies that're pretty transparent about what goes on there. And I think that for the early adopters of Eon Mode there were some challenges with getting data into the new system, I know Vertica has been working on that very hard but you guys push Vertica pretty hard and from what I can tell, they listen. Your thoughts. >> They do listen, they do a great job. And even though the Big Data Conference is canceled, they're committed to having us go virtually to the CAD meeting on Monday, so I'm looking forward to that. They do listen to our requirements and they've been very very responsive. >> Nice. So, I wonder if you could give us some final thoughts as to where you want to take this thing. If you look down the road a year or two, what does success look like, Dan? >> That's a good question. Success means that we're a little bit more nimble as far as the different regions across the world that we can provide our services to. I want to do more corelation. I want to gather more information about what users are actually experiencing. I want to be able to have our phone never ring in our data center, I know that's a grand thought there. But I want to be able to look forward to measuring the data internally and reaching out to our clients when they have issues and then doing the proper corelation so that I can understand how things are intertwining if multiple clients are having an issue. That's the goal going forward. >> Well, in these trying times, during this crisis, it's critical that your operations are running smoothly. The last thing that organizations need right now, especially in healthcare, is disruption. So thank you for all the hard work that you and your teams are doing. I wish you and your family all the best. Stay safe, stay healthy, and thanks so much for coming on theCUBE. >> I really appreciate it, thanks for the opportunity. >> You're very welcome, and thank you, everybody, for watching, keep it right there, we'll be back with our next guest. This is Dave Vellante for theCUBE. Covering Virtual Vertica Big Data Conference. We'll be right back. (upbeat electronic music)

Published Date : Mar 31 2020

SUMMARY :

in the middle of the country? and figure out how to get by. been my pleasure to meet you and then we have one on our main campus and technology, the and then we do that with my team, Yeah, and as we get into it, the goal is to ensure that our clinicians in order to get paid, and so that became in order to get those for the people that are on the network. So, one of the things that we invented I wonder if you could paint a picture from the client side to PCs, of how the system is being used that point to exactly what's going on and then you brought in Vertica. and be able to reach out to them we really can't tell you what it is, And now that we have all And so operational metrics and all the way back. are we talking about? And in Vertica right now we in the cloud, or is it all on-prem today? So, in the operational I don't have to buy in chunks, and listening to us, as to what we need. that are on the customer advisory board. so I'm looking forward to that. as to where you want to take this thing. and reaching out to our that you and your teams are doing. thanks for the opportunity. and thank you, everybody,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dan WoickePERSON

0.99+

Dave VellantePERSON

0.99+

AWSORGANIZATION

0.99+

CernerORGANIZATION

0.99+

Affordable Care ActTITLE

0.99+

BostonLOCATION

0.99+

100%QUANTITY

0.99+

DanPERSON

0.99+

10 doctorsQUANTITY

0.99+

SwedenLOCATION

0.99+

90,000 agentsQUANTITY

0.99+

five clientsQUANTITY

0.99+

CernerWorksORGANIZATION

0.99+

8%QUANTITY

0.99+

twoQUANTITY

0.99+

Kansas CityLOCATION

0.99+

SmithPERSON

0.99+

VerticaORGANIZATION

0.99+

Cerner CorporationORGANIZATION

0.99+

next yearDATE

0.99+

MondayDATE

0.99+

BothQUANTITY

0.99+

todayDATE

0.99+

one yearQUANTITY

0.99+

a yearQUANTITY

0.99+

27,000 facilitiesQUANTITY

0.99+

HoustonLOCATION

0.99+

oneQUANTITY

0.99+

two petabytesQUANTITY

0.99+

five years agoDATE

0.99+

CernerWorks EngineeringORGANIZATION

0.98+

south Kansas CityLOCATION

0.98+

eight years agoDATE

0.98+

about 80%QUANTITY

0.98+

Virtual Vertica Big Data ConferenceEVENT

0.98+

CitrixORGANIZATION

0.98+

two different data centersQUANTITY

0.97+

each dayQUANTITY

0.97+

four years agoDATE

0.97+

two different clustersQUANTITY

0.97+

six years agoDATE

0.97+

eachQUANTITY

0.97+

north Kansas CityLOCATION

0.97+

HIPAATITLE

0.97+

five different teamsQUANTITY

0.97+

firstQUANTITY

0.96+

five different thingsQUANTITY

0.95+

two different sidesQUANTITY

0.95+

about 27,000 environmentsQUANTITY

0.95+

both data centersQUANTITY

0.95+

About 80QUANTITY

0.95+

Response Time Measurement SystemOTHER

0.95+

two gigantic data centersQUANTITY

0.93+

Java HeapTITLE

0.92+

Gabriel Chapman, Pure Storage | Virtual Vertica BDC 2020


 

>>Yeah, it's the queue covering the virtual vertical Big Data Conference 2020. Brought to you by vertical. >>Hi, everybody. And welcome to this cube special presentation of the vertical virtual Big Data conference. The Cube is running in parallel with Day One and day two of the vertical of Big Data event. By the way, the Cube has been every single big data event in It's our pleasure to be here in the virtual slash digital event as well. Gabriel Chapman is here. He's the director of Flash Blade Products Solutions Marketing at Pure Storage. Great to see you. Thanks for coming on. >>Great to see you too. How's it going? >>It's going very well. I mean, I wish we were meeting in Boston at the Encore Hotel, but, uh, you know, and hopefully we'll be able to meet it, accelerate at some point, future or one of the sub shows that you guys are doing the regional shows, but because we've been covering that show as well. But I really want to get into it. And the last accelerate September 2019 pure and vertical announced. Ah, partnership. I remember a joint being ran up to me and said, Hey, you got to check this out. The separation of compute and storage by EON mode now available on Flash Blade. So, uh and and I believe still the only company that can support that separation and independent scaling both on Prem and in the cloud. So I want to ask, what were the trends and analytical database and cloud led to this partnership? You know, >>realistically, I think what we're seeing is that there's been a kind of a larger shift when it comes to modern analytics platforms towards moving away from the traditional, you know, Hadoop type architecture where we were doing on and leveraging a lot of directors that storage primarily because of the limitations of how that solution was architected. When we start to look at the larger trends towards you know how organizations want to do this type of work on premises, they're looking at solutions that allow them to scale the compute storage pieces independently and therefore, you know, the flash blade platform ended up being a great solution to support America in their transition Tian mode. Leveraging essentially is an S three object store. >>Okay, so let's let's circle back on that you guys in your in your announcement of the flash blade, you make the claim that Flash Blade is the industry's most advanced file and object storage platform ever. That's a bold statement. So defend that What? >>I would like to go beyond that and just say, you know, So we've really kind of looked at this from a standpoint of, you know, as as we've developed Flash Blade as a platform and keep in mind, it's been a product that's been around for over three years now and has been very successful for pure storage. The reality is, is that fast file and fast object as a combined storage platform is a direction that many organizations are looking to go, and we believe that we're a leader in that fast object best file storage place in realistically, which we start to see more organizations start to look at building solutions that leverage cloud storage characteristics. But doing so on Prem for a multitude of different reasons. We've built a platform that really addresses a lot of those needs around simplicity around, you know, making things this year that you know, fast matters for us. Ah, simple is smart. Um we can provide, you know, cloud integrations across the spectrum. And, you know, there's a subscription model that fits into that as well. We fall that that falls into our umbrella of what we consider the modern day takes variance. And it's something that we've built into the entire pure portfolio. >>Okay, so I want to get into the architecture a little bit of flash blade and then understand the fit for, uh, analytic databases generally, but specifically for vertical. So it is a blade, so you got compute and network included. It's a key value store based system. So you're talking about scale out. Unlike, unlike, uh, pure is sort of, you know, initial products which were scale up, Um, and so I want on It is a fabric based system. I want to understand what that all means to take us through the architecture. You know, some of the quote unquote firsts that you guys talk about. So let's start with sort of the blade >>aspect. Yeah, the blade aspect of what we call the flash blade. Because if you look at the actual platform, you have, ah, primarily a chassis with built in networking components, right? So there's ah, fabric interconnect with inside the platform that connects to each one of the individual blades. Individual blades have their own compute that drives basically a pure storage flash components inside. It's not like we're just taking SSD is and plugging them into a system and like you would with the traditional commodity off the shelf hardware design. This is very much an engineered solution that is built towards the characteristics that we believe were important with fast filing past object scalability, massive parallel ization. When it comes to performance and the ability to really kind of grow and scale from essentially seven blades right now to 150 that's that's the kind of scale that customers are looking for, especially as we start to address these larger analytics pools. They are multi petabytes data sets, you know that single addressable object space and, you know, file performance that is beyond what most of your traditional scale up storage platforms are able to deliver. >>Yes, I interviewed cause last September and accelerate, and Christie Pure has been attacked by some of the competitors. There's not having scale out. I asked him his thoughts on that, he said Well, first of all, our flash blade is scale out. He said, Look, anything that adds complexity, you know we avoid. But for the workloads that are associated with flash blade scale out is the right sort of approach. Maybe you could talk about why that is. Well, >>realistically, I think you know that that approach is better when we're starting to work with large, unstructured data sets. I mean, flash blade is unique. The architected to allow customers to achieve superior resource utilization for compute and storage, while at the same time, you know, reducing significantly the complexity that has arisen around this kind of bespoke or siloed nature of big data and analytics solutions. I mean, we're really kind of look at this from a standpoint of you have built and delivered are created applications in the public cloud space of dress, you know, object storage and an unstructured data. And for some organizations, the importance is bringing that on Prem. I mean, we do see about repatriation coming on a lot of organizations as these data egress, charges continue to expand and grow, um, and then organizations that want even higher performance and what we're able to get into the public cloud space. They are bringing that data back on Prem They are looking at from a stamp. We still want to be able to scale the way we scale in the cloud. We still want to operate the same way we operate in the cloud, but we want to do it within control of our own, our own borders. And so that's, you know, that's one of the bigger pieces to that. And we start to look at how do we address cloud characteristics and dynamics and consumption metrics or models? A zealous the benefits and efficiencies of scale that they're able to afford but allowing customers to do that with inside their own data center. >>So you're talking about the trends earlier. You have these cloud native databases that allowed of the scaling of compute and storage independently. Vertical comes in with eon of a lot of times we talk about these these partnerships as Barney deals of you know I love you, You love me. Here's a press release and then we go on or they're just straight, you know, go to market. Are there other aspects of this partnership that they're non Barney deal like, in other words, any specific engineering. Um, you know other go to market programs? Could you talk about that a little bit? Yeah, >>it's it's It's more than just that what we consider a channel meet in the middle or, you know, that Barney type of deal. It's realistically, you know, we've done some first with Veronica that I think, really Courtney, if they think you look at the architecture and how we did, we've brought to market together. Ah, we have solutions. Teams in the back end who are, you know, subject matter experts. In this space, if you talk to joy and the people from vertical, they're very high on our very excited about the partnership because it often it opens up a new set of opportunities for their customers to leverage on mode and get into some of the the nuance task specs of how they leverage the depot depot with inside each individual. Compute node in adjustments with inside their reach. Additional performance gains for customers on Prem and at the same time, for them, that's still tough. The ability to go into that cloud model if they wish to. And so I think a lot of it is around. How do we partner is to companies? How do we do a joint selling motions? How do we show up in and do white papers and all of the traditional marketing aspects that we bring to the market? And then, you know, joint selling opportunities exist where they are, and so that's realistically. I think, like any other organization that's going to market with a partner on MSP that they have, ah, strong partnership with. You'll continue to see us, you know, talking about are those mutually beneficial relationships and the solutions that we're bringing to the market. >>Okay, you know, of course, he used to be a Gartner analyst, and you go to the vendor side now, but it's but it's, but it's a Gartner analyst. You're obviously objective. You see it on, you know well, there's a lot of ways to skin the cat There, there their strengths, weaknesses, opportunities, threats, etcetera for every vendor. So you have you have vertical who's got a very mature stack and talking to a number of the customers out there who are using EON mode. You know there's certain workloads where these cloud native databases makes sense. It's not just the economics of scaling and storage independently. I want to talk more about that. There's flexibility aspect as well. But Vertical really has to play its its trump card, which is Look, we've got a big on premise state, and we're gonna bring that eon capability both on Prem and we're embracing the cloud now. There obviously have been there to play catch up in the cloud, but at the same time, they've got a much more mature stack than a lot of these other cloud native databases that might have just started a couple of years ago. So you know, so there's trade offs that customers have to make. How do you sort through that? Where do you see the interest in this? And and what's the sweet spot for this partnership? You know, we've >>been really excited to build the partnership with vertical A and provide, you know, we're really proud to provide pretty much the only on Prem storage platform that's validated with the yang mode to deliver a modern data experience for our customers together. You know, it's ah, it's that partnership that allows us to go into customers that on Prem space, where I think that there's still not to say that not everybody wants to go there, but I think there's aspects and solutions that worked very well there. But for the vast majority, I still think that there's, you know, the your data center is not going away. And you do want to have control over some of the many of the assets with inside of the operational confines. So therefore, we start to look at how do we can do the best of what cloud offers but on prim. And that's realistically, where we start to see the stronger push for those customers. You still want to manage their data locally. A swell as maybe even worked around some of the restrictions that they might have around cost and complexity hiring. You know, the different types of skills skill sets that are required to bring applications purely cloud native. It's still that larger part of that digital transformation that many organizations are going for going forward with. And realistically, I think they're taking a look at the pros and cons, and we've been doing cloud long enough where people recognize that you know it's not perfect for everything and that there's certain things that we still want to keep inside our own data center. So I mean, realistically, as we move forward, that's, Ah, that better option when it comes to a modern architecture that can do, you know, we can deliver an address, a diverse set of performance requirements and allow the organization to continue to grow the model to the data, you know, based on the data that they're actually trying to leverage. And that's really what Flash was built for. It was built for a platform that could address small files or large files or high throughput, high throughput, low latency scale of petabytes in a single name. Space in a single rack is we like to put it in there. I mean, we see customers that have put 150 flash blades into production as a single name space. It's significant for organizations that are making that drive towards modern data experience with modern analytics platforms. Pure and Veronica have delivered an experience that can address that to a wide range of customers that are implementing uh, you know, particularly on technology. >>I'm interested in exploring the use case. A little bit further. You just sort of gave some parameters and some examples and some of the flexibility that you have, um, and take us through kind of what the customer discussions are like. Obviously you've got a big customer base, you and vertical that that's on Prem. That's the the unique advantage of this. But there are others. It's not just the economics of the granular scaling of compute and storage independently. There are other aspects of take us through that sort of a primary use case or use cases. Yeah, you >>know, I mean, I could give you a couple customer examples, and we have a large SAS analyst company which uses vertical on last way to authenticate the quality of digital media in real time, You know, then for them it makes a big difference is they're doing their streaming and whatnot that they can. They can fine tune the grand we control that. So that's one aspect that that we address. We have a multinational car car company, which uses vertical on flash blade to make thousands of decisions per second for autonomous vehicle decision making trees. You know, that's what really these new modern analytics platforms were built for, um, there's another healthcare organization that uses vertical on flash blade to enable healthcare providers to make decisions in real time. The impact lives, especially when we start to look at and, you know, the current state of affairs with code in the Corona virus. You know, those types of technologies, we're really going to help us kind of get of and help lower invent, bend that curve downward. So, you know, there's all these different areas where we can address that the goals and the achievements that we're trying to look bored with with real time analytics decision making tools like and you know, realistically is we have these conversations with customers they're looking to get beyond the ability of just, you know, a data scientist or a data architect looking to just kind of driving information >>that we're talking about Hadoop earlier. We're kind of going well beyond that now. And I guess what I'm saying is that in the first phase of cloud, it was all about infrastructure. It was about, you know, uh, spin it up. You know, compute and storage is a little bit of networking in there. >>It >>seems like the next new workload that's clearly emerging is you've got. And it started with the cloud native databases. But then bringing in, you know, AI and machine learning tooling on top of that Ah, and then being able to really drive these new types of insights and it's really about taking data these bog this bog of data that we've collected over the last 10 years. A lot of that is driven by a dupe bringing machine intelligence into the equation, scaling it with either cloud public cloud or bringing that cloud experience on Prem scale. You know, across organizations and across your partner network, that really is a new emerging workloads. You see that? And maybe talk a little bit about what you're seeing with customers. >>Yeah. I mean, it really is. We see several trends. You know, one of those is the ability to take a take this approach to move it out of the lab, but into production. Um, you know, especially when it comes to data science projects, machine learning projects that traditionally start out as kind of small proofs of concept, easy to spin up in the cloud. But when a customer wants to scale and move towards a riel you know, derived a significant value from that. They do want to be able to control more characteristic site, and we know machine learning, you know, needs toe needs to learn from a massive amounts of data to provide accuracy. There's just too much data retrieving the cloud for every training job. Same time Predictive analytics without accuracy is not going to deliver the business advantage of what everyone is seeking. You know, we see this. Ah, the visualization of Data Analytics is Tricia deployed is being on a continuum with, you know, the things that we've been doing in the long in the past with data warehousing, data Lakes, ai on the other end. But this way, we're starting to manifest it and organizations that are looking towards getting more utility and better elasticity out of the data that they are working for. So they're not looking to just build apps, silos of bespoke ai environments. They're looking to leverage. Ah, you know, ah, platform that can allow them to, you know, do ai, for one thing, machine learning for another leverage multiple protocols to access that data because the tools are so much Jeff um, you know, it is a growing diversity of of use cases that you can put on a single platform I think organizations are looking for as they try to scale these environment. >>I think it's gonna be a big growth area in the coming years. Gable. I wish we were in Boston together. You would have painted your little corner of Boston orange. I know that you guys have but really appreciate you coming on the cube wall to wall coverage. Two days of the vertical vertical virtual big data conference. Keep it right there. Right back. Right after this short break, Yeah.

Published Date : Mar 31 2020

SUMMARY :

Brought to you by vertical. of the vertical of Big Data event. Great to see you too. future or one of the sub shows that you guys are doing the regional shows, but because we've been you know, the flash blade platform ended up being a great solution to support America Okay, so let's let's circle back on that you guys in your in your announcement of the I would like to go beyond that and just say, you know, So we've really kind of looked at this from a standpoint you know, initial products which were scale up, Um, and so I want on It is a fabric based object space and, you know, file performance that is beyond what most adds complexity, you know we avoid. you know, that's one of the bigger pieces to that. straight, you know, go to market. it's it's It's more than just that what we consider a channel meet in the middle or, you know, So you know, so there's trade offs that customers have to make. been really excited to build the partnership with vertical A and provide, you know, we're really proud to provide pretty and some examples and some of the flexibility that you have, um, and take us through you know, the current state of affairs with code in the Corona virus. It was about, you know, uh, spin it up. But then bringing in, you know, AI and machine learning data because the tools are so much Jeff um, you know, it is a growing diversity of I know that you guys have but really appreciate you coming on the cube wall to wall coverage.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Gabriel ChapmanPERSON

0.99+

September 2019DATE

0.99+

BostonLOCATION

0.99+

BarneyORGANIZATION

0.99+

GartnerORGANIZATION

0.99+

Two daysQUANTITY

0.99+

VeronicaPERSON

0.99+

JeffPERSON

0.99+

last SeptemberDATE

0.99+

thousandsQUANTITY

0.98+

150QUANTITY

0.98+

CourtneyPERSON

0.98+

oneQUANTITY

0.98+

one aspectQUANTITY

0.98+

Day OneQUANTITY

0.97+

day twoQUANTITY

0.97+

seven bladesQUANTITY

0.97+

bothQUANTITY

0.96+

Virtual VerticaORGANIZATION

0.96+

over three yearsQUANTITY

0.96+

150 flash bladesQUANTITY

0.95+

firstQUANTITY

0.95+

single rackQUANTITY

0.94+

Corona virusOTHER

0.94+

single nameQUANTITY

0.94+

first phaseQUANTITY

0.94+

Pure StorageORGANIZATION

0.93+

PremORGANIZATION

0.92+

Christie PureORGANIZATION

0.91+

single platformQUANTITY

0.91+

each individualQUANTITY

0.91+

this yearDATE

0.91+

firstsQUANTITY

0.9+

Big Data Conference 2020EVENT

0.9+

AmericaLOCATION

0.89+

Flash Blade Products SolutionsORGANIZATION

0.89+

couple of years agoDATE

0.88+

single nameQUANTITY

0.84+

each oneQUANTITY

0.84+

one thingQUANTITY

0.83+

TriciaPERSON

0.82+

PureORGANIZATION

0.81+

last 10 yearsDATE

0.8+

HadoopTITLE

0.75+

single addressableQUANTITY

0.74+

secondQUANTITY

0.72+

VeronicaORGANIZATION

0.7+

Encore HotelLOCATION

0.68+

Big DataEVENT

0.67+

CubeCOMMERCIAL_ITEM

0.66+

SASORGANIZATION

0.65+

Flash BladeTITLE

0.62+

petabytesQUANTITY

0.62+

eonORGANIZATION

0.59+

couple customerQUANTITY

0.55+

EONORGANIZATION

0.53+

single bigQUANTITY

0.5+

BigEVENT

0.49+

yearsDATE

0.48+

subQUANTITY

0.46+

2020DATE

0.33+

Gabriel Chapman grphx full


 

hi everybody and welcome to this cube special presentation of the verdict of virtual Big Data conference the cube is running in parallel with day 1 and day 2 of the verdict big data event by the way the cube has been at every single big data event and it's our pleasure to be here in the virtual / digital event as well Gabriel Chapman is here is the director of flash blade product solutions marketing at pure storage gave great to see you thanks for coming on great to see you - how's it going it's going very well I mean I wish we were meeting in Boston at the Encore Hotel but you know and and hopefully we'll be able to meet it accelerate at some point you cheer or one of the the sub shows that you guys are doing the regional shows but because we've been covering that show as well but I really want to get into it and the last accelerate September 2019 pure and Vertica announced a partnership I remember a joint being ran up to me and said hey you got to check this out the separation of Butte and storage by a Eon mode now available on flash played so and and I believe still the only company that can support that separation and independent scaling both on permit in the cloud so Gabe I want to ask you what were the trends in analytical database and cloud that led to this partnership you know realistically I think what we're seeing is that there's been in kind of a larger shift when it comes to modern analytics platforms towards moving away from the the traditional you know Hadoop type architecture where we were doing on and leveraging a lot of direct attached storage primarily because of the limitations of how that solution was architected when we start to look at the larger trends towards you know how organizations want to do this type of work on premises they're looking at solutions that allow them to scale the compute storage pieces independently and therefore you know the flash play platform ended up being a great solution to support Vertica in their transition to Eon mode leveraging is essentially as an s3 object store okay so let's let's circle back on that you guys in your in your announcement of a flash blade you make the claim that flash blade is the industry's most advanced file and object storage platform ever that's a bold statement so defend that it's supposed to yeah III like to go beyond that and just say you know so we've really kind of looked at this from a standpoint of you know as as we've developed flash blade as a platform and keep in mind it's been a product that's been around for over three years now and has you know it's been very successful for pure storage the reality is is that fast file and fast object as a combined storage platform is a direction that many organizations are looking to go and we believe that we're a leader in that fast object of best file storage place in realistically would we start to see more organizations start to look at building solutions that leverage cloud storage characteristics but doing so on prem or multitude different reasons we've built a platform that really addresses a lot of those needs around simplicity around you know making things assure that you know vast matters for us simple is smart we can provide you know cloud integrations across the spectrum and you know there's a subscription model that fits into that as well we fall that that falls into our umbrella of what we consider the modern data experience and it's something that we've built into the entire pure portfolio okay so I want to get into the architecture a little bit of Flash blade and then better understand the fit for analytic databases generally but specifically Vertica so it is a blade so you got compute and a network included it's a key value store based system so you're talking about scale out unlike unlike viewers sort of you know initial products which were scale up and so I want to under in as a fabric base system I want to understand what that all mean so take us through the architecture you know some of the quote-unquote firsts that you guys talk about so let's start with sort of the blade aspect yeah the blade aspect meaning we call it a flash blade because if you look at the actual platform you have a primarily a chassis with built in networking components right so there's a fabric interconnect with inside the platform that connects to each one of the individual blades the individual blades have their own compute that drives basically a pure storage flash components inside it's not like we're just taking SSDs and plugging them into a system and like you would with the traditional commodity off-the-shelf hardware design this is a very much an engineered solution that is built towards the characteristics that we believe were important with fast file and fast object scalability you know massive parallelization when it comes to performance and the ability to really kind of grow and scale from essentially seven blades right now to a hundred and fifty that's that's the kind of scale that customers are looking for especially as we start to address these larger analytic spools they have multi petabyte datasets you know that single addressable object space and you know file performance that is beyond what most of your traditional scale-up storage platforms are able to deliver yes I interviewed cause last September and accelerate and and Christopher's been you know attacked by some of the competitors is not having a scale out I asked him his thoughts on that he said well first of all our Flash blade is scale-out and he said look anything that that that adds the complexity you know we avoid but for the workloads that are associated with Flash blade scale-out is the right sort of approach maybe you could talk about why that is well you know realistically I think you know that that approach is better when we're starting to learn to work with large unstructured data sets I mean flash plays uniquely architected to allow customers to achieve you know a superior resource utilization for compute and storage well at the same time you know reducing significantly the complexity that is arisen around these kind of bespoke or siloed nature of big data and analytic solutions I mean we really kind of look at this from a standpoint of you have built and delivered or created applications in the public cloud space that address you know object storage and and unstructured data and and for some organizations the importance is bringing that on Prem I mean we do seek repatriation that coming on on for a lot of organizations as these data egress charges continue to expand and grow and then organizations that want even higher performance in the what we're able to get into the public cloud space they are bringing that data back on Prem they are looking at from a standpoint we still want to be able to scale the way we scale on the cloud we still want to operate the same way we operate in the cloud but we want to do it within control of our own you know our own borders and so that's you know that's one of the bigger pieces to that is we start to look at how do we address cloud characteristics and dynamics and consumption metrics or models as well as the benefits and efficiencies of scale that they're able to afford but allowing customers that do that with inside their own data center yes are you talking about the trends earlier you had these cloud native databases that allowed the scaling of compute and storage independently of Vertica comes in with eon of a lot of times we talk about these these partnerships as Barney deals of you know I love you you love me here's a press release and then we go on or they're just straight you know go to market are there other aspects of this partnership that are that are non Barney deal like in other words any specific you know engineering you know other go to market programs can you talk about that a little bit yeah it's it's it's more than just you know I then what we consider a channel meet in the middle or you know that Barney type of deal it's the realistically you know we've done some first with Vertica that I think are really important if they think you look at the architecture and how we do have we've brought this to market together we have solutions teams in the back end who are you know subject matter experts in this space if you talk to joy and the people from vertigo they're very high on or very excited about the partnership because it often it opens up a new set of opportunities for their customers to to leverage Eon mode and you know get into some of the the nuanced aspects of how they leverage the depot for Depot with inside each individual compute node and adjustments with inside there I reach additional performance gains for customers on Prem and at the same time for them there's still the ability to go into that cloud model if they wish to and so I think a lot of it is around how do we partner as two companies how do we do a joint selling motions you know how do we show up and and you know do white papers and all of the the traditional marketing aspects that we bring devote to the market and then you know joint selling opportunities as exists where they are and so that's realistically I think like any other organization that's going to market with a partner or an ISP that they have a strong partnership with you'll continue to see us you know talking about our chose mutually beneficial relationships and the solutions that we're bringing to the market okay you know of course he used to be a Gartner analyst and you go over to the vendor side now but as but as it but as a gardener analyst you're obviously objective you see it all you know well there's a lot of ways to skin a cat there are there are there are strengths weaknesses opportunities threats etc for every vendor so you have you have Vertica who's got a very mature stack and and talking to a number of the customers out there we're using Eon mode you know there's certain workloads where these cloud native databases make sense it's not just the economics of scaling compute and storage independently I want to talk more about that there's flexibility aspects as well but Vertica really you know has to play its trump card which is look we've got a big on-premise state and we're gonna bring that you know Eon capability both on Prem and we're embracing the cloud now they're obviously you have to they had to play catch-up in the cloud but at the same time they've got a much more mature stack than a lot of these other you know cloud native databases that might have just started a couple of years ago so you know so there's trade-offs that customers have to make how do you sort through that where do you see the interest in this and and and what's the sweet spot for this partnership you know we've been really excited to build the partnership with Vertica and we're providing you know we're really proud to provide pretty much the only on Prem storage platform that's validated with the vertical yawn mode to deliver a modern data experience for our customers together you know it's it's that partnership that allows us to go into customers that on Prem space where I think that they're still you know not to say that not everybody wants to go the cloud I think there's aspects and solutions that work very well there but for the vast majority I still think that there's you know the your data center is not going away and you do want to have control over some of the many of the different facets with inside the operational confines so therefore we start to look at how do we can do the best of what cloud offers but on Prem and that's realistically where we start to see the stronger push for those customers who still want to manage their data locally as well as maybe even work around some of the restrictions that they might have around cost and complexity hiring you know the different types of skills skill sets that are required to bring you know applications purely cloud native it's still that larger part of that digital transformation that many organizations are going for going forward with and realistically I think they're taking a look at the pros and cons and we've been doing cloud long enough for people recognize that you know it's not perfect for everything and that there's certain things that we still want to keep inside our own data center so I mean realistically as we move forward that's that that better option when it comes to a modern architecture they can do it you know we can deliver and address a diverse set of performance requirements and allow the organization to continue to grow the model to the data you know based on the data that they're actually trying to leverage and that's really what flash Wood was built or it was built for a platform that can address small files or large files or high throughput high throughput low latency scale to petabytes in a single namespace in a single rack as we like to put it in there I mean we see customers that have put you know 150 flash blades into production as a single namespace it's significant for organizations that are making that drive towards modern data experience with modern analytics platforms pure and Vertica have delivered an experience that can address that to a wide range of customers that are implementing you know the verdict technology I'm interested in exploring the use case a little bit further you just sort of gave some parameters and some examples and some of the flexibility that you have in but take us through kind of what the discuss the customer discussions are like obviously you've got a big customer base you and Vertica that that's on prem that's the the the unique advantage of this but there are others it's not just the economics of the the granular scaling of compute and storage independently there are other aspects so to take us through that sort of a primary use case or use cases yeah you know I mean I can give you a couple customer examples and we have a large SAS analyst company which uses verdict on flash play to authenticate the quality of digital media in real time and you know then for them it makes a big difference is they're doing they're streaming and whatnot that they can they can fine tune and grandly control that so that's one aspect that that we get address we have a multi national car con company which uses verdict on flash blade to make thousands of decisions per second for autonomous vehicle decision-making trees that you know that's what really these new modern analytics platforms were built or there's another healthcare organization that uses Vertica on flash blade to enable healthcare providers to make decisions in real time the impact Ives especially when we start to look at and you know the current state of affairs with Kovac in the coronavirus you know those types of technologies are really going to help us kind of get love and and help lower and been you know bend that curve downward so you know there's all these different areas where we can address the goals and the achievements that we're trying to look bored with with real-time analytic decision making tools like Berta and you know realistically as we have these conversations with customers they're looking to get beyond the ability of just you know you know a data scientist or a data architect looking to just kind of drive in information we were talking about Hadoop earlier we're kind of going well beyond that now and I guess what I'm saying is that in the first phase of cloud it was all about infrastructure it was about you know spinning up you know compute and storage a little bit of networking in there seems like the the a next a new workload that's clearly emerging is you've got and it started with the cloud databases but then bringing in you know AI and machine learning tooling on top of that and then being able to really drive these new types of insights and it's really about taking data these bogs this bog of data that we've collected over the last 10 years a lot of that you know driven by Hadoop bringing machine intelligence into the equation scaling it with either cloud public cloud or bringing that cloud experience on prams scale you know across your organizations and across your partner network that really is a new emerging work load do you see that and maybe talk a little bit about you know what you're seeing with customers yeah I mean it really is we see several trends you know one of those is the ability to take a take this approach to move it out of the lab but into production you know especially when it comes to you know data science projects machine learning projects that traditionally start out as kind of small proofs of concept easy to spin up in the cloud but when a customer wants to scale and move towards a real you know it derived a significant value from that they do want to be able to control more characteristics right and we know machine learning you know needs to needs to learn from a massive amounts of data to provide accuracy there's just too much data to retrieve in the cloud for every training job at the same time predictive analytics without accuracy is not going to deliver the business advantage of what everyone is seeking you know we see this the visualization of data analytics is traditionally deployed as being on a continuum with you know the things that we've been doing in the long you know in the past you know with data warehousing data lakes AI on the other end but but this way we're starting to manifest it in organizations that are looking towards you know getting more utility and better you know elasticity out of the data that they are working for so they're not looking to just build ups you know silos of bespoke AI environments they're looking to leverage you know a platform that can allow them to you know do a I for one thing machine learning for another leverage multiple protocols to access that data because the tools are so much different you know it is a growing diversity of of use cases that you can put on a single platform I think organizations are looking for as they try to scale these environments I think there's gonna be a big growth area in the coming years gay ball I wish we were in Boston together you would have painted your little corner of Boston Orange I know that you guys are sharing but I really appreciate you coming on the cube wall-to-wall coverage two days at the vertical Vertica virtual big data conference keep you right there but right back right after this short break [Music]

Published Date : Mar 30 2020

**Summary and Sentiment Analysis are not been shown because of improper transcript**

ENTITIES

EntityCategoryConfidence
JimPERSON

0.99+

DavePERSON

0.99+

JohnPERSON

0.99+

JeffPERSON

0.99+

Paul GillinPERSON

0.99+

MicrosoftORGANIZATION

0.99+

DavidPERSON

0.99+

Lisa MartinPERSON

0.99+

PCCWORGANIZATION

0.99+

Dave VolantePERSON

0.99+

AmazonORGANIZATION

0.99+

Michelle DennedyPERSON

0.99+

Matthew RoszakPERSON

0.99+

Jeff FrickPERSON

0.99+

Rebecca KnightPERSON

0.99+

Mark RamseyPERSON

0.99+

GeorgePERSON

0.99+

Jeff SwainPERSON

0.99+

Andy KesslerPERSON

0.99+

EuropeLOCATION

0.99+

Matt RoszakPERSON

0.99+

Frank SlootmanPERSON

0.99+

John DonahoePERSON

0.99+

Dave VellantePERSON

0.99+

Dan CohenPERSON

0.99+

Michael BiltzPERSON

0.99+

Dave NicholsonPERSON

0.99+

Michael ConlinPERSON

0.99+

IBMORGANIZATION

0.99+

MeloPERSON

0.99+

John FurrierPERSON

0.99+

NVIDIAORGANIZATION

0.99+

Joe BrockmeierPERSON

0.99+

SamPERSON

0.99+

MattPERSON

0.99+

Jeff GarzikPERSON

0.99+

CiscoORGANIZATION

0.99+

Dave VellantePERSON

0.99+

JoePERSON

0.99+

George CanuckPERSON

0.99+

AWSORGANIZATION

0.99+

AppleORGANIZATION

0.99+

Rebecca NightPERSON

0.99+

BrianPERSON

0.99+

Dave ValantePERSON

0.99+

NUTANIXORGANIZATION

0.99+

NeilPERSON

0.99+

MichaelPERSON

0.99+

Mike NickersonPERSON

0.99+

Jeremy BurtonPERSON

0.99+

FredPERSON

0.99+

Robert McNamaraPERSON

0.99+

Doug BalogPERSON

0.99+

2013DATE

0.99+

Alistair WildmanPERSON

0.99+

KimberlyPERSON

0.99+

CaliforniaLOCATION

0.99+

Sam GroccotPERSON

0.99+

AlibabaORGANIZATION

0.99+

RebeccaPERSON

0.99+

twoQUANTITY

0.99+

Gabriel Chapman


 

hi everybody and welcome to this cube special presentation of the verdict of virtual Big Data conference the cube is running in parallel with day 1 and day 2 of the verdict big data event by the way the cube has been at every single big data event and it's our pleasure to be here in the virtual / digital event as well Gabriel Chapman is here is the director of flash blade product solutions marketing at pure storage gave great to see you thanks for coming on great to see you - how's it going it's going very well I mean I wish we were meeting in Boston at the Encore Hotel but you know and and hopefully we'll be able to meet it accelerate at some point you cheer or one of the the sub shows that you guys are doing the regional shows but because we've been covering that show as well but I really want to get into it and the last accelerate September 2019 pure and Vertica announced a partnership I remember a joint being ran up to me and said hey you got to check this out the separation of Butte and storage by a Eon mode now available on flash played so and and I believe still the only company that can support that separation and independent scaling both on permit in the cloud so Gabe I want to ask you what were the trends in analytical database and cloud that led to this partnership you know realistically I think what we're seeing is that there's been in kind of a larger shift when it comes to modern analytics platforms towards moving away from the the traditional you know Hadoop type architecture where we were doing on and leveraging a lot of direct attached storage primarily because of the limitations of how that solution was architected when we start to look at the larger trends towards you know how organizations want to do this type of work on premises they're looking at solutions that allow them to scale the compute storage pieces independently and therefore you know the flash play platform ended up being a great solution to support Vertica in their transition to Eon mode leveraging is essentially as an s3 object store okay so let's let's circle back on that you guys in your in your announcement of a flash blade you make the claim that flash blade is the industry's most advanced file and object storage platform ever that's a bold statement so defend that it's supposed to yeah III like to go beyond that and just say you know so we've really kind of looked at this from a standpoint of you know as as we've developed flash blade as a platform and keep in mind it's been a product that's been around for over three years now and has you know it's been very successful for pure storage the reality is is that fast file and fast object as a combined storage platform is a direction that many organizations are looking to go and we believe that we're a leader in that fast object of best file storage place in realistically would we start to see more organizations start to look at building solutions that leverage cloud storage characteristics but doing so on prem or multitude different reasons we've built a platform that really addresses a lot of those needs around simplicity around you know making things assure that you know vast matters for us simple is smart we can provide you know cloud integrations across the spectrum and you know there's a subscription model that fits into that as well we fall that that falls into our umbrella of what we consider the modern data experience and it's something that we've built into the entire pure portfolio okay so I want to get into the architecture a little bit of Flash blade and then better understand the fit for analytic databases generally but specifically Vertica so it is a blade so you got compute and a network included it's a key value store based system so you're talking about scale out unlike unlike viewers sort of you know initial products which were scale up and so I want to under in as a fabric base system I want to understand what that all mean so take us through the architecture you know some of the quote-unquote firsts that you guys talk about so let's start with sort of the blade aspect yeah the blade aspect meaning we call it a flash blade because if you look at the actual platform you have a primarily a chassis with built in networking components right so there's a fabric interconnect with inside the platform that connects to each one of the individual blades the individual blades have their own compute that drives basically a pure storage flash components inside it's not like we're just taking SSDs and plugging them into a system and like you would with the traditional commodity off-the-shelf hardware design this is a very much an engineered solution that is built towards the characteristics that we believe were important with fast file and fast object scalability you know massive parallelization when it comes to performance and the ability to really kind of grow and scale from essentially seven blades right now to a hundred and fifty that's that's the kind of scale that customers are looking for especially as we start to address these larger analytic spools they have multi petabyte datasets you know that single addressable object space and you know file performance that is beyond what most of your traditional scale-up storage platforms are able to deliver yes I interviewed cause last September and accelerate and and Christopher's been you know attacked by some of the competitors is not having a scale out I asked him his thoughts on that he said well first of all our Flash blade is scale-out and he said look anything that that that adds the complexity you know we avoid but for the workloads that are associated with Flash blade scale-out is the right sort of approach maybe you could talk about why that is well you know realistically I think you know that that approach is better when we're starting to learn to work with large unstructured data sets I mean flash plays uniquely architected to allow customers to achieve you know a superior resource utilization for compute and storage well at the same time you know reducing significantly the complexity that is arisen around these kind of bespoke or siloed nature of big data and analytic solutions I mean we really kind of look at this from a standpoint of you have built and delivered or created applications in the public cloud space that address you know object storage and and unstructured data and and for some organizations the importance is bringing that on Prem I mean we do seek repatriation that coming on on for a lot of organizations as these data egress charges continue to expand and grow and then organizations that want even higher performance in the what we're able to get into the public cloud space they are bringing that data back on Prem they are looking at from a standpoint we still want to be able to scale the way we scale on the cloud we still want to operate the same way we operate in the cloud but we want to do it within control of our own you know our own borders and so that's you know that's one of the bigger pieces to that is we start to look at how do we address cloud characteristics and dynamics and consumption metrics or models as well as the benefits and efficiencies of scale that they're able to afford but allowing customers that do that with inside their own data center yes are you talking about the trends earlier you had these cloud native databases that allowed the scaling of compute and storage independently of Vertica comes in with eon of a lot of times we talk about these these partnerships as Barney deals of you know I love you you love me here's a press release and then we go on or they're just straight you know go to market are there other aspects of this partnership that are that are non Barney deal like in other words any specific you know engineering you know other go to market programs can you talk about that a little bit yeah it's it's it's more than just you know I then what we consider a channel meet in the middle or you know that Barney type of deal it's the realistically you know we've done some first with Vertica that I think are really important if they think you look at the architecture and how we do have we've brought this to market together we have solutions teams in the back end who are you know subject matter experts in this space if you talk to joy and the people from vertigo they're very high on or very excited about the partnership because it often it opens up a new set of opportunities for their customers to to leverage Eon mode and you know get into some of the the nuanced aspects of how they leverage the depot for Depot with inside each individual compute node and adjustments with inside there I reach additional performance gains for customers on Prem and at the same time for them there's still the ability to go into that cloud model if they wish to and so I think a lot of it is around how do we partner as two companies how do we do a joint selling motions you know how do we show up and and you know do white papers and all of the the traditional marketing aspects that we bring devote to the market and then you know joint selling opportunities as exists where they are and so that's realistically I think like any other organization that's going to market with a partner or an ISP that they have a strong partnership with you'll continue to see us you know talking about our chose mutually beneficial relationships and the solutions that we're bringing to the market okay you know of course he used to be a Gartner analyst and you go over to the vendor side now but as but as it but as a gardener analyst you're obviously objective you see it all you know well there's a lot of ways to skin a cat there are there are there are strengths weaknesses opportunities threats etc for every vendor so you have you have Vertica who's got a very mature stack and and talking to a number of the customers out there we're using Eon mode you know there's certain workloads where these cloud native databases make sense it's not just the economics of scaling compute and storage independently I want to talk more about that there's flexibility aspects as well but Vertica really you know has to play its trump card which is look we've got a big on-premise state and we're gonna bring that you know Eon capability both on Prem and we're embracing the cloud now they're obviously you have to they had to play catch-up in the cloud but at the same time they've got a much more mature stack than a lot of these other you know cloud native databases that might have just started a couple of years ago so you know so there's trade-offs that customers have to make how do you sort through that where do you see the interest in this and and and what's the sweet spot for this partnership you know we've been really excited to build the partnership with Vertica and we're providing you know we're really proud to provide pretty much the only on Prem storage platform that's validated with the vertical yawn mode to deliver a modern data experience for our customers together you know it's it's that partnership that allows us to go into customers that on Prem space where I think that they're still you know not to say that not everybody wants to go the cloud I think there's aspects and solutions that work very well there but for the vast majority I still think that there's you know the your data center is not going away and you do want to have control over some of the many of the different facets with inside the operational confines so therefore we start to look at how do we can do the best of what cloud offers but on Prem and that's realistically where we start to see the stronger push for those customers who still want to manage their data locally as well as maybe even work around some of the restrictions that they might have around cost and complexity hiring you know the different types of skills skill sets that are required to bring you know applications purely cloud native it's still that larger part of that digital transformation that many organizations are going for going forward with and realistically I think they're taking a look at the pros and cons and we've been doing cloud long enough for people recognize that you know it's not perfect for everything and that there's certain things that we still want to keep inside our own data center so I mean realistically as we move forward that's that that better option when it comes to a modern architecture they can do it you know we can deliver and address a diverse set of performance requirements and allow the organization to continue to grow the model to the data you know based on the data that they're actually trying to leverage and that's really what flash Wood was built or it was built for a platform that can address small files or large files or high throughput high throughput low latency scale to petabytes in a single namespace in a single rack as we like to put it in there I mean we see customers that have put you know 150 flash blades into production as a single namespace it's significant for organizations that are making that drive towards modern data experience with modern analytics platforms pure and Vertica have delivered an experience that can address that to a wide range of customers that are implementing you know the verdict technology I'm interested in exploring the use case a little bit further you just sort of gave some parameters and some examples and some of the flexibility that you have in but take us through kind of what the discuss the customer discussions are like obviously you've got a big customer base you and Vertica that that's on prem that's the the the unique advantage of this but there are others it's not just the economics of the the granular scaling of compute and storage independently there are other aspects so to take us through that sort of a primary use case or use cases yeah you know I mean I can give you a cup of customer examples and we have a large SAS analyst company which uses verdict on flash play to authenticate the quality of digital media in real time and you know then for them it makes a big difference is they're doing they're streaming and whatnot that they can they can fine tune and grandly control that so that's one aspect that we get address we have a multi national car con company which uses verdict on flash blade to make thousands of decisions per second for autonomous vehicle decision-making trees that you know that's what really these new modern analytics platforms were built or there's another healthcare organization that uses Vertica on flash blade to enable healthcare providers to make decisions in real time the impact Ives especially when we start to look at and you know the current state of affairs with Kovac in the coronavirus you know those types of technologies are really going to help us kind of get love and and help lower and been you know bend that curve downward so you know there's all these different areas where we can address the goals and the achievements that we're trying to look bored with with real-time analytic decision making tools like Berta and you know realistically as we have these conversations with customers they're looking to get beyond the ability of just you know you know a data scientist or a data architect looking to just kind of drive in information we were talking about Hadoop earlier we're kind of going well beyond that now and I guess what I'm saying is that in the first phase of cloud it was all about infrastructure it was about you know spinning up you know compute and storage a little bit of networking in there seems like the the a next a new workload that's clearly emerging is you've got and it started with the cloud databases but then bringing in you know AI and machine learning tooling on top of that and then being able to really drive these new types of insights and it's really about taking data these bogs this bog of data that we've collected over the last 10 years a lot of that you know driven by Hadoop bringing machine intelligence into the equation scaling it with either cloud public cloud or bringing that cloud experience on prams scale you know across your organizations and across your partner network that really is a new emerging work load do you see that and maybe talk a little bit about you know what you're seeing with customers yeah I mean it really is we see several trends you know one of those is the ability to take a take this approach to move it out of the lab but into production you know especially when it comes to you know data science projects machine learning projects that traditionally start out as kind of small proofs of concept easy to spin up in the cloud but when a customer wants to scale and move towards a real you know it derived a significant value from that they do want to be able to control more characteristics right and we know machine learning you know needs to needs to learn from a massive amounts of data to provide accuracy there's just too much data to retrieve in the cloud for every training job at the same time predictive analytics without accuracy is not going to deliver the business advantage of what everyone is seeking you know we see this the visualization of data analytics is traditionally deployed as being on a continuum with you know the things that we've been doing in the long you know in the past you know with data warehousing data lakes AI on the other end but but this way we're starting to manifest it in organizations that are looking towards you know getting more utility and better you know elasticity out of the data that they are working for so they're not looking to just build ups you know silos of bespoke AI environments they're looking to leverage you know a platform that can allow them to you know do a I for one thing machine learning for another leverage multiple protocols to access that data because the tools are so much different you know it is a growing diversity of of use cases that you can put on a single platform I think organizations are looking for as they try to scale these environments I think there's gonna be a big growth area in the coming years gay ball I wish we were in Boston together you would have painted your little corner of Boston Orange I know that you guys are sharing but I really appreciate you coming on the cube wall-to-wall coverage two days at the vertical Vertica virtual big data conference keep you right there but right back right after this short break [Music]

Published Date : Mar 30 2020

**Summary and Sentiment Analysis are not been shown because of improper transcript**

ENTITIES

EntityCategoryConfidence
September 2019DATE

0.99+

Gabriel ChapmanPERSON

0.99+

BostonLOCATION

0.99+

two companiesQUANTITY

0.99+

BarneyORGANIZATION

0.99+

VerticaORGANIZATION

0.99+

GabePERSON

0.99+

GartnerORGANIZATION

0.98+

two daysQUANTITY

0.98+

ChristopherPERSON

0.98+

last SeptemberDATE

0.98+

first phaseQUANTITY

0.97+

a hundred and fiftyQUANTITY

0.97+

one aspectQUANTITY

0.97+

over three yearsQUANTITY

0.97+

seven bladesQUANTITY

0.97+

pureORGANIZATION

0.96+

day 2QUANTITY

0.96+

bothQUANTITY

0.95+

oneQUANTITY

0.95+

single rackQUANTITY

0.95+

firstsQUANTITY

0.94+

Boston OrangeLOCATION

0.94+

coronavirusOTHER

0.93+

Encore HotelLOCATION

0.93+

thousands of decisions per secondQUANTITY

0.93+

single namespaceQUANTITY

0.92+

each oneQUANTITY

0.92+

single platformQUANTITY

0.92+

HadoopTITLE

0.91+

day 1QUANTITY

0.91+

150 flash bladesQUANTITY

0.9+

singleQUANTITY

0.89+

Big DataEVENT

0.88+

firstQUANTITY

0.86+

BertaORGANIZATION

0.86+

a couple of years agoDATE

0.85+

KovacORGANIZATION

0.84+

last 10 yearsDATE

0.82+

PremORGANIZATION

0.81+

each individualQUANTITY

0.8+

IvesORGANIZATION

0.7+

big dataEVENT

0.66+

one of the bigger piecesQUANTITY

0.66+

the sub showsQUANTITY

0.66+

every singleQUANTITY

0.64+

VerticaTITLE

0.61+

EonTITLE

0.57+

dataEVENT

0.56+

egressORGANIZATION

0.56+

timesQUANTITY

0.54+

EonORGANIZATION

0.54+

petabytesQUANTITY

0.53+

s3TITLE

0.49+

UNLISTED DO NOT PUBLISH Woicke Edit Suggestions


 

six five four three two one hi everybody and welcome to this cube special presentation of the verdict of virtual big data conference the cube is running in parallel with day 1 and day 2 of the verdict the big data event by the way the cube has been at every single big data event and it's our pleasure to be here in the virtual / digital event as well Gabriel Chapman is here is the director of flash blade product solutions marketing at pure storage Gabe great to see you thanks for coming on great to see you - how's it going it's going very well I mean I wish we were meeting in Boston at the Encore hotel but you know and and hopefully we'll be able to meet it accelerate at some point you cheer or one of the the sub shows that you guys are doing the regional shows but because we've been covering that show as well but I really want to get into it and the last accelerate September 2019 pure and Vertica announced a partnership I remember a joint being ran up to me and said hey you got to check this out the separation of Butte and storage by a Eon mode now available on flash played so and and I believe still the only company that can support that separation and independent scaling both on prime and in the cloud so gave I want to ask you what were the trends in analytical database and plowed that led to this partnership you know realistically I think what we're seeing is that there's been kind of a larger shift when it comes to modern analytics platforms towards moving away from the the traditional you know Hadoop type architecture where we were doing on and leveraging a lot of direct mass storage primarily because of the limitations of how that solution was architected when we start to look at the larger trends towards you know how organizations want to do this type of work on premises they're looking at solutions that allow them to scale the compute storage pieces independently and therefore you know the flash blade platform ended up being a great solution to support Vertica in their transition to Eon mode leveraging >> essentially as an s3 object store okay so let's let's circle back on that you guys in your in your announcement of a flash blade you make the claim that flash blade is the industry's most advanced file and object storage platform ever that's a bold statement I defend that it's supposed to yeah I I like to go beyond that and just say you know so we've really kind of looked at this from a standpoint of you know as as we've developed flash Wade as a platform and keep in mind it's been a product that's been around for over three years now and has you know it's been very successful for pure storage the reality is is that fast file and fast object as a combined storage platform is a direction that many organizations are looking to go and we believe that we're a leader in that fast object of best file storage place in realistically which we start to see more organizations start to look at building solutions that leverage cloud storage characteristics but doing so on prem for a multitude of different reasons we've built a platform that really addresses a lot of those needs around simplicity around you know making things assure that you know vast matters for us simple is smart we can provide you know cloud integrations across the spectrum and you know there's a subscription model that fits into that as well we fall that falls into our umbrella of what we consider the modern day day experience and it's something that we've built into the entire pure portfolio okay so I want to get into the architecture a little bit of Flash blade and then better understand the fit for analytic databases generally but specifically for Vertica so it is a blade so you got compute and a network included it's a key value store based system so you're talking about scale out unlike unlike viewers sort of you know initial products which were scale up and so I want to as a fabric base system I want to understand what that all mean so take us through the architecture you know some of the quote-unquote firsts that you guys talk about so let's start with sort of the blade aspect yeah the blade aspect mean we call it a flash blade because if you look at the actual platform you have a primarily a chassis with built in networking components right so there's a fabric interconnect with inside the platform that connects to each one of the individual blades the individual blades have their own compute that drives basically a pure storage flash components inside it's not like we're just taking SSDs and plugging them into a system and like you would with the traditional commodity off-the-shelf hardware design this is a very much an engineered solution that is built towards the characteristics that we believe were important with fast file and fast object scalability you know massive parallelization when it comes to performance and the ability to really kind of grow and scale from essentially seven blades right now to a hundred and fifty that's that's the kind of scale that customers are looking for especially as we start to address these larger analytics pools mayo multi petabyte datasets you know that single addressable object space and you know file performance that is beyond what most of your traditional scale-up storage platforms are able to deliver yeah I saw you interviewed cause last September and accelerate and and Christopher's been you know attacked by some of the competitors is not having a scale out I asked them his thoughts on that he said well first of all our flash plate is scale out he said look anything that that that adds the complexity you know we avoid but for the workloads that are associated with Flash blade scale out is the right sort of approach maybe you could talk about why that is well you know realistically I think you know that that approach is better when we're starting to learn to work with large unstructured data sets I mean flash plays uniquely architected to allow customers to achieve you know a superior resource utilization for compute and storage well at the same time you know reducing significantly the complexity that is arisen around these kind of bespoke or siloed nature of big data and analytic solutions I mean we really kind of look at this from a standpoint of you have built and delivered or created applications in the public cloud space that address you know object storage and and unstructured data and and for some organizations the importance is bringing that on Prem I mean we do seek repatriation that coming on for a lot of organizations as these data egress charges continue to expand and grow and then organizations that want even higher performance in the what we're able to get into the public cloud space they are bringing that data back on Prem they are looking at from a standpoint we still want to be able to scale the way we scale on the cloud we still want to operate the same way we operate in the cloud but we want to do it within control of our own you know our own borders and so that's you know that's one of the bigger pieces to that is we start to look at how do we address cloud characteristics and dynamics and consumption metrics or models as well as the benefits and efficiencies of scale that they're able to afford but allowing customers that do that with inside their own data center so you're talking about the trends earlier you had these cloud native databases that allowed the scaling of compute and storage independently Vertica comes in with Eon a lot of times we talk about these these partnerships as Barney deals of you know I love you you love me here's a press release and then we go on or they're just straight you know go to market are there other aspects of this partnership that are that are non Barney deal like in other words any specific you know engineering you know other go to market programs could you talk about that a little bit yeah it's it's it's more than just you know I then what we consider a channel meet in the middle or you know that Barney type of deal it's realistically you know we've done some first with Vertica that I think are really important if they think you look at the architecture and how we do how we've brought this to market together we have solutions teams in the back end who are you know subject matter experts in this space if you talk to joy and the people from vertigo they're very high on they're very excited about the partnership because it often it opens up a new set of opportunities for their customers to to leverage Eon mode and you know get into some of the the nuanced aspects of how they leverage the Depot or Depot with inside each individual compute node and adjustments with inside there I reach additional performance gains for customers on Prem and it's the same time for them there's still the ability to go into that cloud model if they wish to and so I think a lot of it is around how do we partner as two companies how do we do a joint selling motions you know how do we show up and and you know do white papers and all of the the traditional marketing aspects that we bring into the market and then you know joint selling opportunities exist where they are and so that's realistically I think like any other organization that's going to market with a partner or an ISP that they have a strong partnership with you'll continue to see us you know talking about our shows mutually beneficial relationships and the solutions that we're bringing it to the market okay you know of course he used to be a Gartner analyst and you go over to the vendor side now but as but as it but as a gardener analyst you're obviously objective you see it all and you know well there's a lot of ways to skin a cat there are there are there are strengths weaknesses opportunities threats etc for every vendor so you have you have Vertica who's got a very mature stack and and talking to a number of the customers out there who are using Eon mode you know there's certain workloads where these cloud native databases make sense it's not just the economics of scaling compute and storage independently I want to talk more about that there's flexibility aspects as well but Vertica really you know has to play its trump card which is look we've got a big on-premise state and we're gonna bring that you know Eon capability both on Prem and we're embracing the cloud now they're obviously having they had to play catch-up in the cloud but at the same time they've got a much more mature stack than a lot of these other you know cloud native databases that might have just started a couple years ago so you know so there's trade-offs that customers have to make how do you sort through that where do you see the interest in this and and and what's the sweet spot for this partnership you know we've been really excited to build the partnership with Vertica and we're providing you know we're really proud to provide pretty much the only on Prem storage platform that's validated with the vertical Aeon mode to deliver a modern data experience for our customers together you know it's it's that partnership that allows us to go into customers that on Prem space where I think that they're still you know not to say that not everybody wants to go the cloud I think there's aspects and then solutions that work very well there but for the vast majority I still think that there's you know the your data center is not going away and you do want to have control over some of the many of the different facets with inside the operational confines so therefore we start to look at how do we can do the best of what cloud offers but on Prem and that's realistically where we start to see the stronger push for those customers you still want to manage their data locally as well as maybe even work around some of the restrictions that they might have around cost and complexity hiring you know the different types of skills skill sets that are required to bring you know applications purely cloud native it's still that larger part of that digital transformation that many organizations are going for going forward with and realistically I think they're taking a look at the pros and cons and we've been doing cloud long enough where people recognize that you know it's not perfect for everything and that there's certain things that we still want to keep inside our own data center so I mean realistically as we move forward that's that that better option when it comes to a modern architecture they can do it you know we can deliver and address a diverse set of performance requirements and allowed the organization to continue to grow the model to the data you know based on the data that they're actually trying to leverage and that's really what flash Wood was built for it was built for a platform that can address small files or large files or high throughput high throughput low latency scale to petabytes in a single namespace in a single rack as we like to put it in there I mean we see customers that have put you know 150 flash blades into production as a single namespace it's significant for organizations that are making that drive towards modern data experience with modern analytics platforms pure and Vertica have delivered an experience that can address that to a wide range of customers that are implementing you know the verdict technology I'm interested in exploring the the use case a little bit further you just sort of gave some parameters and some examples and some of the flexibility that you have in but take us through kind of what to discuss the customer discussions are like obviously you've got a big customer base you and Vertica that that's on prem that's the the unique advantage of this but there are others it's not just the economics of the granular scaling of compute and storage independently there are other aspects so to take us through that sort of a primary use case or use cases yeah you know I mean I can give you a couple customer examples and we have a large SAS analyst company which uses verdict on flash play to authenticate the quality of digital media in real time and you know then for them it makes a big difference is they're doing they're streaming and whatnot that they can they can fine tune and grandly control that so that's one aspect that we need to address we have a multi national car company which uses verdict on flash blade to make thousands of decisions per second for autonomous vehicle decision making trees you know that's what really these new modern analytics platforms were built for there's another healthcare organization that uses Vertica on flash blade to enable healthcare providers to make decisions in real time the impact vibes especially when we start to look at and you know the current state of affairs little Co vid and the coronavirus you know those types of technologies are really going to help us kind of get love and and help lower and been you know bend that curve downward so you know there's all these different areas where we can address the goals and the achievements that we're trying to look bored with with real-time analytic decision making tools like birth and you know realistically as we have these conversations with customers they're looking to get beyond the ability of just you know you know a data scientist or a data architect looking to just kind of drive in information you know you know I'm gonna set this model up and we'll come back in a day now we need to make these and the performs characteristics the Aeon mode and vertical allows for can get us towards this almost near real-time analytics decision-making process and that the customers and that's the kind of conversations that we're having with customers who really need to be able to turn this around very quickly instead of waiting well I think you're hitting on something that is actually pretty relevant and that is that near real-time analytic you know database we were talking about Hadoop earlier we're kind of going well beyond that now and I guess what I'm saying is that in the first phase of cloud it was all about infrastructure it was about you know spinning up you know compute and storage a little bit of networking in there seems like the the a next a new workload that's clearly emerging is you've got and it started with the cloud native databases but then bringing in you know AI and machine learning tooling on top of that and then being able to really drive these new types of insights and it's really about taking data these bogs this bog of data that we've collected over the last 10 years a lot of that you know driven by Hadoop bringing machine intelligence into the equation scaling it with either cloud public cloud or bringing that cloud experience on-premise scale you know across your organizations and across your partner network that really is a new emerging work load do you see that and maybe talk a little bit about you know what you're seeing with customers yeah I mean it really is we see several trends you know one of those is the ability to take a take this approach to move it out of the lab but into production you know especially when it comes to you know data science projects machine learning projects that traditionally start out as kind of small proofs of concept easy to spin up in the cloud but when a customer wants to scale and move towards a real you know that derived a significant value from that they do want to be able to control more characteristics right and we know machine learning you know needs to needs to learn from a massive amounts of data to provide accuracy there's just too much data to retrieve in the cloud for every training job at the same time predictive analytics without accuracy is not going to deliver the business advantage of what everyone is seeking you know we see this the visualization of data analytics is traditionally deployed as being on a continuum with you know the things that we've been doing in the long you know in the past you know with data warehousing data lakes AI on the other end but but this way we're starting to manifest it in organizations that are looking towards you know getting more utility and better you know elasticity out of the data that they are working for so they're not looking to just build ups you know silos of bespoke AI environments they're looking to leverage you know a platform that can allow them to you know do a I for one thing machine learning for another leverage multiple protocols to access that data because the tools are so much different you know it is a growing diversity of of use cases that you can put on a single platform I think organizations are looking for as they try to scale these environments I think there's gonna be a big growth area in the coming years Gabe well I wish we were in Boston together you would have painted your little corner of Boston Orange I know that you guys are sure but I really appreciate you coming on the cube and thank you very much have a great day you too okay thank you everybody for watching this is the cubes coverage wall-to-wall coverage two days of the vertical Vertica virtual Big Data conference keep her at their very back right after this short break

Published Date : Mar 30 2020

**Summary and Sentiment Analysis are not been shown because of improper transcript**

ENTITIES

EntityCategoryConfidence
BostonLOCATION

0.99+

September 2019DATE

0.99+

Gabriel ChapmanPERSON

0.99+

BarneyORGANIZATION

0.99+

two companiesQUANTITY

0.99+

VerticaORGANIZATION

0.99+

two daysQUANTITY

0.99+

GabePERSON

0.99+

WoickePERSON

0.98+

GartnerORGANIZATION

0.98+

last SeptemberDATE

0.97+

over three yearsQUANTITY

0.97+

one aspectQUANTITY

0.96+

first phaseQUANTITY

0.96+

pureORGANIZATION

0.96+

ChristopherPERSON

0.95+

oneQUANTITY

0.95+

single rackQUANTITY

0.95+

a hundred and fiftyQUANTITY

0.95+

day 2QUANTITY

0.95+

bothQUANTITY

0.93+

seven bladesQUANTITY

0.93+

DepotORGANIZATION

0.93+

150 flash bladesQUANTITY

0.92+

HadoopORGANIZATION

0.92+

single namespaceQUANTITY

0.92+

single platformQUANTITY

0.92+

day 1QUANTITY

0.92+

coronavirusOTHER

0.91+

firstsQUANTITY

0.91+

firstQUANTITY

0.9+

flash WadeTITLE

0.89+

singleQUANTITY

0.88+

each oneQUANTITY

0.88+

a dayQUANTITY

0.87+

a couple years agoDATE

0.85+

thousands of decisions per secondQUANTITY

0.83+

PremORGANIZATION

0.77+

primeCOMMERCIAL_ITEM

0.77+

EncoreLOCATION

0.74+

single addressableQUANTITY

0.72+

Big DataEVENT

0.72+

each individualQUANTITY

0.71+

AeonORGANIZATION

0.68+

Boston OrangeLOCATION

0.65+

VerticaTITLE

0.62+

egressORGANIZATION

0.62+

every singleQUANTITY

0.6+

last 10 yearsDATE

0.6+

a couple customerQUANTITY

0.59+

EonTITLE

0.55+

piecesQUANTITY

0.54+

petabytesQUANTITY

0.53+

flash bladeORGANIZATION

0.52+

EonORGANIZATION

0.51+

sub showsQUANTITY

0.5+

HadoopTITLE

0.49+

sixQUANTITY

0.49+

petabyteQUANTITY

0.48+

lotQUANTITY

0.47+

bigEVENT

0.43+

vertigoPERSON

0.34+

Jeff Healey, Vertica at Micro Focus | CUBEConversations, March 2020


 

>> Narrator: From theCUBE studios in Palo Alto in Boston, connecting with top leaders all around the world, this is theCUBE Conversation. >> Hi everybody, I'm Dave Vellante, and welcome to the Vertica Big Data Conference virtual. This is our digital presentation, wall to wall coverage actually, of the Vertica Big Data Conference. And with me is Jeff Healy, who directs product marketing at Vertica. Jeff, good to see you. >> Good to see you, Dave. Thanks for the opportunity to chat. >> You're very welcome Now I'm excited about the products that you guys announced and you're hardcore into product marketing, but we're going to talk about the Vertica Big Data Conference. It's been a while since you guys had this. Obviously, new owner, new company, some changes, but that new company Microfocus has announced that it's investing, I think the number was $70 million into two areas. One was security and the other, of course, was Vertica. So we're really excited to be back at the virtual Big Data Conference. And let's hear it from you, what are your thoughts? >> Yeah, Dave, thanks. And we love having theCUBE at all of these events. We're thrilled to have the next Vertica Big Data Conference. Actually it was a physical event, we're moving it online. We know it's going to be a big hit because we've been doing this for some time particularly with two of the webcast series we have every month. One is under the Hood Webcast Series, which is led by our engineers and the other is what we call a Data Disruptors Webcast Series, which is led by all customers. So we're really confident this is going to be a big hit we've seen the registration spike. We just hit 1,000 and we're planning on having about 1,000 at the physical event. It's growing and growing. We're going to see those big numbers and it's not going to be a one time thing. We're going to keep the conversation going, make sure there's plenty of best practices learning throughout the year. >> We've been at all the big BDCs and the first one's were really in the heart of the Big Data Movement, really exciting time and the interesting thing about this event is it was always sort of customers talking to customers. There wasn't a lot of commercials, an intimate event. Of course I loved it because it was in our hometown. But I think you're trying to carry that theme obviously into the digital sphere. Maybe you can talk about that a little bit. >> Yeah, Dave, absolutely right. Of course, nothing replaces face to face, but everything that you just mentioned that makes it special about the Big Data Conference, and you know, you guys have been there throughout and shown great support in talking to so many customers and leaders and what have you. We're doing the same thing all right. So we had about 40 plus sessions planned for the physical event. We're going to run half of those and we're not going to lose anything though, that's the key point. So what makes the Vertica Big Data Conference really special is that the only presenters that are allowed to present are either engineers, Vertica engineers, or best practices engineers and then customers. Customers that actually use the product. There's no sales or marketing pitches or anything like that. And I'll tell you as far as the customer line up that we have, we've got five or six already lined up as part of those 20 sessions, customers like Uber, customers like the Trade Desk, customers like Phillips talking about predictive maintenance, so list goes on and on. You won't want to miss it if you're on the fence or if you're trying to figure out if you want to register for this event. Best part about it, it's all free, and if you can't attend it live, it will be live Q&A chat on every single one of those sessions, we promise we'll answer every question if we don't get it live, as we always do. They'll all be available on demand. So no reason not to register and attend or watch later. >> Thinking about the content over the years, in the early days of the Big Data Conference, of course Vertica started before the whole Big Data Conference meme really took off and then as it took off, plugged right into it, but back then the discussion was a lot of what do I do with big data, Gartner's three Vs and how do I wrangle it all, and what's the best approach and this stuff is, Hadoop is really complicated. Of course Vertica was an alternative to RDBMS that really couldn't scale or give that type of performance for analytical databases so you had your foot in that door. But now the conversation that's interesting your theme, it's win big with data. Of course, the physical event was at the Encore, which is the new Casino in Boston. But my point is, the conversation is no longer about, how to wrangle all this data, you know how to lower the cost of storing this data, how to make it go faster, and actually make it work. It's really about how to turn data into insights and transform your organizations and quote and quote, win with big data. >> That's right. Yeah, that's great point, Dave. And that's why I mean, we chose the title really, because it's about our customers and what they're able to do with our platform. And it's we know, it's not just one platform, all of the ecosystem, all of our incredible partners. Yeah it's funny when I started with the organization about seven years ago, we were closing lots of deals, and I was following up on case studies and it was like, Okay, why did you choose Vertica? Well, the queries went fast. Okay, so what does that mean for your business? We knew we're kind of in the early adopter stage. And we were disrupting the data warehouse market. Now we're talking to our customers that their volumes are growing, growing and growing. And they really have these analytical use cases again, talk to the value at the entire organization is gaining from it. Like that's the difference between now and a few years ago, just like you were saying, when Vertica disrupted the database market, but also the data warehouse market, you can speak to our customers and they can tell you exactly what's happening, how it's moving the needle or really advancing the entire organization, regardless of the analytical use case, whether it's an internet of things around predictive maintenance, or customer behavior analytics, they can speak confidently of it more than just, hey, our queries went faster. >> You know, I've mentioned before the Micro Focus investment, I want to drill into that a bit because the Vertica brand stands alone. It's a Micro Focus company, but Vertica has its own sort of brand awareness. The reason I've mentioned that is because if you go back to the early days of MPP Database, there was a spate of companies, startups that formed. And many if not all of those got acquired, some lived on with the Codebase, going into the cloud, but generally speaking, many of those brands have gone away Vertica stays. And so my point is that we've seen Vertica have staying power throughout, I think it's a function of the architecture that Stonebraker originally envisioned, you guys were early on the market had a lot of good customer traction, and you've been very responsive to a lot of the trends. Colin Mahony will talk about how you adopted and really embrace cloud, for example, and different data formats. And so you've really been able to participate in a lot of the new emerging waves that have come out to the market. And I would imagine some of that's cultural. I wonder if you could just address that in the context of BDC. >> Oh, yeah, absolutely. You hit on all the key points here, Dave. So a lot of changes in the industry. We're in the hottest industry, the tech industry right now. There's lots of competition. But one of the things we'll say in terms of, Hey, who do you compete with? You compete with these players in the cloud, open source alternatives, traditional enterprise data warehouses. That's true, right. And one of the things we've stayed true within calling is really kind of led the charge for the organization is that we know who we are right. So we're an analytical database platform. And we're constantly just working on that one sole Source Code base, to make sure that we don't provide a bunch of different technologies and databases, and different types of technologies need to stitch together. This platform just has unbelievable universal capabilities from everything from running analytics at scale, to in Database Machine Learning with the different approach to all different types of deployment models that are supported, right. We don't go to our companies and we say, yeah, we take care of all your problems but you have to stitch together all these different types of technologies. It's all based on that core Vertica engine, and we've expanded it to meet all these market needs. So Colin knows and what he believes and what he tells the team what we lead with, is that it lead with that one core platform that can address all these analytical initiatives. So we know who we are, we continue to improve on it, regardless of the pivots and the drastic measures that some of the other competitors have taken. >> You know, I got to ask you, so we're in the middle of this global pandemic with Coronavirus and COVID-19, and things change daily by the hour sometimes by the minute. I mean, every day you get up to something new. So you see a lot of forecasts, you see a lot of probability models, best case worst case likely case even though nobody really knows what that likely case looks like, So there's a lot of analytics going on and a lot of data that people are crunching new data sources come in every day. Are you guys participating directly in that, specifically your customers? Are they using your technology? You can't use a traditional data warehouse for this. It's just you know, too slow to asynchronous, the process is cumbersome. What are you seeing in the customer base as it relates to this crisis? >> Sure, well, I mean naturally, we have a lot of customers that are healthcare technology companies, companies, like Cerner companies like Philips, right, that are kind of leading the charge here. And of course, our whole motto has always been, don't throw away any the data, there's value in that data, you don't have to with Vertica right. So you got petabyte scale types of analytics across many of our customers. Again, just a few years ago, we called the customers a petabyte club. Now a majority of our large enterprise software companies are approaching those petabyte volumes. So it's important to be able to run those analytics at that scale and that volume. The other thing we've been seeing from some of our partners is really putting that analytics to use with visualizations. So one of the customers that's going to be presenting as part of the Vertica Big Data conferences is Domo. Domo has a really nice stout demo around be able to track the Coronavirus the outbreak and how we're getting care and things like that in a visual manner you're seeing more of those. Well, Domo embeds Vertica, right. So that's another customer of ours. So think of Vertica is that embedded analytical engine to support those visualizations so that just anyone in the world can track this. And hopefully as we see over time, cases go down we overcome this. >> Talk a little bit more about that. Because again, the BDC has always been engineers presenting to audiences, you guys have a lot of you just mentioned the demo by Domo, you have a lot of brand names that we've interviewed on theCUBE before, but maybe you could talk a little bit more about some of the customers that are going to be speaking at the virtual event, and what people can expect. >> Sure, yeah, absolutely. So we've got Uber that's presenting just a quick fact around Uber. Really, the analytical data warehouse is all Vertica, right. And it works very closely with Open Source or what have you. Just to quick stat on on Uber, 14 million rides per day, what Uber is able to do is connect the riders with the drivers so that they can determine the appropriate pricing. So Uber is going to be a great session that everyone will want to tune in on that. Others like the Trade Desk, right massive Ad Tech company 10 billion ad auctions daily, it may even be per second or per minute, the amount of scale and analytical volume that they have, that they are running the queries across, it can really only be accomplished with a few platforms in the world and that's Vertica that's another a hot one is with the Trade Desk. Philips is going to be presenting IoT analytical workloads we're seeing more and more of those across not only telematics, which you would expect within automotive, but predictive maintenance that cuts across all the original manufacturers and Philips has got a long history of being able to handle sensor data to be able to apply to those business cases where you can improve customer satisfaction and lower costs related to services. So around their MRI machines and predictive maintenance initiative, again, Vertica is kind of that heartbeat, that analytical platform that's driving those initiatives So list goes on and on. Again, the conversation is going to continue with the Data Disruptors in the Under Hood webcast series. Any customers that weren't able to present and we had a few that just weren't able to do it, they've already signed up for future months. So we're already booked out six months out more and more customer stories you're going to hear from Vertica.com. >> Awesome, and we're going to be sharing some of those on theCUBE as well, the BDC it's always been intimate event, one of my favorites, a lot of substance and I'm sure the online version, the virtual digital version is going to be the same. Jeff Healey, thanks so much for coming on theCUBE and give us a little preview of what we can expect at the Vertica BDC 2020. >> You bet. >> Thank you. >> Yeah, Dave, thanks to you and the whole CUBE team. Appreciate it >> Alright, and thank you for watching everybody. Keep it right here for all the coverage of the virtual Big Data conference 2020. You're watching theCUBE. I'm Dave Vellante, we'll see you soon

Published Date : Mar 20 2020

SUMMARY :

connecting with top leaders all around the world, actually, of the Vertica Big Data Conference. Thanks for the opportunity to chat. Now I'm excited about the products that you guys announced and it's not going to be a one time thing. and the interesting thing about this event is that the only presenters that are allowed to present how to wrangle all this data, you know how to lower the cost all of the ecosystem, all of our incredible partners. in a lot of the new emerging waves So a lot of changes in the industry. and a lot of data that people are crunching So one of the customers that's going to be presenting that are going to be speaking at the virtual event, Again, the conversation is going to continue and I'm sure the online version, the virtual digital version Yeah, Dave, thanks to you and the whole CUBE team. of the virtual Big Data conference 2020.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jeff HealyPERSON

0.99+

PhilipsORGANIZATION

0.99+

Dave VellantePERSON

0.99+

Jeff HealeyPERSON

0.99+

Colin MahonyPERSON

0.99+

VerticaORGANIZATION

0.99+

fiveQUANTITY

0.99+

DavePERSON

0.99+

MicrofocusORGANIZATION

0.99+

JeffPERSON

0.99+

Palo AltoLOCATION

0.99+

UberORGANIZATION

0.99+

$70 millionQUANTITY

0.99+

ColinPERSON

0.99+

20 sessionsQUANTITY

0.99+

sixQUANTITY

0.99+

twoQUANTITY

0.99+

BostonLOCATION

0.99+

March 2020DATE

0.99+

GartnerORGANIZATION

0.99+

OneQUANTITY

0.99+

six monthsQUANTITY

0.99+

DomoORGANIZATION

0.98+

one platformQUANTITY

0.98+

Big Data ConferenceEVENT

0.98+

two areasQUANTITY

0.98+

oneQUANTITY

0.98+

CUBEORGANIZATION

0.98+

Vertica Big Data ConferenceEVENT

0.98+

CoronavirusOTHER

0.98+

StonebrakerORGANIZATION

0.98+

about 40 plus sessionsQUANTITY

0.97+

COVID-19OTHER

0.96+

BDCORGANIZATION

0.96+

one core platformQUANTITY

0.95+

Vertica BDC 2020EVENT

0.95+

1,000QUANTITY

0.95+

Vertica Big DataEVENT

0.95+

one timeQUANTITY

0.95+

Micro FocusORGANIZATION

0.94+

few years agoDATE

0.93+

about 1,000QUANTITY

0.93+

CodebaseORGANIZATION

0.93+

PhillipsORGANIZATION

0.93+

CernerORGANIZATION

0.92+

10 billion ad auctionsQUANTITY

0.91+

14 million rides per dayQUANTITY

0.9+

CoronavirusEVENT

0.89+

first oneQUANTITY

0.89+

Under HoodTITLE

0.86+

HadoopTITLE

0.85+

BDCEVENT

0.83+

seven years agoDATE

0.8+

outbreakEVENT

0.79+

Colin Mahony, Vertica | MIT CDOIQ 2019


 

>> From Cambridge, Massachusetts, it's theCUBE, covering MIT Chief Data Officer and Information Quality Symposium 2019, brought to you by SiliconANGLE Media. >> Welcome back to Cambridge, Massachusetts everybody, you're watching The Cube, the leader in tech coverage. My name is Dave Vellante here with my cohost Paul Gillin. This is day one of our two day coverage of the MIT CDOIQ conferences. CDO, Chief Data Officer, IQ, information quality. Colin Mahoney is here, he's a good friend and long time CUBE alum. I haven't seen you in awhile, >> I know >> But thank you so much for taking some time, you're like a special guest here >> Thank you, yeah it's great to be here, thank you. >> Yeah, so, this is not, you know, something that you would normally attend. I caught up with you, invited you in. This conference has started as, like back office governance, information quality, kind of wonky stuff, hidden. And then when the big data meme took off, kind of around the time we met. The Chief Data Officer role emerged, the whole Hadoop thing exploded, and then this conference kind of got bigger and bigger and bigger. Still intimate, but very high level, very senior. It's kind of come full circle as we've been saying, you know, information quality still matters. You have been in this data business forever, so I wanted to invite you in just to get your perspectives, we'll talk about what's new with what's going on in your company, but let's go back a little bit. When we first met and even before, you saw it coming, you kind of invested your whole career into data. So, take us back 10 years, I mean it was so different, remember it was Batch, it was Hadoop, but it was cool. There was a lot of cool >> It's still cool. (laughs) projects going on, and it's still cool. But, take a look back. >> Yeah, so it's changed a lot, look, I got into it a while ago, I've always loved data, I had no idea, the explosion and the three V's of data that we've seen over the last decade. But, data's really important, and it's just going to get more and more important. But as I look back I think what's really changed, and even if you just go back a decade I mean, there's an insatiable appetite for data. And that is not slowing down, it hasn't slowed down at all, and I think everybody wants that perfect solution that they can ask any question and get an immediate answers to. We went through the Hadoop boom, I'd argue that we're going through the Hadoop bust, but what people actually want is still the same. You know, they want real answers, accurate answers, they want them quickly, and they want it against all their information and all their data. And I think that Hadoop evolved a lot as well, you know, it started as one thing 10 years ago, with MapReduce and I think in the end what it's really been about is disrupting the storage market. But if you really look at what's disrupting storage right now, public clouds, S3, right? That's the new data league. So there's always a lot of hype cycles, everybody talks about you know, now it's Cloud, everything, for maybe the last 10 years it was a lot of Hadoop, but at the end of the day I think what people want to do with data is still very much the same. And a lot of companies are still struggling with it, hence the role for Chief Data Officers to really figure out how do I monetize data on the one hand and how to I protect that asset on the other hand. >> Well so, and the cool this is, so this conference is not a tech conference, really. And we love tech, we love talking about this, this is why I love having you on. We kind of have a little Vertica thread that I've created here, so Colin essentially, is the current CEO of Vertica, I know that's not your title, you're GM and Senior Vice President, but you're running Vertica. So, Michael Stonebreaker's coming on tomorrow, >> Yeah, excellent. >> Chris Lynch is coming on tomorrow, >> Oh, great, yeah. >> we've got Andy Palmer >> Awesome, yeah. >> coming up as well. >> Pretty cool. (laughs) >> So we have this connection, why is that important? It's because, you know, Vertica is a very cool company and is all about data, and it was all about disrupting, sort of the traditional relational database. It's kind of doing more with data, and if you go back to the roots of Vertica, it was like how do you do things faster? How do you really take advantage of data to really drive new business? And that's kind of what it's all about. And the tech behind it is really cool, we did your conference for many, many years. >> It's coming back by the way. >> Is it? >> Yeah, this March, so March 30th. >> Oh, wow, mark that down. >> At Boston, at the new Encore Hotel. >> Well we better have theCUBE there, bro. (laughs) >> Yeah, that's great. And yeah, you've done that conference >> Yep. >> haven't you before? So very cool customers, kind of leading edge, so I want to get to some of that, but let's talk the disruption for a minute. So you guys started with the whole architecture, MPP and so forth. And you talked about Cloud, Cloud really disrupted Hadoop. What are some of the other technology disruptions that you're seeing in the market space? >> I think, I mean, you know, it's hard not to talk about AI machine learning, and what one means versus the other, who knows right? But I think one thing that is definitely happening is people are leveraging the volumes of data and they're trying to use all the processing power and storage power that we have to do things that humans either are too expensive to do or simply can't do at the same speed and scale. And so, I think we're going through a renaissance where a lot more is being automated, certainly on the Vertica roadmap, and our path has always been initially to get the data in and then we want the platform to do a lot more for our customers, lots more analytics, lots more machine-learning in the platform. So that's definitely been a lot of the buzz around, but what's really funny is when you talk to a lot of customers they're still struggling with just some basic stuff. Forget about the predictive thing, first you've got to get to what happened in the past. Let's give accurate reporting on what's actually happening. The other big thing I think as a disruption is, I think IOT, for all the hype that it's getting it's very real. And every device is kicking off lots of information, the feedback loop of AB testing or quality testing for predictive maintenance, it's happening almost instantly. And so you're getting massive amounts of new data coming in, it's all this machine sensor type data, you got to figure out what it means really quick, and then you actually have to do something and act on it within seconds. And that's a whole new area for so many people. It's not their traditional enterprise data network warehouse and you know, back to you comment on Stonebreaker, he got a lot of this right from the beginning, you know, and I think he looked at the architectures, he took a lot of the best in class designs, we didn't necessarily invent everything, but we put a lot of that together. And then I think the other you've got to do is constantly re-invent your platform. We came out with our Eon Mode to run cloud native, we just got rated the best cloud data warehouse from a net promoter score rating perspective, so, but we got to keep going you know, we got to keep re-inventing ourselves, but leverage everything that we've done in the past as well. >> So one of the things that you said, which is kind of relevant for here, Paul, is you're still seeing a real data quality issue that customers are wrestling with, and that's a big theme here, isn't it? >> Absolutely, and the, what goes around comes around, as Dave said earlier, we're still talking about information quality 13 years after this conference began. Have the tools to improve quality improved all that much? >> I think the tools have improved, I think that's another area where machine learning, if you look at Tamr, and I know you're going to have Andy here tomorrow, they're leveraging a lot of the augmented things you can do with the processing to make it better. But I think one thing that makes the problem worse now, is it's gotten really easy to pour data in. It's gotten really easy to store data without having to have the right structure, the right quality, you know, 10 years ago, 20 years ago, everything was perfect before it got into the platform. Right, everything was, there was quality, everything was there. What's been happening over the last decade is you're pumping data into these systems, nobody knows if it's redundant data, nobody knows if the quality's any good, and the amount of data is massive. >> And it's cheap to store >> Very cheap to store. >> So people keep pumping it in. >> But I think that creates a lot of issues when it comes to data quality. So, I do think the technology's gotten better, I think there's a lot of companies that are doing a great job with it, but I think the challenge has definitely upped. >> So, go ahead. >> I'm sorry. You mentioned earlier that we're seeing the death of Hadoop, but I'd like you to elaborate on that becuase (Dave laughs) Hadoop actually came up this morning in the keynote, it's part of what GlaxoSmithKline did. Came up in a conversation I had with the CEO of Experian last week, I mean, it's still out there, why do you think it's in decline? >> I think, I mean first of all if you look at the Hadoop vendors that are out there, they've all been struggling. I mean some of them are shutting down, two of them have merged and they've got killed lately. I think there are some very successful implementations of Hadoop. I think Hadoop as a storage environment is wonderful, I think you can process a lot of data on Hadoop, but the problem with Hadoop is it became the panacea that was going to solve all things data. It was going to be the database, it was going to be the data warehouse, it was going to do everything. >> That's usually the kiss of death, isn't it? >> It's the kiss of death. And it, you know, the killer app on Hadoop, ironically, became SQL. I mean, SQL's the killer app on Hadoop. If you want to SQL engine, you don't need Hadoop. But what we did was, in the beginning Mike sort of made fun of it, Stonebreaker, and joked a lot about he's heard of MapReduce, it's called Group By, (Dave laughs) and that created a lot of tension between the early Vertica and Hadoop. I think, in the end, we embraced it. We sit next to Hadoop, we sit on top of Hadoop, we sit behind it, we sit in front of it, it's there. But I think what the reality check of the industry has been, certainly by the business folks in these companies is it has not fulfilled all the promises, it has not fulfilled a fraction on the promises that they bet on, and so they need to figure those things out. So I don't think it's going to go away completely, but I think its best success has been disrupting the storage market, and I think there's some much larger disruptions of technologies that frankly are better than HTFS to do that. >> And the Cloud was a gamechanger >> And a lot of them are in the cloud. >> Which is ironic, 'cause you know, cloud era, (Colin laughs) they didn't really have a cloud strategy, neither did Hortonworks, neither did MapR and, it just so happened Amazon had one, Google had one, and Microsoft has one, so, it's just convenient to-- >> Well, how is that affecting your business? We've seen this massive migration to the cloud (mumbles) >> It's actually been great for us, so one of the things about Vertica is we run everywhere, and we made a decision a while ago, we had our own data warehouse as a service offering. It might have been ahead of its time, never really took off, what we did instead is we pivoted and we say "you know what? "We're going to invest in that experience "so it's a SaaS-like experience, "but we're going to let our customers "have full control over the cloud. "And if they want to go to Amazon they can, "if they want to go to Google they can, "if they want to go to Azure they can." And we really invested in that and that experience. We're up on the Amazon marketplace, we have lots of customers running up on Amazon Cloud as well as Google and Azure now, and then about two years ago we went down and did this endeavor to completely re-architect our product so that we could separate compute and storage so that our customers could actually take advantage of the cloud economics as well. That's been huge for us, >> So you scale independent-- >> Scale independently, cloud native, add compute, take away compute, and for our existing customers, they're loving the hybrid aspect, they love that they can still run on Premise, they love that they can run up on a public cloud, they love that they can run in both places. So we will continue to invest a lot in that. And it is really, really important, and frankly, I think cloud has helped Vertica a lot, because being able to provision hardware quickly, being able to tie in to these public clouds, into our customers' accounts, give them control, has been great and we're going to continue on that path. >> Because Vertica's an ISV, I mean you're a software company. >> We're a software company. >> I know you were a part of HP for a while, and HP wanted to mash that in and run it on it's hardware, but software runs great in the cloud. And then to you it's another hardware platform. >> It's another hardware platform, exactly. >> So give us the update on Micro Focus, Micro Focus acquired Vertica as part of the HPE software business, how many years ago now? Two years ago? >> Less than two years ago. >> Okay, so how's that going, >> It's going great. >> Give us the update there. >> Yeah, so first of all it is great, HPE and HP were wonderful to Vertica, but it's great being part of a software company. Micro Focus is a software company. And more than just a software company it's a company that has a lot of experience bridging the old and the new. Leveraging all of the investments that you've made but also thinking about cloud and all these other things that are coming down the pike. I think for Vertica it's been really great because, as you've seen Vertica has gotten its identity back again. And that's something that Micro Focus is very good at. You can look at what Micro Focus did with SUSE, the Linux company, which actually you know, now just recently spun out of Micro Focus but, letting organizations like Vertica that have this culture, have this product, have this passion, really focus on our market and our customers and doing the right thing by them has been just really great for us and operating as a software company. The other nice thing is that we do integrate with a lot of other products, some of which came from the HPE side, some of which came from Micro Focus, security products is an example. The other really nice thing is we've been doing this insource thing at Micro Focus where we open up our source code to some of the other teams in Micro Focus and they've been contributing now in amazing ways to the product. In ways that we would just never be able to scale, but with 4,000 engineers strong in Micro Focus, we've got a much larger development organization that can actually contribute to the things that Vertica needs to do. And as we go into the cloud and as we do a lot more operational aspects, the experience that these teams have has been incredible, and security's another great example there. So overall it's been great, we've had four different owners of Vertica, our job is to continue what we do on the innovation side in the culture, but so far Micro Focus has been terrific. >> Well, I'd like to say, you're kind of getting that mojo back, because you guys as an independent company were doing your own thing, and then you did for a while inside of HP, >> We did. >> And that obviously changed, 'cause they wanted more integration, but, and Micro Focus, they know what they're doing, they know how to do acquisitions, they've been very successful. >> It's a very well run company, operationally. >> The SUSE piece was really interesting, spinning that out, because now RHEL is part of IBM, so now you've got SUSE as the lone independent. >> Yeah. >> Yeah. >> But I want to ask you, go back to a technology question, is NoSQL the next Hadoop? Are these databases, it seems to be that the hot fad now is NoSQL, it can do anything. Is the promise overblown? >> I think, I mean NoSQL has been out almost as long as Hadoop, and I, we always say not only SQL, right? Mike's said this from day one, best tool for the job. Nothing is going to do every job well, so I think that there are, whether it's key value stores or other types of NoSQL engines, document DB's, now you have some of these DB's that are running on different chips, >> Graph, yeah. >> there's always, yeah, graph DBs, there's always going to be specialty things. I think one of the things about our analytic platform is we can do, time series is a great example. Vertica's a great time series database. We can compete with specialized time series databases. But we also offer a lot of, the other things that you can do with Vertica that you wouldn't be able to do on a database like that. So, I always think there's going to be specialty products, I also think some of these can do a lot more workloads than you might think, but I don't see as much around the NoSQL movement as say I did a few years ago. >> But so, and you mentioned the cloud before as kind of, your position on it I think is a tailwind, not to put words in your mouth, >> Yeah, yeah, it's a great tailwind. >> You're in the Amazon marketplace, I mean they have products that are competitive, right? >> They do, they do. >> But, so how are you differentiating there? >> I think the way we differentiate, whether it's Redshift from Amazon, or BigQuery from Google, or even what Azure DB does is, first of all, Vertica, I think from, feature functionality and performance standpoint is ahead. Number one, I think the second thing, and we hear this from a lot of customers, especially at the C-level is they don't want to be locked into these full stacks of the clouds. Having the ability to take a product and run it across multiple clouds is a big thing, because the stack lock-in now, the full stack lock-in of these clouds is scary. It's really easy to develop in their ecosystems but you get very locked into them, and I think a lot of people are concerned about that. So that works really well for Vertica, but I think at the end of the day it's just, it's the robustness of the product, we continue to innovate, when you look at separating compute and storage, believe it or not, a lot of these cloud-native databases don't do that. And so we can actually leverage a lot of the cloud hardware better than the native cloud databases do themselves. So, like I said, we have to keep going, those guys aren't going to stop, and we actually have great relationships with those companies, we work really well with the clouds, they seem to care just as much about their cloud ecosystem as their own database products, and so I think that's going to continue as well. >> Well, Colin, congratulations on all the success >> Yeah, thank you, yeah. >> It's awesome to see you again and really appreciate you coming to >> Oh thank you, it's great, I appreciate the invite, >> MIT. >> it's great to be here. >> All right, keep it right there everybody, Paul and I will be back with our next guest from MIT, you're watching theCUBE. (electronic jingle)

Published Date : Jul 31 2019

SUMMARY :

brought to you by SiliconANGLE Media. I haven't seen you in awhile, kind of around the time we met. It's still cool. but at the end of the day I think is the current CEO of Vertica, (laughs) and if you go back to the roots of Vertica, at the new Encore Hotel. Well we better have theCUBE there, bro. And yeah, you've done that conference but let's talk the disruption for a minute. but we got to keep going you know, Have the tools to improve quality the right quality, you know, But I think that creates a lot of issues but I'd like you to elaborate on that becuase I think you can process a lot of data on Hadoop, and so they need to figure those things out. so one of the things about Vertica is we run everywhere, and frankly, I think cloud has helped Vertica a lot, I mean you're a software company. And then to you it's another hardware platform. the Linux company, which actually you know, and Micro Focus, they know what they're doing, so now you've got SUSE as the lone independent. is NoSQL the next Hadoop? Nothing is going to do every job well, the other things that you can do with Vertica and so I think that's going to continue as well. Paul and I will be back with our next guest from MIT,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavePERSON

0.99+

Andy PalmerPERSON

0.99+

Paul GillinPERSON

0.99+

Dave VellantePERSON

0.99+

MicrosoftORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

Colin MahoneyPERSON

0.99+

PaulPERSON

0.99+

ColinPERSON

0.99+

IBMORGANIZATION

0.99+

VerticaORGANIZATION

0.99+

Chris LynchPERSON

0.99+

HPEORGANIZATION

0.99+

Michael StonebreakerPERSON

0.99+

HPORGANIZATION

0.99+

Micro FocusORGANIZATION

0.99+

HadoopTITLE

0.99+

Colin MahonyPERSON

0.99+

last weekDATE

0.99+

AndyPERSON

0.99+

March 30thDATE

0.99+

NoSQLTITLE

0.99+

MikePERSON

0.99+

ExperianORGANIZATION

0.99+

tomorrowDATE

0.99+

SQLTITLE

0.99+

two dayQUANTITY

0.99+

SiliconANGLE MediaORGANIZATION

0.99+

BostonLOCATION

0.99+

Cambridge, MassachusettsLOCATION

0.99+

4,000 engineersQUANTITY

0.99+

Two years agoDATE

0.99+

SUSETITLE

0.99+

Azure DBTITLE

0.98+

second thingQUANTITY

0.98+

20 years agoDATE

0.98+

10 years agoDATE

0.98+

oneQUANTITY

0.98+

VerticaTITLE

0.98+

HortonworksORGANIZATION

0.97+

MapReduceORGANIZATION

0.97+

one thingQUANTITY

0.97+

Vishwam Annam & Philip Bernick | Dell Boomi World 2018


 

>> Live from Las Vegas, it's theCUBE. Covering Boomi World 2018, brought to you by Dell Boomi. >> Welcome back to theCUBE, I'm Lisa Martin Live at Boomi World 2018 at The Encore in Las Vegas. Been here all day, had a lot of great chats. We're excited to welcome to theCUBE for the first time a couple of gents from Hathority Implementation Partner of Dell Boomi, Philip Bernick, PhD, Principal, and Human-Centered Technologist, aka Technology Wonk. >> I go by both. >> It does say on your card, I think that's fantastic. And Vishwan Annam, MBA and principal technology architect at Hathority. Guys, welcome to theCUBE. >> Yes, thank you. >> Thank you for having us Lisa. >> So Hathority has been an implementation partner with Dell Boomi for several years now, congratulations yesterday on winning the Innovation Partner of the Year. Philip, you had an opportunity to talk yesterday at the partner summit with CTO Michael Morton, talk to us a little bit about that and about this Innovation Partner of the Year award, that's a big title. >> It is, and we're really excited to be able to do really interesting things with Boomi. It's more than just an integration platform, it really let's us do a lot of things with devices. IOT is coming to the mainstream because now we have infrastructure that will support it. It's a lot of data, it needs a big, fat pipe. We need gigabit networks in order to move it all around, to get it to the people who need to make decisions or to get it to systems who are making decisions for us, the Dell Boomi atom let's us do that and we've got it running on little tiny devices like Raspberry Pies and we can put it on other Edge devices and routers so we've done some micro services for cities that are interested in improving their smartness. >> Excellent. >> So yeah, we're excited. >> Vishwam, tell us about, for those of our viewers who haven't heard of Hathority, tell us a little bit about what you guys do, who you are, where you're located. >> Sure, so we're a data integration company so we work with Dell Boomi in automating a lot of the data integration practices, so a lot of our customers, they're in all across the world and they're serving their different (mumbles). Just as there's airlines and the healthcare and smart cities, and some are like, you know, the gaming industry. So what we are doing is we are automating all of their work flows and connecting all of their systems in one place so that's where we are liberating. We're based in the greater Phoenix area so, and our employees are, some are here in the U.S., some are India, some are in U.K., so based on what the customers needs are like in Dell Boomi our, our consultants would work there so we are 35 in strength so far, our company. >> So about three or four years you've been in business, Dell Boomi, a number of things that came out this morning, I was up to hear numbers and statistics during the general session and Chris McNabb, CEO, talked about their adding five new customers every single day, they also were, I was reading this over the weekend, fifth year in a row strong leader in the Gartner Magic Quadrant for iPads, but they've come out today and said we are redefining the I in iPads. This is more than integration, it's more than integrating applications, you got to integrate data, news sources, existing sources, you got to integrate people and processings and trading networks with this new reimagination of the I to the intelligence. Philip, I'm curious, what does that signify to you about your partnership with Dell Boomi and what opportunities are you excited that this is going to open up for you? >> Well it says to me that they're excited about the same kinds of things that we're excited about so one of the things that we demonstrated, we have customers who are interested in lots of different technologies, yesterday they talked about three years ago IOT was the eyeroll, right, don't get a headache. This year it's Blockchain. But one of the demos we brought to Boomi World is a demo where we actually use Dell Boomi to integrate with Hyperledger, a Blockchain application, and on top of that we used Flow to produce the front end and so we can integrate across a variety of platforms and now we integrated into the Blockchain and our customers want these kinds of things. The Blockchain is interesting because it's immutable, it's auditable, and it's validated by all of the participants in a particular set of nodes in the Blockchain so, you know, it's an exciting technology. It's exciting because, not because of the tokenization, things like Bitcoin, but because it's a database that you can share, a ledger that we can share. >> Because one of the challenges that a lot of our customers run into is managing the data integrity when somebody sends the data, how reliable it is and whether there, is there any place in the middle that somebody's monitoring the data so those are the challenges that Blockchain would solve in guaranteeing the data delivery and the quality of it so those are kind of I that he was mentioning, you know, as part of integration, innovation and more of a, you know, new parts and transformation. >> We're really transforming. >> The data transformation in the digital world these days. >> So Blockchain, I often hear companies that might be integration companies that talk a lot about Blockchain and I kind of sit back and go I don't understand what your story is there. Talk to us about, cause it's a, you know, crypto Blockchain, huge buzzwords, talk to us exactly about what you guys do and what Dell Boomi is doing, I think they announced support for hyperledger fabric as well as Ethereum but-- >> Right. >> Help unpack that myth around Blockchain and what integrations role is in it. >> A lot of the confusion around Blockchain comes from things like Bitcoin so the interesting thing around Bitcoin is it was the first Blockchain and it's built around this idea of a token, the Bitcoin, right? And so what this ledger is keeping track of are these Bitcoin, but you can keep track of any sort of data on a Blockchain. You can contribute data of any sort to a, not the Bitcoin Blockchain, but Ethereum, for example, we can include software, we can include other sorts of data, you can include a healthcare record that is your healthcare record that you share only with individuals with whom you share part of your private key, right, but you own it and it's yours and it's always yours and you control it. But it's validated by all of the people who are participating in producing that Blockchain so it's decentralized but it's imutable and it's auditable so it guarantees integrity because unless all of the participants agree that a transaction took place, it didn't. So we ensure data integrity through the Blockchain. That's the interesting thing about it, for us. >> That's a major part of integration companies, because a lot of the technologies that we hear, Solaris is one of the messaging queuing systems that they presentate, so they're guaranteeing the delivery at the same time relabel messaging transmissions, streaming the data, and it's faster, reliable, and managing the full data usage. >> Here's a great use case, today is voting day. Many polling places no longer have paper ballots, so you cast your vote but you have no way to actually see the vote that you cast. If it were on a Blockchain, you could inspect your vote, but no body else could know how you voted. You could insure the fact your vote was entered into the Blockchain and count it in the way that you wanted it to be. >> That's a great example and relatable, so thanks for sharing that. So guys, Dell Boomi has, I think they said this morning, Chris McNabb, over 350 partners, you guys are one of them. They have a broad ecosystem. Embedded partners, implementation, GSIs. Talk to us about your partnership and how, as Boomi says, we want to be the transformation partner, and it is all about transformation, right? Especially in an enterprise that wasn't born in the cloud. It can't survive without, as the customer expectation drives, I want to be able to buy something from your physical store, maybe a partner store, online, Amazon, Zappos, whatnot and I expect as a customer to have a seamless experience. That's hard to do for a company that's maybe 20, 30 years old to transform. I'm thinking of omni-channel retailers as the example. How is your integration, pun intended, will Dell Boomi really helping customers transform their digital, IT, security, workforce, what goes through with that opportunity to transform? >> You know, the relationship between Dell Boomi and it's partners is really synergistic. I mean they provide a lot of support. There's really excellent training, there's excellent communication. There's marketing support, we share on projects in a variety of ways, we do jump starts. So we help teach people how to use Boomi in addition to helping Boomi folks teaching us how to use the new tools. There's a great community for providing feedback, for getting resources if there's something that we need to do that we don't know how to do. There's a huge community that shares, we all share connectors, right? We're building integration and a connector doesn't exist and we create a new connector, not the configuration of the connector itself, we share it. So that collaborative approach to doing business is really important to us and it reflects our companies ethos as we hope is also reflects Dell Boomi's ethos. >> We've been working in Boomi since 2012, so over the years like even though we were certified partners since 2015, we have been contributing to various channels, like the support or, like, the community channel, and contributing to the release planning as well, because we are the first line of defense from the customers, we know what the customers are expecting. So say they got Salesforce to implement it. So we as a system integrator, we come in and see what are the data points for the Salesforce. And say like user data, they want to build their contacts in there or any activities or sales data. So there are multiple systems that are feeding into Salesforce in this case. So we are the ones who are contributing to Dell Boomi. Okay, these are the features that we could consider. So because Salesforce a-walled in, just like Boomi, they launched a different watch list as well So as in Boomi, there is a different connector for Salesforce and Service Cloud and multiple layers in that so those are the unique cases that we are contributing to Dell, and obviously there, I mean, they take the feedback so from the partners like us where they see it as they work towards delivering with this. So one use case that we are working with some of out customers who have innovated, we have been asking Dell to build it, like, you know, and they were able to deliver it. There are, like, they want some reporting of it, so you transmit the data to one system to other, and they wanted to see okay how the data system was the source and the system was the destination and how this data was transmitted. So Boomi gave the real time visibility into those. So those are some kind of partnering opportunities like all the way from customer to the product so we are happy to be in the middle and contributing our part of it. >> That's one of the things that I've heard a lot today is that Boomi is listening, one of the great examples of that on stage this morning was Chris McNabb talking about the Dell Boomi employee onboarding solution. They actually did an internal survey earlier this year and found, whoa, this is really not an optimal process, and in implementing an onboarding solution to make that more streamline, to obviously, you know, you hire someone who's brilliant, you want to be able to get them up and running and innovating as fast as possible. I like they shared the feedback they got from their own employees and created a solution that they're now being able to deliver to the market. >> And there was another piece to that that was really interesting which is that they utilized their partner network in order to build solution, right? They didn't build all of it in house. >> You're right, they did talk about that. >> They reach out and partners, they work with partners in a variety of ways and we really, really appreciate that. >> Yeah, that listening, that synergy that you've both talked about was really apparent. So when we look at certain business initiatives, like onboarding or customer 360 or e-commerce, any favorite joint customer example that you've helped to integrate that has approached one of those daunting business initiatives, and worked with Hathority, and you're laughing, to really transform. >> They're all like that. >> Really interesting, yeah. Do you want to talk about it here? >> Give me one of your favorite examples. >> Share, well, share. >> Okay, so with some of our customers, and especially with some of our enterprise scale, so there are a lot of systems that are at stake for them because, you know, they want to have the digital transformation journey so the major one Dell Boomi contributes to is connecting all of the system, giving them their visibility so with, not only the point to point integrations, they also pull the real time integrations capability. So we're like, with this case, where the customer go into retail store and say they want to do something at the point of sale transaction, they want to purchase something, so there and you have the credit card transaction. I mean, those need to encrypt, I mean, we cannot wait for 10 minutes to get the data so that's where, you know, like Dell Boomi is scalable and it's robust in the sense that their response time is pretty quick. So it's on a real time basis. So a lot of these cases like, you know, with the Boomi that we are able to deliver it. You know, on the the integration side, APA side, and now with the EMB hedge, which is a master data hub, a new product from them within the last two years. We have been working with our customers implementing a master data hub as well as ManyWho, which is a Dell Boomi Flow which is amazing. Some of our customers, you know, with the APAs, like can you see the data? But with the Flow, you can visualize, these are the exact UI that you are seeing. How your data is getting in on the back end and then you can throw it out so, because these enterprise customers, especially on the business side if they're working with something, so they want to try it out, but you know, they don't want to learn, you know, programming to do that so that's when, like, Flow will, is already helping, we are already seeing the value of it with our customers. >> We've heard a little bit about that today as well, Flow and terms of the automation, but also how that will enable customers, there was a cute little video on their website that I saw recently which showed an example of Flow. Somebody bangs their car into a tree, gets out, and takes a photograph of the incident, uploads it to their insurance carrier app who then actually initiates the entire claim into process, and that's was to me a clear example of you have to go where the data is. Michael Dell says frequently there's a big boom at the edge, but if I'm in that scenario as a customer, I want to know, I don't care what's on the back end, I want to be able to get this initiated quickly and I thought that was a nice, kind of, example of how they're able to abstract that so that the customer experience can be superior than the competition. >> Absolutely, so that's where Boomi has something called run time engine, which is scalable, like you could install, like, you know, a smaller device like Raspberry Pie which is like, you know, just a mini computer. Or you you could install on the big switchboard itself, so this is a scalable so earlier, as Michael Dell was mentioning, the edge of computing. So you could install on a Gateway, which sits on the-- >> On a tree >> On a tree. (laughs) So you don't have to send all the data to cloud for processing so it's an amazing leap into the next distribution computing because, as you mentioned, the fast, the fastness of response time, you know. We don't have to wait for the cloud to respond so all the combinations and real time navigation's are happening within the Edge network itself so, we are all on the same, we have implemented the same solution so, which was one of the reason why we're the winner of Innovation Partner of the Year award. >> Well congratulations again for that gentlemen. Thank you so much for stopping by. >> Thank you. >> And sharing with our viewers a little bit about Hathority and what you guys are, how you really symbiotically innovating with Dell Boomi. Philip, Vishwam, thanks so much for your time today. >> Thank you for having us. >> Thank you, thank you for having us. >> My pleasure, we want to thank you for watching theCUBE. I'm Lisa Martin live from Boomi World 2018 in Las Vegas. Stick around, I'll be back with John Frayer and our next guest after a short break. (upbeat music)

Published Date : Nov 7 2018

SUMMARY :

brought to you by Dell Boomi. and Human-Centered Technologist, aka Technology Wonk. And Vishwan Annam, MBA and principal at the partner summit with CTO Michael Morton, IOT is coming to the mainstream because now we have tell us a little bit about what you guys do, and some are like, you know, the gaming industry. and what opportunities are you excited that so one of the things that we demonstrated, so those are kind of I that he was mentioning, you know, talk to us exactly about what you guys do and what integrations role is in it. and you control it. because a lot of the technologies that we hear, in the way that you wanted it to be. and I expect as a customer to have a seamless experience. not the configuration of the connector itself, we share it. so from the partners like us where they see it as to make that more streamline, to obviously, you know, that was really interesting which is that and we really, really appreciate that. and you're laughing, to really transform. Do you want to talk about it here? So a lot of these cases like, you know, Flow and terms of the automation, So you could install on a Gateway, which sits on the-- the fastness of response time, you know. Thank you so much for stopping by. Hathority and what you guys are, thank you for having us. My pleasure, we want to thank you for watching theCUBE.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Chris McNabbPERSON

0.99+

Vishwan AnnamPERSON

0.99+

10 minutesQUANTITY

0.99+

U.K.LOCATION

0.99+

20QUANTITY

0.99+

PhilipPERSON

0.99+

John FrayerPERSON

0.99+

Lisa MartinPERSON

0.99+

AmazonORGANIZATION

0.99+

IndiaLOCATION

0.99+

Philip BernickPERSON

0.99+

U.S.LOCATION

0.99+

ZapposORGANIZATION

0.99+

yesterdayDATE

0.99+

LisaPERSON

0.99+

Las VegasLOCATION

0.99+

HathorityORGANIZATION

0.99+

fifth yearQUANTITY

0.99+

DellORGANIZATION

0.99+

35QUANTITY

0.99+

This yearDATE

0.99+

firstQUANTITY

0.99+

BoomiPERSON

0.99+

Michael MortonPERSON

0.99+

VishwamPERSON

0.99+

2015DATE

0.99+

four yearsQUANTITY

0.98+

PhoenixLOCATION

0.98+

oneQUANTITY

0.98+

bothQUANTITY

0.98+

over 350 partnersQUANTITY

0.98+

todayDATE

0.98+

Dell BoomiORGANIZATION

0.98+

iPadsCOMMERCIAL_ITEM

0.98+

FlowTITLE

0.98+

GartnerORGANIZATION

0.98+

2012DATE

0.98+

HyperledgerTITLE

0.98+

CTOPERSON

0.97+

first timeQUANTITY

0.97+

Boomi World 2018EVENT

0.97+

BoomiORGANIZATION

0.97+

one systemQUANTITY

0.96+

Michael DellPERSON

0.96+

SolarisORGANIZATION

0.96+

three years agoDATE

0.95+

five new customersQUANTITY

0.94+

first lineQUANTITY

0.94+

earlier this yearDATE

0.93+

one placeQUANTITY

0.92+

theCUBEORGANIZATION

0.92+

this morningDATE

0.91+

SalesforceTITLE

0.89+

BitcoinOTHER

0.89+

Mandy Dhaliwal, Dell Boomi | Dell Boomi World 2018


 

>> Live, from Las Vegas, it's theCUBE, covering Boomi World 2018. Brought to you by Dell Boomi. >> Welcome back to theCUBE, we are live at Boomi World 2018 at the Encore Las Vegas. I am Lisa Martin with my co-host John Furrier, and we're excited to welcome the CMO, the new CMO of Dell Boomi, Mandy Dhaliwal. Mandy, welcome to theCUBE. >> Thank you Lisa, it's great to be here. >> And thanks for having us here. >> Oh my gosh. >> Second annual Boomi World >> Yes >> Doubled in size from last year, moved it from San Francisco to Las Vegas. This morning's keynote was action-packed, standing room only, and some of the stats that really struck out at me: five new customers are being added to Dell Boomi everyday, over 7500 customers to date, your Dell Boomi community is over 64,000 strong, there's a lot of momentum. Talk to us about, you're new, been seven weeks, what are some of the things that excited you about coming to lead marketing for Dell Boomi? >> Oh my gosh, hard to pinpoint one thing. So many wonderful things about this company. Market leading technology, Gartner Magic Quadrant leader five years in a row, right? Just fantastic reputation in the technology landscape. Everybody has very positive things to say about Boomi. The company culture, right? Companies like this don't come around everyday. It's fantastic, everybody is very collaborative, we have a winning culture, we put customers first. We don't just talk to the talk, we walk the walk, and it's fantastic to be a part of it. Outstanding sales team, outstanding leadership team, I could go on. >> Michael Dell said 80%, sales are booming at Boomi. But, as far as a marketer, or CMO, you have a challenge. You have a successful company that was acquired by Dell eight years ago, incubated, and is part of the puzzle pieces of the Micheal Dell strategy. You have all of Dell Technologies' portfolio, but Boomi seems to be one of the key ingredients. You got VMware, everyone knows what's going on there, Pivotal, and now Dell Boomi, born in the cloud. So you got product market fit, check. >> Absolutely, yes. >> Now you got to get the word out, you got to drive value, be part of that flagship trio that's Dell Technologies. >> Right, right. >> That's a big task, how are you going to attack that? What's your plan, what's the vision? >> First and foremost, it's awareness, right? We've got to get the word out. We've got so many wonderful customer stories, that we just need to share with the world. Our own company, amongst Dell Technologies, day one, Dell EMC merger, sales force was integrated, day one. And guess who did that, what technology was behind the scenes? We drink our own champagne. >> That's impressive considering I can't even imagine the sheer number of sales force instances that came together in a single day >> Absolutely, customer service. We're our own best proof point. Dell Technologies is our largest enterprise case study. Customer service, across RSA, Secureworks, and Dell Boomi, one point of contact, one phone call. We get notes and if there's an issue with any one of our customers, we're able to pass through that customer request directly to the company that needs to be dealing with the customer. We don't make the customer hang up and call another number. >> So cloud scale certainly gives you an advantage, we heard that. Product is strong, data now is becoming much more instrumental across horizontal data sets. So it's not just the silo data and do some integration, you got cloud native, you got VMware and the enterprise, you've got Pivotal, Kubernetes, Cloud Foundry, cloud native stuff. How are you guys going to take that data explosion and make it trustable? Is that part of the plan, is that going to be a key part of that? >> Trustable in terms in privacy and data governance? >> Just leveraging the data, being data driven. You mention integrating sales, that's a tough job that has to be done, check. But now how do you get value out of the app and the workloads that run with that data? >> Well it's a complex ecosystem that we're a part of, right? And that's Boomi's job, we radically simplify that whole ecosystem, so the value is starting to show. We're about to unleash next week a Forrester TEI study. So we took a conglomerate with five of our top enterprise customers and built this 300 billion dollar business as a scenario, and started to look at the value that Boomi was able to derive in terms of cost reduction, in terms of savings on infrastructure costs, in terms of innovation potential, as far as speeding up their routes to market, in the ROI, which came back conservative from an innovation potential perspective, because you really can't quantify what you don't know, 300% was the number in terms of the ROI that we're able to deliver as a Boomi-empowered business. >> Which is huge, there were, besides that, a number of other really eye-popping quantitative stats, business outcomes, that that Forrester Total Economic Impact study covered, one of them being, incremental revenue is the biggest benefit that Dell Boomi customers get, 3.4 million of incremental revenue. Here's some other stats that I saw here that I thought were really transformative are, cutting development times by 70%, freeing up IT resources, being able to reallocate them, helping, ultimately, accelerate the pace of innovation, which we know is critical to transforming and continuing to use data, and to John's point, establish that trust, not just with customers and partners, but also internally. >> Absolutely. Every company's a software company, right? We've been hearing that now for years. We practice it, we live it every day, we're empowering these brands to go out and do what they do best and re-imagine their businesses from their customers' perspectives. It's incredibly powerful, it's exciting. >> And you, sorry John, I was going to say you've got, speaking of customers, over 92% of the breakout sessions here have customers and partners, and I know as a marketer how challenging it is to get. And you said about 68 customers here speaking on your behalf. >> Absolutely. >> That's huge. >> Our community is tremendous. We truly partner with our customers, and it shows. You heard Chris Port on stage, recognizing customers for innovation in various categories. We take our customers and partner with them for them to be successful. The company culture extends beyond the employees, and it's been the secret to our success. We're able to help them unlock the value of their businesses. It starts with the data and the applications, but at the end of the day, we're an enterprise transformation company. And you're going to start to see a lot more of that in the coming months, as far as messaging, and the value that we deliver as a platform. >> I want to give you thoughts, Mandy, on a couple things. One is the technology partner program, and the ecosystem, you mentioned that, but also you're starting to see the messaging change around Boomi, Dell Boomi. Integration, certainly we know how hard it is, as a glue layer, to put stuff together, but you guys are talking about connecting businesses. So you're now moving up the value proposition, the more holistic kind of perspective. By design, is there a rationale for it? Can you explain why this is happening, what's the evolution? >> The market is taking us there, right? The customer need is where we're focused. Digital transformation, right now, today, the stats that we have, only 26% of digital transformations succeed. We've got an awful lot of customers saying, "Hey, we got to get this figured out." It's on the C-suite agenda, it's on the boardroom agenda. It has to succeed, it's innovate or die. There's stats out there in terms of how many of the Fortune 500 are going to be around 10 years from now, five years from now, right? Boomi is that company that will solve those problems. Michael said it this morning. >> And speed's important too, they got to get there faster. >> Absolutely, absolutely. >> And that's not what they're used to. (chuckles) >> We have a very simple UI, very plug-and-play, drag-and-drop platform that helps our customers go deliver. Not to mention the power of the analytics and the AI that we've got behind us. We've got the pattern recognition down. >> Talking about the partner program, I'll say (mumbles) some of the announcements. Yesterday was a partner day. What happened yesterday, what's going on today, what's the vibe of the show, ecosystem, partner program, what are the new things? >> You know, bottom line for the partners, we're here to help them extend their businesses. There's tremendous momentum in the market as far as, we're pulling through demand on the integration scenarios. You know, we've got Deloitte and TCS, Accenture, some of our top sponsors here, our sponsorships are sold out, right? Our partners are here in this ecosystem. Dell Technologies, right behind us. It's a tremendous show of force, it's fantastic. And it just shows you the market potential and the need out there. Customers are clamoring for these types of solutions. >> As the CMO, I want to get your take on some of the messaging breakdown. One of them that came out today, left bold messaging is, not only, as you mentioned a minute ago, Dell Boomi is the transformation partner, but also that, "Hey we're re-imagining the 'i' in iPass." iPass is a competitive, well-established market. You guys are using your own, upwards of 30 terabytes of anonymous metadata to make the Boomi unified platform smarter, more responsive. As you look to help that 76% of customers who are failing in their digital transformations, how is the "re-imagined" 'i' in iPass going to be a facilitator of that? >> It's putting the user at the center of the experience. Steve Wood, our Chief Product Officer, is going to be on stage tomorrow, doing a demo of this re-imagined user experience. It's driven by the data that we've got, It's driven by the patterns that we've been able to look at as far as business processes and integrations, and be able to provide a user experience where the customer's at the center, I go with a problem, not a list of technologies that I need to connect. Mandy wants to build EDI for a couple of trading partners, right? I don't need to tell Boomi that, I need to tell them, "I need this outcome, "and I need data to be transferred from here to here," and at the end of the day, I, from my cell phone, want to be able to figure out what's going on as far as my supply chain. I want to know where that boat is, coming for Black Friday. Is my inventory hitting the port when it needs to? I should be able to see that from my phone. That's what we're doing, we're giving the power back to the users, and enabling them to go power their businesses. >> As a new person to Dell, we've known each other, at the last (mumbles) you were at a born in the cloud, Amazon sets the agenda for a lot of the cloud computing market, you guys are cloud native as a startup, really kind of nailed that stats formula with Boomi. Dell is not restrictive in the sense, but it's got a lot of muscle behind you. Boomi seems to be standing on its own and flying out, like VMware, while it's still 100% owned by Dell. Those trends are big, that's a big wave that you're on. How are you thinking about it as you look at your assignment as the CMO, how are you going to ride that wave, are you going to hang 10 early, are you going to build it out slowly? What's going on? >> Oh, we're going. We're going for it. We're going to go ride that wave, it's here. If anything, we've got to work better with our Dell Technologies partners, right? We're getting in deeper from a go-to-market standpoint, with a lot of the enterprise reps already in the ecosystem. We're looking at driving customer value. As Michael said, there's always a need for Boomi. We haven't found a single opportunity yet that Boomi isn't needed. >> So you're on a growth curve? >> We're absolutely on a growth curve. It's just, we can't get there fast enough. We're hiring like crazy, we're, you know, we're just doing it. >> What kind of jobs you guys looking for, what's the hiring, what are your needs? Take a minute to share. >> Technical talent is always priority number one for a company like ours. On the go-to-market side as well, we need sales people, you know I've got marketing recs out already, check our website. There's lots of opportunity from a VD standpoint partner as well, so tremendous opportunity on the go-to-market side as well as on the R&D side. >> Looks like Boomi is going to be one of those flagships for Dell Technologies. >> I certainly hope so, that's my vision. >> I mean, you've got good company. VMware didn't skip a beat, Pivotal's growing like a weed, Dell Boomi's exploding in a big way, you guys are doing great, congratulations. >> Thank you, thank you. >> And another thing, before we wrap up here, that is impressive, all those companies, those Dell companies that John just mentioned, including Dell Boomi as a business unit, all of them have women at the executive level. There are six CMOs, including yourself, female CMOs in that position, and that's something that theCUBE has always long been a supporter of women in technology, and I always admire that. It's great, congratulations on your appointment. It's great seeing a strong female leader in a role. And your energy is contagious, so. It's a good thing that they got you on that growth trajectory, 'cause I can feel it. >> It's happening, it's going to be amazing. And thank you for being a part of this journey with us. >> Thanks so much, Mandy, for having us, we appreciate your time, and have a great time at the rest of the event, we'll see you next year. >> Thank you, thank you. >> For John Furrier, I'm Lisa Martin. You're watching theCUBE live from Boomi World 2018, John and I will be right back with our next guest. (digital music)

Published Date : Nov 6 2018

SUMMARY :

Brought to you by Dell Boomi. Welcome back to theCUBE, we are live Talk to us about, you're new, been seven weeks, and it's fantastic to be a part of it. of the puzzle pieces of the Micheal Dell strategy. Now you got to get the word out, you got to drive value, We've got to get the word out. to be dealing with the customer. is that going to be a key part of that? and the workloads that run with that data? and started to look at the value that Boomi is the biggest benefit that Dell Boomi customers get, We've been hearing that now for years. of the breakout sessions here have customers and it's been the secret to our success. and the ecosystem, you mentioned that, of the Fortune 500 are going to be around And that's not what they're used to. and the AI that we've got behind us. I'll say (mumbles) some of the announcements. and the need out there. As the CMO, I want to get your take on not a list of technologies that I need to connect. of the cloud computing market, you guys are We're going to go ride that wave, it's here. We're hiring like crazy, we're, you know, What kind of jobs you guys looking for, On the go-to-market side as well, Looks like Boomi is going to be one you guys are doing great, congratulations. It's a good thing that they got you It's happening, it's going to be amazing. at the rest of the event, we'll see you next year. John and I will be right back with our next guest.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
MichielPERSON

0.99+

Dave VellantePERSON

0.99+

AnnaPERSON

0.99+

GartnerORGANIZATION

0.99+

DavidPERSON

0.99+

DavePERSON

0.99+

BryanPERSON

0.99+

JohnPERSON

0.99+

Lisa MartinPERSON

0.99+

IBMORGANIZATION

0.99+

VikasPERSON

0.99+

LisaPERSON

0.99+

MichaelPERSON

0.99+

Katherine KosterevaPERSON

0.99+

ChrisPERSON

0.99+

NECORGANIZATION

0.99+

EricssonORGANIZATION

0.99+

KevinPERSON

0.99+

Dave FramptonPERSON

0.99+

MicrosoftORGANIZATION

0.99+

StevePERSON

0.99+

Kerim AkgonulPERSON

0.99+

Dave NicholsonPERSON

0.99+

JaredPERSON

0.99+

Steve WoodPERSON

0.99+

PeterPERSON

0.99+

JamesPERSON

0.99+

NECJORGANIZATION

0.99+

Lisa MartinPERSON

0.99+

PaulPERSON

0.99+

Mike OlsonPERSON

0.99+

EuropeLOCATION

0.99+

Andy AnglinPERSON

0.99+

AmazonORGANIZATION

0.99+

Eric KurzogPERSON

0.99+

Michiel BakkerPERSON

0.99+

Kerry McFaddenPERSON

0.99+

FCAORGANIZATION

0.99+

EricPERSON

0.99+

Ed WalshPERSON

0.99+

NASAORGANIZATION

0.99+

NokiaORGANIZATION

0.99+

Lee CaswellPERSON

0.99+

ECECTORGANIZATION

0.99+

Peter BurrisPERSON

0.99+

OTELORGANIZATION

0.99+

David FloyerPERSON

0.99+

Bryan PijanowskiPERSON

0.99+

Jeff ClarkePERSON

0.99+

LandmarkORGANIZATION

0.99+

Rich LanePERSON

0.99+

KerimPERSON

0.99+

Kevin BoguszPERSON

0.99+

Jeff FrickPERSON

0.99+

AustraliaLOCATION

0.99+

Jared WoodreyPERSON

0.99+

LincolnshireLOCATION

0.99+

KeithPERSON

0.99+

Dave NicholsonPERSON

0.99+

KatherinePERSON

0.99+

Jason Cook, Accenture | Dell Boomi World 2018


 

>> Live from Las Vegas, it's theCUBE. Covering Boomi World 2018. Brought to you by Dell Boomi. >> Welcome back to theCUBE. We are live at the Encore in Las Vegas, I'm Lisa Martin with John Furrier. We're at Dell Boomi World 2018, second annual Dell Boomi World, and we're here with one of Dell Boomi and Dell's biggest GSIs. We've got Jason Cook, the Global Client Account Lead at Accenture serving Dell. Jason, thanks for joining John and me today. >> Thank you. >> So, second annual Dell Boomi World, bigger than last year. They were talking today, a lot of interesting numbers. 7,500 plus customers to date. They're adding five new customers everyday. I saw the Gartner Magic Quadrant from earlier this year and iPaaS, they are right up there in that strong leader category. Talk to us about the relationship that you have with Dell Technologies and the business heat of Dell Boomi. >> Yeah, yeah, it's an interesting one. So, Accenture has become very big. I think we now have 470,000 global employees, and our brand and presence is technology advisory and delivery, it predominates what we did. What's interesting about Dell and, specifically, Boomi is being so central to the technology ecosystem, there's much opportunity for partnership. Where Dell is present with enterprise clients, we're present too. And we tend to have long-running relationships with those clients. Most of our clients are tenured over 15 years. So it gives us an opportunity to have the type of longstanding relationship that Dell has with clients and advise on technology trends, and change, and break into the best thinking of the marketplace in their clients as they look to solve problems, of course, Dell is central to that solution set, as Boomi is too. >> And yesterday, they announced a new technology partner program. Dell Boomi has a broad partner ecosystem that it partners, implementation, GSIs, talk to us about that and the maybe new business opportunities that it will give to Accenture. >> Yeah, so we've enjoyed a relationship over the past several years in Europe working with Boomi. And we incubated a program over there called Eccentric Growth Partnerships, where with emerging companies such as Boomi, we've gone to market, leveraged the Accenture channel, and then brought scale to those technologies to deliver at enterprise level for their expectations. It's been very successful, you know, seen on both sides is a real win. And we're now transferring that into the North American market, so we're based on the heels of that success. We're looking to formalize some of the things we've been doing internationally in North America. A larger market for both of us, and so it's expanded opportunity in both places. >> Jason, talk about Accenture's own transformation. We've been following you guys for, I've been following Accenture when they changed their name. But recently you guys have invested, in the past decade, really early in data science. You guys have been on the public cloud very early. You've been partnering with your customers. And so that's all great, you guys do a good job with that. But what's interesting is you're actually helping them change their business model. >> Yes. >> So how has your own transformation within Accenture dealing with Dell, he's been doing a trillion dollars in business. Millions and millions of servers sold. His customers are changing. You guys are in that business model, enablement business, you're helping customers. What's the big business model impact that's happening in the market right now. >> Well, I think you know, as it pertains to Accenture, yeah, we've grown. I would say one of the hallmarks of the growth has been around digital, and I think 60% of our revenues are now digitally oriented, which are in the areas you described. So that's become our brand and presence, and the majority of what we do in the marketplace. I think the things that we're doing to serve clients, which are several of the things we've done internally, have been around all sorts of digitally-enabled journeys, whether it's the intelligent enterprise, the connected customer, the adoption of platforms, and the expanded use as a service within enterprises. There are plays within all those spaces where we end up bringing enablement to those clients. You know, examples would be, in the retail space, you know, growth and expansion of omnichannel techniques, so that the same customer experience exists across anywhere in retail. Programs around single views of customer are very, very common for us globally. Traditionally, less technical areas of the business, like a supply chain operating that's dominated by manufacturing and fulfillment and brick and mortar in the retail space. The real time visibility challenges that have historically been there are only now being able to be solved by technologies, and so there's several different. >> And the cloud certainly is horizontally scaled, so it impacts all industries that you play in, so, good for business. But the challenge that the CIOs have that we talked to, we hear and want to get your reaction to is, okay, I loved technology scale. I need to have proof points. I got to have mile markers that are going to be attainable with time-to-value. But the number one thing they say is I got to bring a competitive advantage into I.T., in a cloud construct that's horizontally scalable and work with partners in areas that aren't core. So, leverage supplier relationships, but build a core intellectual property or competitive advantage with I.T. How do you guys help them? What are some trends? What are those I.P. moments for your large and medium-sized customers? >> Yeah, I think that because we have the heritage of both advising on and delivering technology, where we tend to work closely with CIOs is around the speed-to-value, delivering on programs. We represent a wealth of experience and work in the marketplace, and those learnings can be brought to different clients, and fundamentally that's what's valuable to them. So I think that when we talk about cloud enablement, it's often a matter, too of thinking through, what are the specific business outcomes that can be delivered from the use of technology. And so, clients for example, I can think of some clients, that one company that has 1,400 legacy applications in a cloud footprint. And yet the business initiatives that come into the IT-- >> They must use containers a lot. >> Yeah, well exactly. The questions that come into the I.T. organization are often ones around how can we improve our visibility to product line profitability, as an example. And so, the use of cloud, the use of integration technologies like Boomi accelerates the ability to connect information from that disparate environment and deliver outcomes. >> And specifically more tactical, to get those outcomes, what specific things do you see? Is the cloud native? Is it the role of data? How are CIOs getting down and dirty, saying okay, I'm going to lock in on this as territory, we're going to build around and build on top of. Data, cloud, and IoT's new, and everyone knows what IoT is, it's going to be part of, either physical and/or low-hanging fruit. But what are they building on from an I.T. standpoint? Is it the data, is it the network? Is it the storage? So what do you see there? >> Yeah, I think it is the data. I think that's where we see, data-led seems to be the thinking in most of these cases around getting information consistently consumed throughout. 'Cause the world has become so data intensive that access to data is not the problem. It's the integration, and the derivation of value from it that's-- >> And scale, too, I mean. >> And scale, right, yeah. >> Hello cloud, so cloud and data seem to be. >> And it's become more distributed, too. And so dealing with distributed data sources and normalizing has been a-- >> That's where Boomi comes in, integrating all that stuff in, so cloud and data seem to be the pattern across the board generically speaking. I mean, obviously certain industries financial, service, oil, and gas have unique requirements. >> They all have their own cases for it, whether you're a distributed bank, or whether you're a distributed retailer, or whether you're dealing with oil wells in distributed locations, you run into common problems across all industries. >> And integration is so much more, as the iPaaS market has evolved, it's so much more than integrating applications. It's integrating applications, data from existing sources, from new sources, the API economy is essential for that. To enable an organization to create a customer experience that's going to allow them to use that data, and continue to get more customers, more data, and evolve faster than their competition. But transformation is a big challenge, right? And here, well, and even Dell Technologies were, the theme was about making it real, making it real for digital transformation, security transformation, huge priority, workforce. How, when Accenture is going in to integrate at, whether it's a retailer or an oil and gas company, how do you help them start? What's that start of a transformation? >> Well, it often is the transformations you were just referring to. Our typical engagement profile ranges from how do I engage my workforce in a new way? Or how do I improve visibility across a distributed network of retail stores, or banks, or what have you? And so those are the transformations, and then inevitably, the connection of information across those things become the enabling source. If you take, as an example, a customer experience program where, let's talk about a government example where they want a single view of a citizen, a tax payer, whatever it may be. There's so much information on that person in so many disparate places that has to be brought together in a cohesive way. Not only that, but brought together and then used effectively in serving that person. And that's where you see a lot of value. >> Jason, I want to pick your brain while you're here, 'cause Accenture's always got the smart people who know what's going on. And you got big customers, big examples. There's a dynamic right now between two kind of personas. Kind of making it generic for the conversation now. Persona one is the business executive who is responsible and chartered to drive the digital transformation with new and improved applications. Taking advantage of the legacy, bringing in the new, managing them either on their own schedule. And the second persona is the person deploying cloud. So how are companies organizing around these personas? One's got to be under the hood, I got to do multicloud I got to do Kubernetes, I got to do all these things. Stateless applications, stateful applications, integrate them all together. I'm deploying it. And then the business persona, hey, take that hill, more apps, more outcomes. So how are companies organizing around these dynamics? What's the best practice? >> Yeah, along the lines you describe. So, specifically, the business functions are becoming aligned with application domains, and those tend to be programmatically managed. And so we see structures around that programmatic management. To be very responsive to business needs, and particularly as clock speeds accelerate on delivery, maintaining that partnership is very, very important. Likewise, on the infrastructural side, we see alignment there too to take advantage of creating platforms, and enablement, and infrastructure, and delivery capabilities that can deliver on that promise. >> So they're working together on pizza teams, or like agile teams? >> So it's a customer-focused model for the programmatic work and it's an industrialization and an acceleration on the infrastructural side. And that's, again, where there's a strong fit with some of these-- >> Do you have a favorite example, speaking of that? So many departments, lines of business, need to have access to the same data to be able to develop new products and services, tune things, make things better, faster than their competition. So there's this sort of democratization and this need to be able to share the information so that the entire business can grow together. Do you have a favorite example of an organization of any industry that you've worked with that you've seen really do that well, so that business, at the end of the day, everyone's playing well together because they have to. The business now is connecting customers, vendors, partners, and delivering experiences that are truly differentiating. >> Integration programs, data programs, data lake programs, data science programs often have a governance mechanism out in front of them to prioritize the needs of their business. Both in the back, in terms of enablement of different sources of information being accessed, but also the uses on the front end. And so that is a practice that we're seeing grow exponentially. The other thing that's interesting, I think, in terms of best practice is that as intelligence accelerates and companies become more analytically driven, the traditional process of continuous improvement which used to be defined in terms of Six Sigma events and other things, where once in a while a function would be evaluated for efficiencies becomes a continuous capability. So in this governance model, the ability to refine, and tune, and improve things like integration, AI, analytics on a continuous cycle as opposed to having it be event-driven is certainly an emerging trend and a best practice that we see a lot of. >> Well, Jason, thanks so much for joining the program with John and me today, and sharing with us what's new with Accenture and Dell Boomi and how you're helping customers globally truly transform. >> It's a pleasure, thank you for having me. >> And for John Furrier, I'm Lisa Martin. You're watching theCUBE live from Boomi World 2018 in Las Vegas. John and I will be right back with our next guest. (electronic music)

Published Date : Nov 6 2018

SUMMARY :

Brought to you by Dell Boomi. We are live at the Encore in Las Vegas, I saw the Gartner Magic Quadrant from earlier this year is being so central to the technology ecosystem, talk to us about that and the maybe new business leveraged the Accenture channel, and then brought scale You guys have been on the public cloud very early. in the market right now. so that the same customer experience exists But the number one thing they say is I got to bring that can be delivered from the use of technology. accelerates the ability to connect information Is it the data, is it the network? and the derivation of value from it that's-- And so dealing with distributed data sources to be the pattern across the board generically speaking. you run into common problems across all industries. And integration is so much more, as the iPaaS market Well, it often is the transformations And the second persona is the person deploying cloud. Yeah, along the lines you describe. So it's a customer-focused model for the programmatic work at the end of the day, everyone's playing well together Both in the back, in terms of enablement of different Well, Jason, thanks so much for joining the program John and I will be right back with our next guest.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JasonPERSON

0.99+

Lisa MartinPERSON

0.99+

Jason CookPERSON

0.99+

JohnPERSON

0.99+

John FurrierPERSON

0.99+

AccentureORGANIZATION

0.99+

DellORGANIZATION

0.99+

EuropeLOCATION

0.99+

60%QUANTITY

0.99+

millionsQUANTITY

0.99+

Las VegasLOCATION

0.99+

MillionsQUANTITY

0.99+

North AmericaLOCATION

0.99+

last yearDATE

0.99+

BoomiORGANIZATION

0.99+

Dell TechnologiesORGANIZATION

0.99+

bothQUANTITY

0.99+

1,400 legacy applicationsQUANTITY

0.99+

BothQUANTITY

0.99+

both sidesQUANTITY

0.99+

yesterdayDATE

0.99+

7,500 plus customersQUANTITY

0.98+

over 15 yearsQUANTITY

0.98+

todayDATE

0.98+

Dell BoomiORGANIZATION

0.98+

both placesQUANTITY

0.98+

470,000 global employeesQUANTITY

0.98+

five new customersQUANTITY

0.97+

Dell Boomi World 2018EVENT

0.97+

second personaQUANTITY

0.97+

Dell Boomi WorldEVENT

0.96+

GartnerORGANIZATION

0.96+

singleQUANTITY

0.95+

oneQUANTITY

0.93+

earlier this yearDATE

0.92+

KubernetesTITLE

0.92+

Boomi World 2018EVENT

0.91+

one companyQUANTITY

0.87+

single viewQUANTITY

0.86+

two kind of personasQUANTITY

0.86+

past decadeDATE

0.82+

North AmericanLOCATION

0.77+

second annualQUANTITY

0.72+

trillion dollarsQUANTITY

0.72+

EncoreLOCATION

0.71+

orkforceORGANIZATION

0.69+

iPaaSTITLE

0.65+

second annualEVENT

0.64+

Eccentric Growth PartnershipsOTHER

0.6+

Magic QuadrantCOMMERCIAL_ITEM

0.6+

iPaaSCOMMERCIAL_ITEM

0.55+

Six SigmaOTHER

0.54+

yearsDATE

0.52+

Rob Bearden, Hortonworks | DataWorks Summit 2018


 

>> Live from San Jose in the heart of Silicon Valley, it's theCUBE covering DataWorks Summit 2018, brought to you by Hortonworks. >> Welcome back to theCUBE's live coverage of DataWorks Summit here in San Jose, California. I'm your host, Rebecca Knight, along with my co-host, James Kobielus. We're joined by Rob Bearden. He is the CEO of Hortonworks. So thanks so much for coming on theCUBE again, Rob. >> Thank you for having us. >> So you just got off of the keynote on the main stage. The big theme is really about modern data architecture. So we're going to have this modern data architecture. What is it all about? How do you think about it? What's your approach? And how do you walk customers through this process? >> Well, there's a lot of moving parts in enabling a modern data architecture. One of the first steps is what we're trying to do is unlock the siloed transactional applications, and to get that data into a central architecture so you can get real time insights around the inclusive dataset. But what we're really trying to accomplish then within that modern data architecture is to bring all types of data whether it be real time streaming data, whether it be sensor data, IoT data, whether it be data that's coming from a connected core across the network, and to be able to bring all that data together in real time, and give the enterprise the ability to be able to take best in class action so that you get a very prescriptive outcome of what you want. So if we bring that data under management from point of origination and out on the edge, and then have the platforms that move that through its entire lifecycle, and that's our HDF platform, it gives the customer the ability to, after they capture it at the edge, move it, and then have the ability to process it as an event happens, a condition changes, various conditions come together, have the ability to process and take the exact action that you want to see performed against that, and then bring it to rest, and that's where our HDP platform comes into play where then all that data can be aggregated so you can have a holistic insight, and have real time interactions on that data. But then it then becomes about deploying those datasets and workloads on the tier that's most economically and architecturally pragmatic. So if that's on-prem, we make sure that we are architected for that on-prem deployment or private cloud or even across multiple public clouds simultaneously, and give the enterprise the ability to support each of those native environments. And so we think hybrid cloud architecture is really where the vast majority of our customers today and in the future, are going to want to be able to run and deploy their applications and workloads. And that's where our DataPlane Service Offering gives them the ability to have that hybrid architecture and the architectural latitude to move workloads and datasets across each tier transparently to what storage file format that they did or where that application is, and we provide all the tooling to match the complexity from doing that, and then we ensured that it has one common security framework, one common governance through its entire lifecycle, and one management platform to handle that entire lifecycle data. And that's the modern data architecture is to be able to bring all data under management, all types of data under management, and manage that in real time through its lifecycle til it comes at rest and deploy that across whatever architecture tier is most appropriate financially and from a performance on-cloud or prem. >> Rob, this morning at the keynote here in day one at DataWorks San Jose, you presented this whole architecture that you described in the context of what you call hybrid clouds to enable connected communities and with HDP, Hortonworks Data Platform 3.0 is one of the prime announcements, you brought containerization into the story. Could you connect those dots, containerization, connected communities, and HDP 3.0? >> Well, HDP 3.0 is really the foundation for enabling that hybrid architecture natively, and what's it done is it separated the storage from the compute, and so now we have the ability to deploy those workloads via a container strategy across whichever tier makes the most sense, and to move those application and datasets around, and to be able to leverage each tier in the deployment architectures that are most pragmatic. And then what that lets us do then is be able to bring all of the different data types, whether it be customer data, supply chain data, product data. So imagine as an industrial piece of equipment is, an airplane is flying from Atlanta, Georgia to London, and you want to be able to make sure you really understand how well is that each component performing, so that that plane is going to need service when it gets there, it doesn't miss the turnaround and leave 300 passengers stranded or delayed, right? Now with our Connected platform, we have the ability to take every piece of data from every component that's generated and see that in real time, and let the airlines make that real time. >> Delineate essentially. >> And ensure that we know every person that touched it and looked at that data through its entire lifecycle from the ground crew to the pilots to the operations team to the service. Folks on the ground to the reservation agents, and we can prove that if somehow that data has been breached, that we know exactly at what point it was breached and who did or didn't get to see it, and can prevent that because of the security models that we put in place. >> And that relates to compliance and mandates such as the Global Data Protection Regulation GDPR in the EU. At DataWorks Berlin a few months ago, you laid out, Hortonworks laid out, announced a new product called the Data Steward Studio to enable GDPR compliance. Can you give our listeners now who may not have been following the Berlin event a bit of an update on Data Steward Studio, how it relates to the whole data lineage, or set of requirements that you're describing, and then going forward what does Hortonworks's roadmap for supporting the full governance lifecycle for the Connected community, from data lineage through like model governance and so forth. Can you just connect a few dots that will be helpful? >> Absolutely. What's important certainly, driven by GDPR, is the requirement to be able to prove that you understand who's touched that data and who has not had access to it, and that you ensure that you're in compliance with the GDPR regulations which are significant, but essentially what they say is you have to protect the personal data and attributes of that data of the individual. And so what's very important is that you've got to be able to have the systems that not just secure the data, but understand who has the accessibility at any point in time that you've ever maintained that individual's data. And so it's not just about when you've had a transaction with that individual, but it's the rest of the history that you've kept or the multiple datasets that you may try to correlate to try to expand relationship with that customer, and you need to make sure that you can ensure not only that you've secured their data, but then you're protecting and governing who has access to it and when. And as importantly that you can prove in the event of a breach that you had control of that, and who did or did not access it, because if you can't prove any breach, that it was secure, and that no one breached it, who has or access to this not supposed to, you can be opened up for hundreds of thousands of dollars or even multiple millions of dollars of fines just because you can't prove that it was not accessed, and that's what the variety of our platforms, you mentioned Data Studio, is part of. DataPlane is one of the capabilities that gives us the ability. The core engine that does that is Atlas, and that's the open source governance platform that we developed through the community that really drives all the capabilities for governance that moves through each of our products, HDP, HDF, then of course, and DataPlane and Data Studio takes advantage of that and how it moves and replicates data and manages that process for us. >> One of the things that we were talking about before the cameras were rolling was this idea of data driven business models, how they are disrupting current contenders, new rivals coming on the scene all the time. Can you talk a little bit about what you're seeing and what are some of the most exciting and maybe also some of the most threatening things that you're seeing? >> Sure, in the traditional legacy enterprise, it's very procedural driven. You think about classic Encore ERP. It's worked very hard to have a very rigid, very structural procedural order to cash cycle that has not a great deal of flexibility. And it takes through a design process, it builds product, that then you sell product to a customer, and then you service that customer, and then you learn from that transaction different ways to automate or improve efficiencies in their supply chain. But it's very procedural, very linear. And in the new world of connected data models, you want to bring transparency and real time understanding and connectivity between the enterprise, the customer, the product, and the supply chain, and that you can take real time best in practice action. So for example you understand how well your product is performing. Is your customer using it correctly? Are they frustrated with that? Are they using it in the patterns and the frequency that they should be if they are going to expand their use and buy more, and if they're not, how do we engage in that cycle? How do we understand if they're going through a re-review and another buying of something similar that may not be with you for a different reason. And when we have real time visibility to our customer's interaction, understand our product's performance through its entire lifecycle, then we can bring real time efficiency with linking those together with our supply chain into the various relationships we have with our customers. To do that, it requires the modern data architecture, bringing data under management from the point it originates, whether it's from the product or the customer interacting with the company, or the customer interacting potentially with our ecosystem partners, mutual partners, and then letting the best in practice supply chain techniques, make sure that we're bringing the highest level of service and support to that entire lifecycle. And when we bring data under management, manage it through its lifecycle and have the historical view at rest, and leverage that across every tier, that's when we get these high velocity, deep transparency, and connectivity between each of the constituents in the value chain, and that's what our platforms give them the ability to do. >> Not only your platform, you guys have been in business now for I think seven years or so, and you shifted from being in the minds of many and including your own strategy from being the premier data at rest company in terms of the a Hadoop platform to being one of the premier data in motion companies. Is that really where you're going? To be more of a completely streaming focus, solution provider in a multi-cloud environment? And I hear a lot of Kafka in your story now that it's like, oh yeah, that's right, Hortonworks is big on Kafka. Can you give us just a quick sense of how you're making that shift towards low latency real time streaming, big data, or small data for that matter, with embedded analytics and machine learning? >> So, we have evolved from certainly being the leader in global data platforms with all the work that we do collaboratively, and in through the community, to make Hadoop an enterprise viable data platform that has the ability to run mission critical workloads and apps at scale, ensuring that it has all the enterprise facilities from security and governance and management. But you're right, we have expanded our footprint aggressively. And we saw the opportunity to actually create more value for our customers by giving them the ability to not wait til they bring data under management to gain an insight, because in that case, they're happened to be reactive post event post transaction. We want to give them the ability to shift their business model to being interactive, pre-event, pre-conditioned. The way to do that we learned was to be able to bring the data under management from the point of origination, and that's what we used MiNiFi and NiFi for, and then HDF, to move it through its lifecycle, and your point, we have the intellect, we have the insight, and then we have the ability then to process the best in class outcome based on what we know the variables are we're trying to solve for as that's happening. >> And there's the word, the phrase asset which of course is a transactional data paradigm plan, I hear that all over your story now in streaming. So, what you're saying is it's a completely enterprise-grade streaming environment from n to n for the new era of edge computing. Would that be a fair way of-- >> It's very much so. And our model and strategy has always been bring the other best in class engines for what they do well for their particular dataset. A couple of examples of that, one, you brought up Kafka, another is Spark. And they do what they do really well. But what we do is make sure that they fit inside an overall data architecture that then embodies their access to a much broader central dataset that goes from point of origination to point of rest on a whole central architecture, and then benefit from our security, governance, and operations model, being able to manage those engines. So what we're trying to do is eliminate the silos for our customers, and having siloed datasets that just do particular functions. We give them the ability to have an enterprise modern data architecture, we manage the things that bring that forward for the enterprise to have the modern data driven business models by bringing the governance, the security, the operations management, ensure that those workflows go from beginning to end seamlessly. >> Do you, go ahead. >> So I was just going to ask about the customer concerns. So here you are, you've now given them this ability to make these real time changes, what's sort of next? What's on their mind now and what do you see as the future of what you want to deliver next? >> First and foremost we got to make sure we get this right, and we really bring this modern data architecture forward, and make sure that we truly have the governance correct, the security models correct. One pane of glass to manage this. And really enable that hybrid data architecture, and let them leverage the cloud tier where it's architecturally and financially pragmatic to do it, and give them the ability to leg into a cloud architecture without risk of either being locked in or misunderstanding where the lines of demarcation of workloads or datasets are, and not getting the economies or efficiencies they should. And we solved that with DataPlane. So we're working very hard with the community, with our ecosystem and strategic partners to make sure that we're enabling the ability to bring each type of data from any source and deploy it across any tier with a common security, governance, and management framework. So then what's next is now that we have this high velocity of data through its entire lifecycle on one common set of platforms, then we can start enabling the modern applications to function. And we can go look back into some of the legacy technologies that are very procedural based and are dependent on a transaction or an event happening before they can run their logic to get an outcome because that grinds the customer in post world activity. We want to make sure that we're bringing that kind of, for example, supply chain functionality, to the modern data architecture, so that we can put real time inventory allocation based on the patterns that our customers go in either how they're using the product, or frustrations they've had, or success they've had. And we know through artificial intelligence and machine learning that there's a high probability not only they will buy or use or expand their consumption of whatever that they have of our product or service, but it will probably to these other things as well if we do those things. >> Predict the logic as opposed to procedural, yes, AI. >> And very much so. And so it'll be bringing those what's next will be the modern applications on top of this that become very predictive and enabler versus very procedural post to that post transaction. We're little ways downstream. That's looking out. >> That's next year's conference. >> That's probably next year's conference. >> Well, Rob, thank you so much for coming on theCUBE, it's always a pleasure to have you. >> Thank you both for having us, and thank you for being here, and enjoy the summit. >> We're excited. >> Thank you. >> We'll do. >> I'm Rebecca Knight for Jim Kobielus. We will have more from DataWorks Summit just after this. (upbeat music)

Published Date : Jun 20 2018

SUMMARY :

in the heart of Silicon Valley, He is the CEO of Hortonworks. keynote on the main stage. and give the enterprise the ability in the context of what you call and let the airlines from the ground crew to the pilots And that relates to and that you ensure that and maybe also some of the most and that you can take real and you shifted from being that has the ability to run for the new era of edge computing. and then benefit from our security, and what do you see as the future and make sure that we truly have Predict the logic as the modern applications on top of this That's probably next year's it's always a pleasure to have you. and enjoy the summit. I'm Rebecca Knight for Jim Kobielus.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
James KobielusPERSON

0.99+

Rebecca KnightPERSON

0.99+

Rob BeardenPERSON

0.99+

Jim KobielusPERSON

0.99+

LondonLOCATION

0.99+

300 passengersQUANTITY

0.99+

San JoseLOCATION

0.99+

RobPERSON

0.99+

Silicon ValleyLOCATION

0.99+

HortonworksORGANIZATION

0.99+

seven yearsQUANTITY

0.99+

hundreds of thousands of dollarsQUANTITY

0.99+

San Jose, CaliforniaLOCATION

0.99+

each componentQUANTITY

0.99+

GDPRTITLE

0.99+

DataWorks SummitEVENT

0.99+

oneQUANTITY

0.99+

OneQUANTITY

0.98+

millions of dollarsQUANTITY

0.98+

AtlasTITLE

0.98+

first stepsQUANTITY

0.98+

HDP 3.0TITLE

0.97+

One paneQUANTITY

0.97+

bothQUANTITY

0.97+

DataWorks Summit 2018EVENT

0.97+

FirstQUANTITY

0.96+

next yearDATE

0.96+

eachQUANTITY

0.96+

DataPlaneTITLE

0.96+

theCUBEORGANIZATION

0.96+

HadoopTITLE

0.96+

DataWorksORGANIZATION

0.95+

SparkTITLE

0.95+

todayDATE

0.94+

EULOCATION

0.93+

this morningDATE

0.91+

Atlanta,LOCATION

0.91+

BerlinLOCATION

0.9+

each typeQUANTITY

0.88+

Global Data Protection Regulation GDPRTITLE

0.87+

one commonQUANTITY

0.86+

few months agoDATE

0.85+

NiFiORGANIZATION

0.85+

Data Platform 3.0TITLE

0.84+

each tierQUANTITY

0.84+

Data StudioORGANIZATION

0.84+

Data StudioTITLE

0.83+

day oneQUANTITY

0.83+

one management platformQUANTITY

0.82+

MiNiFiORGANIZATION

0.82+

SanLOCATION

0.71+

DataPlaneORGANIZATION

0.69+

KafkaTITLE

0.67+

Encore ERPTITLE

0.66+

one common setQUANTITY

0.65+

Data Steward StudioORGANIZATION

0.65+

HDFORGANIZATION

0.59+

GeorgiaLOCATION

0.55+

announcementsQUANTITY

0.51+

JoseORGANIZATION

0.47+