Rohit De Souza, Actian Corporation | CUBE Conversation, December 2018
(light music) >> Hi, I'm Peter Burris and welcome to another CUBE Conversation from our wonderful studios in beautiful Palo Alto, California. Today we're going to be talking about digital transformation, more specifically the tooling that you have to establish or put in place to achieve digital transformation objectives and to do that, we've got Actian corporation here today. Rohit De Souza is the President CEO of Actian. Rohit, welcome to theCUBE. >> Thank you, Peter, I'm glad to be here. >> Well, we're happy to have you 'cause this is a really important topic, but before we get into the actual topic, give us the update on what's going on with Actian. >> Perfect, I'm not sure how much you know about Actian or how much you've followed it, but we've assembled over the years a series of assets that range from Data Management to Data Integration and Data Analytics, really targeted at the next generation of hybrid-data management, really helps companies manage the digital transformation. >> Alright, so let's jump into that digital transformation because that's where a lot of the conversation about hybrid starts, so our belief, and I want to test this with you, is that there really is a difference between a business and a digital business. And that difference is the degree to which a digital business treats data as an asset. >> Absolutely. >> And, in fact, we think that digital transformation is the process by which you re-institutionalize work, reorganize everything else to achieve the goals of using data as an asset, does that? >> Absolutely, and it's not just using the data as purely an asset, but it's using, it's leveraging the data that an enterprise has or has access to and the quantitative analysis of this to influence all aspects of that businesses functions or processes, the way it deals with customers, the way it deals with internal processes, and so on. >> You have to be able to capture data better, you have to be able to turn that data into value, and then you have to be able to act on it. Where does Actian fit in that kind of virtuous cycle? >> So Actian fits all along that chain. We've got the data management assets to allow you to manage that data effectively, we've got the integration assets to allow you to move data to the compute, to the compute to the data effectively, and we've got the best high-performing tools to be able to extract insights from that data at scale and do this all on commodity hardware, so you're doing this at a price-performance level that you can match up elsewhere. >> So it sounds like, but sounds really like you're more focused on creating value out of your data. You might be doing some work at the edge. >> Absolutely. >> And you might have some tooling-- >> Absolutely, absolutely. >> So tell us a little bit about how Actian's vision of data warehousing, data analytics, data warehousing, that whole range of capabilities, is different because of what the base tooling is capable of doing. >> Absolutely, the Actian, the premise behind Actian is that we're going to supplement an organization's digital strategy or data management strategy. So, we're not talking about having to replace stuff in mass, but we're talking about being able to supplement these things where necessary and giving organizations the flexibility to run things on premise or in the cloud, in multi-clouds giving them the flexibility to move from one cloud to the other, and so on. Those capabilities, that capability to manage that data, whether it's in relational systems, whether it's object-oriented systems, whether it's edge systems. To be able to extract the information from those edge systems, move that along to your central systems and then run analytics through it is what Actian does really, really well. >> So if I can kind of repeat that back to you, so, the idea here is that we've got data in an analytics function, that is now, has to be much more high-performance than it used to be. >> That's correct. >> So that we can do a faster close with almost operational time instead of queries from the analytics back to the transaction systems, have I got that right? >> Absolutely, so if you go back a ways the whole process was and the transactional systems here that are generating some information. I pull that information out into an enterprise data warehouse. I've got some things that happen with that, some results and analytics that results, that are driven and those are the results. May or may not make their way back into operations. Today, the business is slightly different. In this era of hyper-personalization, it's no use to me to find out that you were on my website last week, and you were looking at these three products, and you did buy this last year. I want to understand that you're here now and I want to understand how best we can make use of your presence on our company's website to sell you something else, to give you the next best offer, to know how you're interacting with us at that point and to change the interaction that we have with you. If that's going to take place, that needs to happen while the transaction systems are currently in operation. And so the notion of this operational data warehouse, is I'm interacting, I'm generating analytics while I've got those transactions in flight. >> I would even say that it sounds like it's not just a transaction systems are operational but the transaction-- >> That's correct. >> Is open. >> That's correct. >> So that you have, so you need a high-performance store data manager that's capable of responding while the transaction's open to shape, guide, and hyper-personalize the characteristics of the transaction. >> Absolutely. >> Alright, so now let's talk about the hybrid part. You mentioned that earlier. Another belief that we have is that we're going to see a lot of data moved up into the cloud but we can increasingly, the cloud is going to move to the data. We're going to see the services associated the cloud be bought down to the data. >> Couldn't agree more and, oh, by the way, this move to the cloud, this is not just a one time move. Most people think movement to the cloud is a one-time affair. I've got my data on premises for the move to the cloud. No, it's going to move from cloud to cloud. I'm going to have the ability, at some point and time, I'm going to want the ability to run this thing on multiple clouds. I'm not going to risk locking myself in to a particular vendor, so this notion of movement of data from premise to the cloud, perhaps from the cloud back to premise. And into cloud, is here to stay. So, the notion of creating the platform or the capability to move this data around, to move my compute to that data when I need to. It's here to stay, it's going to be with us for awhile. >> Well, it's one of the premises of cloud. The whole motion that data has to be made more fungible, you don't want to go to a bank where your money is contingent upon the definition of money by the bank. >> Absolutely. >> Same thing exists in cloud, so we want that degree of openness, a degree of evolveability but it also, and this is what I'm testing with you is, we think increasing the businesses are going to look at the value propositions, what activities are necessary to deliver those value propositions, where those activities are going to be extent or going to be an operation, and what data's going to be necessary to satisfactory and successfully and with high quality perform those. So, it means increasing that the data is going to be, you're going to want to move the data closer to the activity with right performance, manageability, security, everything else in place. >> That's correct. Or move the compute to the data. >> Or move the compute to the date. So is that kind of the vision that Actian has? Because you've got this family of data managers that each can be, start to become associated with certain styles or transactions or, better put, certain styles of compute and work, digital work. >> Absolutely, now, we take that one step further. There are people who do this today, but many of the approaches that people are using are either cost-prohibitive or don't work. What we've done is actually developed a set of approaches that make these approaches accessible. Today, the notion of true operational data warehousing to operation analytics has been available to really the large companies that have invested completely in extracting value from their data assets. We're bringing that value all the way down to enterprises without gigantic IT staffs, without necessarily spending an arm and a leg on some of the bespoke data management systems of yesterday. We're looking at leveraging commodity hardware to really move performance up a notch, taking your traditional hadoop systems, transitioning those from these swamps that they were into really, honest, goodness operational data store, operational data warehouses, so I can actually update and delete and manage these things like I would any ordinary database. And I've done this on commodity hardware which is distributed across the enterprise. >> So, it is the commodity hardware allows us to place the processing wherever we want. >> That's correct. >> And now we can put the manageability of the data and creating value out of the data. >> That's correct. >> Wherever we want. >> Very. >> So that we are not constrained by associating the data with the action wherever it needs to be. >> Absolutely. >> So as you look forward, what types of future do you anticipate for the evolving role of transaction systems, operational data stores, and digital business? >> I see them converging. I see there being a convergence of these... Digital business involves the operational data stores and the transaction systems, I think you're going to see an increasing number of hybrid systems. These systems that are good at doing both transactions and analytics out of the same systems. We've got one such one. Such one where we've embedded a very high-speed Colmar engine into a traditional relational source. That allows us to do very, very rapid reporting off of existing transactional systems but get analytics out of these systems without any additional overhead. >> Do you anticipate that customers, I mean, I do, but do you anticipate that enterprises are increasing to get alook at almost a data control plane? How does that likely to evolve and what role might Actian play in that? >> I think you bring up a very interesting point for the next generation of data management. There will be a data control plane, we aspire to play in that data control plane. It's not one plane yet. I don't think that the architecture of that one control plane that manages all your data assets across the company. >> And there probably never will be. >> Come about very soon but-- >> Nor is there likely to be one control plane for the reasons you said, you don't want to get locked in. >> But Actian does play in that control plane to allow enterprises the ability to then move their data selectively from on premise to the cloud or between the clouds. >> Alright, so, Rohit, you're a CEO, you've been around for a long time, lot of different places. Imagine, put yourself in the seat of another CEO at one of these large companies. What is the message that they need to bring to their senior staff and others in your organization about affecting this core transition with technology to become more data-oriented, data-friendly, and a culture rated to data utilization? >> I think if you're looking at, you more than likely underestimated the value that's locked in the data that's within your enterprise. Either from a view from a customer or competitive view or a view to improving the processes in you organization. If you task your organization with unlocking this, unlocking the value of these data assets, and being able to respond in more real time to some of the customer or the operations requirements, I think that would go a long way. >> And a part (mumbles) of that is if you've undervalued the data, you're under-investing in the tooling to get value out of the data. >> That's correct. >> Rohit De Souza is the president and CEO of Actian Conversation. Once again, Rohit, thanks for being on theCUBE and talking to us about digital transformation and Actian. >> Thank you very much, Peter. >> And once again, I'm Peter Burris and this has been another CUBE Conversation. Thanks very much for listening, until next time. (lightt music)
SUMMARY :
and to do that, we've got Well, we're happy to have you Management to Data Integration And that difference is the degree to which or has access to and the You have to be able to allow you to manage at the edge. that whole range of capabilities, To be able to extract the information repeat that back to you, place, that needs to happen So that you have, so you need the cloud is going to move to the data. for the move to the cloud. has to be made more fungible, the data closer to the activity Or move the compute to the data. Or move the compute to the date. to really the large companies to place the processing wherever we want. of the data and creating the data with the action and the transaction systems, of that one control plane that manages for the reasons you said, you the ability to then move that they need to bring and being able to to get value out of the data. and talking to us about digital and this has been another
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Peter Burris | PERSON | 0.99+ |
Rohit | PERSON | 0.99+ |
December 2018 | DATE | 0.99+ |
Rohit De Souza | PERSON | 0.99+ |
Peter | PERSON | 0.99+ |
Actian | ORGANIZATION | 0.99+ |
last week | DATE | 0.99+ |
Actian Corporation | ORGANIZATION | 0.99+ |
yesterday | DATE | 0.99+ |
Today | DATE | 0.99+ |
last year | DATE | 0.99+ |
both | QUANTITY | 0.99+ |
Palo Alto, California | LOCATION | 0.99+ |
one time | QUANTITY | 0.99+ |
each | QUANTITY | 0.98+ |
one-time | QUANTITY | 0.98+ |
one | QUANTITY | 0.98+ |
Actian Conversation | ORGANIZATION | 0.98+ |
three products | QUANTITY | 0.97+ |
today | DATE | 0.97+ |
one control plane | QUANTITY | 0.97+ |
one cloud | QUANTITY | 0.96+ |
one step | QUANTITY | 0.95+ |
one plane | QUANTITY | 0.9+ |
Colmar | ORGANIZATION | 0.73+ |
k | QUANTITY | 0.71+ |
President CEO | PERSON | 0.59+ |
Conversation | EVENT | 0.5+ |
Emma McGrattan, Actian | Big Data NYC 2017
>> Announcer: Live from midtown Manhattan it's theCUBE covering Big Data New York City 2017. Brought to you by Silicon Angle Media and it's ecosystem sponsors. (upbeat techno music) >> Hello, everyone. Welcome back to theCUBE's exclusive coverage of Big Data NYC for all the access. It's our fifth year doing our own event in New York City. The hashtag is BigDataNYC. Also, in conjunction with Strata Hadoop, used to be called Hadoop World, then Strata Hadoop. Now, it's called Strata Data as they try to grope to where the future's going to be. A lot of hype over there. A lot of action. But here as where we do the intimate interviews and the stories. I'm John Furrier, co-host of theCUBE with Emma McGrattan who is the Senior Vice President of Engineering at Actian. Great to have you on. >> Thanks for having me. >> We love having everyone from Ireland cause the accidents great traction. So, I appreciate you coming on. Have a beer later at the pub. New York's got to lot of great Irish pubs. In all seriousness, we've had Actian on before. Mike Hoskins has been on. We had Jeff Veis on yesterday giving us the marketing angle of hybrid data that you guys are doing. What's under the hood? Because Actian has a lot of technology in their portfolio through how you guys had your growth strategy. But now as the world wants to bring it together you're seeing some real critical trends. >> Emma: Right. >> A lot of application development where data's important. Huge amount of security challenges. People are trying to build out and bring security out of IT. And then you've got all this data covering stuff. That's just on the top line. Then you got IOT. So, people are busy. Their plates are full, and data's the center of it. So, what are you guys doing to bring all of Actian together? >> Emma: That's a great question, perfect question for Actian. So, we have in Actian a number of products in the portfolio. And we believe that best fit product. So, if you're doing something like graph database, it doesn't make sense to put a Vector in Hadoop solution against that. And we've got the right fit technology for what we're doing. And for IOT we've got an embedded database that's as small as 30 megs. So, I've got PowerPoint files that are bigger than this database. You put it in a device, set it, it can run for 20 years. You never have to touch it. But all that data that's being generated typically you're generating it because you want, at some point, to be able to analyze it. And we've gone in the portfolio and Vector in Hadoop has the ability to take that data from the IOT sources and perform very high-speed analytics on that. So, the products that we have within the portfolio are focused around data integration, so pulling data into an environment where you're going to perform analysis or otherwise operationalize that data, data management. A lot of our customers are just doing CRM, ERP applications on our product platforms. And then the analytics is where I get really excited cause there's so much happening in the analytics world in terms of new types of applications being built, in terms of real time requirements, in terms of security and governance that you're talking about in reference in your question. And we've got a unique solution that can address all of those areas in our Vector in Hadoop products. So, it's interesting that we see the name Hadoop coming out of the show this week because we see that the focus on Hadoop kind of moving to the background and where the real focus is around the data and not so much-- >> And the business value. >> I hate to sound cliché about outcomes but we were joking on theCUBE yesterday and kind of can't coin the term, "Outcomes as a service." Which is kind of a goof on the whole, "It's about the outcomes." Which is a cliché in tech. But that really is the truth. At the end of the day, you've got a business goal. But the role of data now in real time is key. You're seeing people want real time. Not real time response with old data, they want the real data. So, people are starting to look at data as a really instrumental part of the development process. Similar with DevOps did with infrastructure as code, people want data to be like code. >> Emma: Exactly. >> And that is a hard >> Architectural challenge. So, if you go into your customer base what do you guys tell them? And I was going to the hybrid cloud as the marketing message. But I have challenged, I'm the CXO. I'm the CDO. I'm the CIO. I'm the CFO, COO, whatever the person making these huge, sweeping operational cost decisions. What's the architecture? Cause that's what people are working on right now. And how do you present that? >> Right. So, we recognize the fact that everybody's got a very distributed environment. And part of the message around hybrid data is that data can be generated pretty much any place. You may be generating data in the cloud with your own custom applications. You may be using salesforce.com or NetSuite or whatever. And you've got your on-premise sources of data generation. And what we provide in Actian is the ability to access all of that data in real time, and make it part of the applications that you're deploying that is going to be able to react in real time to changes. You don't want to be acting on yesterday's data because things have happened, things have moved on. So, the importance of real time is not lost on Actian. And all of these solutions that we bring together enable that real time analysis of what's happening in every part of the environment. So, it's hybrid in terms of the type of data that you're working with. It's hybrid in terms of it could be generated in the cloud, in any cloud or on-premise, and being able to pull all of that together an perform real time analysis is incredibly important to generate value from the data. >> Emma, I want to get your thoughts on a comment that I heard last night and then multiple times but the same pattern, they don't get it. "They" could be the venture capitalists as part of the startup. Or the customer has, "Oh, this is the way we do it." There's definitely things that are out there Silo's Legacy things that are-- Still not going away, and we know that. But how do you go into a customer saying look, there's a whole new way of doing things right now. It's not necessarily radical lift and shift or rip and replace. Whatever word you want to use. There's always a word that, you don't like rip and replace, we'll say lift and shift. It's the same thing, right? >> Right. >> You don't want to do a lot of incremental operational wholesale changes. >> Right. >> But you want to do incremental value now. How do you go in and say, "Look, this is the way you want to think about real time in your architecture." Because I don't necessarily want to change my operational mindset for the sake of Salesforce and all these different data sources. How do you guys have that conversation? >> So, Actian is unique in that we have a consumer base that goes back 20, 30 years. I personally will be at Actian 25 years in December. So, we've got customers that are running our I'd like to call them Legacy products, but they're products that powering their business every day of the week. And we've also got incredibly innovative product that we're on the bleeding edge. And what we've done in our recent release of Actian X is do combined bleeding edge technology with this more mature and proven technology. So, at Actian X you've got the OLTP database that was Ingres and now got rebranded because it's got new capabilities. And then we've taken the engine from Actian Vector product, and brought that into Actian X so that you can do in real time analysis of your OLTP data. And we act in real time to changes in the data. And it's interesting that you talk about real time because it means different things to different people. So, if you're talking to somebody doing risk analysis, real time is milliseconds. If you're talking to some customers, real time is yesterday's data and that's fine. And what we've done with Actian X is to provide that ability to determine for yourself what real time means to you and to provide a solution that enables you to respond in real time. Now, bringing analytics into what is a more traditional OLTP database, and kind of demonstrating for them some of the new capabilities it enables and opens up other opportunities as far as we can have conversations about maybe backing up that dataset to the cloud. Somebody that may have been risk averse and not looking at cloud all of a sudden is looking at cloud, looking at analytics, and then kind of opening up new opportunities for us. And new opportunities for them cause the data, as they say, is the new oil. >> That's great, great. And you guys have a good customer base to draw from. So, you've got to bring in the shiny new toy but make it work with existing. So, it sounds like you been like an extraction layer that you're building on tech that was very useful and is useful, by decoupling it with new software that adds value. Is it an extraction layer of sorts? >> We don't think of it as an extraction layer but certainly one could think of it that way because it's ... Well, yeah it's-- >> John: It's a product. You basically take the old product and bring new stuff to it. >> Exactly. >> Okay, so I got to ask you about the trend around IOT. Because IOT is one of those things right now that's super hype. And I think it's going to be even more hype. But security has been a big problem and I hear a lot honestly, certainly IOTs on the agenda. Industrial IOT is kind of the low-hanging fruit. They go to that first. But no one wants to be the next Equifax. So, there's a lot of security stuff that causes, plus there's other things going on they got to take care of. How do you guys talk about the security equation where you can come in and put in a reliable workable solution and still make the customer's feel like they're moving the ball down the field. >> So, that's one of the benefits that we have of being in the industry for as long as we have. We have very deep understanding as to what security requirements are. In terms of providing capabilities within the product to do things like control who can access what data and to what degree. Can they update it? Can they only read it? Providing the ability to encrypt the data. So, for many usecases the data is so sensitive that you'd always want to encrypt it when it's stored. You'd want any traffic coming in and out of the environment to be encrypted. Being able to audit everything that's happening in the environment, who's issuing what queries and from where and to set alarms or something if somebody attempts to access data that they shouldn't be attempting to access. So, taking all of those capabilities together, we're then able to look at things like GDPR. What are the requirements for securing the data? And we've got all the capabilities within the product. And we've got the credibility cause we've been doing this for 30 years, that we can secure these environments. We can conform to the various standards and mandates that are put in place for data security. So, we have a very strong story to tell-- >> John: What is your position >> John: On GDPR? Obviously, you've got a super important, I call it the Y2K that actually is real cause you have there compliance issues. There's a lot of, obviously, political things going on but this is a real problem, about to move fast as a solution. What are you guys offer there? >> Equifax was a prime example of why GDPR is incredibly important. So, for Actian, and you know, I talked about the capabilities we provide with regard to securing data, and secure access to that data. And when it comes to GDPR, a lot of it is around process. So, what we're doing is guiding our customers and making sure that they have secure processes in place. Putting all of the smarts into the technology, and then having somebody doing an offline backup on a CD that they leave on a seat on the train which has, in the past, been a source of data breeches, is an issue with process and not with technology. So, we're helping with that. And helping in educating-- >> John: Equifax had some >> BPN issues but also, I mean, I haven't reported on this yet also have confirmed that there were state actors involved, foreign actors penetrating in through their franchise relationships. So, in partnering in an open internet these days you need to understand who the partners are even if they're in the network. >> Absolutely. And that's why this whole idea of providing all of the capabilities required for data security including auditing, who's coming in. So, failed attempts to get into the system should be reported as problems. And that's a capability that we have within the database. >> So, you've been at Actian for 25 years, I did not know. That's cool. Good folks over there. I've been to the office a few times. I'm sure you got a good healthy customer base but for the folks that don't know Actian. What's the pitch from your standpoint? Not the marketing pitch hybrid data, I get that. I mean, what should they know about you guys. What is the problem that you saw? What do you bring to the table? From an engineering perspective, how do you differentiate? >> So, my primary focus is around high-speed analytics. And so, Actian enables the fastest SQL access to data, on Hadoop and off of Hadoop, proven through benchmarks. So, high-speed analytics is incredibly important. But for Actian, we're unique in having this 30 year history where we understand what it is to run 24/7, mission critical operational databases. So, Actian's known for products like Ingres, like Psql, and being able to analyze data that's operationalized but then also bringing in new data sources. Cause that's where things are really going. But people want to choose the best application whether it's in the cloud or on-premise, it doesn't matter. It's the best application for their need. And being able to pull all of that data together, and for operational purposes, and for analytics purposes is incredibly important. And Actian enables all of that. >> And that's where the hybrid is really clever and smart because you got the consumption side and the creation side, and data integration isn't a project, it's real. It just happens. >> Emma: Right. >> So, you want to enable that. I can see that would be a key benefit. Certainly as, whether these decentralized apps get more traction, you're going to start to see more immutable things transactions happening. Blockchain clearly points to that direction of the market where that's cool. Distributed computing has been around for awhile but now decentralized we know how to behave there. So, we're seeing some apps that will probably be rewritten for that. But again, if architected properly that should be a problem. >> Right, exactly. And we don't want anybody to have to rewrite apps. What we want to be able to do is to provide a platform where the data that you need is available. >> John: Yeah, they're called Dapps for decentralized apps. It's a whole new wave coming, it's not being talked about here at the show. We are on, obviously, at Silicon Angle and Wikibon are those trends as we're riding the big wave. Okay, Em, I want to ask you a final question. Kind of take your Actian hat off, put your Irish techie hat on, and let's get down and dirty on what the main problem in the industry is right now. If you look back and kind of go to the balcony if you will, look at the stage of the industry, obviously Hadoop is now in the background. It's an element of the bigger picture. We're seeing, we were commenting yesterday that these customers have these tool sheds of all these tools they've bought. They bought a hammer that wants to be a lawnmower, right? It's just like they have their tool platforms are being pitched at them. There's a lot of confusion. What's the main problem that the industry's trying to solve? If you look at it, if you can put the dots together. What is the big problem that needs to be solved, that the industry should be solving? >> So, I think data is every place, right? And there's not a whole lot of discipline around corralling that and putting security around it. Being able to deploy security policies across data regardless of where it's deployed or sourced. So, I think that's probably the biggest challenge is bringing compute to the data and pulling all of that together. And that's the challenge that we're addressing. >> And so, the unification, if you will, people use that word, all unifying data. What does that actually mean? You guys call it hybrid data which means you have some flexibility if you need it. >> Emma: Right. >> All right, cool. Emma, thanks so much for coming on theCUBE. Really appreciate it. Congratulations on your success. And again, you guys got to a good spot. You got a broad portfolio, you're bringing together with hybrid data. Best of luck. We'll keep in touch. Emma McGrattan here, the Senior Vice President of Engineering at Actian here on theCUBE. More live coverage here in New York City from theCUBE's coverage of Big Data NYC after this short break. (upbeat techno music)
SUMMARY :
Brought to you by Silicon Angle Media and the stories. hybrid data that you guys are doing. So, what are you guys doing to bring all of Actian together? So, the products that we have within the portfolio and kind of can't coin the term, "Outcomes as a service." So, if you go into your customer base and make it part of the applications that you're deploying Or the customer has, "Oh, this is the way we do it." You don't want to do a lot of incremental operational my operational mindset for the sake of Salesforce And it's interesting that you talk about real time And you guys have a good customer base to draw from. but certainly one could think of it that way and bring new stuff to it. Industrial IOT is kind of the low-hanging fruit. So, that's one of the benefits that we have I call it the Y2K that actually is real Putting all of the smarts into the technology, So, in partnering in an open internet these days all of the capabilities required for data security What is the problem that you saw? And so, Actian enables the fastest SQL access to data, And that's where the hybrid is really clever and smart So, you want to enable that. is to provide a platform where the data that you need What is the big problem that needs to be solved, And that's the challenge that we're addressing. And so, the unification, if you will, And again, you guys got to a good spot.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Emma McGrattan | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Emma | PERSON | 0.99+ |
20 years | QUANTITY | 0.99+ |
Mike Hoskins | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
Actian | ORGANIZATION | 0.99+ |
Equifax | ORGANIZATION | 0.99+ |
Ireland | LOCATION | 0.99+ |
New York City | LOCATION | 0.99+ |
December | DATE | 0.99+ |
Silicon Angle Media | ORGANIZATION | 0.99+ |
25 years | QUANTITY | 0.99+ |
30 years | QUANTITY | 0.99+ |
yesterday | DATE | 0.99+ |
30 year | QUANTITY | 0.99+ |
20 | QUANTITY | 0.99+ |
Jeff Veis | PERSON | 0.99+ |
fifth year | QUANTITY | 0.99+ |
PowerPoint | TITLE | 0.99+ |
New York | LOCATION | 0.99+ |
Actian X | ORGANIZATION | 0.99+ |
30 megs | QUANTITY | 0.99+ |
Actian Vector | ORGANIZATION | 0.99+ |
GDPR | TITLE | 0.99+ |
Ingres | ORGANIZATION | 0.99+ |
this week | DATE | 0.99+ |
Wikibon | ORGANIZATION | 0.98+ |
one | QUANTITY | 0.98+ |
last night | DATE | 0.98+ |
SQL | TITLE | 0.97+ |
theCUBE | ORGANIZATION | 0.97+ |
Strata Hadoop | TITLE | 0.97+ |
Vector | ORGANIZATION | 0.95+ |
Y2K | ORGANIZATION | 0.95+ |
Hadoop | TITLE | 0.95+ |
DevOps | TITLE | 0.95+ |
NYC | LOCATION | 0.94+ |
NetSuite | TITLE | 0.92+ |
Silicon Angle | ORGANIZATION | 0.91+ |
Irish | OTHER | 0.9+ |
2017 | DATE | 0.89+ |
2017 | EVENT | 0.88+ |
Psql | TITLE | 0.86+ |
Salesforce | ORGANIZATION | 0.86+ |
first | QUANTITY | 0.85+ |
Strata Data | TITLE | 0.84+ |
Jeff Veis, Actian | BigData NYC 2017
>> Live from Midtown Manhattan, it's the Cube. Covering big data, New York City 2017. Brought to you by SiliconANGLE Media and its ecosystem sponsors. >> Okay welcome back everyone, live here in New York City it's the Cube special annual presentation of BIGDATA NYC. This is our annual event in New York City where we talk to all the fall leaders and experts, CEOs, entrepreneurs and anyone making shaping the agenda with the Cube. In conjunction with STRATA DATA which was formally called STRATA HEDUP. HEDUP world, the Cube's NYC event. BIGDATA I want to see you separate from that when we're here. Which of these, who's the chief marketing acting of Cube alumni. Formerly with HPE, been on many times. Good to see you. >> Good to see you. >> Well you're a marketing genius we've talked before at HPE. You got so much experience in data and analytics, you've seen the swath of spectrum across the board from classic. I call classic enterprise to cutting edge. To now full on cloud, AI, machine learning, IOT. Lot of stuff going on, on premise seems to be hot still. There's so much going on from the large enterprises dealing with how to better use your analytics. At Acting you're heading up to marketing, what's the positioning? What're you doing there? >> Well the shift that we see and what's unique about Acting. Which has just a very differentiated and robust portfolio is the shift to what we refer to as hybrid data. And it's a shift that people aren't talking about, most of the competition here. They have that next best mouse trap, that one thing. So it's either move your database to the cloud or buy this appliance or move to this piece of open source. And it's not that they don't have interesting technologies but I think they're missing the key point. Which is never before have we seen the creation side of data and the consumption of data becoming more diverse, more dynamic. >> And more in demand too, people want both sides. Before we go any deeper I just want you to take a minute to define what is hybrid data actually mean. What does that term mean for the people that want to understand this term deeper. >> Well it's understanding that it's not just the location of it. Of course there's hybrid computing which is premised in cloud. And that's an important part of it. But there's also about where and how is that data created. What time domain is that data going to be consumed and used and that's so important. A lot of analytics, a lot of the guys across the street are kind of thinking about reporting in analytics and that old world way of. We collect lots of data and then we deliver analytics. But increasingly analytics is being used almost in real time or near real time. Because people are doing things with the data in the moment. Then another dimension of it is AdHawk discovery. Where you can have not one or two or three data scientists but dozens if not hundreds of people. All with copies of Tableau and Click attacking and hitting that data. And of course it's not one data source but multiple as they find adjacencies with data. A lot of the data may be outside of the four walls. So when you look at consumption ad creation of data the net net is you need not one solution but a collection of best fits. >> So a hybrid between consumption and creation so that's the two hybrids. I mean hybrid implies, you know little bit of this little bit of that. >> That's the bridge that you need to be able to cross. Which is where do I get that data? And then where's that data going? >> Great so lets get into Acting. Give us the update, obviously Acting has got a huge portfolio. We've covered you guys know best. Been on the Cube many times. They've cobbled together all these solutions that can be very affective for customers. Take us through the value proposition that this hybrid data enables with Acting. >> Well if you decompose it from our view point there's three pillars. That you kind of needed since the test of time in one sense. They're critical, which is the ability to manage the data. The ability to connect the data. In the old days we said integrate but now I think basically all apps, all kind of data sources are connected in some sense. Sometimes very temporal. And then finally the analytics. So you need those three pillars and you need to be able to orchestrate across them. And what we have is a collection of solutions that span that. They can do transactional data, they can do graph data and object oriented data. Today we're announcing a new generation of our analytics, specifically on HEDUP. And that's Vector H. Love to be able to talk to that today with the native spark integration. >> Lets get into the news. Hard news here at BIGDATA NYC is you guys announced the latest support for Apachi Spark so with Vector H. So Acting Vector in HEDUP, hence the H. What is it? >> Is Spark glue for hybrid data environments or is it something you layer over different underlying databases? >> Well I think it's fair to say it is becoming the glue. In fact we had a previous technology that did a humans job at doing some of the work. Now that we spark and that community. The thing though is if you wanted to take advantage of spark it was kind of like the old days of HEDUP. Assembly was required and that is increasingly not what organizations are looking for. They want to adopt the technology but they want to use it and get on with their day job. What we have done... >> Machine learning, putting algorithms in place, managing software. >> It could be very exonerate things such as predictive machines learning. Next generation AI. But for everyone of those there's an easy a dozen if not a hundred uses of being able to reach and extract data in their native formats. Be able to grab a Parke file and without any transformation being analyze it. Or being able to talk to an application and being able to interface with that. With being able to do reads and writes with zero penalty. So the asset compliance component of databases is critical and a lot of the traditional HEDUP approaches, pretty much read only vehicles. And that meant they were limited on the use cases they could use it. >> Lets talk about the hard news. What specifically was announced? >> Well we have a technology called Vector. Vector does run, just to establish the baseline here. It runs single node, Windows, Linux, and there's a community edition. So your users can download and use that right now. We have Vector H which was designed for scale out for HEDUP and it takes advantage of Yarn. And allows you to scale out across your HEDUP cluster petabytes if you like. What we've added to that solution is now native spark integration and that native spark integration gives you three key things. Number one, zero penalty for real time updates. We're the only ones to the best of our knowledge that can do that. In other words you can update the data and you will not slow down your analytics performance. Every other HEDUP based analytic tool has to, if you will stop the clock. Fresh out the new data to be able to do updates. Because of our architecture and our deep knowledge with transactional processing you don't slow down. That means you can always be assured you'll have fresh data running. The second thing is spark powered direct query access. So we can get at not just Vector formats we have an optimized data format. Which it is the fastest as you'd find in analytic databases but what's so important is you can hit, ORC, Parke and other data file formats through spark and without any transformation. Be it to ingest and analyze an information. The third one and certainly not the least is something that I think you're going to be talking a lot more about. Which is native spark data frame support. Data frames. >> What's the impact of that? >> Well data frames will allow you to be able to talk to spark SQL, spark R based applications. So now that you're not just going to the data you're going to other applications. And that means that you're able to interface directly to the system of record applications that are running. Using this lingua franca of data frames that now has hit a maturity point where you're seeing pretty broad adoption. And by doing native integration with that we've just simplified the ability to connect directly to dozens of enterprise applications and get the information you need. >> Jeff would you be describing what you're offering now. As a form of data, sort of a data virtualization layer that sits in front of all these back end databases. But uses data frames from spark or am I misconstruing. >> Well it's a little less a virtualization layer as maybe a super highway. That we're able to say this analytics tool... You know in the old days it was one of two things. Either you had to do a formal traditional integration and transform that data right so? You had to go from French to German, once it was in German you could read it. Or what you had to do was you had to be able to query and bring in that information. But you had to be able to slow down your performance because that transformation had not occurred. Now what we're able to use is use this park native connector. So you can have the best of both worlds and if you will, it is creating an abstraction layer but it's really for connectivity as opposed to an overall one. What we're not doing is virtualizing the data. That's the key point, there are some people that are pushing data cataloging and cleansing products and abstracting the entire data from you. You're still aware of where the native format is, you're still able to write to it with zero penalty. And that's critical for performance. When you start to build lots of abstraction layers truly traditional ones. You simplify some things but usually you pay a performance penalty. And just to make a point, in the benchmarks we're running compared to Hive and Polor for example. We're used cases against Vector H may take nearly two hours we can do it in less than two minutes. And we've been able to uphold that for over a year. That is because Vector in its core technology has calmer capabilities and, this is a mouthful. But multi level in memory capability. And what does that mean? You ask. >> I was going to ask but keep going. >> I can imagine the performance latency is probably great. I mean you have in memory that everyone kind of wants. >> Well a lot of in memory where it is you used is just held at the RAM level. And it's the ability to breed data in RAM and take advantage of it. And we do that and of course that's a positive but we go down to the cash level. We get down much much lower because we would rather that data be in the CPU if at all possible. And with these high performance cores it's quite possible. So we have some tricks that are special and unique to Vector so that we actually optimize the in memory capability. The other last thing we do is you know HEDUP and HTFS is not particularly smart about where it places the data. And the last thing you want is your data rolling across lots of different data nodes. That just kills performance. What we're able to do is think about the core location of the data. Look at the jobs and look at the performance and we're able to squeeze optimization in there. And that's how we're able to get 50, 100 sometimes an excess of 500 times faster than some of the other well known SQL and HEDUP performances. So that combined now with this spark integration this native spark integration. Means people don't have to do the plumbing they can get out of the basement and up to the first floor. They can take care of, advantage of open source innovation yet get what we're claiming is the fastest HEDUP analytics database in HEDUP. >> So, I got to ask you. I mean you've been, and I mentioned on the intro, industry veteran. CMO, chief marketing officer. I mean challenging with Acting cause there's so many things to focus on. How are you attacking the marketing of Acting because you have a portfolio that hybrid data is a good position. I like that how you bring that to the forefront kind of give it a simple positioning. But as you look at Acting's value proposition and engage you customer base and potentially prospective customers. How are you iterating the marketing message the position and engaging with clients? >> Well it's a fair question and it is daunting when you have multiple products. And you got to have a simple compelling message, less is more to get signal above noise today. At least that's how I feel. So we're hanging our hats on hybrid data. And we're going to take it to the moon or go down with the ship on that. But we've been getting some pretty good feedback. >> What's been the hit one feedback on the hybrid data because, I'm a big fan of hybrid cloud but I've been saying it's a methodology it's not a product. On premise cloud is growing and so is public so hybrid hangs together in the cloud thing. So with data, you're bridging two worlds. Consumption and creation. >> Well what's interesting when you say hybrid data. People put their own definitions around it. In an unaided way and they say you know with all the technology and all the trends, that's actually at the end of the day nets out my situation. I do have data that's hybrid data and it's becoming increasingly more hybrid. And god knows the people that are demanding wanting to use it aren't using it or doing it. And the last thing I need, and I'm really convinced of this. Is a lot of people talk about platforms we love to use the P word. Nobody buys a platform because people are trying to address their use cases. But they don't wat to do it in this siloed kind of brick wall way where I address one use case but it won't function elsewhere. What are they looking for is a collection of best fits solutions that can cooperate together. The secret source for us is we have a cloud control plane. All our technologies, whether it's on premise or in the cloud touch that. And it allows us to orchestrate and do things together. Sometimes it's very intimate and sometimes it's broader. >> Or what exactly is the control plane? >> It does everything from administration, it can do down to billing and it can also be scheduling transactional performance. Now on one extreme we use it for a back up recovery for our transactional database. And we have a cloud based back up recovery service and it all gets administered through the control plane. So it knows exactly when it's appropriate to backup because it understands that database and it takes care of it. It was relatively simple for us to create. On the more intimate sense we were the first company and it was called Acting X which I know we were talking before. We named our product after X before our friends at Apple did. So I like to think we were pioneers. >> San Francisco had the iPhone don't get confused there remember. >> I got to give credit where credit's due. >> And give it up. >> But what Acting X is, and we announced it back in April. Is it takes the same vector technology I just talked about. So it's material and we combined it with our integrated transactional database. Which has over 10,000 users around the world. And what we did is we dropped in this high performance calmer database for free. I'm going to say that again, for free in our transactional part from system. So everyone one of our customers, soon as they upgraded to now Acting X. Got a rocket ship of a calmer high performance database inside their transactional database. The data is fresh, it moves over into the calmer format. And the reporting takes off. >> Jeff to end this statement I'll give you the last word. A lot of people look at Acting also a product I mentioned earlier. Is it product leadership that's winning, is it the values of the customer? Where is Acting and winning for the folks that aren't yet customers that you'd like to talk to. What is the Acting success formula? What's the differentiation, where is it, where does it jump off the page? Is it the product, is it the delivery? Where's the action. >> Is it innovation? >> Well let me tell you about, I would answer with two phrases. First is our tag line, our tag line is "activate your data". And that resonated with a lot of people. A lot of people have a lot of data and we've been in this big data era where people talked about the size of their data. Literally I have 5 petabytes you have 6 petabytes. I think people realized that kind of missed the entire picture. Sometimes smaller data, god forbid 1 terabyte can be amazingly powerful depending on the use case. So it's obviously more than size what it is about is activating it. Are you actually using that data so it's making a meaningful difference. And you're not putting it in a data pond, puddle or lake to be used someday like you're storing it in an attic. There's a lot of data getting dusty in attics today because it is not being activated. And that would bring me to the, not the tag line but what I think what's driving us and why customers are considering us. They see we are about the technology of the future but we're very much about innovation that actually works. Because of our heritage, because we have companies that understand for over 20 years how to run on data. We get what acid compliance is, we get what transactional systems are. We get that you need to be able to not just read but write data. And we bring the methodology to our innovation and so for people, companies, animals, any form of life. That is interested in. >> So it's the product platform that activates and then the result is how you guys roll with customers. >> In the real world today where you can have real concurrency, real enterprise, great performance. Along with the innovation. >> And the hybrid gives them some flexibility that's the new tag line, that's the kind of main. I understand you currently hybrid data means basically flexibility for the customer. >> Yeah it's use the data you need for what you use it for and have the systems work for you. Rather than you work for the systems. >> Okay check out Acting, Jeff Viece friend of the Cube, alumni now. The CMO at Acting, we following your progress so congratulations on the new opportunity. More Cube coverage after this strip break. I'm John Furrier, James Kobielus here inside the Cube in New York City for our BIGDATA NYC event all week. In conjunction with STRATA Data right next door we'll be right back. (tech music)
SUMMARY :
Brought to you by SiliconANGLE Media and anyone making shaping the agenda There's so much going on from the large enterprises is the shift to what we refer to as hybrid data. What does that term mean for the people that the net net is you need not one solution so that's the two hybrids. That's the bridge that you need to be able to cross. Been on the Cube many times. and you need to be able to orchestrate across them. So Acting Vector in HEDUP, hence the H. it is becoming the glue. and being able to interface with that. Lets talk about the hard news. and you will not slow down your analytics performance. and get the information you need. Jeff would you be describing and abstracting the entire data from you. I can imagine the performance latency And the last thing you want is your data rolling across I like that how you bring that to the forefront and it is daunting when you have multiple products. on the hybrid data because, and they say you know with all the technology So I like to think we were pioneers. San Francisco had the iPhone And the reporting takes off. is it the values of the customer? We get that you need to be able to not just read and then the result is how you guys roll with customers. where you can have real concurrency, And the hybrid gives them some flexibility and have the systems work for you. Jeff Viece friend of the Cube, alumni now.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
James Kobielus | PERSON | 0.99+ |
Jeff Viece | PERSON | 0.99+ |
Jeff Veis | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
April | DATE | 0.99+ |
Jeff | PERSON | 0.99+ |
New York City | LOCATION | 0.99+ |
6 petabytes | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
Apple | ORGANIZATION | 0.99+ |
one | QUANTITY | 0.99+ |
HPE | ORGANIZATION | 0.99+ |
5 petabytes | QUANTITY | 0.99+ |
dozens | QUANTITY | 0.99+ |
less than two minutes | QUANTITY | 0.99+ |
50 | QUANTITY | 0.99+ |
Midtown Manhattan | LOCATION | 0.99+ |
First | QUANTITY | 0.99+ |
STRATA Data | ORGANIZATION | 0.99+ |
SiliconANGLE Media | ORGANIZATION | 0.99+ |
1 terabyte | QUANTITY | 0.99+ |
two phrases | QUANTITY | 0.99+ |
first floor | QUANTITY | 0.99+ |
over 20 years | QUANTITY | 0.99+ |
iPhone | COMMERCIAL_ITEM | 0.99+ |
Vector | ORGANIZATION | 0.99+ |
Linux | TITLE | 0.99+ |
both sides | QUANTITY | 0.99+ |
one sense | QUANTITY | 0.99+ |
Acting X | TITLE | 0.99+ |
San Francisco | LOCATION | 0.98+ |
over a year | QUANTITY | 0.98+ |
Windows | TITLE | 0.98+ |
Cube | ORGANIZATION | 0.98+ |
third one | QUANTITY | 0.98+ |
Today | DATE | 0.98+ |
500 times | QUANTITY | 0.98+ |
today | DATE | 0.98+ |
NYC | LOCATION | 0.98+ |
over 10,000 users | QUANTITY | 0.98+ |
three data scientists | QUANTITY | 0.98+ |
two worlds | QUANTITY | 0.98+ |
three pillars | QUANTITY | 0.98+ |
hundreds of people | QUANTITY | 0.98+ |
Tableau | TITLE | 0.97+ |
second thing | QUANTITY | 0.97+ |
STRATA HEDUP | EVENT | 0.97+ |
two hours | QUANTITY | 0.97+ |
both worlds | QUANTITY | 0.96+ |
HEDUP | ORGANIZATION | 0.96+ |
SQL | TITLE | 0.96+ |
two things | QUANTITY | 0.96+ |
a dozen | QUANTITY | 0.95+ |
one data source | QUANTITY | 0.95+ |
first company | QUANTITY | 0.95+ |
one solution | QUANTITY | 0.94+ |
100 | QUANTITY | 0.93+ |
BIGDATA | ORGANIZATION | 0.91+ |
two hybrids | QUANTITY | 0.9+ |
BIGDATA | EVENT | 0.9+ |
STRATA DATA | ORGANIZATION | 0.9+ |
2017 | DATE | 0.89+ |
Vector H | TITLE | 0.88+ |
spark | ORGANIZATION | 0.88+ |
HEDUP | TITLE | 0.88+ |
German | LOCATION | 0.87+ |
one extreme | QUANTITY | 0.86+ |
four walls | QUANTITY | 0.86+ |
dozens of enterprise applications | QUANTITY | 0.85+ |
single | QUANTITY | 0.84+ |
Acting X. | TITLE | 0.82+ |
three key things | QUANTITY | 0.8+ |
Keynote Analysis | Virtual Vertica BDC 2020
(upbeat music) >> Narrator: It's theCUBE, covering the Virtual Vertica Big Data Conference 2020. Brought to you by Vertica. >> Dave Vellante: Hello everyone, and welcome to theCUBE's exclusive coverage of the Vertica Virtual Big Data Conference. You're watching theCUBE, the leader in digital event tech coverage. And we're broadcasting remotely from our studios in Palo Alto and Boston. And, we're pleased to be covering wall-to-wall this digital event. Now, as you know, originally BDC was scheduled this week at the new Encore Hotel and Casino in Boston. Their theme was "Win big with big data". Oh sorry, "Win big with data". That's right, got it. And, I know the community was really looking forward to that, you know, meet up. But look, we're making the best of it, given these uncertain times. We wish you and your families good health and safety. And this is the way that we're going to broadcast for the next several months. Now, we want to unpack Colin Mahony's keynote, but, before we do that, I want to give a little context on the market. First, theCUBE has covered every BDC since its inception, since the BDC's inception that is. It's a very intimate event, with a heavy emphasis on user content. Now, historically, the data engineers and DBAs in the Vertica community, they comprised the majority of the content at this event. And, that's going to be the same for this virtual, or digital, production. Now, theCUBE is going to be broadcasting for two days. What we're doing, is we're going to be concurrent with the Virtual BDC. We got practitioners that are coming on the show, DBAs, data engineers, database gurus, we got a security experts coming on, and really a great line up. And, of course, we'll also be hearing from Vertica Execs, Colin Mahony himself right of the keynote, folks from product marketing, partners, and a number of experts, including some from Micro Focus, which is the, of course, owner of Vertica. But I want to take a moment to share a little bit about the history of Vertica. The company, as you know, was founded by Michael Stonebraker. And, Verica started, really they started out as a SQL platform for analytics. It was the first, or at least one of the first, to really nail the MPP column store trend. Not only did Vertica have an early mover advantage in MPP, but the efficiency and scale of its software, relative to traditional DBMS, and also other MPP players, is underscored by the fact that Vertica, and the Vertica brand, really thrives to this day. But, I have to tell you, it wasn't without some pain. And, I'll talk a little bit about that, and really talk about how we got here today. So first, you know, you think about traditional transaction databases, like Oracle or IMBDB tour, or even enterprise data warehouse platforms like Teradata. They were simply not purpose-built for big data. Vertica was. Along with a whole bunch of other players, like Netezza, which was bought by IBM, Aster Data, which is now Teradata, Actian, ParAccel, which was the basis for Redshift, Amazon's Redshift, Greenplum was bought, in the early days, by EMC. And, these companies were really designed to run as massively parallel systems that smoked traditional RDBMS and EDW for particular analytic applications. You know, back in the big data days, I often joked that, like an NFL draft, there was run on MPP players, like when you see a run on polling guards. You know, once one goes, they all start to fall. And that's what you saw with the MPP columnar stores, IBM, EMC, and then HP getting into the game. So, it was like 2011, and Leo Apotheker, he was the new CEO of HP. Frankly, he has no clue, in my opinion, with what to do with Vertica, and totally missed one the biggest trends of the last decade, the data trend, the big data trend. HP picked up Vertica for a song, it wasn't disclosed, but my guess is that it was around 200 million. So, rather than build a bunch of smart tokens around Vertica, which I always call the diamond in the rough, Apotheker basically permanently altered HP for years. He kind of ruined HP, in my view, with a 12 billion dollar purchase of Autonomy, which turned out to be one of the biggest disasters in recent M&A history. HP was forced to spin merge, and ended up selling most of its software to Microsoft, Micro Focus. (laughs) Luckily, during its time at HP, CEO Meg Whitman, largely was distracted with what to do with the mess that she inherited form Apotheker. So, Vertica was left alone. Now, the upshot is Colin Mahony, who was then the GM of Vertica, and still is. By the way, he's really the CEO, and he just doesn't have the title, I actually think they should give that to him. But anyway, he's been at the helm the whole time. And Colin, as you'll see in our interview, is a rockstar, he's got technical and business jobs, people love him in the community. Vertica's culture is really engineering driven and they're all about data. Despite the fact that Vertica is a 15-year-old company, they've really kept pace, and not been polluted by legacy baggage. Vertica, early on, embraced Hadoop and the whole open-source movement. And that helped give it tailwinds. It leaned heavily into cloud, as we're going to talk about further this week. And they got a good story around machine intelligence and AI. So, whereas many traditional database players are really getting hurt, and some are getting killed, by cloud database providers, Vertica's actually doing a pretty good job of servicing its install base, and is in a reasonable position to compete for new workloads. On its last earnings call, the Micro Focus CFO, Stephen Murdoch, he said they're investing 70 to 80 million dollars in two key growth areas, security and Vertica. Now, Micro Focus is running its Suse play on these two parts of its business. What I mean by that, is they're investing and allowing them to be semi-autonomous, spending on R&D and go to market. And, they have no hardware agenda, unlike when Vertica was part of HP, or HPE, I guess HP, before the spin out. Now, let me come back to the big trend in the market today. And there's something going on around analytic databases in the cloud. You've got companies like Snowflake and AWS with Redshift, as we've reported numerous times, and they're doing quite well, they're gaining share, especially of new workloads that are merging, particularly in the cloud native space. They combine scalable compute, storage, and machine learning, and, importantly, they're allowing customers to scale, compute, and storage independent of each other. Why is that important? Because you don't have to buy storage every time you buy compute, or vice versa, in chunks. So, if you can scale them independently, you've got granularity. Vertica is keeping pace. In talking to customers, Vertica is leaning heavily into the cloud, supporting all the major cloud platforms, as we heard from Colin earlier today, adding Google. And, why my research shows that Vertica has some work to do in cloud and cloud native, to simplify the experience, it's more robust in motor stack, which supports many different environments, you know deep SQL, acid properties, and DNA that allows Vertica to compete with these cloud-native database suppliers. Now, Vertica might lose out in some of those native workloads. But, I have to say, my experience in talking with customers, if you're looking for a great MMP column store that scales and runs in the cloud, or on-prem, Vertica is in a very strong position. Vertica claims to be the only MPP columnar store to allow customers to scale, compute, and storage independently, both in the cloud and in hybrid environments on-prem, et cetera, cross clouds, as well. So, while Vertica may be at a disadvantage in a pure cloud native bake-off, it's more robust in motor stack, combined with its multi-cloud strategy, gives Vertica a compelling set of advantages. So, we heard a lot of this from Colin Mahony, who announced Vertica 10.0 in his keynote. He really emphasized Vertica's multi-cloud affinity, it's Eon Mode, which really allows that separation, or scaling of compute, independent of storage, both in the cloud and on-prem. Vertica 10, according to Mahony, is making big bets on in-database machine learning, he talked about that, AI, and along with some advanced regression techniques. He talked about PMML models, Python integration, which was actually something that they talked about doing with Uber and some other customers. Now, Mahony also stressed the trend toward object stores. And, Vertica now supports, let's see S3, with Eon, S3 Eon in Google Cloud, in addition to AWS, and then Pure and HDFS, as well, they all support Eon Mode. Mahony also stressed, as I mentioned earlier, a big commitment to on-prem and the whole cloud optionality thing. So 10.0, according to Colin Mahony, is all about really doubling down on these industry waves. As they say, enabling native PMML models, running them in Vertica, and really doing all the work that's required around ML and AI, they also announced support for TensorFlow. So, object store optionality is important, is what he talked about in Eon Mode, with the news of support for Google Cloud and, as well as HTFS. And finally, a big focus on deployment flexibility. Migration tools, which are a critical focus really on improving ease of use, and you hear this from a lot of customers. So, these are the critical aspects of Vertica 10.0, and an announcement that we're going to be unpacking all week, with some of the experts that I talked about. So, I'm going to close with this. My long-time co-host, John Furrier, and I have talked some time about this new cocktail of innovation. No longer is Moore's law the, really, mainspring of innovation. It's now about taking all these data troves, bringing machine learning and AI into that data to extract insights, and then operationalizing those insights at scale, leveraging cloud. And, one of the things I always look for from cloud is, if you've got a cloud play, you can attract innovation in the form of startups. It's part of the success equation, certainly for AWS, and I think it's one of the challenges for a lot of the legacy on-prem players. Vertica, I think, has done a pretty good job in this regard. And, you know, we're going to look this week for evidence of that innovation. One of the interviews that I'm personally excited about this week, is a new-ish company, I would consider them a startup, called Zebrium. What they're doing, is they're applying AI to do autonomous log monitoring for IT ops. And, I'm interviewing Larry Lancaster, who's their CEO, this week, and I'm going to press him on why he chose to run on Vertica and not a cloud database. This guy is a hardcore tech guru and I want to hear his opinion. Okay, so keep it right there, stay with us. We're all over the Vertica Virtual Big Data Conference, covering in-depth interviews and following all the news. So, theCUBE is going to be interviewing these folks, two days, wall-to-wall coverage, so keep it right there. We're going to be right back with our next guest, right after this short break. This is Dave Vellante and you're watching theCUBE. (upbeat music)
SUMMARY :
Brought to you by Vertica. and the Vertica brand, really thrives to this day.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Vellante | PERSON | 0.99+ |
Larry Lancaster | PERSON | 0.99+ |
Colin | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
HP | ORGANIZATION | 0.99+ |
70 | QUANTITY | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Michael Stonebraker | PERSON | 0.99+ |
Colin Mahony | PERSON | 0.99+ |
Stephen Murdoch | PERSON | 0.99+ |
Vertica | ORGANIZATION | 0.99+ |
EMC | ORGANIZATION | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
Zebrium | ORGANIZATION | 0.99+ |
two days | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Boston | LOCATION | 0.99+ |
Verica | ORGANIZATION | 0.99+ |
Micro Focus | ORGANIZATION | 0.99+ |
2011 | DATE | 0.99+ |
HPE | ORGANIZATION | 0.99+ |
Uber | ORGANIZATION | 0.99+ |
first | QUANTITY | 0.99+ |
Mahony | PERSON | 0.99+ |
Meg Whitman | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Aster Data | ORGANIZATION | 0.99+ |
Snowflake | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
First | QUANTITY | 0.99+ |
12 billion dollar | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
this week | DATE | 0.99+ |
John Furrier | PERSON | 0.99+ |
15-year-old | QUANTITY | 0.98+ |
Python | TITLE | 0.98+ |
Oracle | ORGANIZATION | 0.98+ |
olin Mahony | PERSON | 0.98+ |
around 200 million | QUANTITY | 0.98+ |
Virtual Vertica Big Data Conference 2020 | EVENT | 0.98+ |
theCUBE | ORGANIZATION | 0.98+ |
80 million dollars | QUANTITY | 0.97+ |
today | DATE | 0.97+ |
two parts | QUANTITY | 0.97+ |
Vertica Virtual Big Data Conference | EVENT | 0.97+ |
Teradata | ORGANIZATION | 0.97+ |
one | QUANTITY | 0.97+ |
Actian | ORGANIZATION | 0.97+ |