Christian Rodatus, Datameer | CUBEConversation, July 2018
(upbeat music) >> Hi, I'm Peter Burris and welcome to another CUBE Conversation from our wonderful studios in Palo Alto, California. Great conversation today, we got Christian Rodatus, who is the CEO of Datameer, here to talk about some of the trends within the overall analytic space. One of the most important things happening in technology today. Christian, welcome back to theCube! >> Good morning, Peter, thanks for having me today. >> It's great to have you here. Hey, let's start with, kind of some of the preliminaries. What's happening at Datameer? >> Well we've been around for nine years now, which is a lot of time in a very agile technology space. And I actually just came back from an Investiere offsite that was arranged from one of our biggest investors. And everything is centering around the cloud, right? We were trotting along within the Hadoop ecosystem, the big data ecosystem over the past couple years and since, 12, 15 months, the transition and the analytics market and how it's transforming from on premise to the cloud in a hybrid way as well has been stunning, right? And we're faced with a challenge in innovating in those spaces and making our product relevant for on premise deployment, for cloud deployments, and various different cloud platforms, and in a hybrid fashion as well. And we've been traditionally working with customers that have been laggards in terms of cloud adoption because we do a lot of business and financial services, and insurance, healthcare, telecommunications, but even in those industries over the past year, it has been stunning how they are accelerate cloud adoption, how they move analytic workloads to the cloud. >> Well, actually, they all sound like sometimes leaders in the analytics world, even if they're laggards in the cloud. And there's something of a relationship there. People didn't want to do a lot of their analytics because they were doing analytics in some of the most strategic, sensitive data, and they felt pressured to not give that off to a company that they felt perhaps, or an industry that's a little bit less ready from infrastructure standpoint. But our research shows pretty strongly that we're seeing a push to adoption, precisely because so much of that ecosystem got wrapped up in the infrastructure and never got to the possible value of analytics. So is that helping to force this along, do you think, the idea of-- >> Absolutely, right, if you look at the key drivers, and there was some other analyst research that I read this week. Why are people being moderated moving analytic workloads into the cloud? It's really less cost, it's really business agility. How do they become independent from IT and procure services across the organization in a very simple, easy, and fast fashion? And then there's a lot of fears associated with it. It's data governance, it's security, it's data privacy, is what these industries that we predominately work in are concerned with, right, and we provide a solution framework that actually helps them to transition those on premise analytic workloads into the cloud and still get the enterprise grade features that they're used to from an on premise solution deployment. >> Yeah, so in other words, a lot of businesses confuse failure to deal with big data infrastructure as failure to do big data. >> That's correct. >> I want to build on something you've just said, specifically the governance issue, because I think you're absolutely right. There's an enormous lack of understanding about what really constitutes data governance. It used to be, oh, data governance is what the data administrator does when they do modeling, and who gets to change the model, and who owns the model, and who gets to, all that other stuff. We're talking about something fundamentally different as we embed more deeply some of these analytics directly into high value business activities that are being utilized or performed by high cost business executives. >> Absolutely. >> How does data governance play out, and I'm going to ask you specifically, what are you guys doing that makes data governance more accessible, more manageable, within Datameer customers? >> So I think there's two key features to a solution that's important. So number one, we have very much a self-service aspect to it, so we're pushing abilities to model and create views on the big data assets that are persisting in the data lakes, towards a business user, right? But we do this in a very governed way, right? We can provide barefold data lineage, we can audit every single step, how the data's being sourced, how it's being manipulated on the way, and provide an audit trail, which is very important for many of the customers that we work with. And we really bring this into the hands of the business users without much IT interference. They don't have to work on models to be built and so on and so forth, and this is really what helps them build rapid analytic applications that provide a lot of value and benefits for their business processes. >> So you talked about how you're using governance, or the ability to provide a manageable governance regime, to open up the aperture on the utilization of some of these high value analytics frameworks by broader numbers of individuals within the organization. That seems to me to be a pretty significant challenge for a lot of businesses. It's not enough to just have a ivory tower group of data scientists be good at crafting data, understanding data, and then advising people what actions to take based on that data. It seems it has to be more broadly diffused within the organization, what do you think? >> So this is clearly the trend, and as these analytics services move to the cloud, you will see this even more so, right? You will have created data assets and you provide access control for certain using groups that can see and work with this data, but then you need to provide a solution framework that enables these customers to consume this in a very seamless and an easy way. This is basically what we are doing. We're going to push it down to the end user and give them the ability to work on complex analytical problems using our framework in a governed way, in a fast way, in a very iterative analytic workflow. A lot of our customers say they have analytic, or they pursue analytic problems that are of investigative nature, and you cannot do this if you rely on IT to build new new models to delay the process-- >> Or if you only rely on IT. >> And only rely on IT, right? They want to do this on their own and create their own views, depending on their analytic workflow in a very rapid, rapid way. And so we support this in a highly governed way that can do this in a very fast and rapid fashion, and as it moves to the cloud, it provides some of the even more opportunities to do so. >> So as CEO of Datameer, you're spending a lot time with customers. Are there some patterns that you're seeing customers, in addition to buy Datameer, but are there some patterns in addition to what you just described that the successful companies are utilizing to facilitate this fusion? Are they training people more? >> Yep. Are they embedding this more deeply into other types of applications or workflows? What are some of those patterns of success that you're seeing amongst your customers? >> So that's a very interesting question, right, because a lot of big data initiatives within companies fail for the lack of an option. So they build these big data lakes and ramp up cloud services, and they never really see adoption. And so the successful customers we work with, they have a couple of things they do differently than others. They have a centralized, serious type of organization, usually, that facilitates and promotes and educates people on number one, the data assets being available through the organization, about the tool sets that are being used, and amongst one of them, obviously, is Datameer within our customers, and they facilitate constant education and experience sharing across the organization for the user of big data assets throughout the organization. And these companies, they see adoption, right? And it spreads throughout the organization. It has increasing momentum and adoption across various business departments from many eye value use cases. >> So we've done a lot of research. I myself have spent a lot of time on questions of technology adoption, questions within the large enterprises. And you actually described it fails to adopt, and from adoption standpoint, it's called they abandon. >> Absolutely true. >> One of the things that often catalyzes whether or not someone continues to adopt, or a group determines to abandon, is a lack of understanding of what the returns are, what kind of returns these changes of behavior are initiating or instantiating. I've always been curious why a lot of these software tools don't do a good job of actually utilizing data about utilization, from a big data standpoint, to improve the adoption of big data. Are you seeing any effort made by companies to use Datameer to help businesses better adopt Datameer? >> Well, I haven't seen that yet. I see this more with our OEN customers. So we've got OEN customers that analyze the cloud consumption with their customers and provide analytics on users across the organization. I see these things, and from our standpoint, we facilitate this process by providing use case discovery workshops, so we have a services organization that helps our customers to see the light, literally, right, to understand what's the nature of the data assets available. How can they leverage for specific use case, high value use case, implementations, experience sharing, what other customers are doing, what kind of high value application are they going after in a specific industry, and things like this. We do lunch and learns with our customers. We just recently did one with a big healthcare provider and the interest is definitely there. You get 200 people in a room for a lunch and learn meeting, and everyone's interesting, how they can make their life easier and make better business decisions based on the data assets that are available throughout the organization. >> That's amazing, when a lunch and learn meeting goes from 20 people to 200 people, it really becomes much more focused on learn. One of the questions I have related to this is that you've got a lot of experience in the analytics space, more than big data, and how the overall analytics space has evolved over the years. We have some research, pretty strong to suggest that it's time to start thinking about big data not as a thing unto itself, but as part of an aggregate approach to how enterprises should think about analytics. What do you think? How do you think an enterprise should start to refashion its understanding of the role that big data plays in a broader understanding of analytics? >> Back in the earlier days, when my career come from the EDW road, and then all the large enterprises had EDWs and they tried to build a centralized repository of data assets-- >> Highly modeled. >> Highly modeled, a lot of work to set up, structured, highly modeled, extreme complex to modify and service a new application regressed from business users, and then came the Hadoop data lake base, big data approach there. It said dump the data in, and this is where we were a part, within where we became very successful in providing a tool framework that allows customers to build virtue of use into these data assets in a very rapid fashion, driven by the business user community. But to some extent, these data lakes have also had issues in servicing the bread and butter BI user community throughout the organization, and the EDW never really went away, right, so now we have EDWs, we have data lakes that service different analytic application requirements throughout the organization. >> And new reporting systems. >> And even reporting systems. And now the third wave is coming by moving workloads into the cloud, and if you look into the cloud, the wealth of available solutions to a customer becomes even more complex, as cloud vendors themselves build out tons of different solutions to service different analytical needs. The marketplaces offer hundreds of solutions of third party vendors, and the customers try to figure out how all these things can be stitched together and provide the right services for the right business user communities throughout the organization. So what we see moving forward will be a hybrid approach that will retain some of the on premise EDW and data lake services, and those will be combined with multi-cloud services. So there always will not be a single cloud service, and we're already seeing this today. One of our customers is Sprint Pinsight, the advertising business of the Sprint. Telecommunications companies say they have a massive Hadoop on premise data lake, and then they do all the preprocessing of the ATS data from their network, with Datameer on premise, and we condensed down the data assets from a daily volume of 70 terabytes to eight, and this gets exposed to a secret cloud base dataware service for BI consumption throughout the organization. So you see these hybrid, very agile services emerging throughout our customer base, and I believe this will be the future-- >> Yeah, one of the things we like about the concept, or the approach of virtual view, is precisely that. It focuses in on the value that the data's creating, and not the underlying implementation, so that you have greater flexibility about whether you treat it as a big data approach, or EDW approach, or whether you put it here, or whether you put it there. But by focusing on the outcome that gets delivered, it allows a lot of flexibility in the implementation you employed. >> Absolutely, I agree. >> Phenomenal, Christian Rodatus, CEO of Datameer, thanks again for being on theCUBE! >> Thanks so much. I appreciate it, thanks, peter. >> We'll be back.
SUMMARY :
One of the most important things It's great to have you here. and the analytics market and how it's transforming and they felt pressured to not give that off and procure services across the organization confuse failure to deal with big data infrastructure specifically the governance issue, for many of the customers that we work with. or the ability to provide a manageable governance regime, and as these analytics services move to the cloud, it provides some of the even more opportunities to do so. in addition to what you just described Are they embedding this more deeply And so the successful customers we work with, and from adoption standpoint, it's called they abandon. One of the things that often catalyzes and the interest is definitely there. One of the questions I have related to this is that and the EDW never really went away, right, and this gets exposed to a secret cloud base dataware and not the underlying implementation, Thanks so much.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Sprint | ORGANIZATION | 0.99+ |
Peter Burris | PERSON | 0.99+ |
20 people | QUANTITY | 0.99+ |
Datameer | ORGANIZATION | 0.99+ |
Peter | PERSON | 0.99+ |
Christian Rodatus | PERSON | 0.99+ |
July 2018 | DATE | 0.99+ |
70 terabytes | QUANTITY | 0.99+ |
200 people | QUANTITY | 0.99+ |
nine years | QUANTITY | 0.99+ |
eight | QUANTITY | 0.99+ |
Christian | PERSON | 0.99+ |
Palo Alto, California | LOCATION | 0.99+ |
One | QUANTITY | 0.99+ |
EDW | ORGANIZATION | 0.99+ |
two key features | QUANTITY | 0.98+ |
today | DATE | 0.98+ |
peter | PERSON | 0.98+ |
Sprint Pinsight | ORGANIZATION | 0.98+ |
this week | DATE | 0.97+ |
OEN | ORGANIZATION | 0.97+ |
one | QUANTITY | 0.96+ |
single cloud | QUANTITY | 0.93+ |
EDWs | ORGANIZATION | 0.91+ |
hundreds of solutions | QUANTITY | 0.88+ |
CEO | PERSON | 0.86+ |
12, 15 months | QUANTITY | 0.82+ |
theCube | ORGANIZATION | 0.82+ |
third wave | EVENT | 0.8+ |
single step | QUANTITY | 0.74+ |
past couple years | DATE | 0.7+ |
Hadoop | LOCATION | 0.66+ |
Investiere | ORGANIZATION | 0.65+ |
past year | DATE | 0.55+ |
Christian Rodatus, Datameer & Pooja Palan, Datameer | AWS re:Invent
>> Announcer: Live from Las Vegas, it's theCUBE. Covering AWS re:Invent 2017. Presented by AWS, Intel, and our ecosystem of partners. >> Well we are back live here at the Sands Expo Center. We're of course in Las Vegas live at re:Invent. AWS putting on quite a show here. Day one of three days of coverage you'll be seeing right here on theCUBE. I'm John Walls along with Justin Warren. And we're now joined by a couple folks from Datameer. Justin Rodatus who's the CEO of that company, and Pooja Palan who's the Senior Product Manager. And Christian and Pusha thanks for being with us. Good to have you here on theCUBE. >> Thanks for having us. >> So you were cube-ing at just recently up at New York, Christian. >> Yeah absolutely we were seeing your guys in New York and we had actually, we've done some work with a couple of customers probably two weeks ago in Palo Alto I believe. >> I don't know how we can afford you. I mean I'm gonna have to look into our budget. >> Christian: Happy to be here again. >> Okay no it is great, thank for taking the time here. I know this is a busy week for you all. First off let's talk about Datameer in general just to let the audience at home known in case they're not familiar with what you're doing from a core competency standpoint. And let's talk about what you're doing here. >> Absolutely, I mean Datameer was founded eight years ago and Datameer was only an onset of the big data wave that started in the 2009 and 2010 time frame. And Datameer was actually the first commercial platform that provided a tool set to enable our customers to consume enterprise scale Hadoop solutions for their enterprise analytics. So we do everything from ingesting the data into the data lake or we're preparing the data for a consumption by analytics tools throughout the enterprise. And we just recently also launched our own visualization capabilities for sophisticated analysis against very large data sets. We also are capable of integrating machine learning solutions and preparing data for machine learning throughout the organization. And probably the biggest push is into the cloud. And we've been in the cloud for couple of years now, but we see increased momentum from our customers in the market place for about 15 months now I would say. >> So before we dive a little deeper here I'm just kind of curious about your work in general. It's kind of chicken and the egg right? You're trying to come up with new products to meet customer demand. So are you producing to give them what you think they need or are you producing on what they're telling you that they need? How does that work as far as trying to keep up with-- >> You know I can kick this off. So it's actually interesting that you ask this because the customers that did interviews with you guys two weeks ago were part of our customer advisory council. So we get direct feedback from leading customers that do really sophisticated things with Datameer. They are at the forefront of developing really mind blowing analytical applications for high value use cases throughout their organizations. And they help us understanding where theses trends go. And to give you an example. So I was recently in a meeting with a Chief Data Officer of a large global bank in London. And they have kicked off 32 Hadoop projects throughout the organization. And what he told me is just these projects will lead to an expansion of the physical footprint of the data centers in the UK by 30%. So in (mumbles) we are not in the data center business, we don't want this, we need other people to take care of this. And they've launched a massive initiative with Amazon to bring a big chunk of their enterprise analytics into AWS. >> It sounds like you're actually really ahead of the curve in many ways 'cause of the explosion in machine learning and AI, that data analytics side of things. Yeah we had big data for a little while, but it's really hitting now where people are starting to really show some of the amazing things that you can do with data and analysis. So what are you seeing from these customers? What are some of the things that they're saying, actually this thing here, this is what we really love about Datameer, and this is something that we can do here that we wouldn't be able to do in any other way. >> Shall I take that? So when it comes to heart of the matter, there's like you know three things that Datameer hits on really well. So in terms of our user personas, we look at all of our users, our analysts, and data engineers. So what we provide them with that ease of use, being able to take data from anywhere, and be able to use any multiple analytic capabilities within one tool without having to jump around in all different UI's. So it's like ease of use single interface. The second one that they really like about us is being able to not have to, whatever being able to not have to switch between interfaces to be able to get something done. So if they want to ingest data from different sources, it's one place to go to. If they want to access their data, all of it is in the single file browser. They want to munch their data, prepare data, analyze data, it's all within the same interface. And they don't have to use 10 different tools to be able to do that. It's a very seamless workflow. And the same token, the third thing which comes up is that collaboration. It enables collaboration across different user groups within the same organization. Which means that we are totally enabling the data democratization which all of the self service tools are trying to promote here. Making the IT's job easier. And that's what Datameer enables. So it's kind of like a win-win situation between our users and the IT. And the third thing that I want to talk about, which is the IT, making their lives easier, but at the same time not letting them go off, leaving the leash alone. Enabling governance, and that's a key challenge, which is where Datameer comes in the picture to be able to provide enterprise ready governance to be able to deploy it across the board in the organization. >> Yeah, that's something that AWS is certainly lead in, is that democratization of access to things so that you can as individual developers, or individual users go and make use of some of these cloud resources. And seeing here at the show, and we've been talking about that today, about this is becoming a much more enterprise type issue. So being able to do that, have that self service, but also have some of those enterprise level controls. We're starting to see a lot of focus on that from enterprises who want to use cloud, but they really want to make sure that they do it properly, and they do it securely. So what are some of the things that Datameer is doing that helps customers keep that kind of enterprise level control, but without getting in the way of people being able to just use the cloud services to do what they want to do? So could you give us some examples of that maybe? >> I let Puja comment on the specifics on how we deploy in AWS and other cloud solutions for that matter. But what you see with on premise data lakes, customers are struggling with it. So the stack has become outrageously complicated. So they try to stitch all these various solutions together. The open source community I believe now supports 27 different technology platforms. And then there's dozens over dozens of commercial tools that play into that. And what they want, they actually just want this thing to work. They want to deploy what they used from the enterprise IT. Scalability, security, seamlessness across the platforms, appropriate service level agreements with the end user communities and so on and so forth. So they really struggle to make this happen on premise. The cloud address a lot of these issues and takes a lot of the burden away, and it becomes way more flexible, scalable, and adjustable to whatever they need. And when it comes to the specific deployments and how we do this, and we give them enterprise grade solutions that make sense for them, Puja maybe you can comment on that. >> Sure absolutely, and more specific to cloud I would love to talk about this. So in the recent times one of our very first financial services customers went on cloud, and that pretty much brings us over here being even more excited about it. And trust me, even before elasticity, their number one requirement is security. And as part of security, it's not just like, one two three Amazon takes care of it, it's sorted, we have security as part of Datameer, it's been deployed before it's sorted. It's not enough. So when it comes to security it's security at multiple levels, it's security about data in motion, it's security about data at rest. So encryption across the board. And then specifically right now while we're at the Amazon conference, we're talking about enabling key management services, being able to have server-side encryption that Amazon enables. Being able to support that, and then besides that, there's a lot of other custom requirements specifically around how do you, because it's more of hybrid architecture. They do have applications on-prem, they do have like a deployed cloud infrastructure to do compute in the cloud as it may needed for any kind of worst workloads. So as part of that, when data moves between, within their land to the cloud, within that VPC, that itself, those connectivity has to be secured and they want to make sure that all of those user passwords, all of that authentication is also kind of secure. So we've enabled a bunch of capabilities around that, specifically for customers who are like super keen on having security, taking care of rule number one, even before they go. >> So financial services, I mean you mentioned that and both of you are talking about it. That's a pretty big target market for you right? I mean you've really made it a point of emphasis. Are there concerns, or I get it (mumbles) so we understand how treasured that data can be. But do you provide anything different for them? I mean is the data point is a point as opposed to another business. You just protect the same way? Or do you have unique processes and procedures and treatments in place that give them maybe whatever that additional of oomph of comfort is that they need? >> So that's a good question. So in principle we service a couple of industries that are very demanding. So it's financial services, it's telecommunication and media, it's government agencies, insurance companies. And when you look at the complexities of the stack that I've described. It's very challenging to make security, scalability in these things really happen. You can not inherit security protocols throughout the stack. So you stack a data prep piece together with a BI accelerator with an ingest tool. These things don't make sense. So the big advantage of Datameer is it's an end to end tool. We do everything from ingest, data preparation to enterprise scale analytics, and provide this out of the box in a seamless fashion to our customers. >> It is fascinating how the whole ecosystem has sort of changed in what feels like only a couple of years and how much customers are taking some of these things and putting them together to create some amazing new products and new ways of doing things. So can you give us a bit of an idea of, you were saying earlier that cloud was sort of, it was about two years ago, three years ago. What was it that finally tipped you over and said you know what we gotta do this. We're hearing a lot of talk about people wanting hybrid solutions, wanting to be able to do bursting. What was it really that drove you from the customer perspective to say you know what we have to do this, and we have to go into AWS? >> Did you just catch the entire question? Just repeat the last one. What drove it to the cloud? >> Justin: Yeah, what drove you to the cloud? >> John: What puts you over the top? >> I mean, so this is a very interesting question because Datameer was always innovating ahead of the curve. And this is probably a big piece to the story. And if you look back. I think the first cloud solutions with Microsoft Azure. So first I think we did our own cloud solution, and we moved to Microsoft Azure and this was already maybe two and a half years ago, or even longer. So we were ahead of the curve. Then I would say it was even too early. You saw some adoption, so we have a couple of great customers like JC Penny is already operating in the cloud for us, big retail company, they're actually in AWS. National Instruments works in Microsoft Azure. So there's some good adoption, but now you see this accelerating. And it's related to the complexity of the stack, to the multiple points of failure of on premise solutions to the fact that people want, really they want elasticity. They want flexibility in rolling this out. The primary, interestingly enough, the primary motivators actually not cost. It's really a breathable solution that allows them to spin up clusters, to manage certain workloads that come for a compliance report every quarter. They need another 50 notes, spin them up, run them for a week or two and spin them down again. So it's really the customers are buying elasticity, they're buying elasticity from a technology perspective. They're buying elasticity from a commercial perspective. But they want enterprise grade. >> Yeah we certainly hear customers like that flexibility. >> And I think we are now at a tipping point where customers see that they can actually do this in a highly secure and governed way. So especially our demanding customers. And that it really makes sense from a commercial and elasticity perspective. >> So you were saying that's what they're buying, but they're buying what you're selling. So congratulations on that. Obviously it's working. So good luck, continued success down the road, and thanks for the time here today, we appreciate it. >> Absolutely, thanks for having us. >> John: Always good to have you on theCUBE. >> It's cocktail time, thanks for having us. >> It is five o' clock somewhere, here right now. Back with more live coverage from re:Invent. We'll be back here from Las Vegas live in just a bit. (electronic music)
SUMMARY :
Announcer: Live from Las Vegas, it's theCUBE. Good to have you here on theCUBE. So you were cube-ing at just recently and we had actually, we've done some work with a couple I mean I'm gonna have to look into our budget. I know this is a busy week for you all. So we do everything from ingesting the data So are you producing to give them what you think So it's actually interesting that you ask this really show some of the amazing things that you can do And they don't have to use 10 different tools So being able to do that, have that self service, So they really struggle to make this happen on premise. So in the recent times one of our very first So financial services, I mean you mentioned that So the big advantage of Datameer is it's an end to end tool. to say you know what we have to do this, What drove it to the cloud? So it's really the customers are buying elasticity, And I think we are now at a tipping point and thanks for the time here today, we appreciate it. Back with more live coverage from re:Invent.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Justin | PERSON | 0.99+ |
Justin Warren | PERSON | 0.99+ |
Justin Rodatus | PERSON | 0.99+ |
John | PERSON | 0.99+ |
New York | LOCATION | 0.99+ |
Christian | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
John Walls | PERSON | 0.99+ |
2009 | DATE | 0.99+ |
London | LOCATION | 0.99+ |
UK | LOCATION | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
30% | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
National Instruments | ORGANIZATION | 0.99+ |
Pooja Palan | PERSON | 0.99+ |
Las Vegas | LOCATION | 0.99+ |
a week | QUANTITY | 0.99+ |
Pusha | PERSON | 0.99+ |
two weeks ago | DATE | 0.99+ |
third thing | QUANTITY | 0.99+ |
27 different technology platforms | QUANTITY | 0.99+ |
Datameer | ORGANIZATION | 0.99+ |
today | DATE | 0.99+ |
2010 | DATE | 0.99+ |
eight years ago | DATE | 0.99+ |
five o' clock | DATE | 0.99+ |
10 different tools | QUANTITY | 0.99+ |
Intel | ORGANIZATION | 0.98+ |
50 notes | QUANTITY | 0.98+ |
second one | QUANTITY | 0.98+ |
three years ago | DATE | 0.98+ |
both | QUANTITY | 0.98+ |
three days | QUANTITY | 0.98+ |
First | QUANTITY | 0.98+ |
one tool | QUANTITY | 0.98+ |
about 15 months | QUANTITY | 0.97+ |
Christian Rodatus | PERSON | 0.97+ |
Sands Expo Center | LOCATION | 0.97+ |
first | QUANTITY | 0.97+ |
two and a half years ago | DATE | 0.97+ |
32 Hadoop | QUANTITY | 0.96+ |
one place | QUANTITY | 0.95+ |
single interface | QUANTITY | 0.94+ |
re:Invent | EVENT | 0.94+ |
single file | QUANTITY | 0.94+ |
three things | QUANTITY | 0.92+ |
first cloud | QUANTITY | 0.92+ |
one | QUANTITY | 0.92+ |
dozens over dozens of commercial tools | QUANTITY | 0.89+ |
years | QUANTITY | 0.88+ |
Day one | QUANTITY | 0.88+ |
first financial services | QUANTITY | 0.88+ |
Invent | EVENT | 0.85+ |
Invent 2017 | EVENT | 0.84+ |
Puja | PERSON | 0.83+ |
first commercial platform | QUANTITY | 0.82+ |
about two years ago | DATE | 0.82+ |
couple folks | QUANTITY | 0.8+ |
JC Penny | ORGANIZATION | 0.8+ |
re: | EVENT | 0.75+ |
Datameer | PERSON | 0.7+ |
Amazon | EVENT | 0.69+ |
couple of years | QUANTITY | 0.68+ |
Azure | TITLE | 0.67+ |
Datameer | TITLE | 0.67+ |
rule | QUANTITY | 0.67+ |
Christian Rodatus, Datameer | BigData NYC 2017
>> Announcer: Live from Midtown Manhattan, it's theCUBE covering Big Data New York City 2017. Brought to by SiliconANGLE Media and its ecosystem sponsors. >> Coverage to theCUBE in New York City for Big Data NYC, the hashtag is BigDataNYC. This is our fifth year doing our own event in conjunction with Strata Hadoop, now called Strata Data, used to be Hadoop World, our eighth year covering the industry, we've been there from the beginning in 2010, the beginning of this revolution. I'm John Furrier, the co-host, with Jim Kobielus, our lead analyst at Wikibon. Our next guest is Christian Rodatus, who is the CEO of Datameer. Datameer, obviously, one of the startups now evolving on the, I think, eighth year or so, roughly seven or eight years old. Great customer base, been successful blocking and tackling, just doing good business. Your shirt says show him the data. Welcome to theCUBE, Christian, appreciate it. >> So well established, I barely think of you as a startup anymore. >> It's kind of true, and actually a couple of months ago, after I took on the job, I met Mike Olson, and Datameer and Cloudera were sort of founded the same year, I believe late 2009, early 2010. Then, he told me there were two open source projects with MapReduce and Hadoop, basically, and Datameer was founded to actually enable customers to do something with it, as an entry platform to help getting data in, create the data and doing something with it. And now, if you walk the show floor, it's a completely different landscape now. >> We've had you guys on before, the founder, Stefan, has been on. Interesting migration, we've seen you guys grow from a customer base standpoint. You've come on as the CEO to kind of take it to the next level. Give us an update on what's going on at Datameer. Obviously, the shirt says "Show me the data." Show me the money kind of play there, I get that. That's where the money is, the data is where the action is. Real solutions, not pie in the sky, we're now in our eighth year of this market, so there's not a lot of tolerance for hype even though there's a lot of AI watching going on. What's going on with you guys? >> I would say, interesting enough I met with a customer, prospective customer, this morning, and this was a very typical organization. So, this is a customer that was an insurance company, and they're just about to spin up their first Hadoop cluster to actually work on customer management applications. And they are overwhelmed with what the market offers now. There's 27 open source projects, there's dozens and dozens of other different tools that try to basically, they try best of reach approaches and certain layers of the stack for specific applications, and they don't really know how to stitch this all together. And if I reflect from a customer meeting at a Canadian bank recently that has very successfully deployed applications on the data lake, like in fraud management and compliance applications and things like this, they still struggle to basically replicate the same performance and the service level agreements that they used from their old EDW that they still have in production. And so, everybody's now going out there and trying to figure out how to get value out of the data lake for the business users, right? There's a lot of approaches that these companies are trying. There's SQL-on-Hadoop that supposedly doesn't perform properly. There is other solutions like OLAP on Hadoop that tries to emulate what they've been used to from the EDWs, and we believe these are the wrong approaches, so we want to stay true to the stack and be native to the stack and offer a platform that really operates end-to-end from interesting the data into the data lake to creation, preparation of the data, and ultimately, building the data pipelines for the business users, and this is certainly something-- >> Here's more of a play for the business users now, not the data scientists and statistical modelers. I thought the data scientists were your core market. Is that not true? >> So, our primary user base as Datameer used to be like, until last week, we were the data engineers in the companies, or basically the people that built the data lake, that created the data and built these data pipelines for the business user community no matter what tool they were using. >> Jim, I want to get your thoughts on this for Christian's interest. Last year, so these guys can fix your microphone. I think you guys fix the microphone for us, his earpiece there, but I want to get a question to Chris, and I ask to redirect through you. Gartner, another analyst firm. >> Jim: I've heard of 'em. >> Not a big fan personally, but you know. >> Jim: They're still in business? >> The magic quadrant, they use that tool. Anyway, they had a good intro stat. Last year, they predicted through 2017, 60% of big data projects will fail. So, the question for both you guys is did that actually happen? I don't think it did, I'm not hearing that 60% have failed, but we are seeing the struggle around analytics and scaling analytics in a way that's like a dev ops mentality. So, thoughts on this 60% data projects fail. >> I don't know whether it's 60%, there was another statistic that said there's only 14% of Hadoop deployments, or production or something, >> They said 60, six zero. >> Or whatever. >> Define failure, I mean, you've built a data lake, and maybe you're not using it immediately for any particular application. Does that mean you've failed, or does it simply mean you haven't found the killer application yet for it? I don't know, your thoughts. >> I agree with you, it's probably not a failure to that extent. It's more like how do they, so they dump the data into it, right, they build the infrastructure, now it's about the next step data lake 2.0 to figure out how do I get value out of the data, how do I go after the right applications, how do I build a platform and tools that basically promotes the use of that data throughout the business community in a meaningful way. >> Okay, so what's going on with you guys from a product standpoint? You guys have some announcements. Let's get to some of the latest and greatest. >> Absolutely. I think we were very strong in data creation, data preparation and the entire data governance around it, and we are using, as a user interface, we are using this spreadsheet-like user interface called a workbook, it really looks like Excel, but it's not. It operates at completely different scale. It's basically an Excel spreadsheet on steroids. Our customers built a data pipeline, so this is the data engineers that we discussed before, but we also have a relatively small power user community in our client base that use that spreadsheet for deep data exploration. Now, we are lifting this to the next level, and we put up a visualization layer on top of it that runs natively in the stack, and what you get is basically a visual experience not only in the data curation process but also in deep data exploration, and this is combined with two platform technologies that we use, it's based on highly scalable distributed search in the backend engine of our product, number one. We have also adopted a columnar data store, Parquet, for our file system now. In this combination, the data exploration capabilities we bring to the market will allow power analysts to really dig deep into the data, so there's literally no limits in terms of the breadth and the depth of the data. It could be billions of rows, it could be thousands of different attributes and columns that you are looking at, and you will get a response time of sub-second as we create indices on demand as we run this through the analytic process. >> With these fast queries and visualization, do you also have the ability to do semantic data virtualization roll-ups across multi-cloud or multi-cluster? >> Yeah, absolutely. We, also there's a second trend that we discussed right before we started the live transmission here. Things are also moving into the cloud, so what we are seeing right now is the EDW's not going away, the on prem is data lake, that prevail, right, and now they are thinking about moving certain workload types into the cloud, and we understand ourselves as a platform play that builds a data fabric that really ties all these data assets together, and it enables business. >> On the trends, we weren't on camera, we'll bring it up here, the impact of cloud to the data world. You've seen this movie before, you have extensive experience in this space going back to the origination, you'd say Teradata. When it was the classic, old-school data warehouse. And then, great purpose, great growth, massive value creation. Enter the Hadoop kind of disruption. Hadoop evolved from batch to do ranking stuff, and then tried to, it was a hammer that turned into a lawnmower, right? Then they started going down the path, and really, it wasn't workable for what people were looking at, but everyone was still trying to be the Teradata of whatever. Fast forward, so things have evolved and things are starting to shake out, same picture of data warehouse-like stuff, now you got cloud. It seems to be changing the nature of what it will become in the future. What's your perspective on that evolution? What's different about now and what's same about now that's, from the old days? What's the similarities of the old-school, and what's different that people are missing? >> I think it's a lot related to cloud, just in general. It is extremely important to fast adoptions throughout the organization, to get performance, and service-level agreements without customers. This is where we clearly can help, and we give them a user experience that is meaningful and that resembles what they were used to from the old EDW world, right? That's number one. Number two, and this comes back to a question to 60% fail, or why is it failing or working. I think there's a lot of really interesting projects out, and our customers are betting big time on the data lake projects whether it being on premise or in the cloud. And we work with HSBC, for instance, in the United Kingdom. They've got 32 data lake projects throughout the organization, and I spoke to one of these-- >> Not 32 data lakes, 32 projects that involve tapping into the data lake. >> 32 projects that involve various data lakes. >> Okay. (chuckling) >> And I spoke to one of the chief data officers there, and they said they are data center infrastructure just by having kick-started these projects will explode. And they're not in the business of operating all the hardware and things like this, and so, a major bank like them, they made an announcement recently, a public announcement, you can read about it, started moving the data assets into the cloud. This is clearly happening at rapid pace, and it will change the paradigm in terms of breathability and being able to satisfy peak workload requirements as they come up, when you run a compliance report at quota end or something like this, so this will certainly help with adoption and creating business value for our customers. >> We talk about all the time real-time, and there's so many examples of how data science has changed the game. I mean, I was talking about, from a cyber perspective, how data science helped capture Bin Laden to how I can get increased sales to better user experience on devices. Having real-time access to data, and you put in some quick data science around things, really helps things in the edge. What's your view on real-time? Obviously, that's super important, you got to kind of get your house in order in terms of base data hygiene and foundational work, building blocks. At the end of the day, the real-time seems to be super hot right now. >> Real-time is a relative term, right, so there's certainly applications like IOT applications, or machine data that you analyze that require real-time access. I would call it right-time, so what's the increment of data load that is required for certain applications? We are certainly not a real-time application yet. We can possibly load data through Kafka and stream data through Kafka, but in general, we are still a batch-oriented platform. We can do. >> Which, by the way, is not going away any time soon. It's like super important. >> No, it's not going away at all, right. It can do many batches at relatively frequent increments, which is usually enough for what our customers demand from our platform today, but we're certainly looking at more streaming types of capability as we move this forward. >> What do the customer architectures look like? Because you brought up the good point, we talk about this all the time, batch versus real-time. They're not mutually exclusive, obviously, good architectures would argue that you decouple them, obviously will have a good software elements all through the life cycle of data. >> Through the stack. >> And have the stack, and the stack's only going to get more robust. Your customers, what's the main value that you guys provide them, the problem that you're solving today and the benefits to them? >> Absolutely, so our true value is that there's no breakages in the stack. We enter, and we can basically satisfy all requirements from interesting the data, from blending and integrating the data, preparing the data, building the data pipelines, and analyzing the data. And all this we do in a highly secure and governed environment, so if you stitch it together, as a customer, the customer this morning asked me, "Whom do you compete with?" I keep getting this question all the time, and we really compete with two things. We compete with build-your-own, which customers still opt to do nowadays, while our things are really point and click and highly automated, and we compete with a combination of different products. You need to have at least three to four different products to be able to do what we do, but then you get security breaks, you get lack of data lineage and data governance through the process, and this is the biggest value that we can bring to the table. And secondly now with visual exploration, we offer capability that literally nobody has in the marketplace, where we give power users the capability to explore with blazing fast response times, billion rows of data in a very free-form type of exploration process. >> Are there more power users now than there were when you started as a company? It seemed like tools like Datameer have brought people into the sort of power user camp, just simply by the virtue of having access to your tool. What are your thoughts there? >> Absolutely, it's definitely growing, and you see also different companies exploiting their capability in different ways. You might find insurance or financial services customers that have a very sophisticated capability building in that area, and you might see 1,000 to 2,000 users that do deep data exploration, and other companies are starting out with a couple of dozen and then evolving it as they go. >> Christian, I got to ask you as the new CEO of Datameer, obviously going to the next level, you guys have been successful. We were commenting yesterday on theCUBE about, we've been covering this for eight years in depth in terms of CUBE coverage, we've seen the waves come and go of hype, but now there's not a lot of tolerance for hype. You guys are one of the companies, I will say, that stay to your knitting, you didn't overplay your hand. You've certainly rode the hype like everyone else did, but your solution is very specific on value, and so, you didn't overplay your hand, the company didn't really overplay their hand, in my opinion. But now, there's really the hand is value. >> Absolutely. >> As the new CEO, you got to kind of put a little shiny new toy on there, and you know, rub the, keep the car lookin' shiny and everything looking good with cutting edge stuff, the same time scaling up what's been working. The question is what are you doubling down on, and what are you investing in to keep that innovation going? >> There's really three things, and you're very much right, so this has become a mature company. We've grown with our customer base, our enterprise features and capabilities are second to none in the marketplace, this is what our customers achieve, and now, the three investment areas that we are putting together and where we are doubling down is really visual exploration as I outlined before. Number two, hybrid cloud architectures, we don't believe the customers move their entire stack right into the cloud. There's a few that are going to do this and that are looking into these things, but we will, we believe in the idea that they will still have to EDW their on premise data lake and some workload capabilities in the cloud which will be growing, so this is investment area number two. Number three is the entire concept of data curation for machine learning. This is something where we've released a plug-in earlier in the year for TensorFlow where we can basically build data pipelines for machine learning applications. This is still very small. We see some interest from customers, but it's growing interest. >> It's a directionally correct kind of vector, you're looking and say, it's a good sign, let's kick the tires on that and play around. >> Absolutely. >> 'Cause machine learning's got to learn, too. You got to learn from somewhere. >> And quite frankly, deep learning, machine learning tools for the rest of us, there aren't really all that many for the rest of us power users, they're going to have to come along and get really super visual in terms of enabling visual modular development and tuning of these models. What are your thoughts there in terms of going forward about a visualization layer to make machine learning and deep learning developers more productive? >> That is an area where we will not engage in a way. We will stick with our platform play where we focus on building the data pipelines into those tools. >> Jim: Gotcha. >> In the last area where we invest is ecosystem integration, so we think with our visual explorer backend that is built on search and on a Parquet file format is, or columnar store, is really a key differentiator in feeding or building data pipelines into the incumbent BRE ecosystems and accelerating those as well. We've currently prototypes running where we can basically give the same performance and depth of analytic capability to some of the existing BI tools that are out there. >> What are some the ecosystem partners do you guys have? I know partnering is a big part of what you guys have done. Can you name a few? >> I mean, the biggest one-- >> Everybody, Switzerland. >> No, not really. We are focused on staying true to our stack and how we can provide value to our customers, so we work actively and very important on our cloud strategy with Microsoft and Amazon AWS in evolving our cloud strategy. We've started working with various BI vendors throughout that you know about, right, and we definitely have a play also with some of the big SIs and IBM is a more popular one. >> So, BI guys mostly on the tool visualization side. You said you were a pipeline. >> On tool and visualization side, right. We have very effective integration for our data pipelines into the BI tools today we support TD for Tableau, we have a native integration. >> Why compete there, just be a service provider. >> Absolutely, and we have more and better technology come up to even accelerate those tools as well in our big data stuff. >> You're focused, you're scaling, final word I'll give to you for the segment. Share with the folks that are a Datameer customer or have not yet become a customer, what's the outlook, what's the new Datameer look like under your leadership? What should they expect? >> Yeah, absolutely, so I think they can expect utmost predictability, the way how we roll out the division and how we build our product in the next couple of releases. The next five, six months are critical for us. We have launched Visual Explorer here at the conference. We're going to launch our native cloud solution probably middle of November to the customer base. So, these are the big milestones that will help us for our next fiscal year and provide really great value to our customers, and that's what they can expect, predictability, a very solid product, all the enterprise-grade features they need and require for what they do. And if you look at it, we are really enterprise play, and the customer base that we have is very demanding and challenging, and we want to keep up and deliver a capability that is relevant for them and helps them create values from the data lakes. >> Christian Rodatus, technology enthusiast, passionate, now CEO of Datameer. Great to have you on theCUBE, thanks for sharing. >> Thanks so much. >> And we'll be following your progress. Datameer here inside theCUBE live coverage, hashtag BigDataNYC, our fifth year doing our own event here in conjunction with Strata Data, formerly Strata Hadoop, Hadoop World, eight years covering this space. I'm John Furrier with Jim Kobielus here inside theCUBE. More after this short break. >> Christian: Thank you. (upbeat electronic music)
SUMMARY :
Brought to by SiliconANGLE Media and its ecosystem sponsors. I'm John Furrier, the co-host, with Jim Kobielus, So well established, I barely think of you create the data and doing something with it. You've come on as the CEO to kind of and the service level agreements that they used Here's more of a play for the business users now, that created the data and built these data pipelines and I ask to redirect through you. So, the question for both you guys is the killer application yet for it? the next step data lake 2.0 to figure out Okay, so what's going on with you guys and columns that you are looking at, and we understand ourselves as a platform play the impact of cloud to the data world. and that resembles what they were used to tapping into the data lake. and being able to satisfy peak workload requirements and you put in some quick data science around things, or machine data that you analyze Which, by the way, is not going away any time soon. more streaming types of capability as we move this forward. What do the customer architectures look like? and the stack's only going to get more robust. and analyzing the data. just simply by the virtue of having access to your tool. and you see also different companies and so, you didn't overplay your hand, the company and what are you investing in to keep that innovation going? and now, the three investment areas let's kick the tires on that and play around. You got to learn from somewhere. for the rest of us power users, We will stick with our platform play and depth of analytic capability to some of What are some the ecosystem partners do you guys have? and how we can provide value to our customers, on the tool visualization side. into the BI tools today we support TD for Tableau, Absolutely, and we have more and better technology Share with the folks that are a Datameer customer and the customer base that we have is Great to have you on theCUBE, here in conjunction with Strata Data, Christian: Thank you.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jim Kobielus | PERSON | 0.99+ |
Chris | PERSON | 0.99+ |
HSBC | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Jim | PERSON | 0.99+ |
Christian Rodatus | PERSON | 0.99+ |
Stefan | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
John Furrier | PERSON | 0.99+ |
60% | QUANTITY | 0.99+ |
2017 | DATE | 0.99+ |
Datameer | ORGANIZATION | 0.99+ |
2010 | DATE | 0.99+ |
32 projects | QUANTITY | 0.99+ |
Last year | DATE | 0.99+ |
United Kingdom | LOCATION | 0.99+ |
1,000 | QUANTITY | 0.99+ |
New York City | LOCATION | 0.99+ |
14% | QUANTITY | 0.99+ |
eight years | QUANTITY | 0.99+ |
fifth year | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
Cloudera | ORGANIZATION | 0.99+ |
Excel | TITLE | 0.99+ |
eighth year | QUANTITY | 0.99+ |
late 2009 | DATE | 0.99+ |
early 2010 | DATE | 0.99+ |
Mike Olson | PERSON | 0.99+ |
60 | QUANTITY | 0.99+ |
27 open source projects | QUANTITY | 0.99+ |
last week | DATE | 0.99+ |
thousands | QUANTITY | 0.99+ |
two things | QUANTITY | 0.99+ |
Kafka | TITLE | 0.99+ |
seven | QUANTITY | 0.99+ |
second trend | QUANTITY | 0.99+ |
Midtown Manhattan | LOCATION | 0.99+ |
yesterday | DATE | 0.99+ |
Christian | PERSON | 0.99+ |
both | QUANTITY | 0.99+ |
SiliconANGLE Media | ORGANIZATION | 0.98+ |
two open source projects | QUANTITY | 0.98+ |
Gartner | ORGANIZATION | 0.98+ |
two platform technologies | QUANTITY | 0.98+ |
Wikibon | ORGANIZATION | 0.98+ |
Switzerland | LOCATION | 0.98+ |
billions of rows | QUANTITY | 0.98+ |
first | QUANTITY | 0.98+ |
MapReduce | ORGANIZATION | 0.98+ |
2,000 users | QUANTITY | 0.98+ |
Bin Laden | PERSON | 0.98+ |
NYC | LOCATION | 0.97+ |
Strata Data | ORGANIZATION | 0.97+ |
32 data lakes | QUANTITY | 0.97+ |
six | QUANTITY | 0.97+ |
Hadoop | TITLE | 0.97+ |
secondly | QUANTITY | 0.96+ |
next fiscal year | DATE | 0.96+ |
three things | QUANTITY | 0.96+ |
today | DATE | 0.95+ |
four different products | QUANTITY | 0.95+ |
Teradata | ORGANIZATION | 0.95+ |
Christian | ORGANIZATION | 0.95+ |
this morning | DATE | 0.95+ |
TD | ORGANIZATION | 0.94+ |
EDW | ORGANIZATION | 0.94+ |
BigData | EVENT | 0.92+ |
Joel Horwitz, IBM | IBM CDO Summit Sping 2018
(techno music) >> Announcer: Live, from downtown San Francisco, it's theCUBE. Covering IBM Chief Data Officer Strategy Summit 2018. Brought to you by IBM. >> Welcome back to San Francisco everybody, this is theCUBE, the leader in live tech coverage. We're here at the Parc 55 in San Francisco covering the IBM CDO Strategy Summit. I'm here with Joel Horwitz who's the Vice President of Digital Partnerships & Offerings at IBM. Good to see you again Joel. >> Thanks, great to be here, thanks for having me. >> So I was just, you're very welcome- It was just, let's see, was it last month, at Think? >> Yeah, it's hard to keep track, right. >> And we were talking about your new role- >> It's been a busy year. >> the importance of partnerships. One of the things I want to, well let's talk about your role, but I really want to get into, it's innovation. And we talked about this at Think, because it's so critical, in my opinion anyway, that you can attract partnerships, innovation partnerships, startups, established companies, et cetera. >> Joel: Yeah. >> To really help drive that innovation, it takes a team of people, IBM can't do it on its own. >> Yeah, I mean look, IBM is the leader in innovation, as we all know. We're the market leader for patents, that we put out each year, and how you get that technology in the hands of the real innovators, the developers, the longtail ISVs, our partners out there, that's the challenging part at times, and so what we've been up to is really looking at how we make it easier for partners to partner with IBM. How we make it easier for developers to work with IBM. So we have a number of areas that we've been adding, so for example, we've added a whole IBM Code portal, so if you go to developer.ibm.com/code you can actually see hundreds of code patterns that we've created to help really any client, any partner, get started using IBM's technology, and to innovate. >> Yeah, and that's critical, I mean you're right, because to me innovation is a combination of invention, which is what you guys do really, and then it's adoption, which is what your customers are all about. You come from the data science world. We're here at the Chief Data Officer Summit, what's the intersection between data science and CDOs? What are you seeing there? >> Yeah, so when I was here last, it was about two years ago in 2015, actually, maybe three years ago, man, time flies when you're having fun. >> Dave: Yeah, the Spark Summit- >> Yeah Spark Technology Center and the Spark Summit, and we were here, I was here at the Chief Data Officer Summit. And it was great, and at that time, I think a lot of the conversation was really not that different than what I'm seeing today. Which is, how do you manage all of your data assets? I think a big part of doing good data science, which is my kind of background, is really having a good understanding of what your data governance is, what your data catalog is, so, you know we introduced the Watson Studio at Think, and actually, what's nice about that, is it brings a lot of this together. So if you look in the market, in the data market, today, you know we used to segment it by a few things, like data gravity, data movement, data science, and data governance. And those are kind of the four themes that I continue to see. And so outside of IBM, I would contend that those are relatively separate kind of tools that are disconnected, in fact Dinesh Nirmal, who's our engineer on the analytic side, Head of Development there, he wrote a great blog just recently, about how you can have some great machine learning, you have some great data, but if you can't operationalize that, then really you can't put it to use. And so it's funny to me because we've been focused on this challenge, and IBM is making the right steps, in my, I'm obviously biased, but we're making some great strides toward unifying the, this tool chain. Which is data management, to data science, to operationalizing, you know, machine learning. So that's what we're starting to see with Watson Studio. >> Well, I always push Dinesh on this and like okay, you've got a collection of tools, but are you bringing those together? And he flat-out says no, we developed this, a lot of this from scratch. Yes, we bring in the best of the knowledge that we have there, but we're not trying to just cobble together a bunch of disparate tools with a UI layer. >> Right, right. >> It's really a fundamental foundation that you're trying to build. >> Well, what's really interesting about that, that piece, is that yeah, I think a lot of folks have cobbled together a UI layer, so we formed a partnership, coming back to the partnership view, with a company called Lightbend, who's based here in San Francisco, as well as in Europe, and the reason why we did that, wasn't just because of the fact that Reactive development, if you're not familiar with Reactive, it's essentially Scala, Akka, Play, this whole framework, that basically allows developers to write once, and it kind of scales up with demand. In fact, Verizon actually used our platform with Lightbend to launch the iPhone 10. And they show dramatic improvements. Now what's exciting about Lightbend, is the fact that application developers are developing with Reactive, but if you turn around, you'll also now be able to operationalize models with Reactive as well. Because it's basically a single platform to move between these two worlds. So what we've continued to see is data science kind of separate from the application world. Really kind of, AI and cloud as different universes. The reality is that for any enterprise, or any company, to really innovate, you have to find a way to bring those two worlds together, to get the most use out of it. >> Fourier always says "Data is the new development kit". He said this I think five or six years ago, and it's barely becoming true. You guys have tried to make an attempt, and have done a pretty good job, of trying to bring those worlds together in a single platform, what do you call it? The Watson Data Platform? >> Yeah, Watson Data Platform, now Watson Studio, and I think the other, so one side of it is, us trying to, not really trying, but us actually bringing together these disparate systems. I mean we are kind of a systems company, we're IT. But not only that, but bringing our trained algorithms, and our trained models to the developers. So for example, we also did a partnership with Unity, at the end of last year, that's now just reaching some pretty good growth, in terms of bringing the Watson SDK to game developers on the Unity platform. So again, it's this idea of bringing the game developer, the application developer, in closer contact with these trained models, and these trained algorithms. And that's where you're seeing incredible things happen. So for example, Star Trek Bridge Crew, which I don't know how many Trekkies we have here at the CDO Summit. >> A few over here probably. >> Yeah, a couple? They're using our SDK in Unity, to basically allow a gamer to use voice commands through the headset, through a VR headset, to talk to other players in the virtual game. So we're going to see more, I can't really disclose too much what we're doing there, but there's some cool stuff coming out of that partnership. >> Real immersive experience driving a lot of data. Now you're part of the Digital Business Group. I like the term digital business, because we talk about it all the time. Digital business, what's the difference between a digital business and a business? What's the, how they use data. >> Joel: Yeah. >> You're a data person, what does that mean? That you're part of the Digital Business Group? Is that an internal facing thing? An external facing thing? Both? >> It's really both. So our Chief Digital Officer, Bob Lord, he has a presentation that he'll give, where he starts out, and he goes, when I tell people I'm the Chief Digital Officer they usually think I just manage the website. You know, if I tell people I'm a Chief Data Officer, it means I manage our data, in governance over here. The reality is that I think these Chief Digital Officer, Chief Data Officer, they're really responsible for business transformation. And so, if you actually look at what we're doing, I think on both sides is we're using data, we're using marketing technology, martech, like Optimizely, like Segment, like some of these great partners of ours, to really look at how we can quickly A/B test, get user feedback, to look at how we actually test different offerings and market. And so really what we're doing is we're setting up a testing platform, to bring not only our traditional offers to market, like DB2, Mainframe, et cetera, but also bring new offers to market, like blockchain, and quantum, and others, and actually figure out how we get better product-market fit. What actually, one thing, one story that comes to mind, is if you've seen the movie Hidden Figures- >> Oh yeah. >> There's this scene where Kevin Costner, I know this is going to look not great for IBM, but I'm going to say it anyways, which is Kevin Costner has like a sledgehammer, and he's like trying to break down the wall to get the mainframe in the room. That's what it feels like sometimes, 'cause we create the best technology, but we forget sometimes about the last mile. You know like, we got to break down the wall. >> Where am I going to put it? >> You know, to get it in the room! So, honestly I think that's a lot of what we're doing. We're bridging that last mile, between these different audiences. So between developers, between ISVs, between commercial buyers. Like how do we actually make this technology, not just accessible to large enterprise, which are our main clients, but also to the other ecosystems, and other audiences out there. >> Well so that's interesting Joel, because as a potential partner of IBM, they want, obviously your go-to-market, your massive company, and great distribution channel. But at the same time, you want more than that. You know you want to have a closer, IBM always focuses on partnerships that have intrinsic value. So you talked about offerings, you talked about quantum, blockchain, off-camera talking about cloud containers. >> Joel: Yeah. >> I'd say cloud and containers may be a little closer than those others, but those others are going to take a lot of market development. So what are the offerings that you guys are bringing? How do they get into the hands of your partners? >> I mean, the commonality with all of these, all the emerging offerings, if you ask me, is the distributed nature of the offering. So if you look at blockchain, it's a distributed ledger. It's a distributed transaction chain that's secure. If you look at data, really and we can hark back to say, Hadoop, right before object storage, it's distributed storage, so it's not just storing on your hard drive locally, it's storing on a distributed network of servers that are all over the world and data centers. If you look at cloud, and containers, what you're really doing is not running your application on an individual server that can go down. You're using containers because you want to distribute that application over a large network of servers, so that if one server goes down, you're not going to be hosed. And so I think the fundamental shift that you're seeing is this distributed nature, which in essence is cloud. So I think cloud is just kind of a synonym, in my opinion, for distributed nature of our business. >> That's interesting and that brings up, you're right, cloud and Big Data/Hadoop, we don't talk about Hadoop much anymore, but it kind of got it all started, with that notion of leave the data where it is. And it's the same thing with cloud. You can't just stuff your business into the public cloud. You got to bring the cloud to your data. >> Joel: That's right. >> But that brings up a whole new set of challenges, which obviously, you're in a position just to help solve. Performance, latency, physics come into play. >> Physics is a rough one. It's kind of hard to avoid that one. >> I hear your best people are working on it though. Some other partnerships that you want to sort of, elucidate. >> Yeah, no, I mean we have some really great, so I think the key kind of partnership, I would say area, that I would allude to is, one of the things, and you kind of referenced this, is a lot of our partners, big or small, want to work with our top clients. So they want to work with our top banking clients. They want, 'cause these are, if you look at for example, MaRisk and what we're doing with them around blockchain, and frankly, talk about innovation, they're innovating containers for real, not virtual containers- >> And that's a joint venture right? >> Yeah, it is, and so it's exciting because, what we're bringing to market is, I also lead our startup programs, called the Global Entrepreneurship Program, and so what I'm focused on doing, and you'll probably see more to come this quarter, is how do we actually bridge that end-to-end? How do you, if you're startup or a small business, ultimately reach that kind of global business partner level? And so kind of bridging that, that end-to-end. So we're starting to bring out a number of different incentives for partners, like co-marketing, so I'll help startups when they're early, figure out product-market fit. We'll give you free credits to use our innovative technology, and we'll also bring you into a number of clients, to basically help you not burn all of your cash on creating your own marketing channel. God knows I did that when I was at a start-up. So I think we're doing a lot to kind of bridge that end-to-end, and help any partner kind of come in, and then grow with IBM. I think that's where we're headed. >> I think that's a critical part of your job. Because I mean, obviously IBM is known for its Global 2000, big enterprise presence, but startups, again, fuel that innovation fire. So being able to attract them, which you're proving you can, providing whatever it is, access, early access to cloud services, or like you say, these other offerings that you're producing, in addition to that go-to-market, 'cause it's funny, we always talk about how efficient, capital efficient, software is, but then you have these companies raising hundreds of millions of dollars, why? Because they got to do promotion, marketing, sales, you know, go-to-market. >> Yeah, it's really expensive. I mean, you look at most startups, like their biggest ticket item is usually marketing and sales. And building channels, and so yeah, if you're, you know we're talking to a number of partners who want to work with us because of the fact that, it's not just like, the direct kind of channel, it's also, as you kind of mentioned, there's other challenges that you have to overcome when you're working with a larger company. for example, security is a big one, GDPR compliance now, is a big one, and just making sure that things don't fall over, is a big one. And so a lot of partners work with us because ultimately, a number of the decision makers in these larger enterprises are going, well, I trust IBM, and if IBM says you're good, then I believe you. And so that's where we're kind of starting to pull partners in, and pull an ecosystem towards us. Because of the fact that we can take them through that level of certification. So we have a number of free online courses. So if you go to partners, excuse me, ibm.com/partners/learn there's a number of blockchain courses that you can learn today, and will actually give you a digital certificate, that's actually certified on our own blockchain, which we're actually a first of a kind to do that, which I think is pretty slick, and it's accredited at some of the universities. So I think that's where people are looking to IBM, and other leaders in this industry, is to help them become experts in their, in this technology, and especially in this emerging technology. >> I love that blockchain actually, because it's such a growing, and interesting, and innovative field. But it needs players like IBM, that can bring credibility, enterprise-grade, whether it's security, or just, as I say, credibility. 'Cause you know, this is, so much of negative connotations associated with blockchain and crypto, but companies like IBM coming to the table, enterprise companies, and building that ecosystem out is in my view, crucial. >> Yeah, no, it takes a village. I mean, there's a lot of folks, I mean that's a big reason why I came to IBM, three, four years ago, was because when I was in start-up land, I used to work for H20, I worked for Alpine Data Labs, Datameer, back in the Hadoop days, and what I realized was that, it's an opportunity cost. So you can't really drive true global innovation, transformation, in some of these bigger companies because there's only so much that you can really kind of bite off. And so you know at IBM it's been a really rewarding experience because we have done things like for example, we partnered with Girls Who Code, Treehouse, Udacity. So there's a number of early educators that we've partnered with, to bring code to, to bring technology to, that frankly, would never have access to some of this stuff. Some of this technology, if we didn't form these alliances, and if we didn't join these partnerships. So I'm very excited about the future of IBM, and I'm very excited about the future of what our partners are doing with IBM, because, geez, you know the cloud, and everything that we're doing to make this accessible, is bar none, I mean, it's great. >> I can tell you're excited. You know, spring in your step. Always a lot of energy Joel, really appreciate you coming onto theCUBE. >> Joel: My pleasure. >> Great to see you again. >> Yeah, thanks Dave. >> You're welcome. Alright keep it right there, everybody. We'll be back. We're at the IBM CDO Strategy Summit in San Francisco. You're watching theCUBE. (techno music) (touch-tone phone beeps)
SUMMARY :
Brought to you by IBM. Good to see you again Joel. that you can attract partnerships, To really help drive that innovation, and how you get that technology Yeah, and that's critical, I mean you're right, Yeah, so when I was here last, to operationalizing, you know, machine learning. that we have there, but we're not trying that you're trying to build. to really innovate, you have to find a way in a single platform, what do you call it? So for example, we also did a partnership with Unity, to basically allow a gamer to use voice commands I like the term digital business, to look at how we actually test different I know this is going to look not great for IBM, but also to the other ecosystems, But at the same time, you want more than that. So what are the offerings that you guys are bringing? So if you look at blockchain, it's a distributed ledger. You got to bring the cloud to your data. But that brings up a whole new set of challenges, It's kind of hard to avoid that one. Some other partnerships that you want to sort of, elucidate. and you kind of referenced this, to basically help you not burn all of your cash early access to cloud services, or like you say, that you can learn today, but companies like IBM coming to the table, that you can really kind of bite off. really appreciate you coming onto theCUBE. We're at the IBM CDO Strategy Summit in San Francisco.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Joel | PERSON | 0.99+ |
Joel Horwitz | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Kevin Costner | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Dinesh Nirmal | PERSON | 0.99+ |
Alpine Data Labs | ORGANIZATION | 0.99+ |
Lightbend | ORGANIZATION | 0.99+ |
Verizon | ORGANIZATION | 0.99+ |
San Francisco | LOCATION | 0.99+ |
Hidden Figures | TITLE | 0.99+ |
Bob Lord | PERSON | 0.99+ |
Both | QUANTITY | 0.99+ |
MaRisk | ORGANIZATION | 0.99+ |
both | QUANTITY | 0.99+ |
iPhone 10 | COMMERCIAL_ITEM | 0.99+ |
2015 | DATE | 0.99+ |
Datameer | ORGANIZATION | 0.99+ |
both sides | QUANTITY | 0.99+ |
one story | QUANTITY | 0.99+ |
Think | ORGANIZATION | 0.99+ |
five | DATE | 0.99+ |
hundreds | QUANTITY | 0.99+ |
Treehouse | ORGANIZATION | 0.99+ |
three years ago | DATE | 0.99+ |
developer.ibm.com/code | OTHER | 0.99+ |
Unity | ORGANIZATION | 0.98+ |
two worlds | QUANTITY | 0.98+ |
Reactive | ORGANIZATION | 0.98+ |
GDPR | TITLE | 0.98+ |
one side | QUANTITY | 0.98+ |
Digital Business Group | ORGANIZATION | 0.98+ |
today | DATE | 0.98+ |
Udacity | ORGANIZATION | 0.98+ |
ibm.com/partners/learn | OTHER | 0.98+ |
last month | DATE | 0.98+ |
Watson Studio | ORGANIZATION | 0.98+ |
each year | QUANTITY | 0.97+ |
three | DATE | 0.97+ |
single platform | QUANTITY | 0.97+ |
Girls Who Code | ORGANIZATION | 0.97+ |
Parc 55 | LOCATION | 0.97+ |
one thing | QUANTITY | 0.97+ |
four themes | QUANTITY | 0.97+ |
Spark Technology Center | ORGANIZATION | 0.97+ |
six years ago | DATE | 0.97+ |
H20 | ORGANIZATION | 0.97+ |
four years ago | DATE | 0.97+ |
martech | ORGANIZATION | 0.97+ |
Unity | TITLE | 0.96+ |
hundreds of millions of dollars | QUANTITY | 0.94+ |
Watson Studio | TITLE | 0.94+ |
Dinesh | PERSON | 0.93+ |
one server | QUANTITY | 0.93+ |
Attila Bayrak, Akbank | Customer Journey
(cheery xylophone music) >> Welcome back everybody Jeff Frick here with theCUBE. We're in the Palo Alto studios today to talk really about the customer journey. We're excited to have our guest today who flew in all the way from Istanbul, Turkey which is a very long flight. It's Attila Bayrak he's the Chief Analytics Officer for Akbank, welcome. >> Hi, hello. >> So first of all I hope you get some time to catch up on your sleep before you turn around and fly all the way back. >> Yeah it's a little bit quick to speak about finance and banking, but it's good to be here. >> Well we're glad you made the trip. And so before we jump in, for people that aren't familiar. Give us a little bit about Akbank, and the history of the bank. >> Yeah sure, sure, Akbank is one of the leading private bank in Turkey. And it's almost 70 years old, and we have nearly 14,000 employees and with the 850 branches around 4,000 ATMs and probably half a million merchant point of sales. We can say that we have a good footprint in Turkey. And also we are keen on to be a leading digital bank in Turkey. And just a brief information about Turkey. The Turkish market is quite young. And 50% of the population is under the age 29. >> Jeff: 50% is under the age of 29, okay. >> It's huge and the total population is around 80 million. >> Jeff: Okay. >> So Turkish economy is quite performing very well for the last 10, nine years. So that's why being digital leader is quite a crucial issue for us. So with these numbers we're performing around probably the best or the second in many KPIs. >> Jeff: Okay. >> We can say that we nominated, we are nominated many times as the best bank in Turkey with the bank in Europe from some of the companies. >> Okay and how long have you been there? >> So I've been there in 11 years. >> 11 years, but you said before that you were at some other banks. You've been in the banking industry for a while. >> Yes, yes I've been banking industry for almost 20 years. So I used to work two other competitors of Akbank. >> Okay so I'm curious especially with that large percentage of younger people, how many of those people ever come into a branch or go to an ATM? As opposed to using their phone. >> So they should prefer doing business in phone because it's quicker, faster, and easy. And the experience is quite much more under control in the phone. And we have, we can say that we have 80, 85% of younger people preferring the digital business rather than the classical ways. >> It's just fascinating to me, especially in banking, 'cause in banking you know, it was that trusted facility on the corner right in every town that you knew was stable, and it was always there, and you went into the branch, and you knew some of the people that worked there. And now almost the entire experience between the bank and its customers is a digital interaction, especially for the young people. They've never been to a branch. They don't hardly ever go to an ATM, in fact the whole concept of cash is kind of funny to them. You know it's a very different world. So digital transformation in banking is so so important. >> Yes they're going in hand in hand. You know the millennials are living in the digital world. And after the millennials they born in the digital world. So it's always that, the business are transformed itself into the digital way, and to deliver the products and needs in the way of doing things with the digital processes. >> So as Chief Analytics Officer, with that move in the millennials, of course there's always regulation and other things that are driving you know your KPIs, but how has that migration to younger people interacting in a digital way, impacted your job and what you measure what you have to do every day. >> They directly impacted my job. (laughing) I used to lead the customer relationship management initiative for 10 years which covers the sales and marketing automation, and the analytics and the design of the processes in the sales. A year ago, one and a half year ago, we transformed the role into the analytics office, and we are keen on to the deep dive in the customer behave, and define what are the needs of the customer, and how is evolving in the digital era. And we are trying to position the bank's products and the communication skills in the digital world with the customers. So it is similar in the old days in the subjects, but it's really different in details. So the story begins to understand the customer, and then segmenting the customer, for sure for probably more than 30, 50 years. But in the digital world, the footprint of the customer and the digital footprint is quite diversifying the thoughts in the corporate side. So we have around 50 million customers, and 90% is a retail one in the new ages. So we need to optimize the banking let's say, the cost structure of the bank, and for sure the digital business gives us the enablement of the optimizing the customer service. >> Right, right. >> So the segmenting the customers, not for the value basis, the behavioral and the other perspectives, and creating a very well defined segments is the initial step. And we are redefining ourselves in serving in this era. >> So I'm just curious, you know 20 years ago, I won't go back to 30, but 20 years ago how many segments did you use to segment your customers? I mean how many kind of classes and how has that changed today? >> Well 20 years ago we have three to five segments. >> Jeff: Three to five segments, that's what I thought. >> So it's like the big ones and the small ones. And if you have the analytic capability you have the mid ones. >> Jeff: Right, right. >> For nowadays we have 80, 85 different perspectives for the customers. So we created that platform to enhance these segmentation capability to serve our specified problems of the bank. I mean problems with the missions of the marketing-- >> Right. >> Let's say so we are considering now the life stage, the life style, and some spending behaviors, and some investment behaviors, some credit risk behaviors also as well. And the potentials of the economic size. >> Jeff: Right. >> And we can say that now we have more than hundreds, but the optimal point of the segmentation is so there is no meaning to create some segments that you do not take some actions-- >> Right, right. >> The action ability of the segment is quite coming forward in this topic. So we created the platform to enhance the capability, to create dynamic segments and dynamic targets to each marketing event. >> Right, and I was gonna say and hand in hand with that, and you just mentioned a bunch of different variables, how many variables fed that segmentation before versus how many variables today feed that segmentation analysis. >> So it increases probably hundred times. So we used to I don't know analyze couple of hundreds of dimensions and variables in older days. It's more than 10,000 today. >> More than 10,000 variables to segment into hundreds of classifications of customers? >> Yeah why not. >> Wow, well there's a good opportunity for an analytics executive. (laughing) So how are you addressing that challenge? So obviously you're here as a Datameer customer. How did you do it in the past? What were the things you couldn't do? And what forced you to go with kind of a new platform and a new approach? >> So we can say that we have a quite well defined analytic architecture in the Akbank. And we are using different types of technologies in different types of solution areas. Datameer is positioned in the measuring of our marketing campaigns. And as we mentioned we have more than millions of customers and we have quite, we can say that in a given period of time we have more than hundreds of campaigns. So we need to speed up the measurement of the campaigns and the results in a business perspective. And once we come across with the Datameer and the capabilities of the technologies much more related with the Hadoop structure and integration of different data sources in one place. So we think that we can optimize our ETL type of measurement data load technologies transformed into the Hadoop structure. And it seems it worked. So we reduced the time to transform the data into a single platform from diversified places. And we created easy to use measurement platform to give some feedbacks before the things are happen. >> Right right 'cause there's a lot of elements to it. Just on the data side, there's the ingest as you said, now you have many many variables so you gotta pull from multiple sources, you gotta get it into a single place, you gotta get it into kind of a single format that then you can drive the analytics on it. Then you got to enable more people to have the power. And I'm curious how that piece of your business has evolved where before probably very few people had access to the data, very few people had access to the tools and the training to use them, but to really get the power out of this effort you need to let a lot of people have access to that data, access to the tools to design these hundreds of campaigns. So how has that evolved over time? >> To be frankly speaking, there are thousands of variables are related to the predictive part of the analytics. But the other critical point is so the results are how are things are going on in the business side. So banking let's say culture of Akbank, so we are keen on to put the business value on the front and then think with that mind and design each and every process in that way. So that's an other perspective to get support to change the classical data load and upload and transform the data and analyze the data to see the results. That's the old way. And we were good to be frankly. But we transformed that into a much more dynamic structure. And the knowledge as you mentioned is a critical point in the team. So the easy to use, the usage of easy to use of the technology is quite another critical point to create that type of thing into the place. So at the end of the day, you are measuring hundreds of marketing actions just in a single month. And if there's something happening that doesn't plan, so you need some time to re-think on this issue and redesign it so we think that we are at the door of this stage. We can say that we can use the output of the predictive analytics much more in an efficient way by understanding the results in much more frequently and speedly I'd say. >> Right, right, and would you say this effort has really been offensive in terms of you trying to get ahead of the competition to be aggressive. Or has it been defensive and you know, if you're not playing this game, you're not really in the game anymore. >> So it depends on the prior subject. If it's retention action, it can be defensive. It seems like defensive. But if it's let's say op selection it can be offensive. So there's no chance to choose one of them because we have variety of products and variety of businesses in Turkey that we are operating. And at the end of the day we need to serve each and every action. >> And I think it was very insightful too that you said that you don't do it just for the sake of doing it and because you can do it. That if there's no action that can come from it, or if it's not actionable, what's the point, it's a wasted effort. >> Yeah, sure at the end of the day we are doing banking business. So we are not doing the analytics business. >> Right, right. >> That's the point. >> Yeah exactly. So as you look back kind of, what has been the high level result of this effort if you're reporting to your boss or the board of using this type of approach. And then secondly, where do you go next? We're almost at the end of 2017. What are some of your objectives and kind of priorities for 2018? >> So we are creating, we are now just nowadays, seeing the results of the new system. And we can say that in some actions we've started to increase the results 10 to 15%. >> Jeff: 10 to 15%? >> Yes it's in the result phase. And it gives us some courage to design new use cases. So the new use cases are much more related with the visualizing of the results in real time, these type of things. Basically I can say that we are trying to get everything in real time. And the modeling in real time. Measuring in real time. Visualizing in real time. So we are trying to push each and every action in the analytics to the closer. We do not want to work in the offline phase. >> Yeah it's fascinating to me to think that we used to make decisions based on a sampling of things that happened in the past. Now we want to make decisions on all the data that's happening now. It's a very different approach. >> Yeah. >> Alright, great well Attila thank you for stopping by and sharing your insights. >> It's a pleasure to share. >> Alright absolutely, alright so he's Attila Bayrak, I'm Jeff Frick, you're watching theCUBe. Thanks for watching we'll see you next time. (electronic music)
SUMMARY :
We're in the Palo Alto and fly all the way back. it's good to be here. and the history of the bank. And 50% of the population It's huge and the total for the last 10, nine years. from some of the companies. You've been in the banking So I used to work two other ever come into a branch or go to an ATM? And the experience is quite And now almost the entire experience So it's always that, the that are driving you know your KPIs, So the story begins to So the segmenting the customers, have three to five segments. Jeff: Three to five So it's like the big missions of the marketing-- And the potentials of the economic size. The action ability of the and hand in hand with that, So we used to I don't know analyze So how are you addressing that challenge? and the capabilities of the technologies the ingest as you said, and analyze the data to see the results. Or has it been defensive and you know, And at the end of the day we need to do it just for the sake Yeah, sure at the end of the day We're almost at the end of 2017. the results 10 to 15%. in the analytics to the closer. decisions on all the data thank you for stopping by we'll see you next time.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff Frick | PERSON | 0.99+ |
2018 | DATE | 0.99+ |
Attila Bayrak | PERSON | 0.99+ |
Akbank | ORGANIZATION | 0.99+ |
Europe | LOCATION | 0.99+ |
Turkey | LOCATION | 0.99+ |
Jeff | PERSON | 0.99+ |
Attila | PERSON | 0.99+ |
10 | QUANTITY | 0.99+ |
10 years | QUANTITY | 0.99+ |
850 branches | QUANTITY | 0.99+ |
Three | QUANTITY | 0.99+ |
90% | QUANTITY | 0.99+ |
hundreds | QUANTITY | 0.99+ |
hundred times | QUANTITY | 0.99+ |
five segments | QUANTITY | 0.99+ |
50% | QUANTITY | 0.99+ |
three | QUANTITY | 0.99+ |
11 years | QUANTITY | 0.99+ |
Datameer | ORGANIZATION | 0.99+ |
A year ago | DATE | 0.99+ |
thousands | QUANTITY | 0.99+ |
15% | QUANTITY | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
five segments | QUANTITY | 0.99+ |
more than 10,000 | QUANTITY | 0.99+ |
More than 10,000 variables | QUANTITY | 0.99+ |
half a million | QUANTITY | 0.99+ |
Istanbul, Turkey | LOCATION | 0.98+ |
second | QUANTITY | 0.98+ |
around 4,000 ATMs | QUANTITY | 0.98+ |
single format | QUANTITY | 0.98+ |
today | DATE | 0.98+ |
20 years ago | DATE | 0.98+ |
nearly 14,000 employees | QUANTITY | 0.98+ |
almost 20 years | QUANTITY | 0.97+ |
around 50 million customers | QUANTITY | 0.97+ |
almost 70 years old | QUANTITY | 0.96+ |
one and a half year ago | DATE | 0.96+ |
single platform | QUANTITY | 0.96+ |
each | QUANTITY | 0.96+ |
single place | QUANTITY | 0.96+ |
end of 2017 | DATE | 0.95+ |
around 80 million | QUANTITY | 0.95+ |
one place | QUANTITY | 0.94+ |
30 | QUANTITY | 0.93+ |
more than 30, 50 years | QUANTITY | 0.93+ |
80, 85 different perspectives | QUANTITY | 0.92+ |
secondly | QUANTITY | 0.92+ |
more than hundreds | QUANTITY | 0.92+ |
more than hundreds of campaigns | QUANTITY | 0.91+ |
29 | QUANTITY | 0.9+ |
more than millions of customers | QUANTITY | 0.9+ |
age | QUANTITY | 0.9+ |
one | QUANTITY | 0.88+ |
two other competitors | QUANTITY | 0.87+ |
each marketing event | QUANTITY | 0.82+ |
80, 85% | QUANTITY | 0.81+ |
one of | QUANTITY | 0.81+ |
hundreds of classifications | QUANTITY | 0.81+ |
theCUBE | ORGANIZATION | 0.79+ |
hundreds of campaigns | QUANTITY | 0.78+ |
a single month | QUANTITY | 0.77+ |
very few people | QUANTITY | 0.76+ |
every | QUANTITY | 0.73+ |
first | QUANTITY | 0.71+ |
Miki Seltzer & Raul Olvera, Vivint | Customer Journey
>> Hey, welcome back, everybody. Jeff Frick, here with The Cube. We're in the Palo Alto studio, talking about customer journeys. We're really excited to have our next guest on, from Vivint. We have Miki Seltzer, she's a data scientist. Welcome, Miki. >> Thank you. >> And with her, also, is Raul Olvera, a senior data engineer at Vivint. First off, welcome. >> Thank you. >> So, for people that aren't familiar with Vivint, what is Vivint? >> So, we are a home security and home automation company. >> Okay. >> We've been around for 20 years. We like to make people's homes safer and smarter, and we're trying to do that in a way that customers can just use their home as they normally would, and we learn from what they do, and make their home smarter. >> Okay, so, I won't call you Nest of security, but probably a lot of people say Nest of security, because we always think of Nest, right, as that first smart home appliance that learns about what's going on. So what does that mean when you say that we learn about what you do and how you move about your house, probably your patterns? What does that really mean, when you talk about learning about a person in their house? >> Well, we have a lot of different devices in the user's house, and we can tell when they come home, how they like their thermostat set, and so all of those things, you know, sometimes you have to do that manually. You know, sometimes people have to come home, and they set their thermostat to 72, and when they go to bed, sometimes they have to set it cooler, because they want to save money when they sleep. >> Jeff: Right. >> But with Vivint, you can set all those controls to happen automatically, and Vivint can detect patterns and know you tend to like your home cooler at night. >> Jeff: Okay. >> And you want to save money during the day, because a lot of times, people aren't home during the day, and so, they don't want to run their air conditioning and cool down a house that's not occupied. >> Right, right. >> So we like to use all those patterns, and just make your home smarter, so that it knows how to save you money, and how to make you safer. >> So, that's a lot of data ingest. So, what are the types of sensors, appliances, inputs that you leverage to feed the front end of that process? >> We have motion detectors, there's locks, there's the main panel that you use to interact with the system, the thermostat, the cameras. >> Miki: We've got smoke alarms, carbon monoxide detectors. >> Oh, a whole host of things. >> We've got a whole host of things, yeah. >> Yeah, and then when people put Vivint in, do they usually want to put it in because of that whole array of stuff, or do they usually start with the doorbell camera, or a thermostat, or a carbon monoxide detector? How does that engagement work, and does it grow over time? >> Well, I think the thing that's really important about Vivint is that we're kind of a one-stop shop solution, so a lot of these products are coming out where you can get a thermostat on its own, and you can get a doorbell camera on its own, and you can get a security system on its own, but the good thing about Vivint is that everything is integrated, and an installer will come to your house, and do everything for you. >> Okay. >> And, so, there's not configuration that has to be done. It's kind of, we come in, we set everything up. >> Okay. >> And you're good to go. >> Okay. >> And a lot of times, people will sign up just for security, and then find out that we have all these great products, and all these smarts that go behind it, and it just makes the product that much more valuable to customers. >> Right, because I would imagine the more of the pieces that you integrate, the more value you get out of the whole system. >> Absolutely. >> One and one makes three type of scenario. And then what's the business model? Do they buy the gear, kind of the classic security, you buy the gear and then you have some type of monthly subscription for the service, or how does the business model work? >> So right now, we are moving more towards a you buy everything up front, and then you just pay a monitoring fee, going forward. >> Jeff: Okay, okay. >> So, you will own all of your equipment. >> Okay, great. So, that's on the data collection side. Now you guys are pulling this back in. You both are data scientists, data engineers, so then what are some of the challenges you have, pulling all this for data? I guess the good news is it's all coming from your own systems, right? Or are you pulling data from other systems, as well? >> It's a lot of the sensor data that we have, and I think a lot of the challenge in that is understanding the data, how it behaves, and creating the metrics out of billions and billions of rows of data. >> Jeff: Right. >> For all the customers that we have, so that's one of our challenges, and we do have other sources from CRM, data sources, to NPS, and other systems that we use, that we combine with all of our data from the sensors, just to get a better view of the customer and understand them better. >> Okay, what's NPS? You said NPS. >> Miki: Net Promoter Score. >> Net Promoter Score. >> Net Promoter Score. Okay, good, and then do you use other external stuff like the weather? I would imagine there's other external factors, public dataset, set impact, whether you turn the furnace up or down. >> Yeah, absolutely. We have a whole host of data sources that we use, in order to power the smarts behind. >> Jeff: Okay. >> Behind our products, and weather is absolutely right. That's one of them. We also need information on peoples' homes in order to figure out how long it's gonna take to heat or cool their house. >> Jeff: Okay. >> Because somebody who lives in maybe a condo, it's gonna take a shorter amount of time to heat up their house than somebody that lives in a 3000 square foot house. >> Right, right. Okay, so then you guys get the data, you can analyze the data, you're both smart people. You both are data scientists. How do you package that up in a way for the consumer? Because I would imagine the consumer interface clearly doesn't have billions of rows of data, and doesn't incorporate that, so how have you guys, I don't wanna say dumbed it down, but dumbed it down to the consumer, so they've got a much easier engagement with the system? >> I think we basically work with each business or person, and from their request, we start working with them, understand what they wanna measure, and usually, as with big data happens, you kind of create a story with metrics for them, so we start with that. It's mostly on a request basis. >> Jeff: Okay. >> And we have some automations, just to keep track of some metrics that we like to keep historical measurements. >> Jeff: Right. >> But it's mostly we talk with the business people to see what they want to track, and kind of create our own story with the data that we have. >> Okay, and then I would imagine over time, the objective would be for the system to take over a lot more the control, without engagement with the consumer in their home, right? Ultimately, you wanna learn what they do and start adapting your patterns to how they act, so that their direct engagement with the system decreases over time. >> Yeah, so that's the ultimate goal, is that we can infer all of these data points without having to confirm with the customer that, yes, I'm not home, or yes, I do want my home to be cooler. >> Jeff: Right. >> So that is something that we're working towards. >> Okay. So, you've been at it for a while. 20 years, the company's been around. That's pretty amazing. How have the challenges changed over that course of time? Are you looking at things differently? Are you pulling in more data sources? Or has it changed very much in the last 20 years, or have you just added more to the portfolio, which adds more data input, which is probably a good thing? >> Well, the journey that we've been on really started in about 2014. >> Jeff: Okay. >> When we launched our own platform for security and home automation, because at that point, that's when we started getting the whole fire hose of data. >> Jeff: Okay. >> And so at that point, that was the beginning of our data journey, and when that happened, we kind of had to harness all of that data and figure out what do people want to know? Like, what does our business need to know about how people are using the system? >> Jeff: Right. >> And so at the very beginning, it was simpler questions, but now that we've kind of evolved more, we can answer the more complex questions that don't necessarily have straightforward answers. So, it's kind of evolved from 2014, when we were able to get all of that rich data. >> Jeff: Right. >> From the platform, and it's evolved to now, where we can use all of that data to inform the smarts for our products. >> And I love the way you said that there's not necessarily an answer. >> Mmhmm. >> Right, it's very nuance, right? >> Right. >> Everything's got some type of a score variable or some type of a trade-off, so have you created your own scoring and trade-off tools internally, to help make those value decisions? >> Yeah, so it's really all driven by context. >> Okay. >> So a lot of our data, without any context, it doesn't matter, it doesn't provide any use. We're in a unique situation, where we define our own success metrics. So a lot of times we'll monitor things like what percentage of the time is a camera connected to the internet? Because if it's not connected to the internet, then you can't view it from your phone or from your computer. >> Jeff: Right. >> So... >> So, a tight relationship with Comcast, hopefully? (laughter) We're all together. >> Yeah. >> Okay, so there's that, and then, again, how much of that stuff do you display back to the customer? How much control do they have? How much control do they want? You know, those are all, kind of, squishy decisions, as well. All right, so you're here on behalf of Datameer. So, you chose them. So what was it that attracted you to the Datameer solution? >> I think it's the fact that just interacting with your big data is way simpler than going to, even if it's on a scale environment like HIVE, it takes a longer process to get your data out, and it's more visual, so you're seeing the transformations that you're doing in there, and I think it allows people with a more analytical skill set to get in to the data, and go through the whole journey from knowing the data from almost raw, to getting their own metrics, which I think it adds value for the end product and metrics reports. >> So more value for the people who have the knowledge and the data science jobs. >> Yeah. >> And how many hardcore data scientists do you have in your team? >> On our team, I think we have about five or six hardcore data scientists. >> Five or six? Okay. >> We're kind of split into two different teams. One teams does real time streaming analytics, and our team does more batch analytics. >> Jeff: Okay. >> So we're all using a whole host of different machine learning and data science techniques. >> Jeff: Okay. >> But on the batch side, we use Datameer a lot to be able to transform and pull insights out of that raw data that would be really difficult otherwise. >> Right, and then what about for the people that aren't in your core team? You know, that aren't the more hardcore data scientists. What's been the impact of Datameer and this type of a tool to enable them to see the data, play with the data, create reports, ask for more specific data? What's been the impact for them to be able to actually engage with this data without being a data scientist, per se? >> They can go into Datameer and get answers quicker than, like I mentioned, just writing something that will take longer time, and we also feed data to them because we have more access to historical data, and aggregations, like probabilities, and those type of metrics, we can create for them, and they can utilize that in their more real time environments, and use probably these metrics for creating or, I forget that one... >> Miki: Predicting. >> Predicting, yes. >> Right, right. >> Predicting actions the customer are going to take. >> Right, right, and I wonder if you could speak a little bit about how the two groups work together between the batch and the real time, because a lot of talk about real time, it's the hot, sexy topic right now, but the two go hand in hand, right? They're not either or. So how do you see the relationship between the two groups working? How do you leverage each other? What's the business benefit that you deliver versus the real time people? How does that work out? >> So when you're doing real time and streaming analytics, you really need to have your analytics based in something that's already happened. So we inform our real time analytics by looking at past behaviors, and that helps us develop methodologies that'll be able to go real quick (snaps fingers) in real time. So using past insights to inform our real time analytics is really important to us. >> Which is a big part of the MLPs, right? The machine learning. You build a model based on the past, you take the data that's streaming in now, make the adjustment to continue to modify it. I'm just curious to get your take on the evolution of machine learning and artificial intelligence, and how your guys are leveraging that to get more value out of the data, out of your platform, deliver more value to your customer. Here's an interesting little example. I always joke with people, they think these big things, I'm like, well how about when Google reads your email and puts your flight information on your calendar? I think that's pretty cool. That's a pretty cool application. I mean, are there some cool little ones that you can highlight that may not seem that big to the outside world, but in fact they're really high value things? >> Well, I think one of the biggest challenges for Vivint is something simple like knowing whether there's somebody home. So occupancy has been a big challenge for us because we have all these sensors, and we can easily tell when somebody's home, because they'll have a motion detector, and we'll be able to see that there's somebody moving around the house. However, knowing that somebody is not home is the bigger challenge because the lack of motion in the house doesn't mean that somebody not home. They could be taking a nap, they could be in a room that doesn't have a motion sensor, and so using machine learning algorithms and data science to figure those problems out, it's been really interesting, and it seems like it's a relatively simple problem, but when you break it down, it gets a little more complicated. >> Check their Instagram feed probably, you get a starting point. >> Right. >> Or if the dog is running around, setting off the motion sensors, I'd imagine is another interesting challenge. >> Yeah, that's also a big challenge. >> All right, so as you look forward to 2018, I can't believe this year's already over, what are some of your priorities? What are some of the things that you're working on? If we were to sit down a year from now, what would we be talking about? >> I think create something that is more approachable, as in people can get their own value from it, rather than doing one of timed requests, is when we're moving from on our data journey. >> Right, so basically democratizing the data, democratizing the tools, letting more people engage with it to get their own solutions. >> Yeah, because like Miki said, the data that we're getting, it wasn't available to us until like 2014. So people are just realizing that we have this amount of data, and first the questions come, and they're kind of specific, and eventually you start getting similar requests to the point that, to speed development on other reports, we want to be able to provide some of the more important metrics that we have received in the past years to a more automated way, so that we can keep track of them historically and for people that need to know those metrics. >> Jeff: Miki? >> Yeah, as Raul said, we're trying to move more toward self service. In the past, since our data is constantly evolving, there are not many people who know the context and the nuance of all of our data, so it's been really important for us to work with our business stake holders, so that we know that they're getting the right data with the right context, and so moving towards having them be able to pull their own data is a really big opportunity for us. >> With that context overlay. >> Absolutely. >> So they know what they're actually looking at. It feels so under reported the importance of context to anything, right? Without the context, is it big, is it small, what are we comparing it to? >> Exactly. >> Well, Miki and Raul, thanks for taking a few minutes of your time and sharing your story. Fascinating little look into more about Vivint, and I guess you just have to get more motion sensors around the house, under the bed, keep an eye on that Instagram account, are they taking pictures? >> Let's not be creepy. (laughs) >> Well that's a great line, right? Data science done great is magic, and data science not done well is creepy. So there's a fine line. So thanks again for sharing your story, really appreciate it. >> Thanks for having us. >> And I'm Jeff Frick, and you're watching The Cube. Thanks for tuning in, and we'll catch you next time. Thanks for watching.
SUMMARY :
We're in the Palo Alto studio, And with her, also, is Raul Olvera, and home automation company. and we learn from what they do, that we learn about what you do and so all of those things, you know, and know you tend to like and so, they don't want to and how to make you safer. inputs that you leverage to feed to interact with the system, Miki: We've got smoke alarms, and you can get a doorbell configuration that has to be done. and it just makes the product of the pieces that you integrate, of the classic security, and then you just pay a the challenges you have, and creating the metrics out and other systems that we use, Okay, what's NPS? Okay, good, and then do you use data sources that we use, in order to figure out of time to heat up their house Okay, so then you guys get the data, and usually, as with big data happens, that we like to keep and kind of create our own story and start adapting your is that we can infer So that is something How have the challenges changed Well, the journey that we've been on the whole fire hose of data. And so at the very beginning, and it's evolved to now, And I love the way you said Yeah, so it's really of the time is a camera with Comcast, hopefully? how much of that stuff do you that just interacting with your big data the knowledge and the data science jobs. On our team, I think we have Okay. and our team does more batch analytics. and data science techniques. But on the batch side, You know, that aren't the and we also feed data to them Predicting actions the and I wonder if you and that helps us develop make the adjustment to and data science to you get a starting point. Or if the dog is running around, that is more approachable, democratizing the data, and for people that need so that we know that they're getting of context to anything, right? and I guess you just have to Let's not be creepy. and data science not done well is creepy. and we'll catch you next time.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Miki | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Raul Olvera | PERSON | 0.99+ |
Raul | PERSON | 0.99+ |
Jeff | PERSON | 0.99+ |
Miki Seltzer | PERSON | 0.99+ |
Comcast | ORGANIZATION | 0.99+ |
Datameer | ORGANIZATION | 0.99+ |
2014 | DATE | 0.99+ |
two groups | QUANTITY | 0.99+ |
2018 | DATE | 0.99+ |
Five | QUANTITY | 0.99+ |
six | QUANTITY | 0.99+ |
20 years | QUANTITY | 0.99+ |
Vivint | PERSON | 0.99+ |
ORGANIZATION | 0.99+ | |
Palo Alto | LOCATION | 0.99+ |
Vivint | ORGANIZATION | 0.99+ |
two | QUANTITY | 0.99+ |
billions | QUANTITY | 0.99+ |
3000 square foot | QUANTITY | 0.99+ |
The Cube | TITLE | 0.99+ |
both | QUANTITY | 0.99+ |
One teams | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
One | QUANTITY | 0.98+ |
The Cube | ORGANIZATION | 0.98+ |
72 | QUANTITY | 0.97+ |
two different teams | QUANTITY | 0.97+ |
each business | QUANTITY | 0.96+ |
first | QUANTITY | 0.95+ |
First | QUANTITY | 0.94+ |
ORGANIZATION | 0.94+ | |
this year | DATE | 0.89+ |
a year | QUANTITY | 0.88+ |
first smart | QUANTITY | 0.86+ |
three type | QUANTITY | 0.85+ |
about five | QUANTITY | 0.82+ |
billions of rows | QUANTITY | 0.8+ |
last 20 years | DATE | 0.77+ |
data scientists | QUANTITY | 0.74+ |
of times | QUANTITY | 0.7+ |
NPS | ORGANIZATION | 0.68+ |
Datameer | TITLE | 0.68+ |
lot of times | QUANTITY | 0.59+ |
each | QUANTITY | 0.59+ |
past years | DATE | 0.55+ |
Jeff Weidner, Director Information Management | Customer Journey
>> Welcome back everybody. Jeff Frick here with theCube. We're in the Palo Alto studio talking about customer journeys today. And we're really excited to have professional, who's been doing this for a long time, he's Jeff Weidener, he's an Information Management Professional at this moment in time, and still, in the past and future, Jeff Welcome. >> Well thank you for having me. >> So you've been playing in the spheres for a very long time, and we talked a little bit before we turned the cameras on, about one of the great topics that I love in this area is, the customer, the 360 view of the customer. And that the Nirvana that everyone says you know, we're there, we're pulling in all these data sets, we know exactly what's going on, the person calls into the call center and they can pull up all their records, and there's this great vision that we're all striving for. How close are we to that? >> I think we're several years away from that perfect vision that we've talked about, for the last, I would say, 10, 10 to 15 years, that I've dealt with, from folks who were doing catalogs, like Sears catalogs, all the way to today, where we're trying to mix and match all this information, but most companies are not turning that into actionable data, or actionable information, in any way that's reasonable. And it's just because of the historic kind of Silo, nature of all these different systems, I mean, you know, I keep hearing about, we're gonna do it, all these things can tie together, we can dump all the data in a single data lake and pull it out, what are some of the inhibitors and what are some of the approaches to try to break some of those down? >> Most has been around getting that data lake, in order to put the data in its spot, basically try and make sure that, do I have the environment to work in? Many times a traditional enterprise warehouse doesn't have the right processing power, for you, the individual, who wants to do the work, or, doesn't have the capacity that'll allow you to just bring all the data in, try to ratify it. That's really just trying to do the data cleansing, and trying to just make some sense of it, cause many times, there aren't those domain experts. So I usually work in marketing, and on our Customer 360 exercise, was around, direct mail, email, all the interactions from our Salesmaker, and alike. So, when we look at the data, we go, I don't understand why the Salesmaker is forgetting X, of that behavior that we want to roll together. >> Right. >> But really it's finding that environment, second is the harmonization, is I have Bob Smith and Robert Smith, and Master Data Management Systems, are perhaps few and far between, of being real services that I can call as a data scientist, or as a data worker, to be able to say, how do I line these together? How can I make sure that all these customer touchpoints are really talking about the same individual, the company, or maybe just the consumer? >> Right. >> And finally, it is in those Customer 360 projects getting those teams to want to play together, getting that crowdsourcing, either to change the data, such as, I have data, as you mentioned around Chat, and I want you to tell me more about it, or I want you to tell me how I can break it down. >> Right, right. >> And if I wanna make changes to it, you go, we'll wait, where's your money, in order to make that change. >> Right, right. >> And there's so many aspects to it, right. So there's kind of the classic, you know, ingest, you gotta get the data, you gotta run it through the processes you said did harmonize it to bring it together, and then you gotta present it to the person who's in a position at the moment of truth, to do something with it. And those are three very very different challenges. They've been the same challenges forever, but now we're adding all this new stuff to it, like, are you pulling data from other sources outside of the system of record, are you pulling social data, are you pulling other system data that's not necessarily part of the transactional system. So, we're making the job harder, at the same time, we're trying to give more power to more people and not just the data scientists. But as you said I think, the data worker, so how's that transformation taking place where we're enabling more kind of data workers if you will, that aren't necessarily data scientists, to have the power that's available with the analytics, and an aggregated data set behind them. >> Right. Well we are creating or have created the wild west, we gave them tools, and said, go forth and make, make something out of it. Oh okay. Then we started having this decentralization of all the tools, and when we finally gave them the big tools, the big, that's quote unquote, big data tools, like the process, billings of records, that still is the wild west, but at least we're got them centralized with certain tools. So we were able to do at least standardize on the tool set, standardize on the data environment, so that at least when they're working on that space, we get to go, well, what are you working on? How are you working on that? What type of data are you working with? And how do we bring that back as a process, so that we can say, you did something on Chat Data? Great! Bob over here, he likes to work with that Chat data. So that, that exposure and transparency because of these centralization data. Now, new tools are adding on top of that, data catalogs, and putting inside tools that will make it so that you actually tell, that known information, all-in-one wiki-like interface. So we're trying to add more around putting the right permissions on top of that data, cataloging them in some way, with these either worksheets, or these information management tools, so that, if you're starting to deal with privacy data, you've got a flag, from, it's ingest all the way to the end. >> Right. >> But more controls are being seen as a way that a business is improving its maturity. >> Yeah. Now, the good news bad news is, more and more of the actual interactions are electronic. You want it going to places, they're not picking up the phone as much, as they're engaging with the company either via web browser or more and more a mobile browser, a mobile app, whatever. So, now the good news is, you can track all that. The bad news is, you can track all that. So, as we add more complexity, then there's this other little thing that everybody wants to do now, which is real-time, right, so with Kafka and Flink and Spark and all these new technologies, that enable you to basically see all the data as it's flowing, versus a sampling of the data from the past, a whole new opportunity, and challenge. So how are you seeing it and how are you gonna try to take advantage of that opportunity as well as address that challenge in your world. >> Well in my data science world, I've said, hey, give me some more data, keep on going, and when I have to put on the data sheriff hat, I'm now having to ask the executives, and our stakeholders, why streaming? Why do you really need to have all of this? >> It's the newest shiny toy. >> New shiny toy! So, when you talk to a stakeholder and you say, you need a shiny toy, great. I can get you that shiny toy. But I need an outcome. I need a, a value. And that helps me in tempering the next statement I give to them, you want streaming, so, or you want real time data, it's gonna cost you, three X. Are you gonna pay for it? Great. Here's my shiny toy. But yes, with the influx of all of this data, you're having to change the architecture and many times IT traditionally hasn't been able to make that, that rapid transition, which lends itself to shadow IT, or other folks trying to cobble something together, not to make that happen. >> And then there's this other pesky little thing that gets in the way, in the form of governance, and security. >> Compliance, privacy and finally marketability, I wanna give you a, I want you to feel that you're trusting me, in handling your data, but also that when I respond back to you, I'm giving you a good customer experience so called, don't be creepy. >> Right, right. >> Lately, the new compliance rule in Europe, GDPR, a policy that comes with a, well, a shotgun, that says, if there are violations of this policy, which involves privacy, or the ability for me to be forgotten, of the information that a corporation collects, it can mean four percent of a total company's revenue. >> Right. >> And that's on every instance, that's getting a lot of motivation for information governance today. >> Right. >> That risk, but the rules are around, trying to be able to say, where did the data come from? How did the data flow through the system? Who's touched that data? And those information management tools are mostly the human interaction, hey what are you guys working on? How are you guys working on it? What type of assets are you actually driving, so that we can bring it together for that privacy, that compliance, and workflow, and then later on top of that, that deliverability. How do you want to be contacted? How do you, what are the areas that you feel, are the ways that we should engage with you? And of course, everything that gets missed in any optimization exercise, the feedback loop. I get feedback from you that say, you're interested in puppies, but your data set says you're interested in cats. How do I make that go into a Customer 360 product. So, privacy, and being, and coming at, saying, oh, here's an advertisement for, for hippos and you go, what do you know about me that I don't know? >> Wrong browser. >> So you chose Datameer, along the journey, why did you choose them, how did you implement them, and how did they address some of these issues that we've just been discussing? >> Datameer was chosen primarily to take on that self-service data preparational layer from the beginning. Dealing with large amounts of online data, we move from from taking the digital intelligence tools that are out there, knowing about browser activities, the cookies that you have to get your identity, and said, we want the entire feed. We want all of that information, because we wanna make that actionable. I don't wanna just give it to a BI report, I wanna turn it into marketing automation. So we got the entire feed of data, and we worked on that with the usual SQL tools, but after a while, it wasn't manageable, by either, all of the 450 to 950 columns of data, or the fact that there are multiple teams working on it, and I had no idea, what they were able to do. So I couldn't share in that value, I couldn't reuse, the insights that they could have. So Datameer allowed for a visual interface, that was not in a coding language, that allowed people to start putting all of their work inside one interface, that didn't have to worry about saving it up to the server, it was all being done inside one environment. So that it could take not only the digital data, but the Salesforce CRN data, marry them together and let people work with it. And it broadened on the other areas, again allowing it that crowdsourcing of other people's analytics. Why? Mostly because of the state we are in around IT, is an inability to change rapidly, at least for us, in our field. >> Right. >> That my, the biggest problem we had, was there wasn't a scheduler. We didn't have the ability to get value out of our, on our work, without having someone to press the button and run it, and if they ran it, it took eight hours, they walked away, it would fail. And you had no, you had to go back and do it all over again. >> Oh yeah. >> So Datameer allows us to have that self-service interface, that had management that IT could agree upon, to let us have our own lab environment, and execute our work. >> So what was the results, when you suddenly give people access to this tool? I mean, were they receptive, did you have to train them a lot, did some people just get it and some people just don't, they don't wanna act on data, what was kind of the real-world results of rolling this out, within the population? Real-world results allowed us to get ten million dollars in uplift, in our marketing activities across multiple channels. >> Ten million dollars in uplift? How did you measure that? >> That was measured through the operating expenses, by one not sending that work outside, some of the management, of the data, is being, was sent outside, and that team builds their own models off of them, we said, we should be able to drink our own champagne, second, it was on the uplift of a direct mail and email campaign, so having a better response rate, and generally, not sending out a bunch of app store messages, that we weren't needing too. And then turning that into a list that could be sent out to our email and direct mail vendors, to say, this is what we believe, this account or contact is engaged with on the site. Give those a little bit more context. So we add that in, that we were hopefully getting and resonating a better message. >> Right. >> In, and where did you start? What was the easiest way to provide an opportunity for people new to this type of tooling access to have success? >> Mostly it was trying to, was taking pre-doctored worksheets, or already pre-packaged output, and one of the challenges that we had were people saying well I don't wanna work in a visual language, while they're users of tools like Tableau or Clicks, and others that are happy to drag-and-drop in their data, many of the data workers, the tried-and-true, are saying, I wanna write it in SQL. >> Mm hm. >> So, we had to give at least that last mile, analytical data set to them, and say, okay. Yeah, go ahead and move it over to your SQL environment, move it over into the space that you feel comfortable and you feel confident to control, but let' come on back and we'll translate it back to, this tool, we'll show you how easy it was, to go from, working with IT, which would take months, to go and doing it on yourself, which would take weeks, and the processing and the cost of your Siloed, shadowed IT environment, will go down in days. We're able to show them that, that acceleration of time to market of their data. >> What was your biggest surprise? An individual user, an individual use case, something that really you just didn't see coming, that's kind of a pleasant, you know the law of unintended consequences on the positive side. >> That's was such a wide option, I mean honestly, beginning back from the data science background, we thought it would just be, bring your data in, throw it on out there, and we're done. We went from, maybe about 20 large datasets of AdTech and Martech, and information, advertising, technology, marketing technology, data, to CRMM formation, order activity, and many other categories, just within marketing alone, and I think perhaps, the other big ah-ha moment was, since we brought that in, of other divisions data, those own teams came in, said, hey, we can use this too. >> Right. >> The adoption really surprised me that it would, you would have people that say, oh I can work with this, I have this freedom to work with this data. >> Right right. >> Well we see it time and time again, it's a recurring theme of all the things we cover, which is, you know a really, big piece of the innovation story, is giving, you know, more people access to more data, and the tools to actually manipulate it. So that you can unlock that brain power, as opposed to keeping it with the data scientists on Mahogany Row, and the super-big brain. So, sounds like that really validates that whole hypothesis. >> I went through reviewing hands-on 11 different tools, when I chose Datameer. This was everything from, big name companies, to small start-up companies, that have wild artificial intelligence slogans in their marketing material, and we chose it mostly because it had the right fit, as an end-to-end approach. It had the scheduler, it had the visual interface, it had the, enough management and other capabilities that IT would leave us alone. Some of the other products that we were looking at gave you, Pig-El-Lee to work with data, will allow you to schedule data, but they never came all together. And for the value we get out of it, we needed to have something altogether. >> Right. Well Jeff, thanks for taking a few minutes and sharing your story, really appreciate it, and it sounds like it was a really successful project. >> Was! >> All right. He's Jeff Weidener, I'm Jeff Frick, you're watching theCube from Palo Alto. Thanks for watching.
SUMMARY :
We're in the Palo Alto studio talking And that the Nirvana that of the approaches to try to the environment to work in? and I want you to tell me to it, you go, we'll wait, the processes you said did harmonize it so that we can say, you that a business is improving its maturity. of the actual interactions are electronic. I give to them, you want gets in the way, in the form I wanna give you a, I want you of the information that of motivation for that you feel, are the ways of the 450 to 950 columns That my, the biggest problem we had, that self-service interface, of the real-world results the data, is being, was sent and others that are happy to that you feel comfortable that really you just didn't back from the data science me that it would, you would So that you can unlock that And for the value we it was a really successful project. Thanks for watching.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff Weidner | PERSON | 0.99+ |
Jeff Weidener | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Jeff | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
eight hours | QUANTITY | 0.99+ |
Bob | PERSON | 0.99+ |
ten million dollars | QUANTITY | 0.99+ |
Datameer | ORGANIZATION | 0.99+ |
Ten million dollars | QUANTITY | 0.99+ |
10 | QUANTITY | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
450 | QUANTITY | 0.99+ |
11 different tools | QUANTITY | 0.99+ |
four percent | QUANTITY | 0.99+ |
Sears | ORGANIZATION | 0.99+ |
GDPR | TITLE | 0.99+ |
three | QUANTITY | 0.99+ |
15 years | QUANTITY | 0.99+ |
second | QUANTITY | 0.98+ |
AdTech | ORGANIZATION | 0.98+ |
Martech | ORGANIZATION | 0.98+ |
SQL | TITLE | 0.98+ |
360 view | QUANTITY | 0.97+ |
950 columns | QUANTITY | 0.97+ |
today | DATE | 0.97+ |
theCube | ORGANIZATION | 0.97+ |
one | QUANTITY | 0.95+ |
Tableau | TITLE | 0.95+ |
one interface | QUANTITY | 0.93+ |
single | QUANTITY | 0.93+ |
Pig-El-Lee | ORGANIZATION | 0.93+ |
Master Data Management Systems | ORGANIZATION | 0.89+ |
Mahogany Row | TITLE | 0.86+ |
Spark | TITLE | 0.81+ |
one environment | QUANTITY | 0.8+ |
about 20 large datasets | QUANTITY | 0.79+ |
Clicks | TITLE | 0.77+ |
360 | QUANTITY | 0.77+ |
Robert Smith | PERSON | 0.73+ |
Bob | ORGANIZATION | 0.7+ |
Salesmaker | ORGANIZATION | 0.7+ |
Smith | PERSON | 0.67+ |
Salesforce | ORGANIZATION | 0.66+ |
Flink | ORGANIZATION | 0.66+ |
instance | QUANTITY | 0.63+ |
Kafka | ORGANIZATION | 0.52+ |
Nirvana | ORGANIZATION | 0.43+ |
CRN | TITLE | 0.39+ |