Joe Nolte, Allegis Group & Torsten Grabs, Snowflake | Snowflake Summit 2022
>>Hey everyone. Welcome back to the cube. Lisa Martin, with Dave ante. We're here in Las Vegas with snowflake at the snowflake summit 22. This is the fourth annual there's close to 10,000 people here. Lots going on. Customers, partners, analysts, cross media, everyone talking about all of this news. We've got a couple of guests joining us. We're gonna unpack snow park. Torston grabs the director of product management at snowflake and Joe. No NTY AI and MDM architect at Allegis group. Guys. Welcome to the program. Thank >>You so much for having >>Us. Isn't it great to be back in person? It is. >>Oh, wonderful. Yes, it >>Is. Indeed. Joe, talk to us a little bit about Allegis group. What do you do? And then tell us a little bit about your role specifically. >>Well, Allegis group is a collection of OPCA operating companies that do staffing. We're one of the biggest staffing companies in north America. We have a presence in AMEA and in the APAC region. So we work to find people jobs, and we help get 'em staffed and we help companies find people and we help individuals find >>People incredibly important these days, excuse me, incredibly important. These days. It is >>Very, it very is right >>There. Tell me a little bit about your role. You are the AI and MDM architect. You wear a lot of hats. >>Okay. So I'm a architect and I support both of those verticals within the company. So I work, I have a set of engineers and data scientists that work with me on the AI side, and we build data science models and solutions that help support what the company wants to do, right? So we build it to make business business processes faster and more streamlined. And we really see snow park and Python helping us to accelerate that and accelerate that delivery. So we're very excited about it. >>Explain snow park for, for people. I mean, I look at it as this, this wonderful sandbox. You can bring your own developer tools in, but, but explain in your words what it >>Is. Yeah. So we got interested in, in snow park because increasingly the feedback was that everybody wants to interact with snowflake through SQL. There are other languages that they would prefer to use, including Java Scala and of course, Python. Right? So then this led down to the, our, our work into snow park where we're building an infrastructure that allows us to host other languages natively on the snowflake compute platform. And now here, what we're, what we just announced is snow park for Python in public preview. So now you have the ability to natively run Python code on snowflake and benefit from the thousands of packages and libraries that the open source community around Python has contributed over the years. And that's a huge benefit for data scientists. It is ML practitioners and data engineers, because those are the, the languages and packages that are popular with them. So yeah, we very much look forward to working with the likes of you and other data scientists and, and data engineers around the Python ecosystem. >>Yeah. And, and snow park helps reduce the architectural footprint and it makes the data pipelines a little easier and less complex. We have a, we had a pipeline and it works on DMV data. And we converted that entire pipeline from Python, running on a VM to directly running down on snowflake. Right. We were able to eliminate code because you don't have to worry about multi threading, right? Because we can just set the warehouse size through a task, no more multi threading, throw that code away. Don't need to do it anymore. Right. We get the same results, but the architecture to run that pipeline gets immensely easier because it's a store procedure that's already there. And implementing that calling to that store procedure is very easy. The architecture that we use today uses six different components just to be able to run that Python code on a VM within our ecosystem to make sure that it runs on time and is scheduled and all of that. Right. But with snowflake, with snowflake and snow park and snowflake Python, it's two components. It's the store procedure and our ETL tool calling it. >>Okay. So you've simplified that, that stack. Yes. And, and eliminated all the other stuff that you had to do that now Snowflake's doing, am I correct? That you're actually taking the application development stack and the analytics stack and bringing them together? Are they merging? >>I don't know. I think in a way I'm not real sure how I would answer that question to be quite honest. I think with stream lit, there's a little bit of application that's gonna be down there. So you could maybe start to say that I'd have to see how that carries out and what we do and what we produce to really give you an answer to that. But yeah, maybe in a >>Little bit. Well, the reason I asked you is because you talk, we always talk about injecting data into apps, injecting machine intelligence and ML and AI into apps, but there are two separate stacks today. Aren't they >>Certainly the two are getting closer >>To Python Python. It gets a little better. Explain that, >>Explain, explain how >>That I just like in the keynote, right? The other day was SRE. When she showed her sample application, you can start to see that cuz you can do some data pipelining and data building and then throw that into a training module within Python, right down inside a snowflake and have it sitting there. Then you can use something like stream lit to, to expose it to your users. Right? We were talking about that the other day, about how do you get an ML and AI, after you have it running in front of people, we have a model right now that is a Mo a predictive and prescriptive model of one of our top KPIs. Right. And right now we can show it to everybody in the company, but it's through a Jupyter notebook. How do I deliver it? How do I get it in the front of people? So they can use it well with what we saw was streamlet, right? It's a perfect match. And then we can compile it. It's right down there on snowflake. And it's completely easier time to delivery to production because since it's already part of snowflake, there's no architectural review, right. As long as the code passes code review, and it's not poorly written code and isn't using a library that's dangerous, right. It's a simple deployment to production. So because it's encapsulated inside of that snowflake environment, we have approval to just use it. However we see fit. >>It's very, so that code delivery, that code review has to occur irrespective of, you know, not always whatever you're running it on. Okay. So I get that. And, and, but you, it's a frictionless environment you're saying, right. What would you have had to do prior to snowflake that you don't have to do now? >>Well, one, it's a longer review process to allow me to push the solution into production, right. Because I have to explain to my InfoSec people, right? My other it's not >>Trusted. >>Well, well don't use that word. No. Right? It got, there are checks and balances in everything that we do, >>It has to be verified. And >>That's all, it's, it's part of the, the, what I like to call the good bureaucracy, right? Those processes are in place to help all of us stay protected. >>It's the checklist. Yeah. That you >>Gotta go to. >>That's all it is. It's like fly on a plane. You, >>But that checklist gets smaller. And sometimes it's just one box now with, with Python through snow park, running down on the snowflake platform. And that's, that's the real advantage because we can do things faster. Right? We can do things easier, right? We're doing some mathematical data science right now and we're doing it through SQL, but Python will open that up much easier and allow us to deliver faster and more accurate results and easier not to mention, we're gonna try to bolt on the hybrid tables to that afterwards. >>Oh, we had talk about that. So can you, and I don't, I don't need an exact metric, but when you say faster talking 10% faster, 20% faster, 50% path >>Faster, it really depends on the solution. >>Well, gimme a range of, of the worst case, best case. >>I, I really don't have that. I don't, I wish I did. I wish I had that for you, but I really don't have >>It. I mean, obviously it's meaningful. I mean, if >>It is meaningful, it >>Has a business impact. It'll >>Be FA I think what it will do is it will speed up our work inside of our iterations. So we can then, you know, look at the code sooner. Right. And evaluate it sooner, measure it sooner, measure it faster. >>So is it fair to say that as a result, you can do more. Yeah. That's to, >>We be able do more well, and it will enable more of our people because they're used to working in Python. >>Can you talk a little bit about, from an enablement perspective, let's go up the stack to the folks at Allegis who are on the front lines, helping people get jobs. What are some of the benefits that having snow park for Python under the hood, how does it facilitate them being able to get access to data, to deliver what they need to, to their clients? >>Well, I think what we would use snowflake for a Python for there is when we're building them tools to let them know whether or not a user or a piece of talent is already within our system. Right. Things like that. Right. That's how we would leverage that. But again, it's also new. We're still figuring out what solutions we would move to Python. We are, we have some targeted, like we're, I have developers that are waiting for this and they're, and they're in private preview. Now they're playing around with it. They're ready to start using it. They're ready to start doing some analytical work on it, to get some of our analytical work out of, out of GCP. Right. Because that's where it is right now. Right. But all the data's in snowflake and it just, but we need to move that down now and take the data outta the data wasn't in snowflake before. So there, so the dashboards are up in GCP, but now that we've moved all of that data down in, down in the snowflake, the team that did that, those analytical dashboards, they want to use Python because that's the way it's written right now. So it's an easier transformation, an easier migration off of GCP and get us into snow, doing everything in snowflake, which is what we want. >>So you're saying you're doing the visualization in GCP. Is that righting? >>It's just some dashboarding. That's all, >>Not even visualization. You won't even give for. You won't even give me that. Okay. Okay. But >>Cause it's not visualization. It's just some D boardings of numbers and percentages and things like that. It's no graphic >>And it doesn't make sense to run that in snowflake, in GCP, you could just move it into AWS or, or >>No, we, what we'll be able to do now is all that data before was in GCP and all that Python code was running in GCP. We've moved all that data outta GCP, and now it's in snowflake and now we're gonna work on taking those Python scripts that we thought we were gonna have to rewrite differently. Right. Because Python, wasn't available now that Python's available, we have an easier way of getting those dashboards back out to our people. >>Okay. But you're taking it outta GCP, putting it to snowflake where anywhere, >>Well, the, so we'll build the, we'll build those, those, those dashboards. And they'll actually be, they'll be displayed through Tableau, which is our enterprise >>Tool for that. Yeah. Sure. Okay. And then when you operationalize it it'll go. >>But the idea is it's an easier pathway for us to migrate our code, our existing code it's in Python, down into snowflake, have it run against snowflake. Right. And because all the data's there >>Because it's not a, not a going out and coming back in, it's all integrated. >>We want, we, we want our people working on the data in snowflake. We want, that's our data platform. That's where we want our analytics done. Right. We don't want, we don't want, 'em done in other places. We when get all that data down and we've, we've over our data cloud journey, we've worked really hard to move all of that data. We use out of existing systems on prem, and now we're attacking our, the data that's in GCP and making sure it's down. And it's not a lot of data. And we, we fixed it with one data. Pipeline exposes all that data down on, down in snowflake now. And we're just migrating our code down to work against the snowflake platform, which is what we want. >>Why are you excited about hybrid tables? What's what, what, what's the >>Potential hybrid tables I'm excited about? Because we, so some of the data science that we do inside of snowflake produces a set of results and there recommendations, well, we have to get those recommendations back to our people back into our, our talent management system. And there's just some delays. There's about an hour delay of delivering that data back to that team. Well, with hybrid tables, I can just write it to the hybrid table. And that hybrid table can be directly accessed from our talent management system, be for the recruiters and for the hiring managers, to be able to see those recommendations and near real time. And that that's the value. >>Yep. We learned that access to real time. Data it in recent years is no longer a nice to have. It's like a huge competitive differentiator for every industry, including yours guys. Thank you for joining David me on the program, talking about snow park for Python. What that announcement means, how Allegis is leveraging the technology. We look forward to hearing what comes when it's GA >>Yeah. We're looking forward to, to it. Nice >>Guys. Great. All right guys. Thank you for our guests and Dave ante. I'm Lisa Martin. You're watching the cubes coverage of snowflake summit 22 stick around. We'll be right back with our next guest.
SUMMARY :
This is the fourth annual there's close to Us. Isn't it great to be back in person? Yes, it Joe, talk to us a little bit about Allegis group. So we work to find people jobs, and we help get 'em staffed and we help companies find people and we help It is You are the AI and MDM architect. on the AI side, and we build data science models and solutions I mean, I look at it as this, this wonderful sandbox. and libraries that the open source community around Python has contributed over the years. And implementing that calling to that store procedure is very easy. And, and eliminated all the other stuff that you had to do that now Snowflake's doing, am I correct? we produce to really give you an answer to that. Well, the reason I asked you is because you talk, we always talk about injecting data into apps, It gets a little better. And it's completely easier time to delivery to production because since to snowflake that you don't have to do now? Because I have to explain to my InfoSec we do, It has to be verified. Those processes are in place to help all of us stay protected. It's the checklist. That's all it is. And that's, that's the real advantage because we can do things faster. I don't need an exact metric, but when you say faster talking 10% faster, I wish I had that for you, but I really don't have I mean, if Has a business impact. So we can then, you know, look at the code sooner. So is it fair to say that as a result, you can do more. We be able do more well, and it will enable more of our people because they're used to working What are some of the benefits that having snow park of that data down in, down in the snowflake, the team that did that, those analytical dashboards, So you're saying you're doing the visualization in GCP. It's just some dashboarding. You won't even give for. It's just some D boardings of numbers and percentages and things like that. gonna have to rewrite differently. And they'll actually be, they'll be displayed through Tableau, which is our enterprise And then when you operationalize it it'll go. And because all the data's there And it's not a lot of data. so some of the data science that we do inside of snowflake produces a set of results and We look forward to hearing what comes when it's GA Thank you for our guests and Dave ante.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Joe | PERSON | 0.99+ |
10% | QUANTITY | 0.99+ |
20% | QUANTITY | 0.99+ |
Dave | PERSON | 0.99+ |
Allegis | ORGANIZATION | 0.99+ |
Las Vegas | LOCATION | 0.99+ |
Allegis Group | ORGANIZATION | 0.99+ |
Joe Nolte | PERSON | 0.99+ |
50% | QUANTITY | 0.99+ |
north America | LOCATION | 0.99+ |
Python | TITLE | 0.99+ |
Java Scala | TITLE | 0.99+ |
SQL | TITLE | 0.99+ |
both | QUANTITY | 0.99+ |
one box | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
thousands | QUANTITY | 0.99+ |
Snowflake Summit 2022 | EVENT | 0.98+ |
AWS | ORGANIZATION | 0.98+ |
Tableau | TITLE | 0.98+ |
six different components | QUANTITY | 0.98+ |
two components | QUANTITY | 0.98+ |
Python Python | TITLE | 0.98+ |
Torsten Grabs | PERSON | 0.97+ |
one | QUANTITY | 0.96+ |
today | DATE | 0.96+ |
Torston | PERSON | 0.96+ |
Allegis group | ORGANIZATION | 0.96+ |
OPCA | ORGANIZATION | 0.95+ |
one data | QUANTITY | 0.95+ |
two separate stacks | QUANTITY | 0.94+ |
InfoSec | ORGANIZATION | 0.91+ |
Dave ante | PERSON | 0.9+ |
fourth annual | QUANTITY | 0.88+ |
Jupyter | ORGANIZATION | 0.88+ |
park | TITLE | 0.85+ |
snowflake summit 22 | EVENT | 0.84+ |
10,000 people | QUANTITY | 0.82+ |
Snowflake | ORGANIZATION | 0.78+ |
AMEA | LOCATION | 0.77+ |
snow park | TITLE | 0.76+ |
snow | ORGANIZATION | 0.66+ |
couple of guests | QUANTITY | 0.65+ |
NTY | ORGANIZATION | 0.6+ |
Snowflake | EVENT | 0.59+ |
MDM | ORGANIZATION | 0.58+ |
APAC | ORGANIZATION | 0.58+ |
prem | ORGANIZATION | 0.52+ |
GA | LOCATION | 0.5+ |
snow | TITLE | 0.46+ |
SRE | TITLE | 0.46+ |
lit | ORGANIZATION | 0.43+ |
stream | TITLE | 0.41+ |
22 | QUANTITY | 0.4+ |
Manish Sood, CTO & Co Founder, Reltio ***Incorrect Version
(upbeat music) >> It's my pleasure, to be one of the hosts of theCUBE on cloud and the startup showcase brought to you by AWS. This is Dave Vellante and for years theCUBE has been following the trail of data. And with the relentless match of data growth this idea of a single version of the truth has become more and more elusive. Moreover, data has become the lifeblood of a digital business. And if there's one thing that we've learned throughout the pandemic, if you're not digital, you're in trouble. So we've seen firsthand, the critical importance of reliable and trusted data. And with me to talk about his company and the trends in the market is Manish Sood the CTO and co-founder of Reltio. Manish, welcome to the program. >> Thank you, Dave. It's a pleasure to be here. >> Okay, let's start with, let's go back to you and your co-founders when you started Reltio it was back in the early days of the big data movement, cloud was kind of just starting to take off, but what problems did you see then and what are enterprises struggling with today, especially with data as a source of digital innovation. >> Dave, if you look at the changes that have taken place in the landscape over the course of the last 10 years, when we started Reltio in 2011 there were a few secular trends that were coming to life. One was a cloud compute type of capabilities being provided by vendors like AWS. It was starting to pick up steam where making compute capabilities available at scale to solve large data problems was becoming real and possible. The second thing that we saw was this big trend of you know, you can not have a wall to wall, one single application that solves your entire business problem. Those visions have come and gone and we are seeing more of the best of breed application type of a landscape where even if you look within a specific function let's say sales or marketing, you have more than a dozen applications that any company is using today. And that trend was starting to emerge where we knew very well that the number of systems that we would have to work with would continue to increase. And that created a problem of where would you get the single source of truth or the single best origin of a customer, a supplier, a product that you're trying to sell, those types of critical pieces of information that are core to any business that's out there today. And, you know, that created the opportunity for us at Reltio to think about the problem at scale for every company out there, every business who needed this kind of capability and for us to provide this capability in the cloud as a software, as a service offering. So that's where, you know, the foundation of Reltio started. And the core problem that we wanted to solve was to bridge the gap that was created by all these data silos, and create a unified view of the core critical information that these companies run on. >> Yeah, the cloud is this giant, you know hyper distributed system, data by its very nature is distributed. It's interesting what you were sort of implying about you know, the days of the monolithic app are gone, but my business partner years ago John Furrier at theCUBE said, data is going to become the new development kit. And we've certainly seen that with the pandemic but tell us more about Reltio and how you help customers deal with that notion of data silos, data fragmentation, how do you solve that problem? >> So data fragmentation is what exists today. And, with the Reltio software as a service offering that we provide, we allow customers to stitch together and unify the data coming from these different fragmented siloed applications or data sources that they have within their enterprise. At the same time, there's a lot of dependence on the third party data. You know, when you think about different problems that you're trying to solve, you have for B2B type of information that in Bradstreet type of data providers, in life sciences you have IQVIA type of data providers. You know, as you look at other verticals that is a specialized third party data provider for any and every kind of information that most of the enterprise businesses want to combine with their in-house data or first party data to get the best view of who they're dealing with, who are they working with, you know who are the customers that they're serving and use that information also as a starting point for the digital transformation that they want to get to. And that's where Reltio fits in as the only platform that can help stitch together this kind of information and create a 360 degree view that spans all the data silos and provides that for real-time use, for BI and analytics to benefit from, for data science to benefit from, and then this emerging notion of data in itself is a, you know, key starting point that is used by us in order to make any decisions. Just like we go, you know, if I they wanted to look at information about you, I would go to places like LinkedIn, look up the information, and then on my next set of decisions with that information. If somebody wanted to look up information on Reltio they would go to, let's say crunchbase as an example and look up, who are the investors? How much money have we raised? All those details that are available. It's not a CRM system by itself but it is an information application that can aid and assist in the decision-making process as a starting point. And that user experience on top of the data becomes an important vehicle for us to provide as a part of the Reltio platform capabilities. >> Awesome, thank you. And I want to get into the tech, but before we do maybe we just cut to the chase and maybe you can talk about some of the examples of Reltio and action, some of the customers that you can talk about, maybe the industries that are really adopting this. What can you tell us there Manish? >> We work across a few different verticals some of the key verticals that we work in are life sciences and travel and hospitality and financial services, insurance retail, as an example. Those are some of the key verticals for us. But to give you some examples of the type of problems that customers are solving with Reltio as the data unification platform, let's take CarMax as an example,. CarMax is a customer who's in the business of buying used cars, selling used cars servicing those used cars. And then, you know, you as a customer don't just transact with them once, you know, you've had a car for three years you go back and look at what can you trade in that car for? But in order for CarMax to provide a service to you that goes across all the different touch points whether you are visiting them at their store location trying to test drive a car or viewing information about the various vehicles on their website, or just you know, punching in the registration number of your car just to see what is the appraisal from them in terms of how much will they pay for your car. This requires a lot of data behind the scenes for them to provide a seamless journey across all touch points. And the type of information that they use relative for aggregating, unifying, and then making available across all these touch points, is all of the information about the customers, all of the information about the household, you know, the understanding that they are trying to achieve because life events can be buying signals for consumers like you and I, as well as who was the associate who helped you either in the selling of a car, buying of a car, because their business is all about building relationships for the longer term, lifetime value that they want to capture. And in that process, making sure that they're providing continuity of relationship, they need to keep track of that data. And then the vehicle itself, the vehicle that you buy yourself, there is a lot of information in order to price it right, that needs to be gathered from multiple sources. So the continuum of data all the way from consumer to the vehicle is aggregated from multiple sources, unified inside Reltio and then made available through APIs or through other methods and means to the various applications, can be either built on top of that information, or can consume that information in order to better aid and assist the processes, business processes that those applications have to run and to end. >> Well, sounds like we come along, (indistinct). >> I was just going to say that's one example and, you know across other verticals, that are other similar examples of how companies are leveraging, Reltio >> Yeah, so as you say, we've come a long way from simple linear clickstream analysis of a website. I mean, you're talking about really rich information and you know happy to dig into some other examples, but I wonder how does it work? I mean, what's the magic behind it? What's the tech look like? I mean, obviously leveraging AWS, maybe you could talk about how, so, and maybe some of the services there and some of your unique IP. >> Yeah, you know, so the unique opportunity for us when we started in 2011 was really to leverage the power of the cloud. We started building out this capability on top of AWS back in 2011. And, you know, if you think about the problem itself, the problem has been around as long as you have had more than one system to run your business, but the magnitude of the problem has expanded several fold. You know, for example, I have been in this area was responsible for creating some of the previous generation capabilities and most of the friction in those previous generation MDM or master data management type of solutions as the you know, the technical term that is used to refer to this area, was that those systems could not keep pace with the increasing number of sources or the depth and breadth of the information that customers want to capture, whether it is, you know, about a patient or a product or let's say a supplier that you're working with, there is always additional information that you can capture and you know use to better inform the decisions for the next engagement. And that kind of model where the number of sources we're always going to increase the depth and breadth of information was always going to increase. The previous generation systems were not geared to handle that. So we decided that not only would we use add scale compute capabilities in the cloud, with the products like AWS as the backbone, but also solve some of the core problems around how more sources of information can be unified at scale. And then the last mile, which is the ability to consume such rich information just locking it in a data warehouse has been sort of the problem in the past, and you talked about the clickstream analysis. Analytics has a place, but most of the analytics is a real view mirror picture of the, you know, work that you have to do versus everybody that we talk to as a potential customer wanted to solve the problem of what can we do at the point of engagement? How can we influence decisions? So, you know, I'll give you an example. I think everybody's familiar with Quicken loans as the mortgage lender, and in the mortgage lending business, Quicken loans is the customer who's using Reltio as the customer data unification platform behind the scenes. But every interaction that takes place, their goal is that they have a very narrow time vendor, you know anywhere from 10 minutes to about an hour where if somebody expresses an interest in refinancing or getting a mortgage they have to close that business within that hot vendor. The conversion ratios are exponentially better in that hot vendor versus waiting for 48 hours to come back with the answer of what will you be able to refinance your mortgage at? And they've been able to use this notion of real time data where as soon as you come in through the website or if you come in through the rocket mortgage app or you're talking to a broker by calling the 1800 number they are able to triangulate that it's the same person coming from any of these different channels and respond to that person with an offer ASAP so that there is no opportunity for the competition to get in and present you with a better offer. So those are the types of things where the time to conversion or the time to action is being looked at, and everybody's trying to shrink that time down. That ability to respond in real time with the capabilities were sort of the last mile missing out of this equation, which didn't exist with previous generation capabilities, and now customers are able to benefit from that. >> That is an awesome example. I know at firsthand, I'm a customer of Quicken and rocket when you experience that environment, it's totally different, than anything you've ever seen before. So it's helpful to hear you explain like what's behind that because, it's truly disruptive and I'll tell you the other thing that sort of triggered a thought was that we use the word realtime a lot and we try to develop years ago. We said, what does real-time really mean? And the answer we landed on was, before you lose the customer, and that's kind of what you just described. And that is what gives as an example a quick and a real advantage again, having experienced it firsthand. It's pretty, pretty tremendous. So that's a nice reference. So, and the other thing that struck me is, I wanted to ask you how it's different from sort of legacy Master Data Management solutions and you sort of described that they've since to me they've got to take their traditional on-prime stack, rip it out, stick it in the iCloud, it's okay we got our stack in the cloud now. Your technical approach is dramatically different. You had the advantage of having a clean sheet of paper, right? I mean, from a CTO's perspective, what's your take? >> Yeah, the clean sheet of paper is the luxury that we have. You know, having seen this movie before having, you know looked at solving this problem with previous generation technologies, it was really the opportunity to start with a clean sheet of paper and define a cloud native architecture for solving the problem at scale. So just to give you an example, you know, across all of our customers, we are today managing about 6.5 billion consolidated profiles of people, organizations, product, locations, you know, assets, those kinds of details. And these are the types of crown jewels of the business that every business runs on. You know, for example, if you wanted to let's say you're a large company, like, you know, Ford and you wanted to figure out how much business are you doing, whether, you know another large company, because the other large company could be a global organization, could be spread across multiple geographies, could have multiple subsidiaries associated with it. It's been a very difficult to answer to understand what is the total book of business that they have with that other big customer. And, you know, being able to have the right, unified, relevant, ready clean information as the starting point that gives you visibility to that data, and then allows you to run precise analytics on top of that data, or, you know drive any kind of conclusions out of the data science type of algorithms or MLAI algorithms that you're trying to run. You have to have that foundation of clean data to work with in order to get to those answers. >> Nice, and then I had questions on just analysis, it's a SAS model I presume, how is it priced? Do you have a freemium? How do I get started? Maybe you could give us some color on that. >> Yeah, we are a SAS provider. We do everything in the cloud, offer it as a SAS offering for customers to leverage and benefit from. Our pricing is based on the volume of consolidated profiles, and I use the word profiles because this is not the traditional data model, where you have rows, columns, foreign keys. This is a profile of a customer, regardless of attribution or any other details that you want to capture. And you know, that just as an example is what we consider as a profile. So number of consolidated profiles under management is the key vector of pricing. Customers can start small and they can grow from there. We have customers who manage anywhere from a few hundred thousand profiles, you know, off these different types of data domains, customer, patient, provider, product, asset, those types of details, but then they grow and some of the customers HPInc, as a customer, is managing close to 1.5 billion profiles of B2B businesses at a global scale of B2C consumers at global scale. And they continue to expand that footprint as they look at other opportunities to use, the single source of truth capabilities provided by Reltio. >> And, and your relationship with AWS, you're obviously building on top of AWS, you're taking advantage of the cloud native capabilities. Are you in the AWS marketplace? Maybe you could talk about AWS relationship a bit. >> Yeah, AWS has been a key partner for us since the very beginning. We are now on the marketplace. Customers can start with the free version of the product and start to play with the product, understand it better and then move into the paid tier, you know as they bring in more data into Reltio and, you know be also have the partnership with AWS where, you know customers can benefit from the relationship where they are able to use the spend against Reltio to offset the commitment credits that they have for AWS, you know, as a cloud provider. So, you know, we are working closely with AWS on key verticals, like life sciences, travel and hospitality as a starting point. >> Nice, love those credits. Company update, you know, head count, funding, revenue trajectory what kind of metrics are you comfortable sharing? >> So we are currently at about, you know, slightly not at 300 people overall at Reltio. We will grow from 300 to about 400 people this year itself we are, you know, we just put out a press release where we mentioned some of the subscription ARR we finished last year at about $74 million in ARR. And we are looking at crossing the hundred million dollar ARR threshold later this year. So we are on a great growth trajectory and the business is performing really well. And we are looking at working with more customers and helping them solve this, you know, data silo, fragmentation of data problem by having them leverage the Reltio capability at scale across their enterprise. >> That's some impressive growth, congratulations. We're, I'm sure adding hundred people you're hiring all over the place, but where we are some of your priorities? >> So, you know, the, as the business is growing we are spending equally, both on the R and D side of the house investing more there, but at the same time also on our go to market so that we can extend our reach, make sure that more people know about Reltio and can start leveraging the benefit of the technology that we have built on top of AWS. >> Yeah, I mean it sounds like you've obviously nailed product market fit and now you're, you know, scaling the grip, go to market. You moved from CEO into the CTO role. Maybe you could talk about that a little bit. Why, what was prompted that move? >> Problems of luxury, you know, as I like to call them once you know that you're in a great growth trajectory, and the business is performing well, it's all about figuring out ways of, you know making sure that you can drive harder and faster towards that growth milestones that you want to achieve. And, you know, for us, the story is no different. The team has done a wonderful job of making sure that we can build the right platform, you know work towards this opportunity that we see, which by the way they've just to share with you, MDM or Master Data Management has always been underestimated as a, you know, yes there is a problem that needs to be solved but the market sizing was in a, not as clear but some of the most recent estimates from analysts like Gartner, but the, you know, sort of the new incarnation of data unification and Master Data Management at about a $30 billion, yeah, TAM for this market. So with that comes the responsibility that we have to really make sure that we are able to bring this capability to a wide array of customers. And with that, I looked at, you know how could we scale the business faster and have the right team to work help us maximize the opportunity. And that's why, you know, we decided that it was the right point in time for me to bring in somebody who's worked at the stretch of, you know taking a company from just a hundred million dollars in ARR to, you know, half a billion dollars in ARR and doing it at a global scale. So Chris Highland, you know, has had that experience and having him take on the CEO role really puts us on a tremendous path or path to tremendous growth and achieving that with the right team. >> Yeah, and I think I appreciate your comments on the TAM. I love to look at the TAM and to do a lot of TAM analysis. And I think a lot of times when you define the the future TAM based on sort of historical categories, you sometimes under count them. I mean, to me you guys are in the digital business. I mean, the data transformation the company transformation business, I mean that could be order of magnitude even bigger. So I think the future is bright for your company Reltio, Manish and thank you so much for coming on the program. Really appreciate it. >> Well, thanks for having me, really enjoyed it. Thank you. >> Okay, thank you for watching. You're watching theCUBEs Startup Showcase. We'll be right back. (upbeat music)
SUMMARY :
and the startup showcase It's a pleasure to be here. let's go back to you and your co-founders that have taken place in the landscape Yeah, the cloud is this giant, you know that spans all the data silos that you can talk about, the household, you know, Well, sounds like we and maybe some of the services there as the you know, the technical term So it's helpful to hear you explain So just to give you an example, you know, Do you have a freemium? that you want to capture. the cloud native capabilities. and then move into the paid tier, you know Company update, you know, and helping them solve this, you know, but where we are some of your priorities? and can start leveraging the scaling the grip, go to market. and have the right team to work and thank you so much for me, really enjoyed it. Okay, thank you for watching.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Chris Highland | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
2011 | DATE | 0.99+ |
48 hours | QUANTITY | 0.99+ |
three years | QUANTITY | 0.99+ |
Ford | ORGANIZATION | 0.99+ |
last year | DATE | 0.99+ |
Quicken | ORGANIZATION | 0.99+ |
10 minutes | QUANTITY | 0.99+ |
360 degree | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
Gartner | ORGANIZATION | 0.99+ |
John Furrier | PERSON | 0.99+ |
Reltio | ORGANIZATION | 0.99+ |
HPInc | ORGANIZATION | 0.99+ |
hundred people | QUANTITY | 0.99+ |
300 | QUANTITY | 0.99+ |
Manish Sood | PERSON | 0.99+ |
second thing | QUANTITY | 0.99+ |
300 people | QUANTITY | 0.99+ |
this year | DATE | 0.99+ |
Manish | PERSON | 0.99+ |
more than one system | QUANTITY | 0.98+ |
iCloud | TITLE | 0.98+ |
hundred million dollar | QUANTITY | 0.98+ |
half a billion dollars | QUANTITY | 0.98+ |
more than a dozen applications | QUANTITY | 0.97+ |
today | DATE | 0.97+ |
Quicken and rocket | ORGANIZATION | 0.97+ |
IQVIA | ORGANIZATION | 0.97+ |
both | QUANTITY | 0.97+ |
Reltio | TITLE | 0.97+ |
one thing | QUANTITY | 0.96+ |
Bradstreet | ORGANIZATION | 0.96+ |
CTO & Co | ORGANIZATION | 0.96+ |
about $74 million | QUANTITY | 0.96+ |
$30 billion | QUANTITY | 0.96+ |
1.5 billion profiles | QUANTITY | 0.95+ |
single | QUANTITY | 0.95+ |
about 6.5 billion | QUANTITY | 0.95+ |
about an hour | QUANTITY | 0.93+ |
one | QUANTITY | 0.93+ |
one example | QUANTITY | 0.93+ |
later this year | DATE | 0.92+ |
about 400 people | QUANTITY | 0.91+ |
One | QUANTITY | 0.91+ |
one single application | QUANTITY | 0.9+ |
years | DATE | 0.9+ |
SAS | ORGANIZATION | 0.9+ |
Reltio | PERSON | 0.89+ |
1800 | OTHER | 0.86+ |
CarMax | ORGANIZATION | 0.86+ |
single source | QUANTITY | 0.86+ |
last 10 years | DATE | 0.83+ |
pandemic | EVENT | 0.83+ |
hundred million dollars | QUANTITY | 0.82+ |
TAM | ORGANIZATION | 0.8+ |
hundred thousand profiles | QUANTITY | 0.78+ |
single version | QUANTITY | 0.74+ |
Victoria Stasiewicz, Harley-Davidson Motor Company | IBM DataOps 2020
from the cube studios in Palo Alto in Boston connecting with thought leaders all around the world this is a cube conversation hi everybody this is Dave Volante and welcome to this special digital cube presentation sponsored by IBM we're going to focus in on data op data ops in action a lot of practitioners tell us that they really have challenges operationalizing in infusing AI into the data pipeline we're going to talk to some practitioners and really understand how they're solving this problem and really pleased to bring Victoria stayshia vich who's the Global Information Systems Manager for information management at harley-davidson Vik thanks for coming to the cube great to see you wish we were face to face but really appreciate your coming on in this manner that's okay that's why technology's great right so you you are steeped in a data role at harley-davidson can you describe a little bit about what you're doing and what that role is like definitely so obviously a manager of information management >> governance at harley-davidson and what my team is charged with is building out data governance at an enterprise level as well as supporting the AI and machine learning technologies within my function right so I have a portfolio that portfolio really includes DNA I and governance and also our master data and reference data and data quality function if you're familiar with the dama wheel of course what I can tell you is that my team did an excellent job within this last year in 2019 standing up the infrastructure so those technologies right specific to governance as well as their newer more modern warehouse on cloud technologies and cloud objects tour which also included Watson Studio and Watson Explorer so many of the IBM errs of the world might hear about obviously IBM ISEE or work on it directly we stood that up in the cloud as well as db2 warehouse and cloud like I said in cloud object store we spent about the first five months of last year standing that infrastructure up working on the workflow ensuring that access security management was all set up and can within the platform and what we did the last half of the year right was really start to collect that metadata as well as the data itself and bring the metadata into our metadata repository which is rx metadata base without a tie FCE and then also bring that into our db2 warehouse on cloud environment so we were able to start with what we would consider our dealer domain for harley-davidson and bring those dimensions within to db2 warehouse on cloud which was never done before a lot of the information that we were collecting and bringing together for the analytics team lived in disparate data sources throughout the enterprise so the goal right was to stop with redundant data across the enterprise eliminate some of those disparity to source data resources right and bring it into a centralized repository for reporting okay Wow we got a lot to unpack here Victoria so but let me start with sort of the macro picture I mean years ago you see the data was this thing that had to be managed and it still does but it was a cost was largely a liability you know governance was sort of front and center sometimes you know it was the tail that wagged the value dog and then the whole Big Data movement comes in and everybody wants to be data-driven and so you saw some pretty big changes in just the way in which people looked at data they wanted to you know mine that data and make it an asset versus just a straight liability so what what are the changes that you discerned in in data and in your organization over the last let's say half a decade we to tell you the truth we started looking at access management and the ability to allow some of our users to do some rapid prototyping that they could never do before so what more and more we're seeing as far as data citizens or data scientists right or even analysts throughout most enterprises is it well they want access to the information they want it now they want speed to insight at this moment using pretty much minimal Viable Product they may not need the entire data set and they don't want to have to go through leaps and bounds right to just get access to that information or to bring that information into necessarily a centralized location so while I talk about our db2 warehouse on cloud and that's an excellent example of one we actually need to model data we know that this is data that we trust right that's going to be called upon many many times from many many analysts right there's other information out there that people are collecting because there's so much big data right there's so many ways to enrich your data within your organization for your customer reporting the people are really trying to tap into those third-party datasets so what my team has done what we're seeing right change throughout the industry is that a lot of teams and a lot of enterprises are looking at s technologists how can we enable our scientists and our analysts right the ability to access data virtually so instead of repeating right recuperating redundant data sources we're actually ambling data virtualization at harley-davidson and we've been doing that first working with our db2 warehouse on cloud and connecting to some of our other trusted versions of data warehouses that we have throughout the enterprise that being our dealer warehouse as well to enable obviously analysts to do some quick reporting without having to bring all that data together that is a big change I see the fact that we were able to tackle that that's allowed technology to get back ahead because most backup Furnish say most organizations right have given IT the bad rap wrap up it takes too long to get what we need my technologists cannot give me my data at my fingertips in a timely manner to not allow for speed to insight and answers the business questions at point of time of delivery most and we've supplied data to our analysts right they're able to calculate aggregate brief the reporting metrics to get those answers back to the business but they're a week two weeks too late the information is no longer relevant so data virtualization through data Ops is one of the ways and we've been able to speed that up and act as a catalyst for data delivery but we've also done though and I see this quite a bit is well that's excellent we still need to start classifying our information and labeling that at the system level we've seen most most enterprises right I worked at Blue Cross as well with IBM tool had the same struggle they were trying to eliminate their technology debt reduce their spend reduce the time it takes for resources working on technologies to maintain technologies they want to reduce their their IT portfolio of assets and capabilities that they license today so what do they do to do that it's time to start taking a look at what systems should be classified as essential systems versus those systems that are disparate and could be eliminated and that starts with data governance right so okay so your your main focus is on governance and you talked about real people want answers now they don't want to have to wait they don't want to go big waterfall process so what was what would you say was sort of some of the top challenges in terms of just operationalizing your data pipelining getting to the point that you are today you know I have to be quite honest um standing up the governance framework the methodology behind it right to get it data owners data stewards at a catalog established that was not necessarily the heavy lifting the heavy lifting really came with I'm setting up a brand new infrastructure in the cloud for us to be quite honest um we with IBM partnered and said you know what we're going to the cloud and these tools had never been implemented in the cloud before we were kind of the first do it so some of the struggles that we aren't they or took on and we're actually um standing up the infrastructure security and access management network pipeline access right VPN issues things of that nature I would say is some of the initial roadblocks we went through but after we overcame those challenges with the help of IBM and the patience of both the Harley and IBM team it became quite easy to roll out these technologies to other users the nice thing is right we at harley-davidson have been taking the time to educate our users today up for example we had what we call the data bytes a Lunch and Learn and so in that Lunch and Learn what we did is we took our entire GIS team our global information services team which is all of IT through these new technologies it was a form of over 250 people with our CIO and CTO on and taking them through how do we use these tools what are the purpose of schools why do we need governance to maintain these pools why is metadata management important to the organization that piece of it seems to be much easier than just our initial scanning it up so it's good enough to start letting users in well sounds like you had real sponsorship from from leadership and input from leadership and they were kind of leaning into the whole process first of all is that true and how important is that for success oh it's essential we often said when we were first standing up the tools to be quite honest is our CIO really understand what it is that were for standing up as our CIO really understand governance because we didn't have the time to really get that face-to-face interaction with our leadership so I myself made it a mandate having done this previously at Blue Cross to get in front of my CIO and my CTO and educate them on what it is we are exactly standing up and once we did that it was very easy to get at an executive steering committee as well as an executive membership Council right I'm boarded with our governance council and now they're the champions of that it's never easy that was selling governance to leadership and the ROI is never easy because it's not something that you can easily calculate it's something that has to show its return on investment over time and that means that you're bringing dashboards you're educating your CIO and CTO and how you're bringing people together how groups are now talking about solutions and technologies in a domain like environment right where you have people from at an international level we have people from Asia from Europe from China that join calls every Thursday to talk about the data quality issue specific to dealer for example what systems were using what solutions on there are on the horizon to solve them so that now instead of having people from other countries that work for Harley as well as just even within the US right creating one-off solutions that are answering the same business questions using the same data but creating multiple solutions right to solve the same problem we're now bringing them together and we're solving together and we're prioritizing those as well so that return on investment necessarily down the line you can show that is you know what instead of this printing into five projects we've now turned this into one and instead of implementing four systems we've now implemented one and guess what we have the business rules and we have the classification I to this system so that you CIO or CTO right you now go in and reference this information a glossary a user interface something that a c-level can read interpret understand quickly write dissect the information for their own need without having to take the long lengthy time to talk to a technologist about what does this information mean and how do i how do I use it you know what's interesting is take away based on what you just said is you know harley-davidson is an iconic brand cool company with fuckin motorcycles right and but you came out of an insurance background which is a regulated industry where you know governance is sort of de rigueur right I mean it's it's a table steak so how are you able that arleigh to balance the sort of tension between governance and the sort of business flexibility so there's different there's different lovers I would call them right obviously within healthcare in insurance the importance becomes compliance and risk and regulatory right they're big pushes gosh I don't want to pay millions of dollars for fines start classifying this information enabling security reducing risk all that good stuff right for Harley Davidson it was much different it was more or less we have a mission right we want to invest in our technologies yet we want to save money how do we cut down the technologies that we have today reduce our technology spend yet and able our users have access to more information in a timely manner that's not an easy that's not an easy pass right um so what we did is I took that my married governance part-time model and our time model is specific worried they're gonna tolerate an application we're going to invest in an application we're gonna migrate an application or we're gonna eliminate that so I'm talking to my CIO said you know we can use governance the classifier system help act as a catalyst when we start to implement what it is we're doing with our technologies which technologies are we going to eliminate tomorrow we as IG cannot do that unless we discuss some sort of business impact unless you look at a system and say how many users are using us what reports are essential the business teams do they need this system is this something that's critical for users today to eat is this duplicate 'iv right we have many systems that are solving the same capability that is how I sold that off my CIO and it made it important to the rest of the organization they knew we had a mandate in front of us we had to reduce technology spend and that really for me made it quite easy and talking to other technologists as well as business users on why if governance is important why it's going to help harley-davidson and their mission to save money going forward I will tell you though that the businesses of biggest value right is the fact that they now owns the data they're more likely right to use your master data management systems like I said I'm the owner of our MDM services today as well as our customer knowledge center today they're more likely to access and reference those systems if they feel that they built the rule and they own the rules in those systems so that's another big value add to write as many business users will say ok you know you think I need access to this system I don't know I'm not sure I don't know what the data looks like within it is it easily accessible is it gonna give me the reporting metrics that I need that's where governance will help them for example like our state a scientist beam using a catalog right you can browse your metadata you can look at your server your database your tables your fields understand what those mean understand the classifications the formulas within them right they're all documented in a glossary versus having to go and ask for access to six different systems throughout the enterprise hoping right that's Sally next few that told you you needed access to these systems was right just to find out that you don't need the access and hence it took you three days to get the access anyway that's why a glossary is really a catalyst a lot of that well it's really interesting what you just said about you went through essentially an application rationalization exercise which which saved your organization money that's not always easy because you know businesses even though the you know IIT may be spending money on these systems businesses don't want to give them up but you were able to use it sounds like you're able to use data to actually inform which applications you should invest in versus you know sunset as well you'd sounds like you were giving the business a real incentive to go through this exercise because they ended up as you said owning the data well then what's great right who wants pepper what's using the old power and driving a new car if they can buy the I'm sorry bull owning the old car right driving the old park if they can truly own a new car for a cheaper price nobody wants to do that I've even looked at Tesla's right I can buy a Tesla for the same prices I can buy a minivan these days I think I might buy the Tesla but what I will say is that we also use that we built out a capabilities model with our enterprise architecture team and building that capabilities model we started to bucket our technologies within those capabilities models right like AI machine learning warehouse on cloud technologies are even warehousing technologies governance technologies you know those types of classifications today integrations technologies reporting technologies by kind of grouping all those into a capabilities matrix right and was Eve it was easy for us to then start identifying alright we're the system owners for these when it comes to technologies who are the business users for these based on that right let's go talk to this team the dealer management team about access to this new profiling capability with an IBM or this new catalog with an IBM right that they can use stay versus this sharepoint excel spreadsheets they were using for their metadata management right or the profiling tools that were old you know ten years old some of our sa peoples that they were using before right let's sell them on the noodles and start migrating them that becomes pretty easy because I mean unless you're buying some really old technology when you give people a purview into those new tools and those new capabilities especially with some of the IBM's new tools we have today there the buy-in is pretty quick it's pretty easy to sell somebody on something shiny and it's much easier to use than some of the older technologies let's talk about the business impact in my understanding is you were trying to increase the improve the effectiveness of the dealers not not just go out and brute force sign up more dealers were you able to achieve that outcome and what does it meant for your business yes actually we were so right now what we did is we slipped something called a CDR and that's our consumer dealer and development repository right that's where a lot of our dealer information resides today it's actually argue ler warehouse we had some other systems that we're collecting that information Kalinin like speed for example we were able to bring all that reporting man to one location sunset some of those other technologies but then also enable for that centralized reporting layer which we've also used data virtualization to start to marry submit information to db2 warehouse on cloud for users so we're allowing basically those that want to access CDR and our db2 warehouse and called dealer information to do that within one reporting layer um in doing so we were able to create something called a dealer harmonized ID really which is our version of we have so many dealers today right and some of those dealers actually sell bytes some of those dealers sell just apparel material some of those dealers just sell parts of those dealers right can we have certain you IDs kind of a golden record mastered information if you will right bought back in reporting so that we can accurately assess the dealer performance up to two years ago right it was really hard to do that we had information spread out all over it was really hard to get a good handle on what dealers were performing and what dealers weren't because was it was tough right for our analysts to wrangle that information and bring it together it took time many times we you would get multiple answers to one business question which is never good right one one question should have one answer if it's accurate um that is what we worked on within us last year and that's where really our CEO so the value at is now we can start to act on what dealers are performing at an optimal level versus what dealers are struggling and that's allowed even our account reps or field steel fields that right to go work with those struggling dealers and start to share with them the information of you know these are what some of our stronger dealer performing dealers are doing today that is making them more affecting it inside sorry effective is selling bikes you know these are some of the best practices you can implement that's where we make right our field staff smarter and our dealers smarter we're not looking to shut down dealers we just want to educate them on how to do better well and to your point about a single version of the truth if you will the the lines of business kind of owning their own data that's critical because you're not spending all your time you know pointing at fingers trying to understand the data if the if the users own it then they own it I and so how does self-service fit in were you able to achieve you know some level of self-service how far could you and you go there we were we did use some other tools I'll be quite honest aside from just the IBM tools today that's enabled some of that self-service analytics si PSAC was one of them Alteryx is another big one that we like to that our analyst team likes to use today to wrangle and bring that data together but that really allowed for our analysts spread in our reporting teams to start to build their own derivations their transformations for reporting themselves because they're more user interface space versus going in the backend systems and having to write straight pull right sequel queries things of that nature it usually takes time then requires a deeper level of knowledge then what we'd like to allow for our analysts right to have today I can say the same thing with the data scientist scheme you know they use a lot of the R and Python coding today what we've tried to do is make sure that the tools are available so that they can do everything they need to do without us really having to touch anything and I will be quite honest we have not had to touch much of anything we have a very skilled data scientist team so I will tell you that the tools that we put in place today Watson explore some of the other tools as well they haven't that has enabled the data scientists to really quickly move do what they need to do for reporting and even in cases where maybe Watson or Explorer may not be the optimal technology right for them to use we've also allowed for them to use some of our other resources are open source resources to build some of the models that they're that they were looking to build well I'm glad you brought that up Victoria because IBM makes a big deal out of you know being open and so you're kind of confirming that you can use third-party tools and and if you like you know tool vendor ABC you can use them as part of this framework yeah it's really about TCO right so take a look at what you have today if it's giving you at least 80% of what you need for the business or for your data scientists or reporting analysts right to do what they need to do it's to me it's good enough right it's giving you what you need it's pretty hard to find anything that's exactly 100 percent it's about being open though to when you're scientists or your analysts find another reporting tool right that requires minimal maintenance or let's just say did a scientist flow that requires minimal maintenance it's free right because it's open source IBM can integrate with that and we can enable that to be a quicker way for them to do what they need to do versus telling them no right you can't use the other technologies or the other open source information out there for you today you've got to use just these spools that's pretty tough to do and I think that would shut most IT shops down pretty quick within larger enterprises because it would really act as a roadblock to allow most of our teams right to do what they need to do reporting well last question so a big part of this the data ops you know borrowing from DevOps is this continuous integration continuous improvement you know kind of ongoing MOOC raising the bar if you will what do you see going from here oh I definitely see I see a world I see a world of where we're allowing for that rapid prototyping like I was talking about earlier I see a very big change in the data industry you said it yourself right we are in the brink of big data and it's only gonna get bigger there are organizations right right now that have literally understood how much of an asset their data really is today but they're starting to sell their data ah to other of their similar people are smaller industries right similar vendors within the industry similar spaces right so they can make money off of it because data truly is an asset now the key to it that was obviously making sure that it's curated that it's cleanse that it's rusted so that when you are selling that back you can't really make money off of it but we've seen though and what I really see on the horizon is the ability to vet that data right is in the past what have you been doing the past decade or just buying big data sets we're trusting that it's you know good information we're not doing a lot of profiling at most organizations arts you're gonna pay this big top dollar you're gonna receive this third-party data set and you're not gonna be able to use it the way you need to what I see on the horizon is us being able to do that you know we're building data Lake houses if you will right we're building um really those Hadoop link environments those data lakes right where we can land information we can quickly access it we can quickly profile it with tools that it would take hours for an ALICE write a bunch of queries do to understand what the profile of that data look like we did that recently at harley-davidson we bought and some third-party data evaluated it quickly through our agile scrum team right within a week we determined that the data was not as good as it as the vendor selling it right pretty much sold it to be and so we told the vendor we want our money back the data is not what we thought it would be please take the data sets back now that's just one use case right but to me that was golden it's a way to save money and start betting the data that we're buying otherwise what I would see in the past or what I've seen in the past is many organizations are just buying up big third-party data sets and just saying okay now it's good enough we think that you know just because it comes from the motorcycle and council right for motorcycles and operation Council then it's good enough it may not be it's up to us to start vetting that and that's where technology is going to change data is going to change analytics is going to change is a great example you're really in the cutting edge of this whole data op trend really appreciate you coming on the cube and sharing your insights and there's more in the crowd chatter crowd chatter off the Thank You Victoria for coming on the cube well thank you Dave nice to meet you it was a pleasure speaking with you yeah really a pleasure was all ours and thank you for watching everybody as I say crowd chatting at flash data op or more detail more Q&A this is Dave Volante for the cube keep it right there but right back right after this short break [Music]
**Summary and Sentiment Analysis are not been shown because of improper transcript**
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Volante | PERSON | 0.99+ |
Asia | LOCATION | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
five projects | QUANTITY | 0.99+ |
Victoria Stasiewicz | PERSON | 0.99+ |
China | LOCATION | 0.99+ |
Tesla | ORGANIZATION | 0.99+ |
Victoria | PERSON | 0.99+ |
Harley | ORGANIZATION | 0.99+ |
Harley Davidson | ORGANIZATION | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
Blue Cross | ORGANIZATION | 0.99+ |
Blue Cross | ORGANIZATION | 0.99+ |
Europe | LOCATION | 0.99+ |
Dave | PERSON | 0.99+ |
US | LOCATION | 0.99+ |
Harley-Davidson Motor Company | ORGANIZATION | 0.99+ |
harley-davidson | PERSON | 0.99+ |
six different systems | QUANTITY | 0.99+ |
Dave Volante | PERSON | 0.99+ |
last year | DATE | 0.99+ |
over 250 people | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
three days | QUANTITY | 0.99+ |
100 percent | QUANTITY | 0.99+ |
IG | ORGANIZATION | 0.99+ |
Watson | TITLE | 0.99+ |
Boston | LOCATION | 0.99+ |
tomorrow | DATE | 0.98+ |
one business question | QUANTITY | 0.98+ |
first | QUANTITY | 0.98+ |
ABC | ORGANIZATION | 0.98+ |
one answer | QUANTITY | 0.97+ |
four systems | QUANTITY | 0.97+ |
one | QUANTITY | 0.97+ |
Victoria stayshia | PERSON | 0.96+ |
Watson Explorer | TITLE | 0.96+ |
Explorer | TITLE | 0.96+ |
2019 | DATE | 0.96+ |
agile | ORGANIZATION | 0.95+ |
Vik | PERSON | 0.95+ |
two years ago | DATE | 0.95+ |
one question | QUANTITY | 0.95+ |
two weeks | QUANTITY | 0.94+ |
both | QUANTITY | 0.93+ |
excel | TITLE | 0.93+ |
Sally | PERSON | 0.92+ |
a week | QUANTITY | 0.92+ |
harley | ORGANIZATION | 0.91+ |
Watson Studio | TITLE | 0.91+ |
last half of the year | DATE | 0.89+ |
Alteryx | ORGANIZATION | 0.88+ |
millions of dollars | QUANTITY | 0.87+ |
single version | QUANTITY | 0.86+ |
every Thursday | QUANTITY | 0.86+ |
R | TITLE | 0.85+ |
Amit Walia, Informatica | CUBE Conversation, May 2020
>> Presenter: From theCUBE Studios in Palo Alto and Boston, connecting with Dot leaders all around the world. This is a CUBE conversation. >> Everyone welcome to theCUBE studio here in Palo Alto. I'm John Furrier, host of theCUBE. We're here with our quarantine crew. We've been here for three months quarantining but we're getting the stories out. We're talking to all of our favorite guests and most important stories in technologies here remotely and we have a great conversation in store for you today with Amit Walia CEO of Informatica. Cube alumni, frequent guest of theCUBE, now, the CEO of Informatica. Amit, great to see you. Thank you for coming on this CUBE conversation. >> Good to see you John. It's different to be doing this like this versus being in the studio with you but I'm glad that we could leverage technology to still talk to each other. >> You're usually right here, right next to me, but I'm glad to get you remotely at least and I really appreciate you. You always have some great commentary and insights. And Amit, before we get into the real meaty stuff that I'd love around the data, I want to get your thoughts on this COVID-19 crisis. It's a new reality, it's highlighted as we've been reporting on SiliconANGLE for the past few months. The at scale problems that people are facing but it's also an opportunity. People are sheltered in place, there's a lot of anxiety on what their work environment is going to look like but the world still runs. Your thoughts on the current crisis and how you're looking at it, how you're navigating it as a leader. >> No doubt, it is a very unique situation we all live in. We've never all faced something like this. So I think first of all, I'll begin by expressing my prayers for anyone out there who has been impacted by it and of course, a huge round of thank you to all the heroes out there at the front lines. The healthcare workers, the doctors, the nurses (mumbles) so we can't forget that. These are very unique situations but as you said, let's not forget that this is a health crisis first and then it becomes an economic crisis. And then, as you said there is a tremendous amount of disruption and (mumbles) I think all of them will go through some phases and I think you can see already while there is disruption in front of us, you see the digital contents of organizations who are ready for that have definitely faced it lot better but as obviously the ones that have been somewhat in the previous generations, let's just say business models or technologies models are struggling through it. So there is a lot data chain. I think they're still learning. We're absolutely still learning and we will continue to learn til the end of this year and we'll come out very different for the next decade for sure. >> If anyone who's watching goes to YouTube on the SiliconANGLE CUBE and look at your videos over the years, we've been talking about big data and these transformational things. It's been an inside the industry kind of discussion. Board room for your clients and your business and Informatica but I think this is now showing the world this digital transformation. The future has been pulled forward faster than people have been expecting it and innovation strategy has been on paper, maybe some execution but now I think it's apparent to everyone that the innovation strategy needs to start now because of this business model impact, the economic crisis is exposed. The scale of opportunities and challenges, there will be winners and losers and projects still need to get done or reset or reinvented to come out of this with growth. So this is going to be the number one conversation. What are your thoughts around this? >> No, so I've talked to hundreds of customers across the globe and we see the same thing. In fact actually, in some ways as we went through this, something very profound dawned on me. We, John, talked about digital transformation for the last few years and clearly digital transformation will accelerate but as I was talking to customers, I came to this realization that we actually haven't digitally transformed. To be honest, what happened in the last three to four years is that it was more digital modernization. A few apps got tweaked, a few front-ends got tweaked but if you realize, it was more digital modernization, not transformation because in my opinion, there are four aspects to digital transformation. You think of new products and services, you think of new models of engaging with your customers, you think of absolutely new operating models and you think of fundamentally new business models. That's a whole rewrite of an organization, which is not just creating a new application out there, fundamental end to end transformation. My belief is, our belief is that, now starts a whole new era of transformation, digital transformation. We've just gone through digital modernization. >> Well, that's a great point and the business model impacts create... And in times of these inflection points, and again, you're a student of history in the tech industry, PC revolution, TCP IP. These are big points in time. They're not transitions. The big players tend to win the transitions. When you have a transformation, it's a Cambrian explosion of new kinds of capabilities. This is really, I would agree with your point but I think it's going to be a Cambrian explosion because the business model forcing function is there. How do you see it play, 'cause you're in the middle of all this, 'cause you guys are the control plane for data in the industry as a company. You enable these new apps. Could you share your-- >> So, we see a lot of that and I think the way to think about it, I think first of all, you said it right. This is a step function changing orbit. This is a whole new... You get to a new curve, you go to a different model. It's a whole new equation you're hiking for the curve you're going to be on. It's not just changing the gradient of the curve you've been on, this is going to be a whole journey. And when we think of the new world of digital transformation, there are four elements that are taught. First of all, it has to be strategic. It has to be Board, CEO, executive topped down, fundamentally across the whole organization, across every function of an organization. Second one you talked about scale. I believe this is all about innovating at scale. It's not about, hey, let me go put a new application in some far plans of my business. You've got to innovate at scale, end to end change does not happen in bits and pieces. Third one, this is cloud native, absolutely cloud native. If there was any minuscule of doubt, this is taking it away. Cloud nativity is the fundamental differentiator and the last but not the least is digital natives, which is where everybody wants to go become a digitally transformed company that are data-led. You got to make data-led decisions. So for competence, strategic mindset, innovation at scale cloud nativity and being data-led is going to define digital transformation. >> I think that encapsulates absolutely innovation strategy. I agree with you 100%, that's really insightful. I want to also get your thoughts on some things that you're talking about and you have always had some really kind of high level conversations around this and theCUBE has been a very social organization. We'd love to be that social construct between companies and audiences but you use a term, the digital transformation, the soul of digital transformation is data 4.0. This idea of having a soul is interesting because the apps all have personalization built in. You have CLAIRE, you've been doing CLAIRE AI for a while. So this idea of social organizations, a soul is kind of an interesting piece of metadata you're putting out in the messaging. What do you mean by that? How can digital transmission have a soul? >> I think we talked about it a lot and I think it just came to me that, look at the end of the day, any transformation is so fundamental to anything that anybody does and I think if you think about, you can go to a fundamental transformation that is just qualitative, it's qualitative and quantitative. It's about a human body, it's about a human body transforming itself and then something doesn't have a soul, John, it does not have life. It cannot truly move to the next paradigm. So I believe that, any transformation has to have a soul and the digital world is all about data. So obviously, we believe that we're walking into a data-for-data world where, as I said, the four pillars of digital transformation would be data-led and I believe data is the soul of that transformation and data itself is moving into a new paradigm. You've heard us talk about 1.0, 2.0, 3.0, and this is the new world of 4.0, a data 4.0 which basically is all about cloud nativity, intelligent automation, AI powered, focusing on data, trust in data ethics and operations and innovation at scale. When you bring these elements together, then that enables digital transformation to happen on the shoulders of data 4.0, which in my opinion, is the soul of digital transformation. >> All right, so just rewind on data 4.0 for a minute. Pretend I'm a CIO, I'm super busy. I don't have time to read up about it. Give me the bottom line, what is data 4.0? Describe it to me in basic terms, is it just an advancement, acceleration? What's the quick elevator pitch on 4.0, data 4.0? >> Very simple?. We're all walking into a world where we're going to be digital. Digital means that we're basically going to be creating tons of data. By the way, and data is everywhere. It's not just within the four walls of us. It's basically what I call transaction and interaction and with the scale and volume of data increasing, the complexity of it increasing. We want to make decisions. I say, tomorrow's decision, today and with data that is available to us yesterday, so I can be better at that decision. So we need intelligence, we need automation, we need flexibility, which is where AI comes in. These are all very fundamental rewrites of the technology stack to enable a fundamental business transformation. So in that world, data is front and center and you look at the amount of data we are going to collect, the whole concept of data ethics and data trust become very important, not just Goodwill governance, governance is important but data privacy, data trust becomes very important. Then we're going to do things like contact tracing, it's very important for the society but the ethics, trust and privacy of what you and I will give to the government is going to become very much important. So to me, that world that we go in, every enterprise has to think data first, data led, build an infrastructure to support the business in that context and then, as I said, then the soul, which is data will give life to digital transformation. >> That's awesome. Love the personalization and the soul angle on it. I always believe that you guys had that intelligent automation fabric and to me, you said earlier, cloud native is apparent to everyone now. I think out of all this crisis, I think the one thing that's not going to be debated anymore is that cloud native is the operating model. I think that's pretty much a done deal at this point. So having this horizontally scalable data, you know I've been on this rant for years. I think that's the killer app. I think having horizontally scalable data is going to enable a lot, souls and more life. So I got to ask you the real, the billion dollar question. I'm a customer of yours or prospect or a large enterprise. I'm seeing what's happening at scale, provisioning of VPNs for 100% employees at home, except for the most needed workers. I now see all the things I need to either process, I need to cancel and projects that double down on. I still got to go out and build my competitive advantage. I still have to run my business. So I need to really start deploying right out of the gate data centric, data first, virtual first, whatever you want to call it, the new reality first, this inflection point. What do I do? What is the things that you see as projects or playbook recipes that people could implement? >> First of all, we see a very fundamental reevaluation of the entire business model. In fact, we have this term that we're using now that we have to think of business has a business 360 and if I think about it in this new world, that the businesses that stood the test is that had basically what I call, a digital supply chain or in a very digital scalable way of interacting with their customers, being able to engage with their customers. A digital fabric often making sure that they can bring their product and services to the customers very quickly or in some cases, if they were creating new products and services, they had the ability for a whole new supply chain to reach that end customer. And of course, a business model that is flexible so they dont obviously, they can cater to the needs of their customers. So in all of these worlds, customers are a building digital, scalable data platforms and when I say platforms, it's not about some monolithic platform. These are, as you and I have talked about, very modular microservices based platform that reside on what we call metadata. Data has to be the soul of the digital enterprise. Metadata is the nervous system, that makes it all work. That's the left brain, right brain, that makes it all work, which is where we put AI on top. AI that works for the customers and then they leverage it but AI applied to that metadata allows them to be very flexible, nimble and make these decisions very rapidly, whether they are doing analytics for tomorrow's offering to be brought in front of a customer or understanding the customer better to give them something that appeals to them in changing times or to protect the customer's data or to provide governance on top of it. Anything that you would like to do has to ride on top of what I call a, AI led metadata driven platform that can scale horizontally. >> Okay, so I got to go to the next level on this, which is, okay, you got me on that. I hear what you're saying, I agree, great. But I got to put my developers to work and I got insight, I got analytics teams, I got competencies but Amit, my complexities don't go away. I still got compliance at scale, I got governance at scale but I also got, now my developers not just to get analytical insight, there's great dashboards and there's great analytic data out there, you guys do a good job there. I got to get my developers coding so I can get that agility of the data into the apps for visualization in the app or having a key ingredient of the software. How do I do that? What's your answer to that one? >> So, that's a critic use case. If you think about it, for a developer, one of the biggest challenge for analytics project is how do I bring all the data that is in sites across the enterprise so then I can put it in any kind of visualization analytics tool and things are happening at scale. An enterprise is spread across the globe. It's so many different data sources available everywhere. Again, what we've done is that as a part of the data platform when you focus upon the metadata, that allows you to go to one place where you can have full access to all of the data assets that are available across (mumbles). Do you remember at theCUBE years ago, we unveiled the launch of our enterprise data catalog, which as I said, was the Google for enterprise data through metadata. Now, developers don't have to go start wasting their time, trying to find whether data has (mumbles), through the catalog that CLAIRE is in-built, they have access to it. They can start putting that to work and figuring out how do I take different kinds of data? How do I put it in some data times tool? Through which we have the in-built integrations. Do what I call the valuable last mile work, which is where the intelligence is needed from them versus spend their energy trying to figure out where good data, clean data, all kinds of data sets. We have eliminated all of that complexity with the help of metadata data platform, CLAIRE, to let the developers do what I call value-added productive work. >> Amit, final question for you. I know you talk to customers a lot, you're always on the road, you got a great product background, that's where you came from, good mix understanding of the business but now your customers and prospects are trynna put the fires out. The big room that... No one's going to talk about their kitchen appliances when the house is burning down and in some cases on the business model side or if it's a growth strategy, they're going to put all their energies where the action is. So getting mind share with them is going to be very difficult. How are you as a leader and how is Informatica getting in front of these folks and saying, "Look, I know things are tough "but we're an important supplier for you." How do you differentiate? How are you going to get that mind share? What are some of those conversations? 'Cause this is really the psychology of the marketplace right now, the buyer and the customer. >> Well, first of all, obviously we had to adapt to reach our customers in a different way because, virtually based just like you and I are chatting right now and to be candid, our teams were fantastic in being able to do it. We've actually already had multiple pretty big sides of it. In fact, the first week before we started (mumbles), we had set up the MDM and Data Governance Summit up in New York and we expected thousands of customers to come there, ask them (mumbles) virtual and we did it virtually and we had three times more people attend the virtual event. It was much easier for people who attended from the confines of their living room. So we'd gone 100% virtual and good news is, that our customers are heavily engaged. We've actually had more participation of customers coming and attending our events. We've had obviously our customers speaking, talking about how they've created value. In light of that next week, we have the big event which we're calling, CLAIREview named after ClAIRE AI engine. It's basically a beautiful net-filled tech experience. We'll have a keynote, we'll have seasons and episodes, people can do bite-sized viewing at their own leisure. We'll talk about all kinds of transformation. In fact, we have Scott Guthrie who runs all of Azure and Cloud at Microsoft as a part of my Keynote. We have two great customers, CDO at XXL and a CEO of GDR nonprofit that does (mumbles) on diabetes work talk about the data journeys. We have Martin Byer from Gardner. So we've been able to pivot and our customers are heavily engaged because data is a P-zero or a P-one activity for them to invest in. So we haven't seen any drop-off in customer engagement with us and we've been very blessed that we have a very loyal and a very high retention rate customer base. >> Well, I would expect that being the center of the value proposition, where we've always said data has been. One more final question since this just popped in my head. You and I have been talking about the edge for years. Certainly now the edge is exposed, we all know what the edge is, it's working at home. It's the human, it's me, it's my IOT devices. More than ever, the edge is now the new perimeter. It's the edge and now the edges is there. There's something that you've been talking a while. This is another part of data fabric that's important. Your view on this new edge that's now visualized by everybody, realized this immersion. What's your thoughts on the edge? >> Oh, I think the edge is real now. You and me chatted about that almost four years ago and I (mumbles). Look, think of it this way. Think of how security is going to change. There's no more data center to which we route our traffic anymore. It's sitting over there somewhere where no human beings is going to have access. People are connecting to all kinds of cloud application directly from their offices or living rooms or their cultures and the world of security has to change in that context. And people are more going to be more, enterprise (mumbles) are more worried about, hey, how do I make sure that that data centric, privacy and security is there in my device and that connects to the third party cloud vendors versus I can't transfer traffic to mine, everything to my VPN. So the edge is going to become a lot more compute intensive as well as it will require a lot of the elements that are, to be honest, used to be data center centric. We have to lighten them and bring them to the edge so enterprises can feel assured and working because at the end of the day, they have to run a business by the standards that an enterprise is held to. So you will see a ton of innovation, by the way, robotics. Robotics is going to make edge even more interesting in live view. So I see the next couple of years, heavy IOP edge computing, just like the clients that are modeled to mainframe that the PC became like a mainframe in terms of compute capacity. I guarantee at the desktop, compute capacity will go down to the edge and we're going to see that happen in the next five years or so. >> The edge is the new data centers. I always say, it's the land is the way, the way is the land. Amit, great to see you and thanks for sharing and I'm sorry, we can't do it in person but this has been like a fireside chat meets CUBE interview, remote. Thanks for spending the time and sharing your insights and we've always had great interviews at your events, virtual again, this year. We're going to spread it out over time, good call. Thanks for coming on, I appreciate it. >> Thanks, John, take care. >> Okay, Amit, CEO of Informatica, always great to get the conversation updates from him on the industry and what Informatica, as at the center of the value proposition data 4.0. This is really the new transformation, not transition, data science, data, data engineering, all happening. theCUBE with our remote interviews, bringing you all the coverage here from our Palo Alto studios, I'm John Furrier. Thanks for watching. (gentle music)
SUMMARY :
all around the world. Amit, great to see you. Good to see you John. but I'm glad to get you remotely at least and of course, a huge round of thank you So this is going to be the the last three to four years and the business model impacts create... and being data-led is going to and audiences but you use a term, and I think it just came to me that, I don't have time to read up about it. is going to become very much important. and to me, you said earlier, that the businesses that stood the test so I can get that agility of the data They can start putting that to work is going to be very difficult. and to be candid, our teams were fantastic is now the new perimeter. and that connects to the Amit, great to see you This is really the new transformation,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Amit | PERSON | 0.99+ |
John | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
Informatica | ORGANIZATION | 0.99+ |
Scott Guthrie | PERSON | 0.99+ |
New York | LOCATION | 0.99+ |
Amit Walia | PERSON | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
GDR | ORGANIZATION | 0.99+ |
Boston | LOCATION | 0.99+ |
100% | QUANTITY | 0.99+ |
May 2020 | DATE | 0.99+ |
yesterday | DATE | 0.99+ |
three months | QUANTITY | 0.99+ |
tomorrow | DATE | 0.99+ |
today | DATE | 0.99+ |
theCUBE | ORGANIZATION | 0.99+ |
Martin Byer | PERSON | 0.99+ |
ORGANIZATION | 0.99+ | |
Microsoft | ORGANIZATION | 0.99+ |
YouTube | ORGANIZATION | 0.99+ |
Cube | ORGANIZATION | 0.99+ |
next decade | DATE | 0.98+ |
CLAIRE | PERSON | 0.98+ |
MDM | EVENT | 0.98+ |
hundreds of customers | QUANTITY | 0.98+ |
XXL | ORGANIZATION | 0.98+ |
theCUBE Studios | ORGANIZATION | 0.97+ |
Data Governance Summit | EVENT | 0.97+ |
next week | DATE | 0.97+ |
this year | DATE | 0.96+ |
Third one | QUANTITY | 0.96+ |
CUBE | ORGANIZATION | 0.95+ |
billion dollar | QUANTITY | 0.95+ |
Second one | QUANTITY | 0.94+ |
thousands of customers | QUANTITY | 0.94+ |
four years | QUANTITY | 0.93+ |
One more final question | QUANTITY | 0.93+ |
one place | QUANTITY | 0.91+ |
First | QUANTITY | 0.91+ |
first week | DATE | 0.91+ |
four years ago | DATE | 0.91+ |
one | QUANTITY | 0.91+ |
one thing | QUANTITY | 0.9+ |
Azure | TITLE | 0.9+ |
next five years | DATE | 0.89+ |
first | QUANTITY | 0.89+ |
next couple of years | DATE | 0.89+ |
Gardner | ORGANIZATION | 0.88+ |
4.0 | QUANTITY | 0.88+ |
end of this year | DATE | 0.85+ |
years ago | DATE | 0.85+ |
four aspects | QUANTITY | 0.81+ |
two great customers | QUANTITY | 0.8+ |
360 | QUANTITY | 0.8+ |
years | QUANTITY | 0.79+ |
four elements | QUANTITY | 0.78+ |
CLAIREview | ORGANIZATION | 0.78+ |
past few months | DATE | 0.77+ |
three times more people | QUANTITY | 0.74+ |
Cloud | TITLE | 0.71+ |
four pillars | QUANTITY | 0.71+ |
SiliconANGLE | ORGANIZATION | 0.67+ |
CDO | ORGANIZATION | 0.64+ |
ClAIRE | ORGANIZATION | 0.64+ |
Michael Stonebraker, TAMR | MIT CDOIQ 2019
>> from Cambridge, Massachusetts. It's the Cube covering M I T. Chief data officer and information quality Symposium 2019. Brought to you by Silicon Angle Media. >> Welcome back to Cambridge, Massachusetts. Everybody, You're watching the Cube, the leader in live tech coverage, and we're covering the M I t CDO conference M I t. CDO. My name is David Monty in here with my co host, Paul Galen. Mike Stone breakers here. The legend is founder CTO of Of Tamer, as well as many other companies. Inventor Michael. Thanks for coming back in the Cube. Good to see again. Nice to be here. So this is kind of ah, repeat pattern for all of us. We kind of gather here in August that the CDO conference You're always the highlight of the show. You gave a talk this week on the top 10. Big data mistakes. You and I are one of the few. You were the few people who still use the term big data. I happen to like it. Sad that it's out of vogue already, but people associated with the doo doop it's kind of waning, but regardless, so welcome. How'd the talk go? What were you talking about. >> So I talked to a lot of people who were doing analytics. We're doing operation Offer operational day of data at scale, and they always make most of them make a collection of bad mistakes. And so the talk waas a litany of the blunders that I've seen people make, and so the audience could relate to the blunders about most. Most of the enterprise is represented. Make a bunch of the blunders. So I think no. One blunder is not planning on moving most everything to the cloud. >> So that's interesting, because a lot of people would would would love to debate that, but and I would imagine you probably could have done this 10 years ago in a lot of the blunders would be the same, but that's one that wouldn't have been there. But so I tend to agree. I was one of the two hands that went up this morning, and vocalist talk when he asked, Is the cloud cheaper for us? It is anyway. But so what? Why should everybody move everything? The cloud aren't there laws of physics, laws of economics, laws of the land that suggest maybe you >> shouldn't? Well, I guess 22 things and then a comment. First thing is James Hamilton, who's no techies. Techie works for Amazon. We know James. So he claims that he could stand up a server for 25% of your cost. I have no reason to disbelieve him. That number has been pretty constant for a few years, so his cost is 1/4 of your cost. Sooner or later, prices are gonna reflect costs as there's a race to the bottom of cloud servers. So >> So can I just stop you there for a second? Because you're some other date on that. All you have to do is look at a W S is operating margin and you'll see how profitable they are. They have software like economics. Now we're deploying servers. So sorry to interrupt, but so carry. So >> anyway, sooner or later, they're gonna have their gonna be wildly cheaper than you are. The second, then yet is from Dave DeWitt, whose database wizard. And here's the current technology that that Microsoft Azure is using. As of 18 months ago, it's shipping containers and parking lots, chilled water in power in Internet, Ian otherwise sealed roof and walls optional. So if you're doing raised flooring in Cambridge versus I'm doing shipping containers in the Columbia River Valley, who's gonna be a lot cheaper? And so you know the economies of scale? I mean, that, uh, big, big cloud guys are building data centers as fast as they can, using the cheapest technology around. You put up the data center every 10 years on dhe. You do it on raised flooring in Cambridge. So sooner or later, the cloud guys are gonna be a lot cheaper. And the only thing that isn't gonna the only thing that will change that equation is For example, my lab is up the street with Frank Gehry building, and we have we have an I t i t department who runs servers in Cambridge. Uh, and they claim they're cheaper than the cloud. And they don't pay rent for square footage and they don't pay for electricity. So yeah, if if think externalities, If there are no externalities, the cloud is assuredly going to be cheaper. And then the other thing is that most everybody tonight that I talk thio including me, has very skewed resource demands. So in the cloud finding three servers, except for the last day of the month on the last day of the month. I need 20 servers. I just do it. If I'm doing on Prem, I've got a provision for peak load. And so again, I'm just way more expensive. So I think sooner or later these combinations of effects was going to send everybody to the cloud for most everything, >> and my point about the operating margins is difference in price and cost. I think James Hamilton's right on it. If he If you look at the actual cost of deploying, it's even lower than the price with the market allows them to their growing at 40 plus percent a year and a 35 $40,000,000,000 run rate company sooner, Sooner or >> later, it's gonna be a race to the lot of you >> and the only guys are gonna win. You have guys have the best cost structure. A >> couple other highlights from your talk. >> Sure, I think 2nd 2nd thing like Thio Thio, no stress is that machine learning is going to be a game is going to be a game changer for essentially everybody. And not only is it going to be autonomous vehicles. It's gonna be automatic. Check out. It's going to be drone delivery of most everything. Uh, and so you can, either. And it's gonna affect essentially everybody gonna concert of, say, categorically. Any job that is easy to understand is going to get automated. And I think that's it's gonna be majorly impactful to most everybody. So if you're in Enterprise, you have two choices. You can be a disrupt or or you could be a disruptive. And so you can either be a taxi company or you can be you over, and it's gonna be a I machine learning that's going going to be determined which side of that equation you're on. So I was a big blunder that I see people not taking ml incredibly seriously. >> Do you see that? In fact, everyone I talked who seems to be bought in that this is we've got to get on the bandwagon. Yeah, >> I'm just pointing out the obvious. Yeah, yeah, I think, But one that's not quite so obvious you're is a lot of a lot of people I talked to say, uh, I'm on top of data science. I've hired a group of of 10 data scientists, and they're doing great. And when I talked, one vignette that's kind of fun is I talked to a data scientist from iRobot, which is the guys that have the vacuum cleaner that runs around your living room. So, uh, she said, I spend 90% of my time locating the data. I want to analyze getting my hands on it and cleaning it, leaving the 10% to do data science job for which I was hired. Of the 10% I spend 90% fixing the data cleaning errors in my data so that my models work. So she spends 99% of her time on what you call data preparation 1% of her time doing the job for which he was hired. So data science is not about data science. It's about data integration, data cleaning, data, discovery. >> But your new latest venture, >> so tamer does that sort of stuff. And so that's But that's the rial data science problem. And a lot of people don't realize that yet, And, uh, you know they will. I >> want to ask you because you've been involved in this by my count and starting up at least a dozen companies. Um, 99 Okay, It's a lot. >> It's not overstated. You estimated high fall. How do you How >> do you >> decide what challenge to move on? Because they're really not. You're not solving the same problems. You're You're moving on to new problems. How do you decide? What's the next thing that interests you? Enough to actually start a company. Okay, >> that's really easy. You know, I'm on the faculty of M i t. My job is to think of news new ship and investigate it, and I come up. No, I'm paid to come up with new ideas, some of which have commercial value, some of which don't and the ones that have commercial value, like, commercialized on. So it's whatever I'm doing at the time on. And that's why all the things I've commercialized, you're different >> s so going back to tamer data integration platform is a lot of companies out there claim to do it day to get integration right now. What did you see? What? That was the deficit in the market that you could address. >> Okay, great question. So there's the traditional data. Integration is extract transforming load systems and so called Master Data management systems brought to you by IBM in from Attica. Talent that class of folks. So a dirty little secret is that that technology does not scale Okay, in the following sense that it's all well, e t l doesn't scale for a different reason with an m d l e t l doesn't scale because e t. L is based on the premise that somebody really smart comes up with a global data model For all the data sources you want put together. You then send a human out to interview each business unit to figure out exactly what data they've got and then how to transform it into the global data model. How to load it into your data warehouse. That's very human intensive. And it doesn't scale because it's so human intensive. So I've never talked to a data warehouse operator who who says I integrate the average I talk to says they they integrate less than 10 data sources. Some people 20. If you twist my arm hard, I'll give you 50. So a Here. Here's a real world problem, which is Toyota Motor Europe. I want you right now. They have a distributor in Spain, another distributor in France. They have a country by country distributor, sometimes canton by Canton. Distribute distribution. So if you buy a Toyota and Spain and move to France, Toyota develops amnesia. The French French guys know nothing about you. So they've got 250 separate customer databases with 40,000,000 total records in 50 languages. And they're in the process of integrating that. It was single customer database so that they can Duke custom. They could do the customer service we expect when you cross cross and you boundary. I've never seen an e t l system capable of dealing with that kind of scale. E t l dozen scale to this level of problem. >> So how do you solve that problem? >> I'll tell you that they're a tamer customer. I'll tell you all about it. Let me first tell you why MGM doesn't scare. >> Okay. Great. >> So e t l says I now have all your data in one place in the same format, but now you've got following problems. You've got a d duplicated because if if I if I bought it, I bought a Toyota in Spain, I bought another Toyota in France. I'm both databases. So if you want to avoid double counting customers, you got a dupe. Uh, you know, got Duke 30,000,000 records. And so MGM says Okay, you write some rules. It's a rule based technology. So you write a rule. That's so, for example, my favorite example of a rule. I don't know if you guys like to downhill downhill skiing, All right? I love downhill skiing. So ski areas, Aaron, all kinds of public databases assemble those all together. Now you gotta figure out which ones are the same the same ski area, and they're called different names in different addresses and so forth. However, a vertical drop from bottom to the top is the same. Chances are they're the same ski area. So that's a rule that says how to how to put how to put data together in clusters. And so I now have a cluster for mount sanity, and I have a problem which is, uh, one address says something rather another address as something else. Which one is right or both? Right, so now you want. Now you have a gold. Let's call the golden Record problem to basically decide which, which, which data elements among a variety that maybe all associated with the same entity are in fact correct. So again, MDM, that's a rule's a rule based system. So it's a rule based technology and rule systems don't scale the best example I can give you for why Rules systems don't scale. His tamer has another customer. General Electric probably heard of them, and G wanted to do spend analytics, and so they had 20,000,000 spend transactions. Frank the year before last and spend transaction is I paid $12 to take a cab from here here to the airport, and I charged it to cost center X Y Z 20,000,000 of those so G has a pre built classification system for spend, so they have parts and underneath parts or computers underneath computers and memory and so forth. So pre existing preexisting class classifications for spend they want to simply classified 20,000,000 spent transactions into this pre existing hierarchy. So the traditional technology is, well, let's write some rules. So G wrote 500 rules, which is about the most any single human I can get there, their arms around so that classified 2,000,000 of the 20,000,000 transactions. You've now got 18 to go and another 500 rules is not going to give you 2,000,000 more. It's gonna give you love diminishing returns, right? So you have to write a huge number of rules and no one can possibly understand. So the technology simply doesn't scale, right? So in the case of G, uh, they had tamer health. Um, solve this. Solved this classification problem. Tamer used their 2,000,000 rule based, uh, tag records as training data. They used an ML model, then work off the training data classifies remaining 18,000,000. So the answer is machine learning. If you don't use machine learning, you're absolutely toast. So the answer to MDM the answer to MGM doesn't scale. You've got to use them. L The answer to each yell doesn't scale. You gotta You're putting together disparate records can. The answer is ml So you've got to replace humans by machine learning. And so that's that seems, at least in this conference, that seems to be resonating, which is people are understanding that at scale tradition, traditional data integration, technology's just don't work >> well and you got you got a great shot out on yesterday from the former G S K Mark Grams, a leader Mark Ramsay. Exactly. Guys. And how they solve their problem. He basically laid it out. BTW didn't work and GM didn't work, All right. I mean, kick it, kick the can top down data modelling, didn't work, kicked the candid governance That's not going to solve the problem. And But Tamer did, along with some other tooling. Obviously, of course, >> the Well, the other thing is No. One technology. There's no silver bullet here. It's going to be a bunch of technologies working together, right? Mark Ramsay is a great example. He used his stream sets and a bunch of other a bunch of other startup technology operating together and that traditional guys >> Okay, we're good >> question. I want to show we have time. >> So with traditional vendors by and large or 10 years behind the times, And if you want cutting edge stuff, you've got to go to start ups. >> I want to jump. It's a different topic, but I know that you in the past were critic of know of the no sequel movement, and no sequel isn't going away. It seems to be a uh uh, it seems to be actually gaining steam right now. What what are the flaws in no sequel? It has your opinion changed >> all? No. So so no sequel originally meant no sequel. Don't use it then. Then the marketing message changed to not only sequel, So sequel is fine, but no sequel does others. >> Now it's all sequel, right? >> And my point of view is now. No sequel means not yet sequel because high level language, high level data languages, air good. Mongo is inventing one Cassandra's inventing one. Those unless you squint, look like sequel. And so I think the answer is no sequel. Guys are drifting towards sequel. Meanwhile, Jason is That's a great idea. If you've got your regular data sequel, guys were saying, Sure, let's have Jason is the data type, and I think the only place where this a fair amount of argument is schema later versus schema first, and I pretty much think schema later is a bad idea because schema later really means you're creating a data swamp exactly on. So if you >> have to fix it and then you get a feel of >> salary, so you're storing employees and salaries. So, Paul salaries recorded as dollars per month. Uh, Dave, salary is in euros per week with a lunch allowance minds. So if you if you don't, If you don't deal with irregularities up front on data that you care about, you're gonna create a mess. >> No scheme on right. Was convenient of larger store, a lot of data cheaply. But then what? Hard to get value out of it created. >> So So I think the I'm not opposed to scheme later. As long as you realize that you were kicking the can down the road and you're just you're just going to give your successor a big mess. >> Yeah, right. Michael, we gotta jump. But thank you so much. Sure appreciate it. All right. Keep it right there, everybody. We'll be back with our next guest right into the short break. You watching the cue from M i t cdo Ike, you right back
SUMMARY :
Brought to you by We kind of gather here in August that the CDO conference You're always the highlight of the so the audience could relate to the blunders about most. physics, laws of economics, laws of the land that suggest maybe you So he claims that So can I just stop you there for a second? And so you know the and my point about the operating margins is difference in price and cost. You have guys have the best cost structure. And so you can either be a taxi company got to get on the bandwagon. leaving the 10% to do data science job for which I was hired. But that's the rial data science problem. want to ask you because you've been involved in this by my count and starting up at least a dozen companies. How do you How You're You're moving on to new problems. No, I'm paid to come up with new ideas, s so going back to tamer data integration platform is a lot of companies out there claim to do and so called Master Data management systems brought to you by IBM I'll tell you that they're a tamer customer. So the answer to MDM the I mean, kick it, kick the can top down data modelling, It's going to be a bunch of technologies working together, I want to show we have time. and large or 10 years behind the times, And if you want cutting edge It's a different topic, but I know that you in the past were critic of know of the no sequel movement, No. So so no sequel originally meant no So if you So if you if Hard to get value out of it created. So So I think the I'm not opposed to scheme later. But thank you so much.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Michael | PERSON | 0.99+ |
James | PERSON | 0.99+ |
Mark Ramsay | PERSON | 0.99+ |
James Hamilton | PERSON | 0.99+ |
Paul Galen | PERSON | 0.99+ |
Dave DeWitt | PERSON | 0.99+ |
Toyota | ORGANIZATION | 0.99+ |
David Monty | PERSON | 0.99+ |
General Electric | ORGANIZATION | 0.99+ |
2,000,000 | QUANTITY | 0.99+ |
France | LOCATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
20,000,000 | QUANTITY | 0.99+ |
10% | QUANTITY | 0.99+ |
Michael Stonebraker | PERSON | 0.99+ |
Cambridge | LOCATION | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
50 | QUANTITY | 0.99+ |
$12 | QUANTITY | 0.99+ |
Spain | LOCATION | 0.99+ |
18,000,000 | QUANTITY | 0.99+ |
25% | QUANTITY | 0.99+ |
20 servers | QUANTITY | 0.99+ |
90% | QUANTITY | 0.99+ |
Columbia River Valley | LOCATION | 0.99+ |
99% | QUANTITY | 0.99+ |
18 | QUANTITY | 0.99+ |
Aaron | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
August | DATE | 0.99+ |
Silicon Angle Media | ORGANIZATION | 0.99+ |
three servers | QUANTITY | 0.99+ |
35 $40,000,000,000 | QUANTITY | 0.99+ |
50 languages | QUANTITY | 0.99+ |
500 rules | QUANTITY | 0.99+ |
22 things | QUANTITY | 0.99+ |
10 data scientists | QUANTITY | 0.99+ |
Mike Stone | PERSON | 0.99+ |
Cambridge, Massachusetts | LOCATION | 0.99+ |
MGM | ORGANIZATION | 0.99+ |
less than 10 data sources | QUANTITY | 0.99+ |
Ian | PERSON | 0.99+ |
Paul | PERSON | 0.99+ |
1% | QUANTITY | 0.99+ |
both | QUANTITY | 0.99+ |
Toyota Motor Europe | ORGANIZATION | 0.99+ |
Of Tamer | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
one | QUANTITY | 0.99+ |
single | QUANTITY | 0.99+ |
Attica | ORGANIZATION | 0.99+ |
10 years ago | DATE | 0.99+ |
yesterday | DATE | 0.99+ |
iRobot | ORGANIZATION | 0.99+ |
Mark Grams | PERSON | 0.99+ |
TAMR | PERSON | 0.99+ |
10 years | QUANTITY | 0.99+ |
20 | QUANTITY | 0.98+ |
1/4 | QUANTITY | 0.98+ |
250 separate customer databases | QUANTITY | 0.98+ |
Cassandra | PERSON | 0.98+ |
First thing | QUANTITY | 0.98+ |
30,000,000 records | QUANTITY | 0.98+ |
both databases | QUANTITY | 0.98+ |
18 months ago | DATE | 0.98+ |
first | QUANTITY | 0.98+ |
M I t CDO | EVENT | 0.98+ |
One blunder | QUANTITY | 0.98+ |
Tamer | PERSON | 0.98+ |
one place | QUANTITY | 0.98+ |
second | QUANTITY | 0.97+ |
two choices | QUANTITY | 0.97+ |
tonight | DATE | 0.97+ |
each business unit | QUANTITY | 0.97+ |
Thio Thio | PERSON | 0.97+ |
two hands | QUANTITY | 0.96+ |
this week | DATE | 0.96+ |
Frank | PERSON | 0.95+ |
Duke | ORGANIZATION | 0.95+ |
John Lieto, Wolters Kluwer | Informatica World 2019
(upbeat music) >> Live from Las Vegas, it's theCUBE! Covering Informatica World 2019. Brought to you by Informatica. >> Welcome back everyone to theCUBE's live coverage of Informatica World here in Las Vegas. I am your host, Rebecca Knight. We are joined by John Lieto. He is the Director, Data Management at Wolters Kluwer. Thank you so much for coming on the show. >> Very welcome. >> So, Wolters Kluwer is a global provider of professional information, software solutions, tax information. Tell our viewers a little bit more about the company and about your role at the company. >> Yeah, so Wolters Kluwer, I would say probably 20 years ago, was a typical holding company. Has a very long history of publishing in Europe. It's over 185 years old in Europe. But, went on a journey to acquire businesses that were in the services business with a focus on legal, but there are also big concentrations in health divisions, tax and accounting, really a professional company. Very, very, very big in print. What happened over the last 10, 15 years though, it's completely flipped over to digital. In fact, it's been one of the more successful transformations. So now we're mostly in the digital space and electronic space. So where I come in, and my business unit comes in, CT Corporation is a 126-year-old company. Number one player in registered agent services. Legal information, helping companies like Informatica stay in compliance. United States is 50 states with 50 sets of rules, plus international. So typically, companies of any size get a provider. Sometimes their law firms will do it, but a lot of times, it's going to be CT Corporations, things like that. My role in the company, I've been there 19 years, I've had a mix of roles, mostly in the business but a little technical. I'm the Director of Data Management, I am basically in charge of managing governance and data quality for the business. It is focused on the customer right now and all things related to customer, but we're expanding into other domains like vendors, products, suppliers and supporting of pretty large digital transformation. >> So I'm sure in your role you have a lot of practical insights for MDM practitioners but before we go there, I want to hear from you about the customer mindset, I mean, this is a moment for data governance and security... >> Sure >> and privacy, a real inflection point, and like Wolters Kluwer, so many companies undergoing their own digital transformations. How would you describe the customer mindset about all of this? How are customers wrapping their brains around it? >> So for us, we're not in a very regulated business. We touch customers that are heavily regulated, but we're not, we're a service company, right? Most of the stuff, the data we deal with is public knowledge, right? A company's data is public knowledge, you can go in any state website and find out when Informatica was formed, who the board of directors are, so it's all public. But customers are extremely sensitive about where their data is, and what we're doing with it, so we were on top of that, especially for our foreign customers. Internally the CT and Wolters Kluwer we have to be very, very, very customer-focused 'cause it's a very direct service, right? So it's all about the customer. How we got to this point of using Informatica MDM, Massive Data Management, is trying to get close to the customer, trying to understand the customer. Our customers go from J P Morgan to these big, big, big companies that have investments in companies that you wouldn't even know they're related to that customer. So they rely on us to help them stay compliant. How do I deal with these diverse businesses that are under my portfolio, and how do I keep them compliant in the States? So we have all this data and we help our customers understand it, and know what to do next, almost anticipate where they're going to fall out of compliance in the State. >> So what is your advice for the people who are really starting, for the executives starting at square one, trying to think about a master data management solution? >> Yeah, great question. And it's really where the heart of my devotion has been the last year. I would say the most important thing is start with a business case. Understand where your business is going. Make it about what outcomes are you looking for. Really thoroughly understand that. Also take the systems or the subjects that are important to you, your company, and profile it. Understand that data. You can come to an MDM project, a master data management project, with so much knowledge first, don't just say, well everybody is doing master data management, we should do it too. I mean, it might be true, but you're really not going to get the outcomes. And then focus your project to hit those business goals, 'cause MDM is a process and a tool, it's not an answer. You need to use that tool to get to where you are, so for us the number one thing was reduce duplication, okay, MDM tools do that, so we're trying to get to the golden record, okay. Data quality, I don't have the good phone numbers I have bad email addresses, oh, mass data management does that too. So, again, it's going for the outcomes you're driving for, and MDM happens to be a good tool for that. >> So it's really about defining the objectives before you even jump in. >> Absolutely. >> Do you recommend experiments? What's the approach you... >> Wonderful question. In data we call it profiling, right? And you want to go in small wins, because one of the things that will happen to anyone in this space is the business is really not sure about this investment. These days, data is becoming so huge that's becoming a lot easier for guys like me to win a business case, but two years ago it was pretty hard. I'm sorry I just lost my train of thought. >> But that's an interesting point, just talking about the overcoming the skepticism within these companies to latch on to this idea, and as you were saying, the announcing the small wins, really getting everyone on board. >> Thank you. What we did is, we had profiled, found a problem, oh, we have definitive cost duplication, we've got email addresses that are completely bogus. Let's just to take those two. And we did small little pilots. We'd use tools we had, completely manual ad-hoc, let's fix 200 records, let's take a really important customer that we're trying to onboard, or expand, and let's fix that data, and then show the outcomes. Go for the quick wins. Communicate, communicate, communicate. Once we did that, and we did a series of, I want to say, 30 or 40 of these. That built our requirement set. We built the requirement set by doing. It was so easy that way to show victories, but too, to really get the requirements to a point where we could build the system. We happened to fall on that method, from prior learnings of not doing well on projects that had nothing to do with MDM. So for this one, I think the other piece of advice that I would give folks, is we built a data management team of business analysts that know our business and data. It is really critical that you keep this function out of IT. IT is your supporter and your partner. This does not go to IT. So we know our data. I have a guy on my team that's 45 years in the company, a woman who's 28 years in the company, just for example. So we can do a lot without a tool, and what's happening is now we are live for going on eight months now, and we're staying on top, making sure the tool's delivering what it's supposed to deliver, based on our deep knowledge. >> And I think that what you're talking about really, is introducing this technology and this new way of thinking, and it's really all about change management. >> It truly is. >> One of the things that we're talking a lot here in theCUBE about is the skills gap, and this is a problem throughout the technology industry. How big a problem is it for you at Wolters Kluwer? And what are you doing to make sure that you have the right technical talent on your team, and as we're saying, not just the technical talent but also the understanding of the business? >> One thing to understand is Wolters Kluwer is a fairly big company, and we as a company are just starting this journey. I have a small data management team in one business unit at Wolters Kluwer. There's another business unit within our health division that has data management, and that's all that I know of that is a formal data management. That's pretty small, so it's just beginning. What we're doing, we're trying to communicate, communicate, communicate. I am having some success because in our next huge journey, which is a digital transformation, a six-year project, data now is center. I've been asked to actually be the business sponsor for the data track, which, two years ago, that would not have happened. So I take that as a win, but you make a fair point, skills and understanding, both at the business and technical level is always a challenge, and it's justifying bringing in that skill set. No we can just outsource that, or we'll just use a consultant. I'm right now fighting a battle to bring in a data architect, full-time, they don't understand that... >> Just that role. >> You have to architect things. We've now done that, so what you have, because I' doing the data governance piece right now, and what I'm finding is, it's not the Wild West, but you can't always know what the parts of the organization is doing, and a lack of an architect is not keeping all the plumbing all centralized. So, a I build this data governance, I'm going to centralize data definitions and data glossary, data catalog, but I'm going to be looking around and going, okay, how do I actually have the technology piece architected correctly and that's the piece I'm really trying to pump, so hopefully when we build this data layer we're building my goal is to prove to the business that you need to fill this role. It's not me, it's going to be someone who really is deep, deep, deep in architecture. >> Hire a contractor, get that small win. >> That's what we're doing. (laughing) >> And then, the proof. I learned that from you, John. >> I'm actually in the process of just doing that. >> Excellent! >> One of those vendors is here. >> Well, we'll look forward to talking to you next year and hearing an update. >> Yeah, there you go. >> John Lieto, thank you so much for coming on theCUBE. >> You're very welcome, thank you. >> I'm Rebecca Knight, we will have more of theCUBE's live coverage of Informatica World. Stay tuned! (upbeat musing)
SUMMARY :
Brought to you by Informatica. He is the Director, Data Management about the company and about It is focused on the customer right now about the customer mindset, I mean, this is How would you describe the customer mindset Most of the stuff, the data we deal with in the State. to get to where you are, so for us So it's really about defining the objectives What's the approach you... because one of the things that will happen just talking about the overcoming It is really critical that you keep this function And I think that what you're talking about One of the things that we're talking a lot So I take that as a win, but you make it's not the Wild West, but you can't That's what we're doing. I learned that from you, John. Well, we'll look forward to talking to you John Lieto, thank you so much I'm Rebecca Knight, we will have more
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Rebecca Knight | PERSON | 0.99+ |
John Lieto | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
Informatica | ORGANIZATION | 0.99+ |
J P Morgan | ORGANIZATION | 0.99+ |
Wolters Kluwer | ORGANIZATION | 0.99+ |
John | PERSON | 0.99+ |
eight months | QUANTITY | 0.99+ |
30 | QUANTITY | 0.99+ |
200 records | QUANTITY | 0.99+ |
45 years | QUANTITY | 0.99+ |
40 | QUANTITY | 0.99+ |
28 years | QUANTITY | 0.99+ |
six-year | QUANTITY | 0.99+ |
19 years | QUANTITY | 0.99+ |
last year | DATE | 0.99+ |
two | QUANTITY | 0.99+ |
next year | DATE | 0.99+ |
Las Vegas | LOCATION | 0.99+ |
50 sets | QUANTITY | 0.99+ |
United States | LOCATION | 0.99+ |
CT Corporations | ORGANIZATION | 0.99+ |
50 states | QUANTITY | 0.99+ |
both | QUANTITY | 0.98+ |
two years ago | DATE | 0.98+ |
One | QUANTITY | 0.98+ |
CT Corporation | ORGANIZATION | 0.97+ |
126-year-old | QUANTITY | 0.97+ |
CT | ORGANIZATION | 0.97+ |
one | QUANTITY | 0.97+ |
Massive Data Management | ORGANIZATION | 0.94+ |
20 years ago | DATE | 0.93+ |
Informatica MDM | ORGANIZATION | 0.91+ |
rules | QUANTITY | 0.9+ |
over 185 years old | QUANTITY | 0.9+ |
one business unit | QUANTITY | 0.88+ |
2019 | DATE | 0.87+ |
theCUBE | ORGANIZATION | 0.85+ |
One thing | QUANTITY | 0.83+ |
World | TITLE | 0.79+ |
Informatica World | EVENT | 0.74+ |
10 | QUANTITY | 0.71+ |
Informatica World | ORGANIZATION | 0.6+ |
Number one player | QUANTITY | 0.6+ |
15 years | QUANTITY | 0.57+ |
Wild West | LOCATION | 0.55+ |
Abhiman Matlapudi & Rajeev Krishnan, Deloitte | Informatica World 2019
>> Live from Las Vegas. It's theCUBE. Covering Informatica World 2019, brought to you by Informatica. >> Welcome back everyone to theCUBE's live coverage of Informatica World. I am your host, Rebecca Knight, along with co-host, John Furrier. We have two guests for this segment. We have Abhiman Matlapudi. He is the Product Master at Deloitte. Welcome. >> Thanks for having us. >> And we have Kubalahm Rajeev Krishnan, Specialist Leader at Deloitte. Thank you both so much for coming on theCUBE. >> Thanks Rebecca, John. It's always good to be back on theCUBE. >> Love the new logos here, what's the pins? What's the new take on those? >> It looks like a honeycomb! >> Yeah, so interesting that you ask, so this is our joined Deloitte- Informatica label pin. You can see the Deloitte green colors, >> Nice! They're beautiful. >> And the Informatica colors. This shows the collaboration, the great collaboration that we've had over, you know, the past few years and plans, for the future as well. Well that's what we're here to talk about. So why don't you start the conversation by telling us a little bit about the history of the collaboration, and what you're planning ahead for the future. Yeah. So, you know, if we go like you know, ten years back the collaboration between Deloitte and Informatica has not always been that, that strong and specifically because Deloitte is a huge place to navigate, and you know, in order to have those meaningful collaborations. But over the past few years, we've... built solid relationships with Informatica and vise versa. I think we seek great value. The clear leaders in the Data Management Space. It's easy for us to kind of advise clients in terms of different facets of data management. You know, because no other company actually pulls together you know, the whole ecosystem this well. >> Well you're being polite. In reality, you know where it's weak and where it's real. I mean, the reality is there's a lot of fun out there, a lot of noise, and so, I got to ask you, cause this is the real question, because there's no one environment that's the same. Customers want to get to the truth faster, like, where's the deal? What's the real deal with data? What's gettable? What's attainable? What's aspirational? Because you could say "Hey, well I make data, data-driven organization, Sass apps everywhere." >> Yeah. Yeah absolutely. I mean every, every company wants to be more agile. Business agility is what's driving companies to kind of move all of their business apps to the Cloud. The uh, problem with that is that, is that people don't realize that you also need to have your data management governance house in order, right, so according to a recent Gartner study, they say by next year, 75% of companies who have moved their business apps to the Cloud, is going to, you know, unless they have their data management and data assets under control, they have some kind of information governance, that has, you know, context, or purview over all of these business apps, 50% of their data assets are going to erode in value. So, absolutely the need of the hour. So we've seen that great demand from our clients as well, and that's what we've been advising them as well. >> What's a modern MDM approach? Because this is really the heart of the conversation, we're here at Informatica World. What's- What does it look like? What is it? >> So I mean, there are different facets or functionalities within MDM that actually make up what is the holistic modern MDM, right. In the past, we've seen companies doing MDM to get to that 360-degree view. Somewhere along the line, the ball gets dropped. That 360 view doesn't get combined with your data warehouse and all of the transaction information, right, and, you know, your business uses don't get the value that they were looking for while they invested in that MDM platform. So in today's world, MDM needs to provide front office users with the agility that they need. It's not about someone at the back office doing some data stewardship. It's all about empowering the front office users as well. There's an aspect of AIML from a data stewardship perspective. I mean everyone wants cost take out, right, I mean there's fewer resources and more data coming in. So how how do you manage all of the data? Absolutely you need to have AIML. So Informatica's CLAIRE product helps with suggestions and recommendations for algorithms, matching those algorithms. Deloitte has our own MDM elevate solution that embeds AIML for data stewardship. So it learns from human data inputs, and you know, cuts through the mass of data records that have to be managed. >> You know Rajeev, it was interesting, last year we were talking, the big conversation was moving data around is really hard. Now there's solutions for that. Move the data integrity on premise, on Cloud. Give us an update on what's going on there, because there seems to be a lot of movement, positive movement, around that. In terms of, you know, quality, end to end. We heard Google up here earlier saying "Look, we can go into end to end all you want". This has been a big thing. How are you guys handling this? >> Yeah absolutely, so in today's key note you heard Anil Chakravarthy and Thomas Green up on the stage and Anil announced MDM on GCP, so that's an offering that Deloitte is hosting and managing. So it's going to be an absolutely white-glove service that gives you everything from advice to implement to operate, all hosted on GCP. So it's a three-way ecosystem offering between Deloitte, Informatica, and GCP. >> Well just something about GCP, just as a side note before you get there, is that they are really clever. They're using Sequel as a way to abstract all the under the hood kind of configuration stuff. Smart move, because there's a ton of Sequel people out there! >> Exactly. >> I mean, it's not structured query language for structured data. It's lingua franca for data. They've been changing the game on that. >> Exactly, it should be part of their Cloud journey. So organizations, when they start thinking about Cloud, first of all, what they need to do is they have to understand where all the data assets are and they read the data feeds coming in, where are the data lakes, and once they understand where their datas are, it's not always wise, or necessary to move all their data to the Cloud. So, Deloitte's approach or recommendation is to have a hybrid approach. So that they can keep some of their legacy datas, data assets, in the on premise and some in the Cloud applications. So, Informatica, MDM, and GCP, powered by Deloitte, so it acts as an MDM nimble hub. In respect of where your data assets are, it can give you the quick access to the data and it can enrich the data, it can do the master data, and also it can protect your data. And it's all done by Informatica. >> Describe what a nimble hub is real quick. What does a nimble hub mean? What does that mean? >> So it means that, in respect of wherever your data is coming in and going out, so it gives you a very light feeling that the client wouldn't know. All we- Informatica, MDM, on GCP powered by Deloitte, what we are saying is we are asking clients to just give the data. And everything, as Rajeev said, it's a white-glove approach. It's that from engagement, to the operation, they will just feel a seamless support from Deloitte. >> Yeah, and just to address the nimbleness factor right, so we see clients that suddenly need to get into new market, or they want to say, introduce a new product, so they need the nimbleness from a business perspective. Which means that, well suddenly you've got to like scale up and down your data workloads as well, right? And that's not just transactional data, but master data as well. And that's where the Cloud approach, you know, gives them a positive advantage. >> I want to get back to something Abhiman said about how it's not always wise or necessary to move to the Cloud. And this is a debate about where do you keep stuff. Should it be on on prem, and you said that Deloitte recommends a hybrid approach and I'm sure that's a data-driven recommendation. I'm wondering what evidence you have and what- why that recommendation? >> So, especially when it depends on the applications you're putting on for MDM, and the sources and data is what you are trying to get, for the Informatica MDM to work. So, it's not- some of your social systems are already tied up with so many other applications within your on premise, and they don't want to give every other data. And some might have concerns of sending this data to the Cloud. So that's when you want to keep those old world legacy systems, who doesn't want to get upgrades, to your on premise, and who are all Cloud-savy and they can all starting new. So they can think of what, and which, need a lot of compute power, and storage. And so those are the systems we want to recommend to the Cloud. So that's why we say, think where you want to move your data bases. >> And some of it is also driven by regulation, right, like GDPR, and where, you know, which providers offer in what countries. And there's also companies that want to say "Oh well my product strategy and my pricing around products, I don't want to give that away to someone." Especially in the high tech field, right. Your provider is going to be a confidere. >> Rajeev, one of the things I'm seeing here in this show, is clearly that the importance of the Cloud should not be understated. You see, and you guys, you mentioned you get the servers at Google. This is changing not just the customers opportunity, but your ability to service them. You got a white-glove service, I'm sure there's a ton more head room. Where do you guys see the Cloud going next? Obviously it's not going away, and the on premise isn't going away. But certainly, the importance of the Cloud should not be understated. That's what I'm hearing clearly. You see Amazon, Azure, Google, all big names with Informatica. But with respect to you guys, as you guys go out and do your services. This is good for business. For you guys, helping customers. >> Yeah absolutely, I think there's value for us, there's value for our clients. You know, it's not just the apps that are kind of going to the Cloud, right? I mean you see all data platforms that are going to the Cloud. For example, Cloudera. They just launched CDP. Being GA by July- August. You know, Snowflake's on the Cloud doing great, getting good traction in the market. So eventually what were seeing is, whether it's business applications or data platforms, they're all moving to the Cloud. Now the key things to look out for in the future is, how do we help our clients navigate a multi Cloud environment, for example, because sooner or later, they wouldn't want to have all of their eggs invested in one basket, right? So, how do we help navigate that? How do we make that seamless to the business user? Those are the challenges that we're thinking about. >> What's interesting about Databricks and Snowflake, you mentioned them, is that it really is a tell sign that start-ups can break through and crack the enterprise with Cloud and the ecosystem. And you're starting to see companies that have a Sass-like mindset with technology. Coming into an enterprise marketed with these ecosystems, it's a tough crowd believe me, you know the enterprise. It's not easy to break into the enterprise, so for Databricks and Snowflake, that's a huge tell sign. What's your reaction to that because it's great for Informatica because it's validation for them, but also the start-ups are now growing very fast. I mean, I wouldn't call Snowflake 3 billion dollar start-up their unicorn but, times three. But it's a tell sign. It's just something new we haven't seen. We've seen Cloudera break in. They kind of ramped their way in there with a lot of raise and they had a big field sales force. But Data Bear and Snowflake, they don't have a huge set in the sales force. >> Yeah, I think it's all about clients and understanding, what is the true value that someone provides. Is it someone that we can rely on to keep our data safe? Do they have the capacity to scale? If you can crack those things, then you'll be in the market. >> Who are you attracting to the MDM on Google Cloud? What's the early data look like? You don't have to name names, but whats some of the huge cases that get the white glove service from Deloitte on the Google Cloud? Tell us about that. Give us more data on that. >> So we've just announced that, here at Informatica World, we've got about three to four mid to large enterprises. One large enterprise and about three mid-size companies that are interested in it. So we've been in talks with them in terms of- and that how we want to do it. We don't want to open the flood gates. We'd like to make sure it's all stable, you know, clients are happy and there's word of mouth around. >> I'm sure the end to end management piece of it, that's probably attractive. The end to end... >> Exactly. I mean, Deloitte's clearly the leader in the data analytics space, according to Gartner Reports. Informatica is the leader in their space. GCP has great growth plans, so the three of them coming together is going to be a winner. >> One of the most pressing challenges facing the technology industry is the skills gap and the difficulty in finding talent. Surveys show that I.T. managers can't find qualified candidates for open Cloud roles. What are Deloitte's thought on this and also, what are you doing as a company to address it? >> I mean, this is absolutely a good problem to have, for us. Right, which means that there is a demand. But unless we beat that demand, it's a problem. So we've been taking some creative ways, in terms of addressing that. An example would be our analytics foundry offering, where we provide a pod of people that go from data engineers you know, with Python and Sparks skills, to, you know, Java associates, to front end developers. So a whole stack of developers, a full stack, we provide that full pod so that they can go and address a particular business analytics problem or some kind of visualization issues, in terms of what they want to get from the data. So, we teach Leverate that pod, across multiple clients, I think that's been helping us. >> If you could get an automated, full time employee, that would be great. >> Yeah, and this digital FD concept is something that we'd be looking at, as well. >> I would like to add on that, as well. So, earlier- with the data disruption, Informatica's so busy and Informatica's so busy that Deloitte is so busy. Now, earlier we used plain Informatica folks and then, later on because of the Cloud disruption, so we are training them on the Cloud concepts. Now what the organizations have to think, or the universities to think is that having the curriculum, the Cloud concepts in their universities and their curriculum so that they get all their Cloud skills and after, once they have their Cloud skills, we can train them on the Informatica skills. And Informatica has full training on that. >> I think it's a great opportunity for you guys. We were talking with Sally Jenkins to the team earlier, and the CEO. I was saying that it reminds me of early days of VMware, with virtualization you saw the shift. Certainly the economics. You replaced servers, do a virtual change to the economics. With the data, although not directly, it's a similar concept where there's new operational opportunities, whether it's using leverage in Google Cloud for say, high-end, modern data warehousing to whatever. The community is going to respond. That's going to be a great ecosystem money making opportunity. The ability to add new services, give you guys more capabilities with customers to really move the needle on creating value. >> Yeah, and it's interesting you mention VMware because I actually helped, as VMware stood up there, VMCA, AW's and NSA's offerings on the Cloud. We actually helped them get ready for that GA and their data strategy, in terms of support, both for data and analytics friendliness. So we see a lot of such tech companies who are moving to a flexible consumption service. I mean, the challenges are different and we've got a whole practice around that flex consumption. >> I'm sure Informatica would love the VMware valuation. Maybe not worry for Dell technology. >> We all would love that. >> Rajeem, Abhiman, thank you so much for joining us on theCube today. >> Thank you very much. Good talking to you. >> I'm Rebecca Knight for John Furrier. We will have more from Informatica World tomorrow.
SUMMARY :
brought to you by Informatica. He is the Product Master at Deloitte. Thank you both so much for coming on theCUBE. It's always good to be back on theCUBE. Yeah, so interesting that you ask, They're beautiful. to navigate, and you know, I mean, the reality is there's a lot of fun out there, is that people don't realize that you also need What does it look like? and all of the transaction information, right, "Look, we can go into end to end all you want". So it's going to be an absolutely white-glove service just as a side note before you get there, They've been changing the game on that. and it can enrich the data, What does that mean? It's that from engagement, to the operation, And that's where the Cloud approach, you know, and you said that Deloitte recommends a hybrid approach think where you want to move your data bases. right, like GDPR, and where, you know, is clearly that the importance of the Cloud Now the key things to look out for in the future is, and crack the enterprise with Cloud and the ecosystem. Do they have the capacity to scale? What's the early data look like? We'd like to make sure it's all stable, you know, I'm sure the end to end management piece of it, the data analytics space, according to Gartner Reports. One of the most pressing challenges facing the I mean, this is absolutely a good problem to have, for us. If you could get an automated, full time employee, Yeah, and this digital FD concept is something that the Cloud concepts in their universities and their and the CEO. Yeah, and it's interesting you mention VMware because I'm sure Informatica would love the VMware valuation. thank you so much for joining us on theCube today. Thank you very much. I'm Rebecca Knight for John Furrier.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Stephane Monoboisset | PERSON | 0.99+ |
Anthony | PERSON | 0.99+ |
Teresa | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Rebecca | PERSON | 0.99+ |
Informatica | ORGANIZATION | 0.99+ |
Jeff | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Teresa Tung | PERSON | 0.99+ |
Keith Townsend | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
Rebecca Knight | PERSON | 0.99+ |
Mark | PERSON | 0.99+ |
Samsung | ORGANIZATION | 0.99+ |
Deloitte | ORGANIZATION | 0.99+ |
Jamie | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
Jamie Sharath | PERSON | 0.99+ |
Rajeev | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Jeremy | PERSON | 0.99+ |
Ramin Sayar | PERSON | 0.99+ |
Holland | LOCATION | 0.99+ |
Abhiman Matlapudi | PERSON | 0.99+ |
2014 | DATE | 0.99+ |
Rajeem | PERSON | 0.99+ |
Jeff Rick | PERSON | 0.99+ |
Savannah | PERSON | 0.99+ |
Rajeev Krishnan | PERSON | 0.99+ |
three | QUANTITY | 0.99+ |
Savannah Peterson | PERSON | 0.99+ |
France | LOCATION | 0.99+ |
Sally Jenkins | PERSON | 0.99+ |
George | PERSON | 0.99+ |
Stephane | PERSON | 0.99+ |
John Farer | PERSON | 0.99+ |
Jamaica | LOCATION | 0.99+ |
Europe | LOCATION | 0.99+ |
Abhiman | PERSON | 0.99+ |
Yahoo | ORGANIZATION | 0.99+ |
130% | QUANTITY | 0.99+ |
Amazon Web Services | ORGANIZATION | 0.99+ |
2018 | DATE | 0.99+ |
30 days | QUANTITY | 0.99+ |
Cloudera | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
183% | QUANTITY | 0.99+ |
14 million | QUANTITY | 0.99+ |
Asia | LOCATION | 0.99+ |
38% | QUANTITY | 0.99+ |
Tom | PERSON | 0.99+ |
24 million | QUANTITY | 0.99+ |
Theresa | PERSON | 0.99+ |
Accenture | ORGANIZATION | 0.99+ |
Accelize | ORGANIZATION | 0.99+ |
32 million | QUANTITY | 0.99+ |
Sudhir Hasbe, Google Cloud | Informatica World 2019
>> Live from Las Vegas, it's theCUBE. Covering Informatica World 2019. Brought to you by Informatica. >> Welcome back, everyone to theCUBE's live coverage of Informatica World 2019 I'm your host, Rebecca Knight, along with my cohost, John Furrier. We are joined by Sudhir Hasbe. He is the director of product management at Google Cloud. Thank you so much for coming on theCUBE. >> Thank you for inviting me. (laughing) >> So, this morning we saw Thomas Kurian up on the main stage to announce the expanded partnership. Big story in Wall Street Journal. Google Cloud and Informatica Team Up to Tame Data. Tell us more about this partnership. >> So if you take a look at the whole journey of data within organizations, lot of data is still siloed in different systems within different environments. Could be a hybrid on-prem. It could be multi-cloud and all. And customers need this whole end-to-end experience where you can go ahead and take that data, move it to Cloud, do data cleansing on it, do data preparation. You want to be able to go ahead and govern the data, know what data you have, like a catalog. Informatica provides all of those capabilities. And if you look at Google Cloud, we have some highly differentiated services like Google BigQuery, which customers love across the globe, to go ahead and use for analytics. We can do large scale analytics. We have customers from few terabytes to 100-plus petabytes, and storing that amount of data in BigQuery, analyzing, getting value out of it. And from there, all the A.I. capabilities that we have built on top of it. This whole journey of taking data from wherever it is, moving it, cleansing it, and then actually getting value out of it with Big Query, as with our A.I. capabilities. That whole end-to-end experience is what customers need. And with this partnership, I think we are bringing all the key components our customers need together for a perfect fit. >> Sadhir, first of all, great to see you. Since Google Next, we just had a great event by the way this year, congratulations. >> Thanks. >> A lot of great momentum in the enterprise. Explain for a minute. What is the relationship, what is the partnership? Just take a quick minute to describe what it is with Informatica that you're doing. >> Yeah, that's great. I think if you take a look at it, you can bring two key areas together in this partnership. There's data management. How do you get data into Cloud, how do you govern it, manage it, understand it. And then there is analyze the data and AI. So the main thing that we're bring together is these two capabilities. What do I mean by that? The two key components that will be available for our customers is the Intelligent Cloud services from Informatica, which will be available on GCP, will run on GCP. This will basically make sure that the whole end-to-end capability for that platform, like data pipelines and data cleansing and preparation, everything is now available natively on GCP. That's one thing. What that will also do is, Informatica team has actually optimized the execution as part of this migration. What that means is, now you'll be able to use products like Data Cloud, Dataproc. You'll be able to use some of the AI capabilities in BigQuery to actually go do the data cleansing and preparation and process-- >> So when you say "execute", you mean "running." >> Yeah, just running software. >> Not executing, go to market, but executing software. >> Executing software. If you have a data pipeline, you can literally layer this Dataproc underneath to go ahead and run some of the key processes. >> And so the value to the customer is seamless-- >> Seamless integration. >> Okay, so as you guys get more enterprise savvy, and it's clear you guys are doing good work, and obviously Thomas has got the chops there. We've covered that on theCUBE many times. As you go forward, this Cloud formula seems to be taking shape. Amazon, Azure, Google, coming in, providing onboarding to Cloud and vice-versa, so those relationships. The customers are scratching their heads, going, "Okay, where do I fit in that?" So, when you talk to customers, how do you explain that? Because, unlike the old days in computer science and the computer industry, there was known practices. You built a data center, you provisioned some servers, you did some things. It was the general-purpose formula. But every company is different. Their journey's different. Their software legacy make-up's different. Could be born in the cloud with on-prem compliance needs. So, how do customers figure this out? What's the playbook? >> I think the big thing is this: There's a trend in the industry, across the board, to go ahead and be more data-driven, build a culture that is data-driven culture. And as customers are looking at it, what they are seeing is, "Hey, traditionally I was doing a lot of stuff. "Managing infrastructure. Let me go build a data center. "Let me buy machines." That is not adding that much value. It is because. "I need to go do that." That's why they did that. But the real value is, if I can get the data, I can go analyze it, I can get better decisions from it. If I can use machine learning to differentiate my services, that's where the value is. So, most customers are looking at it and saying, "Hey, I know what I need to do in the industry now, "is basically go ahead and focus more on insights "and less on infrastructure." But as doing this, the most important thing is, data is still, as you mentioned, siloed. It's different applications, different data centers, still sitting in different places. So, I think what is happening with what we announced today is making it easy to get that data into Google Cloud and then leveraging that to go ahead and get insights. That's where the focus is for us. And as you get more of these capabilities in the cloud as native services, from Infomatica and Google, customers can now focus more on how to derive value from the data. Putting the data into Cloud, cleansing it, and data preparation, and all of that, that becomes easier. >> Okay, so that brings the solution question to the table. With the solutions that you see with Infomatica, because again, they have a broad space, a horizontal, on-prem and cloud, and they have a huge customer base with enterprise, 25 years, and big data is their thing. What us case is their low-hanging fruit right now? Where are people putting their toe in the water? Where are they jumping full in? Where do you see that spectrum of solutions? >> Great question. There are two or three key scenarios that I see across the board with talking to a lot of customers. Even today, I spoke to a lot of customers at this show. And the first main thing I hear is this whole thing, modedernization of the data warehousing and analytics infrastructure. Lot of data is still siloed and stuck into these different data systems that are there within organizations. And, if you want to go ahead and leverage that data to build on top of the data, democratize it with everybody within the organization, or to leverage AI and machine learning on top of it, you need to unwind what you've done and just take that data and put into Cloud and all. I think modernization of data warehouses and analytics infrastructure is one key play across the IT systems and IT operations. >> Before you go on to the next one, I just want to drill down on that. Because one of the things we're hearing, obviously here and all of the places, is that if you constrain the data, machine learning and AI application ultimately fails. >> Yes. >> So, legacy silos. You mentioned that. But also regulatory things. I got to have privacy now, forget my customer, GDPR first-year anniversary, new regulatory things around, all kinds of data, nevermind outside the United States. But the cloud is appealing, of just throwing it in there as one thing. It's an agility lag issue. Because lagging is not good for AI. You want real-time data. You need to have it fast. How does a customer do that? Is it best to store it in the cloud first, on-premise, with mechanisms? What's your take on this? >> I think it's different in different scenarios. I talk a lot of customers on this. Not all data is restricted from going anywhere. I think there are some data sets you want to have good governance in place. For example, if you have PII data, if you have important customer information, you want to make sure that you take the right steps to govern it. You want to anonymize it. You want to make sure that the right amount of data, per the policies within the organization, only gets into the right systems. And I think this is where, also, the partnership is helpful, because with Infomatica, the tooling that they're provided, or as you mentioned over 25 years, allows customers to understand what these data sets are, what value they're providing. And so, you can do anonymization of data before it lands into Cloud and all of that. So I think one thing is the tooling around that, which is critical. And the second thing is, if you can identify data sets that are real-time, and they don't have business-critical or PII-critical data, that you're fine as a business process to be there, then you can derive a lot of data in real time from all the data sets. >> Tell me about Google's big capabilities, because you guys have a lot of internal power platform features. BigQuery is one of them. Is BigQuery the secret weapon? Is that the big power source for managing the data? >> I would just say: Our customers love BigQuery, primarily because of the capability it provides. There are different capabilities. Let me just list a few. One is: We can do analytics at scale. So as organizations grow, even if data sets are small within organization, what I have seen is, over a period of time, when you derive a lot of value from data, you will start collecting more data within organization. And so, you have to think about scale, whether you are starting with one terabyte or one petabyte or 100 petabytes, it doesn't matter. Analyzing data at scale is what we're really good at, at different types of scale. Second is: democratizing data. We have done a good job of making data available through different tooling, existing tooling that customers have invested in and our tooling, to make it available to everybody. AirAsia is a good example. They have been able to go ahead and give right insights to everybody within the organization, which has helped them go save 5 to 10% in their operational costs. So that's one great example of democratizing access to insights. The third big thing is machine learning and AI. We all know there are just lack of resources to do, at once, analytics with AI and machine learning in the industry. So our goal has been democratize it. Make it easy within an organization. So investments that we have done with BigQuery ML, where you can do machine learning with just simple SQL statements or AutoML tables, which basically allows you to just, within the UI, map and say, "That's table in BigQuery, here's a column that I want to predict, and just automatically figure out what model you want to create, and then we can use neural networks to go do that. I think that kind of investments is what customers love about it from the platform side. >> What about the partnership from a particular functional part of the company, marketing? There's the old adage: 50% of my marketing budget is wasted. I just don't know which one. This one could really change that. >> Exactly right. >> So talk a little bit about the impact of it on marketing. >> I think the main thing is, if you think about the biggest challenge that CMOs have within organizations is how do you better marketing analytics and optimize the spend? So, one of the thing that we're doing with the partnership is not just breaking the silos, getting the data in BigQuery, all of that side and data governance. But another thing is with master data management capability that Infomatica brings to table. Now you can have all of your data in BigQuery. You leverage the Customer 360 that MDM provides and now CMOs can actually say, "Hey, I have a complete view of my customer. "I can do better segmentation. I can do better targeting. "I can give them better service." So that is actually going to derive lot of value with our customers. >> I want to just touch on that once, see if I can get this right. What you just said, I think might be the question I was just about to ask, which is: What is unique about Google's analytical portfolio with Infomatica specifically? Because there's other cloud deals they have. They have Azure and AWS. What's unique about you guys and Infomatica? Was it that piece? >> Yeah, I think there are a few things. One is the whole end-to-end experience of basically getting the data, breaking the silos, doing data governance, this tight integration between our product portfolio, where now you can get a great experience within the native GCP environment. That's one. And then on the other side, Cloud for Marketing is a big, big initiative for us. We work with hundreds of thousand of customers across the globe on their marketing spend and optimizing their marketing. And this is one of the areas where we can work together to go ahead and help those CMOs to get more value from their marketing investments. >> One of the conversations we're having here on theCUBE, and really that we're having in the technology industry, is about the skills gap. I want to hear what you're doing at Google to tackle this problem. >> I think one of the big things that we're doing is just trying to-- I have this team internally. In planning, I use "radical simplicity." And radical simplicity is: How do we take things that we are doing today and make it extremely simple for the next generation of innovation that we're doing? All the investments and BigQuery ML, you SQL for mostly everything. One of the other things that we announced at Next was SQL for data flow, SQL pipelines. What that means is, instead of writing Beam or Java code to build data flow pipelines, now you can write SQL commands to go ahead and create a whole pipeline. Similarly, machine learning with SQL. This whole aspect of simplifying capabilities so that you can use SQL and then AutoML, that's one part of it. And the second, of course, we are working with different partners to go ahead and have a lot of training that is available online, where customers don't have to go take classes, like traditional classes, but just go online. All the assets are available, examples are available. One of the big things in BigQuery we have is we have 70-plus public data sets, where you can go, with BigQuery sandbox, without credit card, you can start using it. You can start trying it out. You can use 70-plus data sets that already available and start learning the product. So I think that should help drive more-- >> Google's a real cultural tech company, so you guys obviously based that from Stanford. Very academic field, so you do hire a lot of smart people. But there's a lot of people graduating middle school, high school, college. Berkeley just graduated their first, inaugural class in data science and analytics. What's the skills, specifically, that young kids or people who are either retraining should either reboot, hone, or dial up? Is there any things that you see from people that are successful inside Google? I mean, sometimes you don't have to have that traditional math background or computer science, although math does help; it's key. But what is your observation? What's your personal view on this? >> I think the biggest thing I've noticed is the passion for data. I fundamentally believe that, in the next three to five years, most organizations will be driven with data and insights. Machine learning and AI is going to become more and more important. So this understanding and having the passion for understanding data, answering questions based on data is the first thing that you need to have. And then you can learn the technologies and everything else. They will become simpler and easier to use. But the key thing is this passion for data and having this data-driven decision-making is the biggest thing, so my recommendation to everybody who is going to college today and learning is: Go learn more about how to make better decisions with data. Learn more about tooling around data. Focus on data, and then-- >> It's like an athlete. If you're not at the gym shooting hoops, if you don't love it, if you're not living it, you're probably not going to be any-- (laughing) It's kind of like that. >> Sudhir, thank you so much for coming on theCUBE. It's a pleasure talking to you. >> Thank you. Thanks a lot for having me. >> I'm Rebecca Knight for John Furrier. You are watching theCUBE. (techno music)
SUMMARY :
Brought to you by Informatica. He is the director of product management at Google Cloud. Thank you for inviting me. Google Cloud and Informatica Team Up to Tame Data. at the whole journey of data within organizations, by the way this year, congratulations. What is the relationship, what is the partnership? the AI capabilities in BigQuery to actually go do If you have a data pipeline, you can literally layer and the computer industry, there was known practices. data is still, as you mentioned, siloed. Okay, so that brings the solution question to the table. And the first main thing I hear is obviously here and all of the places, is that all kinds of data, nevermind outside the United States. And the second thing is, if you can identify Is that the big power source for managing the data? And so, you have to think about scale, What about the partnership from a particular So, one of the thing that we're doing with the partnership the question I was just about to ask, which is: One is the whole end-to-end experience One of the conversations we're having here on theCUBE, One of the big things in BigQuery we have I mean, sometimes you don't have to have is the first thing that you need to have. if you don't love it, Sudhir, thank you so much for coming on theCUBE. Thanks a lot for having me. You are watching theCUBE.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Rebecca Knight | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
two | QUANTITY | 0.99+ |
John Furrier | PERSON | 0.99+ |
ORGANIZATION | 0.99+ | |
Sadhir | PERSON | 0.99+ |
Sudhir | PERSON | 0.99+ |
50% | QUANTITY | 0.99+ |
Sudhir Hasbe | PERSON | 0.99+ |
Informatica | ORGANIZATION | 0.99+ |
Thomas Kurian | PERSON | 0.99+ |
one petabyte | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
5 | QUANTITY | 0.99+ |
100 petabytes | QUANTITY | 0.99+ |
one terabyte | QUANTITY | 0.99+ |
SQL | TITLE | 0.99+ |
25 years | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
United States | LOCATION | 0.99+ |
third | QUANTITY | 0.99+ |
Las Vegas | LOCATION | 0.99+ |
Thomas | PERSON | 0.99+ |
Java | TITLE | 0.99+ |
over 25 years | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
Second | QUANTITY | 0.99+ |
BigQuery | TITLE | 0.99+ |
today | DATE | 0.99+ |
Infomatica | ORGANIZATION | 0.99+ |
one | QUANTITY | 0.99+ |
second | QUANTITY | 0.98+ |
one part | QUANTITY | 0.98+ |
Beam | TITLE | 0.98+ |
one thing | QUANTITY | 0.98+ |
Azure | ORGANIZATION | 0.98+ |
70-plus | QUANTITY | 0.97+ |
10% | QUANTITY | 0.97+ |
theCUBE | ORGANIZATION | 0.97+ |
three key scenarios | QUANTITY | 0.97+ |
second thing | QUANTITY | 0.97+ |
AirAsia | ORGANIZATION | 0.96+ |
Dataproc | ORGANIZATION | 0.96+ |
two key components | QUANTITY | 0.96+ |
five years | QUANTITY | 0.95+ |
BigQuery ML | TITLE | 0.95+ |
this year | DATE | 0.95+ |
first-year | QUANTITY | 0.94+ |
Cloud | TITLE | 0.94+ |
two capabilities | QUANTITY | 0.94+ |
first thing | QUANTITY | 0.94+ |
Informatica World 2019 | EVENT | 0.93+ |
hundreds of thousand of customers | QUANTITY | 0.93+ |
two key areas | QUANTITY | 0.91+ |
Google Cloud | ORGANIZATION | 0.91+ |
GCP | TITLE | 0.91+ |
Ryan O’Connor, Splunk & Jon Moore, UConn | Splunk .conf18
you live from Orlando Florida it's the cube coverage conf 18 got to you by Splunk welcome back to comp 2018 this is the cube the leader in live tech coverage my name is Dave Volante I'm here with my co-host Stu minimun we're gonna start the day we're going to talk to some customers we love that John Morris here is the MIS program director at UConn the Huskies welcome to the cube good to see you and he's joined by Ryan O'Connor who's the senior advisory engineer at Splunk he's got the cool hat on gents welcome to the cube great to have you thank you thank you for having us so kind of a cool setting this morning is the Stu's first conf and I said you know when you see this it's kind of crazy we're all shaking our phones we had the horse race this morning we won so that was kind of orange yeah team are and team orange as well that's great you're on Team Orange so we're in the media section and the median guys were like sitting on their hands but Stu and I were getting into it good job nice and easy so Jon let's start with you start always left to start with the customer perspective maybe you describe your role and we'll get into it sure so as you mentioned I'm the director of our undergrad program Mis management information systems business technology we're in the school of business under the operations and information management department the acronym OPI M okay cool and gesture Ryan tell us about your role explain the Hat absolutely yeah so I'm a member of an honorary member of the Splunk trust now I recently joined Splunk about a month ago back in August and yeah and outside of my full-time job working at Splunk I'm also an adjunct professor at the University of Connecticut and so I helped John in teaching and you know that's that's kind of my role and where our worlds sort of meet so John we were to when I were talking about the sort of evolution of Splunk the company that was just you know okay log file analysis kind of on-prem perpetual license model and it's really evolved and its permanent permeating throughout you know many organizations but maybe you could take us through sort of the early days and it was UConn for a while what what was life like before Splunk what prompted you to start playing around with Splunk and where have you taken it what's your journey look like so about three years ago we started looking at it through kind of an educational lens started to think of how could we tie it into the curriculum we started talking to a lot of the recruiters and companies that many of our students go into saying what skillsets are you looking for and Splunk was definitely one of those so academia takes a while to change the curriculum make that pendulum swing so it was how can we get this into students hands as quickly as possible and also make it applicable so we developed this initiative in our department called OPI M innovate which was all based around bringing emerging technology skills to students outside of the general curriculum we built an innovation space a research lab and really focused in bringing students in classes and incorporating it that way we started kind of slowly different parts of some early classes about three years ago different data analytics predictive analytics courses and then that really built into we did a few workshops with our innovate initiative which Ryan taught and then from there it kind of exploded we started doing projects and our latest one was with the Splunk mobile team okay you guys had some hard news around now well today right yeah maybe take us through that absolutely wanted sure yeah I'll take that so we we teach a course on IOT industrial IOT at the University of Connecticut and so we heard about the mobile projects and you know the basically they were doing a beta of the mobile and application so we we partnered with them this summer and they came in you know we have a Splunk Enterprise license through Splunk for good so we're able to actually ingest Splunk data and so as part of that course we can ingest IOT data and use Splunk mobile to visualize it all right right right maybe you could explain to our audience that might not know spun for good absolutely yeah so spun for good is a great initiative they offer a Splunk pledge license they call it to higher education institutions and research initiatives so we're able to have a 10 gig license for free that we can you know run our own Splunk enterprise we can have students actually get hands-on experience with it and in addition to that they also get free training so they can take Splunk fundamentals one and two and actually come out of school with hands-on experience and certifications when they go into the job market that's John name you know we talk so much about them the important role of data and you know that the tools change a lot you know when we talk about kind of the next generation of jobs you're right at that intersection maybe you can give you know what what are what are the students what are they looking for what are the people that are looking for them hoping that they come out of school with you know yeah it's it's um you have two different types of students I would say those that know what they're looking for and those that don't that I really have the curiosity they want to learn and so we we try to build this initiative around both those that maybe they're afraid of the technology and the skills so how do we bring them in how do we make a very immersive environment kind of have that aha moment quickly so we have a series of services around that we have what's called tech kits the students come in they're able to do something applicable right away and it sparks an interest and then we also kind of developed another path for those that were more interested in doing projects or they had that higher level skill set but we also wanted to cultivate an environment where they could learn more so a lot of it is being able to scaffold the learning environment based off of the different student coming in so it's interesting my son's a junior in college at GW and he's very excited he's playing around with date he says I'm learning are I'm learning Tablo I'm like great what about Splunk and he said what's that yes so yeah then though it's a little off-center from some of the more traditional visualization tools for example so it's it's interesting and impressive that you guys sort of identified that need and actually brought it to two students how did that all how what was in an epiphany or was that demand from the students how'd that come about it was a combination of a lot of things you know we were lucky Ryan and I have known each other for a long time as the director of the program trying to figure out what classes we should bring in how to build out the curriculum and we have our core classes but then we have the liberty to build out special topics things that we think are irrelevant up-and-coming we can try it out once if it's good maybe teach it a few more times maybe it becomes a permanent class and that's kind of where we were able to pull Ryan in and he had been doing consulting for Splunk for a number of years I said I think you know this is our important skill set is it something that you could help bring to the students sure yeah yeah I mean one of the big courses we looked at was a data analytics course and we were already teaching with a separate piece of software not gonna name names but essentially I looked at it one for one like what key benefits does this piece of software have you know what are the students trying to get out of it and then just compared to one for one to Splunk like could Splunk actually give them the same learning components and all that and it could and and with this one for swum for a good license and all that stuff we could give them the hands-on experience and augment our teaching with that free training so and they come out of school they have something tangible they can say you know I have this and so that would kind of snowball once that course worked then we could integrate it into multiple other courses so you were able to essentially replicate the value to the students of the legacy software and but also have a modern platform exactly exactly yep yeah you know that and that was a what was like a Doug was talking about making jokes about MDM and codifying business processes and yeah it's a little bit more of an antiquated piece of software essentially you know and I mean it was nice it did a great job but there wasn't when we were talking to recruiters and stuff it wasn't a piece of software that recruiters were actually looking at so we said we were hearing Splunk over and over again so why not just bring it into the classroom and give them that so in the keynote this morning started to give a vision I believe they call it Splunk next and mobile things like augmented reality are fitting in you know how do you look at things like this what what how's the mobile going to impact you especially I would think yeah so when we kind of came up with our initiative we identified five tracks that both skill sets we believe the students needed and that and companies were kind of looking for a lot of that was our students would go into internships and say hey you know the the set skills that were learning you know they're asking us to do all this other work in AWS and drones and VR so as again it's part of this it was identifying let's start small five tracks so we started with 3d printing virtual reality microcontrollers IOT and then analytics kind of tying that all together so we had already been building an environment to try and incorporate that and when we kind of started working with the spunk mobile team there's all these different components we wanted to not only tie into the class but tied into the larger initiative so the goal of the class is not to just get these students the skills interesting interested in it but to spread that awareness the Augmented part is just kind of another feature was the next piece that we're looking in to build activities and it just had this great synergy of coming in at the right time saying hey look at this sensor that we built and now you can look at data in an AR it's a really powerful thing to most people so yeah they showed that screenshot of AR during the keynote and one of the challenges that we have at the farm so we're teaching that this is the latest course that we're talking about on an industrial IOT one of the challenges we have at that farm is we don't have a desk we don't have a laptop but we do have a phone in our pocket and we have we can put a QR code or NFC tag anywhere inside that facility so we can actually have we have students go around and you know they can put an iPhone upto a sensor or scan a QR code and see actual live real-time data of what those sensors are doing which is it's an invaluable tool inside the classroom and in an environment like that for sure so it's interesting able to do things we never would have been able to do before I want to ask you about come back to mobile yeah as you you just saying it was a something that people have wanted for a long time it took a while yeah presumably it's not trivial to take all this data and present it in a format and mobile that's simple number one and number two is a lot of spunky users are you know they're at the command center right and they're on the grid yep so maybe that worked to your advantage a little bit and that you know you look at how quickly mobile apps become obsolete hmm so is that why it took so long because it was so complicated and you had a user profile that was largely stationary yeah and how is that change yes honestly I'm not sure in the full history of the mobile app I know there previously was a new mobile app and I are there was an old mobile app and this new one though is you use it the new one yes oh so when we're talking about augmented reality that might be we may not been clear on that augmented reality is actually part of its features and then in addition we have the Apple TV app is in our lab we have a dashboard displayed on a monitor so not only can we teach this class and have students setting up sensors and all this but we can live display it for everyone to come in and look at all the time and we know that it's a secure connection to our back-end people walk into the lab and the first thing I see is this live dashboard Splunk data from the Apple TV based off of project we've been working on what's that well that's a live feed from a farm five miles off campus giving us all these data points and it's just a talking point people are like wow how did you do that and you know it kind of goes from there yeah and the farm managers are actively looking at it too so they can see when the doors are open and closed to the facility you know the temperature gets too high all these metrics are actually used by the you know that was the important part to actually solve a business problem for them you know we we built a proof of concept for the class so the students could see it then their students are kind of replicating another final project in the class class is still ongoing but where they have to build out a sensor for for plants to so it's kind of the same type of sensor kit but it's they're more stationary plant systems and then they have to figure out how to take that data put it into Splunk and make sense of it so there's all these different components and you get for the students get the glam factor you can put it in a fishbowl have the Apple TV up there exactly and that's I mean part of it when we when we started to think about in ishutin you know it was recruitment you know how do we get students beyond that fear of technology especially kind of coming into a business school but it really went well beyond that we aligned it with the launch of our analytics minor which was open to anyone so now we're getting students from outside the school a bit liberal arts students creating very diverse teams and even in the class itself we have engineers business psychology student history student that are all looking to understand data and platforms to be able to make decisions so there's essentially one Splunk class today instead of a Splunk 101 there this semester there's there's a couple classes that are actually using Splunk inside the classroom and I mean depends on the semester how many we have going on that are actually there's three the semester I sorry I misspoke there we have a another professor as well who's also utilizing it so so yeah we have three three classes that are essentially relying on Splunk to teach different components or you know is it helped us understand is it part of almost exclusively part of the analytics you know curriculum or is it sort of permeate into other Mis and computer science or right now it's within our kind of Mis purview trying to you know build all their partners within the university and the classes they're not it's not solely on spunk spunk is a component of you the tool so it's like for example the particular industrial IOT course it is understanding microcontrollers understanding aquaponics and sustainability understanding how to look at data clean data and then using Splunk as a tool to help bring that all together yeah it's kind of the backbone you know love it and then and I mean in addition to I just wanted to mention that we've had students already go out into the field which is great and come back and tell us hey we went out to a job and we mentioned that we knew Splunk and we were you know a shoo-in for certain things once it goes up on their LinkedIn profile and start getting yeah I mean I again I would think it's right up there with I mean even even more so I mean everybody says and right and our day it was SPSS now it's our yep tableau obviously for the VIS everybody's kind of playing around but spunk is a very you know specific capability that not everybody has except every IT department on the planet exactly coming out of school you take a little bit deeper you either you find you find that out yeah cool well great work guys really thank you guys coming on the cube it was great to meet you I appreciate it incoming all right you're welcome all right keep it right - everybody stew and I will be right back after this this is day one of cough 18 from Splunk this is the cube [Music]
**Summary and Sentiment Analysis are not been shown because of improper transcript**
ENTITIES
Entity | Category | Confidence |
---|---|---|
Ryan O'Connor | PERSON | 0.99+ |
Ryan | PERSON | 0.99+ |
Dave Volante | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Jon Moore | PERSON | 0.99+ |
John Morris | PERSON | 0.99+ |
Ryan O’Connor | PERSON | 0.99+ |
iPhone | COMMERCIAL_ITEM | 0.99+ |
five miles | QUANTITY | 0.99+ |
10 gig | QUANTITY | 0.99+ |
Splunk | ORGANIZATION | 0.99+ |
two students | QUANTITY | 0.99+ |
Stu | PERSON | 0.99+ |
Orlando Florida | LOCATION | 0.99+ |
Doug | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
five tracks | QUANTITY | 0.99+ |
Apple TV | COMMERCIAL_ITEM | 0.99+ |
University of Connecticut | ORGANIZATION | 0.98+ |
one | QUANTITY | 0.98+ |
today | DATE | 0.98+ |
Jon | PERSON | 0.97+ |
first thing | QUANTITY | 0.97+ |
three | QUANTITY | 0.97+ |
five tracks | QUANTITY | 0.97+ |
both | QUANTITY | 0.96+ |
two different types | QUANTITY | 0.92+ |
Stu minimun | PERSON | 0.92+ |
three three classes | QUANTITY | 0.91+ |
number two | QUANTITY | 0.91+ |
ORGANIZATION | 0.9+ | |
University of Connecticut | ORGANIZATION | 0.9+ |
two | QUANTITY | 0.9+ |
this summer | DATE | 0.9+ |
about three years ago | DATE | 0.89+ |
this morning | DATE | 0.89+ |
GW | ORGANIZATION | 0.89+ |
August | DATE | 0.88+ |
both skill sets | QUANTITY | 0.88+ |
Team Orange | ORGANIZATION | 0.87+ |
UConn the Huskies | ORGANIZATION | 0.87+ |
first conf | QUANTITY | 0.86+ |
Splunk .conf18 | OTHER | 0.86+ |
about three years ago | DATE | 0.85+ |
a month ago | DATE | 0.83+ |
Splunk | PERSON | 0.83+ |
Apple | COMMERCIAL_ITEM | 0.81+ |
this morning | DATE | 0.8+ |
UConn | LOCATION | 0.8+ |
couple classes | QUANTITY | 0.79+ |
OPI M innovate | ORGANIZATION | 0.77+ |
number one | QUANTITY | 0.77+ |
Splunk | TITLE | 0.76+ |
UConn | ORGANIZATION | 0.72+ |
TV | TITLE | 0.72+ |
Splunk 101 | TITLE | 0.72+ |
one of the challenges | QUANTITY | 0.69+ |
few more times | QUANTITY | 0.65+ |
stew | PERSON | 0.63+ |
comp 2018 | EVENT | 0.63+ |
Matthew Cox, McAfee | Informatica World 2018
(techy music) >> Announcer: Live from Las Vegas, it's theCUBE, covering Informatica World 2018. Brought to you by Informatica. >> Hello, and welcome back to theCUBE. We are broadcasting from Informatica World 2018, The Venetian in Las Vegas. I'm Peter Burris, once again, my cohost is Jim Kobielus, Wikibon/SiliconANGLE. And at this segment, we're joined by Matthew Cox, who's the director of Data & Technology Services in McAfee. Welcome to theCUBE, Matthew. >> Thank you very much. Glad to be here. >> So, you're a user, so you're on the practitioner side. Tell us a little bit about what you're doing in McAfee then. >> So, from a technology standpoint, my role, per se, is to create and deliver an end-to-end vision and strategy for data, data platforms and services around those, but always identifying a line to measurable business outcomes. So my goal is to leverage data and bring meaning of data to the business and help them leverage more data-driven decisions, more toward business outcomes and business goals. >> So you're working both with the people who are managing the data or administering the data, but also the consumers of the data, and trying to arbitrate and match. >> Absolutely, absolutely. So, the first part of my career, I was in IT for many years, and then I moved into the business. So for probably the last 10 years, I've been in sales and marketing in various roles, so it gives me kind of a unique perspective in that I've lived their life and, probably more importantly, I understand the language of business, and I think too often, with our IT roles, we get into an IT-speak, and we aren't translating that into the world of the business, and I have been able to do that. So I'm really acting as a liaison, kind of bringing what I've seen of the business to IT, and helping us deliver solutions that drive business outcomes and goals. >> What strategic initiatives are you working on at McAfee that involve data? >> Well, we have a handful. Number one, I would say that our first goal is to build out our hub-and-spoke model with MDM, and really delivering our-- >> Jim: Master data management? >> Our master data management, that's correct. And really delivering our, because at MDM, that is where we define our accounts, our contacts, we build our upward-linking parents and our account hierarchies, and we create that customer master. That's the one lens that we want to see, our customers across all of our ecosystem. So we're finishing out that hub-and-spoke model, which is kind of an industry best practice, but for both realtime and batch-type integrations. But on top of that, MDM is a great platform, and it gives you that, but the end-to-end data flow is another area that we've really put a priority on, and making sure that as we move data throughout the ecosystem, we are looking at the transformations, we are looking at the data quality, we're looking at governance, to make sure that what started on one end of the spectrum look the same, or, appropriately, it was transformed by the time it gets to the other side as well. I'll say data quality three times: Data quality, data quality, data quality. For us, it's really about mastering the domain of data quality, and then looking at other areas of compliance, and the GDPR just being one. There's a number of areas of compliance areas around data, but GDPR's the most relevant one at this time. >> There's compliance, there's data quality, but also, there must be operational analytical insights to be gained from using MDM. Can you describe how McAfee, what kind of insights you're gaining from utilization of that technology in your organization? >> Sure, well, and MDM's a piece part of that, so I can talk how the account hierarchy gives us a full view. Now you've got other products, like data quality, that bolt on, that allow us to filter through and make sure that that data looks correct, and is augmented and appended correctly, but MDM gives us that wonderful foundation of understanding the lens of an account, no matter what landscape or platform we're leveraging. So if I'm looking at reporting, if I'm looking at my CRM system, if I'm looking at my marketing automation platform, I can see Account A consistently. What that allows me to do is not only have analytics built that I can have the same answers, because if I get a different number for Company A at every platform, we've got problem. What I should be seeing, the same information across the landscape, but importantly, it also drives the conversation between the different business units, so I can have marketing talk to sales, talk to operations, about Company A, and they all know who we're talking about. Historically, that's been a problem for a lot of companies because a source system would have Company A a little bit differently, or would have the data around it differently, or see it differently from one spectrum to the next. And we're trying to make that one lens consistent. >> So MDM allows you to have one consistent lens, based on the customer, but McAfee, I'm sure, is also in the midst of finding new ways, sources of data and new ways of using data, like product information, how it's being used, improving products, improving service quality. How is it, how is that hub-and-spoke approach able to accommodate some of the evolving challenges or evolving definitions and needs of data, since so much of that data often is localized to specific activities after they're performed? >> In business, there is a lot of data that happens very specific to that silo. So I have certain data within, say, marketing, that really is only marketing data, so one of the things that we do is we differentiate data. This kind of goes to governance, even saying there's some data as an organization is kind of our treasure that we want to make sure we manage consistently across the landscape of the ecosystem. There's some data that's very specific to a business function, that doesn't need to proliferate around. So we don't necessarily have the type of governance that would necessitate the level of governance that an ecosystem level data attribute would. So MDM provides, in that hub-and-spoke, what's really powerful for that as it relates to that account domain, because you're talking about product. Products is another area we may go look at at some point, adding a product domain into MDM, but today with our customer domain, and kind of our partners as well, it gives us the ability to, with this hub-and-spoke topology, to do realtime and batch, whereas before, it may have been a latency as we moved information around, and things could get either out of sync or there'd be a delay. With that hub-and-spoke, we're able to now have a realtime integration, a realtime interaction, so I can see changes made-- >> At the spoke? >> Peter: At the spoke, right. So the spoke pops back to the hub, hub delivers that back out again, so I can have something happening in marketing, translate that to sales, very quickly, translate that out to service and support, and that gives me the ability to have clarity, consistency, and timeliness across my ecosystem. And the hub-and-spoke helps drive that. >> Tell us about, you just alluded to it, sales and marketing, how is customer data, as an asset that you manage through your MDM environment, how is that driving better engagement with your customers? >> Well, it drives better engagement, first of all, you said an important thing, which is asset. We are very keen on doing data as an asset. I mean, systems come and go, platforms come and go. It's CRM tool today, CRM tool number two tomorrow, but data always is. Some of the things we've done is try to house and put a label on data as an asset, something that needs to be managed, that needs to be maintained, that needs to-- >> Governed. >> have an investment to. Right, governed, because if you don't, then it's going to decline in value over time, just like a physical asset, like a building. If you don't maintain and invest, it deteriorates. It's the same with data. What's really important about getting data from a customer's standpoint is the more we can align quality data, again, looking at that, not all data. Trying to govern all data is very difficult, but there's a treasure of data that helps us make decisions about our customers, but having that data align consistently to a lens of an account that's driven by MDM proliferate across your ecosystem so that everyone knows how to act and react accordingly, regardless of their function, gives us a very powerful process that we can gauge our customers, so that customer experience becomes consistent as well. If I'm talking to someone in sales and they understand me differently, then I'm talking to someone in support, versus talking to someone in marketing or another organization, it creates a differentiating customer experience. So if I can house that customer data, aligned to one lens of the customer, that provides that ubiquity and a consistency from a view in dealing with our customers. >> Talk to us about governance and stewardship with the data. Who owns the customer data? Is it sales, is it marketing, or is there another specified data steward who manages that data? >> Well, there's several different roles that you've going to hit through. Stewardship, we have, within my data technology services organization, we have a stewardship function. So, we steward data, act on data, but there's processes that we put in place, that's you're default process, and that's how we steward data and augment data over time. We do take very specific requests from sales and marketing. More likely, when it comes to an account from marketing, sorry, from sales, whose sales will guide, you know, move this, change this, alter that. So from a domain perspective, one of the things we're working through right now is data domains, and who has, I don't know if you're familiar with racing models, but who is responsible, who is accountable, who is consulted, who just receives an interest or gets information about it. But understanding how those data domains play against data is very, very important. We're working through some of that now, but typically, from a customer data, we align more toward sales, because they have that direct engagement. Part of it, also, is that differentiated view. Who has the most authority, the most knowledge about the top 500, top 1,000, top 2,000 customers is different than the people you had customer 10,000. So you usually have different audiences that play, who helps us govern and steward that data. >> So, one of the tensions that's been in place for years as we tried to codify and capture information about engagement, was who put the data in, what was the level of quality that got in there, and in many respects, the whole CRM thing, took a long time to work, precisely, because what we did is we moved data entry jobs from administrators into sales people, and they rebelled. So as you think about the role that quality plays and how you guide your organization to become active participants in data quality, what types of challenges do you face in communicating with the business, how to do about doing that, and then having your systems reflect what is practical and real in the rest of your organization? >> Well, it's a number of things. First of all, you have to make data relevant. If the data that that these people are entering is not relevant and isn't meaningful to them, the quality isn't going to be there, because they haven't had a purpose or a reason to engage. So, first thing is help make the data be relevant to the people who are you're data creators, right? And that's also to your business leaders. You also want the business leaders coming to you and talking about data, not just systems, and that's one of the things we're working toward as well. But as part of that, though, is giving them tools to ease the process of data-create. If I can go to my CRM tool instead of having to type in a new account, if I can then click on a tool and say, Hey, send to CRM, or add to CRM. So it's really more of a click and action that moves data, so I ensure that I have a good quality source that moves into my data store. That removes that person from being in the middle, and making those typing mistakes, those error mistakes. So it's really about the data-create process and putting a standard there, which is very important, but also then having your cleansing tools and capabilities in your back end, like the MDM or a data stewardship function. >> So by making the activity valuable, you create incentive for them to stay very close to quality consideration? >> Absolutely, because at the end of the day, they use that old term, garbage in, garbage out, and we try to be very clear with them, listen, someday you're going to want to see this data, and you probably should take the time to put quality effort in to begin with. >> Got it, one last quick question. If you think about five years, how is your role going to change? 30 seconds. >> I think the role's going to change in going from an IT-centric view, where I'm looking at tools and systems, to driving business outcomes and addressing business goals, and really, talking to business about how do they leverage data as a meaningful asset to move their business forward, versus just how am I deploying stewardship governance and systems and tools. >> Excellent. Matthew Cox, McAffee, data quality and utilization. >> Absolutely. >> Once again, you're watching theCUBE. We'll be back in a second. (techy music)
SUMMARY :
Brought to you by Informatica. Welcome to theCUBE, Matthew. Glad to be here. on the practitioner side. and bring meaning of data to the business but also the consumers of the data, seen of the business to IT, is to build out our and making sure that as we move data to be gained from using MDM. What that allows me to do is not only is also in the midst of finding new ways, that doesn't need to proliferate around. and that gives me the ability something that needs to be managed, is the more we can Talk to us about governance that we put in place, and in many respects, the whole CRM thing, the quality isn't going to be there, and we try to be very clear with them, how is your role going to change? and really, talking to business about Matthew Cox, McAffee, data We'll be back in a second.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jim Kobielus | PERSON | 0.99+ |
Peter | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
Matthew Cox | PERSON | 0.99+ |
McAfee | ORGANIZATION | 0.99+ |
Informatica | ORGANIZATION | 0.99+ |
Matthew | PERSON | 0.99+ |
Jim | PERSON | 0.99+ |
today | DATE | 0.99+ |
30 seconds | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
tomorrow | DATE | 0.99+ |
both | QUANTITY | 0.99+ |
three times | QUANTITY | 0.98+ |
First | QUANTITY | 0.98+ |
first goal | QUANTITY | 0.97+ |
GDPR | TITLE | 0.97+ |
McAffee | ORGANIZATION | 0.97+ |
one | QUANTITY | 0.97+ |
Las Vegas | LOCATION | 0.97+ |
Data & Technology Services | ORGANIZATION | 0.95+ |
theCUBE | ORGANIZATION | 0.94+ |
Company A | ORGANIZATION | 0.93+ |
first part | QUANTITY | 0.92+ |
about five years | QUANTITY | 0.92+ |
one spectrum | QUANTITY | 0.91+ |
Informatica World 2018 | EVENT | 0.87+ |
top | QUANTITY | 0.86+ |
one consistent | QUANTITY | 0.84+ |
MDM | TITLE | 0.84+ |
one last quick question | QUANTITY | 0.77+ |
Wikibon | ORGANIZATION | 0.76+ |
2,000 customers | QUANTITY | 0.74+ |
Informatica World | EVENT | 0.72+ |
last 10 | DATE | 0.7+ |
MDM | ORGANIZATION | 0.7+ |
SiliconANGLE | ORGANIZATION | 0.69+ |
2018 | DATE | 0.68+ |
one end | QUANTITY | 0.68+ |
second | QUANTITY | 0.62+ |
top 500 | QUANTITY | 0.6+ |
10,000 | QUANTITY | 0.51+ |
years | QUANTITY | 0.51+ |
Venetian | LOCATION | 0.5+ |
1,000 | QUANTITY | 0.49+ |
two | QUANTITY | 0.42+ |
Suresh Menon, Informatica | Informatica World 2018
>> Announcer: Live from Las Vegas, it's theCUBE! Covering Informatica World 2018. Brought to you by Informatica. >> Welcome back everyone. This is theCUBE's exclusive coverage of Informatica World 2018. Live here in Las Vegas at the Venetian Hotel. I'm John Furrier, co-host with Peter Burris. Here for the next two days of wall to wall coverage. Our next guest is is Suresh Menon, Senior Vice President and General Manager of the Master Data Management group within Informatica. He's got the keys to the kingdom, literally. Welcome back, good to see you. >> Thank you for having me. >> The key of all this pun intended is the data. And the cataloging's looking good. There's a lot of buzz around cataloging. What you guys have as a core product. Your customers love the product. The world's changing. Where are we, what's the update? >> Catalog is extremely important. Not just to enterprise data, the entire landscape by itself. But it's equally very exciting for MDM. Cause what has the potential to to is transform the way in how quickly people can get value out of MDM. Cause a combination of metadata and artificial intelligence through machine learning is what can create self-configuring, self-operating, even self maintaining Master Data Management. And that's extremely important because in today's world, the digital world that we live in, the explosion of data. The explosion of data sources. The new kinds of data that MDM is being asked to master, correlate and link with is becoming so huge that it's not humanly going to be possible to manage/curate this data. And you need to have AINML, and the underlying metadata awareness that the catalog brings, in order to solve these new problems. >> So Suresh, after you came onto theCUBE last year. You left and I said, there's a question I should've asked him. I'm going to put you on the spot. If you could do it. If you could create a new term for this Master Data Management. And where it's going. What would you call it? >> Yeah. You know Master Data Management has been around not for very long. About eight or nine years. It doesn't begin to describe the kind of problem that we're trying to solve here today. The only one that I can think of is 360's. It's more about getting the complete holistic view of all the business critical entities that you as an organization need to know. And 360 has traditionally been used around customer. But it's not only about the customer. You need to understand what products the customer owns. Engineer a 360 around their product. You need to understand how those customers interact with employees. You need an employee 360. You need an asset 360. How can you even begin to do householding, if you don't do a location 360? >> I want to build on that. In many respects it's the ability to sustain the context of data for different personas, for different applications, for different utilizations. So in many respects, Master Data Management really is the contextual framework by which an organization consumes data. Have I got that right? >> Absolutely. It is the you know. Another way to describe that would be it is what delivers the consistent authoritative description where you have the semantics being completely differently described in all of these cloud applications. We've gone very far away from the days maybe ten years ago, where you had a handful of CRM and ERP applications that you needed to disambiguate this information. Today I think I was reading this morning that an organization on average has 1,050 different cloud applications. And 3/4 of them are not connected to anything. And the describing, creating, authoring information around all these business critical entities. MDM is becoming the center of this ultra-connected universe in another way that I would look at it. >> It's also a key part of making data addressable. And we talked about this last year. But something that I have observed that's been happening since last year. The storage vendors have been radically changing their view. They're going to be have storage, but their data layer is sitting in all the clouds. That's interesting. That means that they're seeing that there's a data abstraction kind of underneath Informatica if you will. If that happens then you have to be working across all the clouds. Are customers seeing that? Are they coming to you saying that? Or are you guys getting out front? How do you view that dynamic? >> Customers are seeing that, have been seeing that for the last two to three years. As they have started taking these monolithic, very comprehensive, on premise applications to a fragmented set of applications in the cloud. Where do they keep a layer where they have all this business critical data in one place? And they're beginning to realize that as they move these things to the cloud, these applications are moving to the cloud, it's going from one to a couple of hundred. Master data is being seen as that layer that basically connects all these pieces of information together. And very importantly for a lot of these organizations, data that's proprietary to them. That they don't necessarily want locked up in an application that may or may not be there a couple of years down the road. >> The value shifting from state commodity. Even I was talking last week with the guys from NetApp about a great solid state drive they're going to have. But that values up top where the data is. And they have the data stored. So why not facilitate? And you guys can take it and integrate it into the applications, into the workloads. How is that going with respect to say catalog or the edge, for instance? How should a customer think about MDM? If they have to architect it out, what's the playbook? >> The number one thing is where the catalog comes in is first of all trying to identify in this highly fragmented universe you now have. As to where all your fragments, or master data reside. This is where the catalog comes in. It gives you in one Google-like text search, tells you where all the customer master attributes are residing across the landscape. Third party, on premise, in the cloud. The catalog will also tell you what the relative quality is of those those attributes. And then by apply AINML to it, be able to now figure out how those pieces of data can be transformed, cleansed, enriched and brought into MDM. The catalog has a role to play within MDM. What are the most appropriate matching and linking rules? What are the most appropriate survivorship trust tools that you need to apply? And how do you secure all that data that's now sitting in MDM? Because it's now in the cloud, and you know data security and protection is top of mind for most-- >> Talk about AI over at MDM. Because last year Claire was announced. We've seen certainly with GDPR that AI will play a role. Machine learning and AI. It's all coming together. The relationship between MDM and AI. Natural to me, seems like it's natural. How do you guys see the fit between AI and MDM? >> It is fundamental to MDM. And where we've begun our investment in AINML is one of the most core capabilities around MDM, which is being able to recognize potential duplicates. Or detect non-obvious relationships across this vast set of master data that's coming in. We've applied AINML, and we'll see a demo of that tomorrow, and we'll here in Vegas, is using machine learning on top of the world's best matching algorithms, in order to infer what are the most appropriate strategies in order to link and discover these entities? And build a relationship graph, without a human having to introspect the data. >> One of our predictions is that over the course of the next few years companies are actually going to start thinking about networks of data. That data is going to get the network formation treatment. That devices, and pages, and identities and services that we've gotten in the past. It does seem as though MDM could play a very, very important role in as you said identifying patterns in the data, utilization of the data. What constitutes a data node? What constitutes an edge? Number of different ways of thinking about it. Is that the direction that you see? First of all, do you agree with that notion of networks of data? And is that the direction you see MDM playing in the future? >> Absolutely. Because up until now MDM was used to solve the problem of creating a distinct node of data. Where we absolutely had to ensure that whatever it is then node was describing is actually the entire, complete, comprehensive entity. Now the next step, the new frontier for MDM is now about trying to understand the relationships across those nodes. And absolutely. MDM is both about that curation that governs, which is very important for GDPR and all of the other initiatives out there. But equally importantly now being able to understand how these entities are related across those, the graph of all of those nodes now. >> Weave in the role that security's going to play. Because MDM can... Well we'll step back. Everybody has historically figured that either data is secure or it's not. Largely because it was focused on a device. And if you have a device, and secure the device, all the data on that device got equally secured. Nowadays data is much more in flight. It's all over the place. It's a lot of different sources. The role that security plays in crafting the node, in privatizing data and turning it into an asset, is really important. But it could really use the information that's MDM to ensure that we are applying the appropriate levels of security, and types of security. Do you see an evolving role between MDM and data security? >> I would actually describe it differently. I would say that security is now the core design principal for MDM. It has to be baked into everything that we do around designing MDM for the future. Because like you said, we've again gone away from some handful of sources, bringing data into MDM in a highly protected, on premise environment with a very limited number of consumers. Now we have thousands of applications delivering that data to MDM. And you've got thousands of business users. Tens of thousands of them. Applications all leveraging that master data in the context of those interfaces. Security has never bee more important for MDM. This is again another way of security. And I want to bring catalog back again. Catalog is going to automatically tell the MDM configuration developer that these are pieces of data that should be protected. This is PII data. The the health data. This is credit data. That security is implicit in the design of those MDM initiatives. >> I think that's huge with cloud and connected edge in the network that is critical. I got to ask you. I now we're tight on time. I want to get one more question in. Define intelligent MDM. I've heard that term. What does that mean to you? You mentioned security design in the beginning. I get that, what that is. But I heard the term intelligent MDM. What is the definition of that? What does it mean? >> It really means MDM that is built for three new imperatives. One is being able to scale, what I would call digital scale. It's no longer enterprise scale. It is about being able to make sense of interactions and relationships, and being able to use the power of the catalog, and AINML, in order to connect all of these dots. Because connecting these dots is what's going to deliver immense business value to those organizations. Facilitate the rise of the business user, and their requirements. Intuitive interfaces that allow them to perform their day to day interaction with MDM. And finally time to value. Intelligent MDM should be up and running, not in months or years, but in weeks if not days. And this is where the power of catalog, power of machine learning, can make this a reality. >> That's a great clip. I'm going to clip that. That's awesome. And then putting it into action, that's the key to success. Suresh, thanks for coming on. Great to see you. >> Thank you very much. >> As always. You've got the keys to the kingdom, literally. MDM is at the center of it all, the things going on with data from cloud, edge computing, all connected. I'm John Furrier with Peter Burrs bringing all the action here at Informatica World 2018. We'll be back with more after this short break.
SUMMARY :
Brought to you by Informatica. He's got the keys to the kingdom, literally. is the data. that the catalog brings, I'm going to put you on the spot. of all the business critical entities the ability to sustain the context It is the you know. Are they coming to you saying that? have been seeing that for the last two to three years. How is that going with respect to say catalog What are the most appropriate matching and linking rules? Natural to me, seems like it's natural. is one of the most core capabilities around MDM, And is that the direction you see MDM playing and all of the other Weave in the role that security's going to play. in the context of those interfaces. What is the definition of that? It is about being able to that's the key to success. You've got the keys to the kingdom, literally.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Peter Burris | PERSON | 0.99+ |
Suresh Menon | PERSON | 0.99+ |
Informatica | ORGANIZATION | 0.99+ |
Suresh | PERSON | 0.99+ |
Peter Burrs | PERSON | 0.99+ |
Vegas | LOCATION | 0.99+ |
John Furrier | PERSON | 0.99+ |
last year | DATE | 0.99+ |
Las Vegas | LOCATION | 0.99+ |
Today | DATE | 0.99+ |
thousands | QUANTITY | 0.99+ |
last week | DATE | 0.99+ |
Claire | PERSON | 0.99+ |
GDPR | TITLE | 0.99+ |
tomorrow | DATE | 0.99+ |
3/4 | QUANTITY | 0.98+ |
ten years ago | DATE | 0.98+ |
One | QUANTITY | 0.98+ |
1,050 different cloud applications | QUANTITY | 0.98+ |
Informatica World 2018 | EVENT | 0.97+ |
both | QUANTITY | 0.97+ |
one | QUANTITY | 0.96+ |
today | DATE | 0.96+ |
ORGANIZATION | 0.94+ | |
one place | QUANTITY | 0.94+ |
one more question | QUANTITY | 0.94+ |
Tens of thousands | QUANTITY | 0.93+ |
MDM | TITLE | 0.9+ |
three years | QUANTITY | 0.9+ |
this morning | DATE | 0.9+ |
first | QUANTITY | 0.9+ |
First | QUANTITY | 0.89+ |
applications | QUANTITY | 0.89+ |
Venetian Hotel | LOCATION | 0.89+ |
About eight | QUANTITY | 0.88+ |
two | QUANTITY | 0.87+ |
three | QUANTITY | 0.84+ |
360 | QUANTITY | 0.84+ |
theCUBE | ORGANIZATION | 0.79+ |
nine years | QUANTITY | 0.77+ |
AINML | TITLE | 0.76+ |
a couple of hundred | QUANTITY | 0.76+ |
thousands of business | QUANTITY | 0.75+ |
two days | QUANTITY | 0.72+ |
360 | TITLE | 0.68+ |
couple of years | QUANTITY | 0.59+ |
NetApp | ORGANIZATION | 0.55+ |
them | QUANTITY | 0.52+ |
360 | COMMERCIAL_ITEM | 0.51+ |
few years | DATE | 0.49+ |
predictions | QUANTITY | 0.49+ |
360 | ORGANIZATION | 0.43+ |
360 | OTHER | 0.35+ |
Suresh Menon, Informatica - Informatica World 2017 - #INFA17 - #theCUBE
>> Narrator: Live from San Francisco, it's theCUBE, covering Informatica World 2017, brought to you by Informatica. (driving techno music) >> Hey, welcome back everyone. Live here in San Francisco, Informatica World 2017, this is theCUBE's exclusive coverage from SiliconANGLE Media. I'm John Furrier, host of theCUBE, with my co-host Peter Burris, head of research at SiliconANGLE Media, also General Manager of wikibon.com, doing all the cutting edge research on data, data value, what's it mean, cloud, etc. Check it out at wikibon.com. Next guest is Suresh Menon, who's the SVP and General Manager of Master Data Management Informatica. The key to success, the central brains. MDM, great, hot area. Suresh, thanks for coming on theCUBE. Appreciate it. >> Thank you for having me. >> So, MDM has been in almost all the conversations we've had, some overtly and some kind of implied through... Take a minute to describe what you're managing and what the role is in that data fabric, in that Data 3.0 vision, why Master Data Management is so important. >> Right, if you think about Master Data Management, there are two ways to look at it. The first one would be in terms of MDM, let's follow the definition. Master Data is really about all the business critical entities that any organization is, you know, should be concerned about. So if you think about customers and products, that's the two most critical ones, and that's really where Master Data Management began. But then you should also think about employees, locations and channels, suppliers, as all being the business critical entities that every organization should care about. Master Data Management is about making sure that you have the most trusted, authoritative and consistent data about these entities, which can then fuel the rest of your enterprise. MDM has been used in the past to fulfill certain specific business objectives or outcomes, such as improving customer centricity, making sure that you're onboarding suppliers with a minimal amount of risk, and also to make sure that your products as being described and syndicated out to the web are done in the most efficient manner. >> You guys have the Industry Perspective Monday night. What was the insight from the industry? I mean, how was the industry... I know Peter's got a perspective on this. He thinks there's opportunity, big time, to reposition kind of how this is thought, but what's the industry reaction to MDM? >> The industry reaction is renewed excitement in MDM. MDM started off about 10 years ago. A lot of early adopters were there. And as is usual with a lot of early adopters, there was a quick dip into the cycle of disillusionment. What you've seen over the last couple of years and the excitement from Monday is the resurgence about MDM, and looking at MDM as being a force of disruption for the digital transformation that most organizations are going through, and actually being at the center of that disruption. >> Well it's interesting, I almost liken this to... I'm not a physicist, I wish I was, perhaps... Physics encounters a problem, and then people look at this problem and they say "Oh my goodness, that's, how are we going to solve that?" And then somebody says "Oh, I remember a math technique that I can apply to solve this problem and it works beautifully." I see MDM almost in the same situation. Oh, we've got this enormous amount of data. It's coming from a lot of different sources. How do we reconcile those all those sources? Oh, what a... oh, wait a minute. We had this MDM thing a number of years ago. How about if we took that MDM and tried to apply it to this problem, would it work? And it seems to fit pretty nicely now. Do you agree with that? >> I agree with that. There's also a re-defninition of MDM. Because sometimes when you look at what people think about, "Oh, that was MDM from seven years ago. How does that apply to the problems I'm dealing with today, with IoT data, social network data, interaction data that I need to make sense of. Wasn't MDM for the structured world and how does it apply for the new world?" And this is really the third phase of MDM, going from batch analytics, fueling old real-time applications, whether it was marketing, customer service and so on. And now, providing the context that is necessary to connect dots across this billions and billions of data that is coming in, and being able to provide that insight and the outcome that organizations are hoping to achieve by bringing all this together. >> You mentioned... I just want to jump in for a second, cause you mentioned unstructured data and also the speed of data, getting the value. So data as a service, these trends are happening, right? The role of data isn't just, okay, unstructured, now deal with it. You've got to be ready for any data injection to an application being available. >> Suresh: Yes. >> I mean, that's a big fact too, isn't it? >> Absolutely, and organizations are looking at what used to be a batch process that could run overnight, to now saying "I'm getting this data in real time and I need to be able to act on it right now." This could be organizations saying, "I'm using MDM to connect all of this interaction data that's coming in, and being able to make the right offer to that customer before my competition can." Shortening that time between getting a signal to actually going out and making the most relevant offer, has become crucial. And it also applies to other things such as, you identify risk across any part of your organization, being able to act upon that in real time as opposed to find out later and pay the expense. >> I know this is not a perfect way of thinking about it, but perhaps it will be a nice metaphor for introducing what I'm going to say. I've always thought about MDM as the system of record for data. >> Suresh: Yes. >> Right? And as we think about digital business, and we think about going after new opportunities and new types of customers, new classes of products, we now have to think about how we're going to introduce and translate the concepts of design into data. So we can literally envision what that new system of record for data is going to look like. What will be the role of MDM as we start introducing more design principles into data? Here's where we are, here's where we need to be, here's how we're going to move, and MDM being part of that change process. Is that something you foresee for MDM? >> Absolutely, and also, the definition of... MDM in the past used to be considered as, let's take a small collection of slowly changing attributes, and that's what we master for through the course of time. Instead now, MDM is becoming in this digital age, as you're bringing in tens of thousands of attributes even about a customer and a supplier, MDM being part of that process that can grow, and at the same time, those small collection of attributes important as a kernel inside of this information, it's that kernel that provides the connection, the missing link, if you will, across all of these. And absolutely, it's a journey that MDM can fuel. >> We think that's crucially important. So for example, what we like to say is we can demarcate the industry. We think we're in the middle of a demarcation point, I guess I should say. Where for the first 50 years we had known process, unknown technology. Now we're looking at known technology generally speaking, but extremely unknown process. Let me explain what I mean by that. We used to have very stylized, as you said, structured data. Accounting is a stylized data form, slow moving changes etc. And that's what kind of MDM was originally built for, to capture that system of record for those things. Now we're talking about trying to create digital twins of real world things that behave inconsistently, that behave unpredictably, especially human beings. And now we're trying to capture more data about them, and bring them in to the system. Highly unstructured, highly uncertain, learning and training. So, help us connect this notion of machine learning, artificial intelligence back to MDM, and how do you see MDM evolving to be able to take this massive, new and uncertain types of data, but turn it into assets very quickly. >> Absolutely. It's a crucial part of what MDM is all about today and going forward into the future. It is the combination of both the metadata understanding about what it is that these data sets are going to be about, and then applying artificial intelligence through machine learning on top of it, so that... MDM was always about well-curated data. How can you curate data by human curation, how is that possible when you've got these real time transactions coming in at such high speed and such high volume? This is where artificial intelligence can detect those streams, be able to infer the relationships across these different streams, and then be able to allow for that kind of relationship exploration and persistence, which is key to all of this. Completely new algorithms that are being built now, it augments... >> Does it enhance master data, or extracts it away? What's the impact... like ClAIRE, for instance. What's the impact to MDM? More relevant, less relevant? >> Even more relevant, and three key areas of relevance. Number one is about automating the initial putting together about MDM, and then also automating the ongoing maintenance. Reacting to changes, both within the organization and outside the organization, and being able to learn from previous such interactions and making MDM self-configuring. The second part of it is stewardship. If you think about MDM, in the past you always had stewards, a small number of stewards in an organization who would go out and curate this data. We now have tens of thousands of businesses across the organization saying, "I want to interact with this master data, I have a role to play here." For those business users now, you have tens of thousands of them, and then thousands and thousands of attributes. Machine learning is the only way that you can stop this data explosion from causing a human explosion in terms of how do you manage this. >> John: Yeah, a meltdown. >> Yeah, a meltdown. MDM both is going to be improved through these technologies, but MDM also has to capture these crucial new sources of data and represent them to the business. >> New metadata, right? >> Yeah, all these artificial intelligence systems and machine learning stuff is going to be generating data that has to be captured somehow, and MDM's a crucial part of that. >> Exactly, right. >> So let me ask you a question. >> If we can boil this down really simply... >> John: He's excited about MDM. >> Look, I'm excited about data, this is so... If we kind of think about this, we had an accounting system, well let me step back. In the world where we were talking about hard assets, we had an accounting system that had a fixed asset module. So we put all our assets in there, we put depreciation schedules on it, we said, "Okay, who's got what? Who owns it, who owns the other things?" Is MDM really become the data asset system within the business? Is that too far a leap for you? >> I don't think so. I mean, if you think about, if master data was all about making sure that the business critical data, everything that the organization runs on, the business is running on, and now if you think of that, that's the data that's going to fuel, um, enable this digital disruption that these organizations want to do with that data, MDM's at the heart of that. And finally, the last piece I think, your point about the artificial intelligence, the third part of where MDM increases its relevance is, you have the insight now. The data is being put together, we've curated that data, we've discovered those relationships through machine learning. What next? What's next is really about not just putting that data in the hands of a user or inside of a consuming application, but instead, recommending what that application or user needs to do with that data. Predict what the next product is that a customer is going to buy, and make that next best offer recommendation to a system or a user. >> Suresh, you're the GM now, you've got the view of the landscape, you've got a business to run. Charge customers for the product, subscription, cloud, on-premise license, volving. You've got a new CMO. You've got to now snap into the storyline. What's your role in the storyline? Obviously, the story's got to be coherent around one big message and there's got to be the new logo we see behind here. What's your contribution to the story, and how are you guys keeping in cadence with the new marketing mission? >> This has been a very closely run project, this entire re-branding. It's not just a new logo and a new font for the company's name. This has been a process that began many, many months ago. It started from a look at what the direction of our products are across MDM. We worked very closely with Sally and her team to... >> John: So You've been involved. >> Absolutely, yes. >> The board certainly has. >> Both board members said they were actively involved as well. >> Yeah, this has been a... >> What do you think about it, are you excited? >> It's fantastic. >> It think it's one of those once-in-a-generation opportunities that we get where we've got such a broad breadth of capabilities across the company, and now to be able to tell that story in a way that we've never been able to before. >> It's going to help pull you into the wind that's blowing at your back. You guys have great momentum on the product site, congratulations. Now you got the... the brand is going to be building. >> Fantastic, yes. >> Okay, so what's the final question? Outlook for next year? How's the business going, you excited by things? >> Very much so. MDM has been across the board for Informatica, and I'm sure you've seen here at the conference, the interest in MDM, the success stories with MDM, large organizations like Coca-Cola and GE redoing the way they do business all powered through MDM. MDM has never been more relevant than it is now. >> And the data tsunami is here and coming and not stopping, the waves are hitting. IoT. Gene learning. >> Suresh: Right. >> Batching. >> Batching, absolutely. >> With enable frederated MDM, we'll be able to do this on a global scale, and master class... >> We'll have to have you come into our studio and do an MDM session. You guys are like, this is a great topic. Suresh, thank you so much for coming on theCUBE, really appreciate it. General Manager of the MDM Business for Informatica Master Data Management. Was once a cottage industry, now full blown, part of the data fabric at Informatica. Thanks so much for sharing on theCUBE. We're bringing you all the master CUBE interviews here in San Francisco for theCUBE's coverage of Informatica World. Back after this short break, stay with us. (techno music)
SUMMARY :
brought to you by Informatica. The key to success, the central brains. Take a minute to describe what you're managing Master Data is really about all the You guys have the and actually being at the center of that disruption. I see MDM almost in the same situation. and how does it apply for the new world?" and also the speed of data, getting the value. and being able to make the right offer the system of record for data. data is going to look like. that can grow, and at the same time, back to MDM, and how do you see MDM evolving that these data sets are going to be about, What's the impact to MDM? and outside the organization, and being able to MDM both is going to be generating data that has to be Is MDM really become the data asset putting that data in the hands of Obviously, the story's got to be new font for the company's name. Both board members said they across the company, and now to It's going to help pull you into the MDM has been across the board for Informatica, And the data tsunami is here and do this on a global scale, and master class... We'll have to have you come into
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Sally | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
Suresh Menon | PERSON | 0.99+ |
Suresh | PERSON | 0.99+ |
John | PERSON | 0.99+ |
GE | ORGANIZATION | 0.99+ |
Coca-Cola | ORGANIZATION | 0.99+ |
John Furrier | PERSON | 0.99+ |
SiliconANGLE Media | ORGANIZATION | 0.99+ |
San Francisco | LOCATION | 0.99+ |
Monday | DATE | 0.99+ |
two | QUANTITY | 0.99+ |
thousands | QUANTITY | 0.99+ |
Monday night | DATE | 0.99+ |
Peter | PERSON | 0.99+ |
Informatica | ORGANIZATION | 0.99+ |
next year | DATE | 0.99+ |
second part | QUANTITY | 0.99+ |
two ways | QUANTITY | 0.99+ |
wikibon.com | ORGANIZATION | 0.99+ |
both | QUANTITY | 0.98+ |
third phase | QUANTITY | 0.98+ |
theCUBE | ORGANIZATION | 0.98+ |
tens of thousands | QUANTITY | 0.98+ |
seven years ago | DATE | 0.98+ |
first 50 years | QUANTITY | 0.98+ |
first one | QUANTITY | 0.97+ |
#INFA17 | EVENT | 0.97+ |
Informatica World 2017 | EVENT | 0.96+ |
billions of data | QUANTITY | 0.94+ |
one big message | QUANTITY | 0.93+ |
one | QUANTITY | 0.93+ |
today | DATE | 0.93+ |
MDM | ORGANIZATION | 0.93+ |
businesses | QUANTITY | 0.91+ |
three key | QUANTITY | 0.9+ |
Both board members | QUANTITY | 0.89+ |
MDM | PERSON | 0.87+ |
last couple of years | DATE | 0.85+ |
third part | QUANTITY | 0.84+ |
Number one | QUANTITY | 0.83+ |
Narrator | TITLE | 0.82+ |
of years ago | DATE | 0.81+ |
about 10 years ago | DATE | 0.8+ |
a second | QUANTITY | 0.8+ |
ClAIRE | TITLE | 0.79+ |
billions and | QUANTITY | 0.74+ |
Master Data | TITLE | 0.73+ |
many months ago | DATE | 0.72+ |
MDM | TITLE | 0.64+ |
attributes | QUANTITY | 0.62+ |
Informatica | EVENT | 0.61+ |
SVP | PERSON | 0.61+ |
twins | QUANTITY | 0.59+ |
General | PERSON | 0.5+ |
#theCUBE | EVENT | 0.45+ |
World | TITLE | 0.38+ |
Albrecht Powell, Accenture Analytics - Informatica World 2017 - #INFA17 - #theCUBE
>> Narrator: Live from San Francisco, it's the Cube. Covering Informatica World 2017. Brought to you by Informatica. (futuristic electronic music) >> Welcome back, everyone. We're here live in San Francisco. This is the Cube's exclusive coverage of Informatica World 2017. I'm John Furrier looking to angle the Cube. My co-host, Peter Burris, head of research for SiliconANGLE media, also general manager of Wikibon.com. Our next guest is Albrecht Powell who's the enterprise information management global lead at Accenture Analytics. Welcome to the Cube. >> Thanks very much. Good to be here today. >> John: See you're sporting the sideways A, not to be confused with siliconANGLE red A, which is the other way around. Great to have you on. >> That would be the accent on the future. (laughing) Our moniker. >> So, um. Great to have you on. Center analytics. A lot of people may or may not know-- huge investment in data science. You guy's are doing a lot of work, and integrating in with customers. Not just on the management consulting side, but, you know, a lot of the architecture, a lot of the delivery-- You essentially manage services across the board. >> Albrecht: Oh yeah. >> There's a lot of architecture going on, so I got to ask you about the data powered enterprise vision that you have, because that's the theme that you guys have. What does that mean, first of all? And how does it relate to Informatica World, and ultimately the customers just trying to get to the Cloud, lower their costs, increase their top line. What's the digital transformation connection? >> Boy, lots of questions in there. So, you know, to us, in the digital revolution that's happening right now, the expectations on companies are just growing exponentially. You've got customers, you've got shareholders, business partners. You've got stockholders that all have so much more insight on companies. They want more, and they're putting so many demands on companies today. So, it's causing disruption in the industry. We all know about the Uber's. We all know about going from print media to digital media. But you've got companies like John Deere; they sell tractors, right? But they're moving toward a platform based company now, where they're now working with farmers, they're working with agriculture, helping to support. So, when you've got that as a different business model, you've got that coupled with the explosion in data. So, you know, the statistics-- Amazon, I think it took six years to get their first trillion. Now it's you know, the next trillion they got in one year. By the year, I think 2020, 1.7 megabytes of data is going to be created per person per second. These are staggering numbers. And when you put those two together, I personally think that the next big wave, the next big value proposition for clients, is going to be data, and harnessing the power of that. When I look back over my 28 year career, I go back to the ERP days. That was the big wave. Right? You had to be on Oracle or SAP or PeopleSoft or JD Edwards. I think right now, we're just starting in this phenomenal wave of opportunity. >> You mentioned re-platforming, or platform approach. The word re-platforming is an industry buzzword. But that really is an impact to IT, business operations, and personnel, and ultimately the business model! I mean, this is like a serious impact. >> It really is, and that is where this data powered enterprise comes in. We're trying to work with our clients to figure out how to harness this value proposition, unlock the data that they've got stuck in their systems, the dark data wherever it may be, and unleash that and try to gain business insights from that. >> Alright. Take us through the playbook, because okay-- I buy it. I see the train coming down the tracks that is really high speed. I bet I got to move to the new model. You look at Amazon, it's a great proof point. Hockey sticks since 2010. No doubt about it. Just one tell sign. I want to move. Now, I got to be careful, if I move too fast I get over my ski's, or over-rotate-- whatever metaphor you want to use, but how do I get there? What are you guys doing with clients and what's the strategy? Playbook. >> You know, the biggest thing we try and do is the relationships we have with clients are long term, trust based relationships. And when we go in, we're not selling a product. We're trying to help them drive business value. So, what we typically do around the data space is help them figure out what's the strategy, what's the vision, where do they want to go? They may think they need a data quality solution, an MDM solution. But you know, we come in and we talk to them and we realize: what are you trying to get out of it? Where do you want to go? And lay out a vision, a set of guiding principles. And that framework often times help them drive within the next one-two years, a much more sustainable set of growth as opposed to trying to do a point solution. So typically, we'll start there. But, you know, we'll also come in if they're hemorrhaging, if they're bleeding, if they've got major problems. Or, if they're trying to hit a strategic adjective, procurement spend analytics, or growth, or disruption in the market. Those are the type of things that we'll come in and talk to them about to start with. >> Is there a mindset-- obviously, there's a mindset shift. But given that, certainly if the certain room's on fire, you take care of those first. I get the critical piece of it, 'cause sometimes it is mission critical right out of the gate. But, is there an architectural mindset? Is it a building blocks approach? Has there been a shift in how to deploy and iterate through, in an agile way, that you've seen a pattern that's emerged? >> I mean obviously Cloud is big with everybody today, and the hype out there is everybody's moving everything to Cloud. And in reality, a lot of our clients-- They've invested a lot in these data centers, so they're reticent to make the leap. So, we're working with them to help, and Informatica has been phenomenal with some of the tools and solutions that they have to help them pull over to you know, Cloud based solutions. And you know, most of our clients right now, they have a hybrid architecture. They're moving in that way. They've got some stuff that they want to keep close and tight, they've got some stuff that they want to move. But between OpenSource with the new subscription models-- For instance, and Informatica has. It's a game changer for our clients. Because now, they're able to get solutions up faster, quicker, and we do a lot of work with our liquid studios to help them pile at those type of solutions. >> But it's still got to be in service to some outcome, or to some idea? >> Albrecht: Absolutely. >> So, that suggests that one of the challenges that people have been having in the big data universe is this disconnect between what we want to do, and implementing a dupe on a cluster. And that notion of how do we actually introduce some of the concepts of design into that process so that we can see realistically, and practically, and in a way that executed, a process to go from the idea down to the actual implementation? So, use cases are a big issue. Getting developers more involved and active is a big issue. But, what is the role of design in this process? >> So one of the things that we've shifted to is we have a set of innovation centers, where we'll bring clients in, and we might start with a workshop or two, right? To talk to them about the capabilities. But very quickly we evolve that into design thinking sessions, to really draw out what's the real challenge they're trying to find? Because half the time, they think they know what the problem is, but they really don't, and we help them uncover that. And then, from a design standpoint, we do a lot more prototyping now, where we'll go through and actually build in a matter of weeks, a real time capability that they can go take and run with. We have this thing called the Accenture Insights Platform, where we've negotiated with a lot of partners, such as Informatica, to have their tools, their software, in a hot, ready Cloud-based environment, where again, in the matter of a couple of weeks, we can stand something up, and they can see it, they can touch it. It's no longer the big capital investments to go start these type of projects. >> But it has to again, be something that people can touch and can play with. >> Albrecht: Exactly. >> And start themselves, to start saying, "Well, yes, "it works here. It doesn't work here." So they can start iterating on it. It's a way of increasing the degree to which iteration is the dominant feature of how things roll out. Ties back to the use case. As you guys think about the tooling that's available, from Informatica and elsewhere, how does the tooling-- Is the tooling robust enough at this point to really support that process, or is there still some holes we have to fill? >> Yeah, you know, I almost feel like the technology is there, right? We can do so much. The challenge that I run into when I meet with the C-suite-- I always ask the question, "What's your holy grail question?" If you knew this piece of information, how would that be a game changer? Eight times out of ten, I hear, "If I knew sales by quarter by region, "and that is was accurate, "I could really do something." It's like, that's not your question. The question should be: Who should I acquire? When is a customer going to walk out of the store? What's the weather going to be? What's the minimum amount of water I need to put in a plant for it to grow? You, know, in a drought situation. And those are the kind of questions that we are trying to draw out from our clients. And again, these design thinking sessions help us drive to that. >> John: Is that liquid studio's and the innovation centers the same thing? You mentioned liquid studios. What is that? Real quick. >> They are. So, again the whole idea behind these studios is that instead of doing, you know, starting with a massive project, or driving a massive five year RFP for a program. Again, get it in a liquid fashion; very agile, very prototypical, you know, build something. >> John: Very fluid. (laughs) >> Exactly right. And so that they can see, touch, feel, and manipulate these things. And then from there, they may want to scale that up. And you know, they may do it themselves. Often times, they'll partner with us to do it. >> You're partnering in the real time requirements definition of what they're trying to do. >> Albrecht: Correct. >> Well, it must be organized. I saw on Twitter that Accenture received the Informatica Ecosystem Impact Award last evening. Congratulations. >> Albrecht: Thank you very much, I appreciate that. Very excited. >> Where did that come from, and why is it important to you guys? Obviously, the recognition with Informatica, you guys are doing well with them. >> Now, Informatica is a very strong strategic partner of ours. I mean, we've worked with them for the last 18 or so years. I personally been involved with them the whole time. The company has vision, you know, when you talk to Anel, you talk to Ahmet, who was just on-- The vision that they have for their products, they know where they want to go. The reinvention that they've done here with the new branding, and the new marketing-- A lot of our clients had traditionally thought of them as more the power center, and more the-- >> John: The plumbing. >> Exactly. >> John: I'll say it. >> And we keep challenging them. It's like, you know, why aren't you bigger? Why isn't everybody using you? Because I think the tool set is robust enough right now. And again, it's finding these use cases to be able to apply this. >> Well, they made a big bed. The joke in silicon valley right now, in infrastructure companies, is that plumbers are turning into machinists, as kind of an analogy. But now with machine learning, you're starting to see things that they've made a bed on that's flowering, and it's important. And I think they made some good bets. They'll be on the right side of history, in my opinion. But I want to ask you a personal question, because you know, you mention waves. You mention the ERP waves and the software wave of the mini computer, which then became local area networks, inter-networking, et cetera. Basically the premise of what IT has turned into. With now, the disruption that's going on, how is it different? Because Informatica seems to be on that same software cycle in a new way. What is different about this new world order that's different than those days, the glory days, of rolling out SAP implementations, or Oracle ERP and CRM's. Shorter time cycles. What are the things that you're seeing that are key things that customers should pay attention to, they need to avoid, and things they should double down on, relative to this new wave of software? And how does Informatica fit into all that? >> Sure. The ERP wave was critical. It was the way to get everything under one umbrella. Very important, right? But today, the idea of single instance, companies can't keep up. They can't do that. So it's the nimble, it's the agile. I'm really excited about Informatica is that they've got the end to end solution, which is phenomenal, but they've also got the piece parts. And there's a lot of our clients that you know, they're trying to integrate multiple ERP systems together, they're trying to integrate multiple platforms, so MDM is becoming much more important today. Data governance. Absolutely critical out there. They've had a gap, frankly, in data governance for years. And yeah their acquisition, their AXON tool-- Again, it's a game changer out there and a lot of our clients are aggressively looking at that, and trying to do that. >> Paul: How does it change the game for some of your clients? Give an example. You don't have to name the customer, but in the use case basis. >> Everybody needs, you know. We talk about the need for governance, right? And it comes into whether it's paper based, whether it's automation-- Some way to get processes standardization and so forth around governance, and get people accountable. The tools that have been out in the market-- There are some that are good, but they're not integrated. There's no interoperability between them. And what I like about AXON now is they can sell it as a single point solution. Great way to get in the door of a client. But, they can also then integrate that with all of the other platform pieces that Informatica has, and that tie is really powerful. >> Well, governance also plays a role when you think about, for example, the idea that we want greater distribution of data-- Data is going to be more distributed. We want some visibility into that data through metadata, and (mumbles) talked about that. But, we heard from healthcare conversation this morning, and others, that one of the big barriers is, do I have access? Do I have rights? Do I have privileges to this data? And governance has to follow that process where people know in advance: What rights do I have? What access do I have? Am I using it properly? Am I breaking rules? That notion of governance can't just be centered on compliance and regulation, it has to be moved into more of an asset management approach. Do you agree? >> Right. Agreed. And the way we look at governance, it's expanding now. It's not the traditional data-owner, data-steward, data-operator any more. >> Yeah, it's not the central group. It's a corporate set of responsibilities. >> Right. And we're rolling governance now out to the end-user. So, how they are looking at data and interacting with data. Because data, now, it's a utility. It is something that everybody touches, everybody uses, not just an IT thing anymore. When you take that, and again you take the expanse of that into security. You know, as you talked about-- Secured source for example. The play in tying the two of those together. Very powerful solution. And even within Accenture, you know, we're tying our data, our governance, our security practices, much more tightly together as a single, unified solution. >> John: How does the AI machine learn, 'cause we hear in Claire their new interface, see LX out there, and Amazon. I mean Google I/O's announcing neural nets that train computers! Certainly it's a lot of buzzwords out there. Does that make the master data management, and the MDM, and the data quality more relevant? Or less relevant? >> I think just as relevant as it's always been. There's a lot of people that sit and say that the traditional data stuff is a commodity now. And again, machine learning is absolutely essential, AI. We need that because we're scaling so much bigger out in industry today. But, MDM is not going away. The integration between platforms, the need for good data quality. And I think, we almost took a shift in the industry to the buzzwords. Right? It's all about big data and AI and everything, and in some ways we almost left the traditional behind. And now we're coming back to realizing that you need good data to power the different data sources you've got, the big data and everything else, that then needs to be scaled, and that's where the machine learning-- >> And freed up for developers who have a DevOps mindset don't want to get into the nuances of being a data wrangler. >> Well, the patterns of data usage are going to be important, thinking about MDM. Because at the end of the day, you're not going to have copies of everything. >> No. >> You're going to have relationships, increasingly. >> Right. >> Peter: And MDM has to be able to capture that, too. >> Exactly. >> Alright, final question I have to ask you, what's the future for you guys? What do you guys see? 'Cause you guys always got the top brains in the industry working on things. what is Accenture's view of the future? What's the most important things coming down after this wave? Or is this wave just multiple sets, and to your clients, what are the top three things, or top things that you guys see as future waves or items that you're working on? >> You know, again, this data wave right now-- Again, it's the most exciting time that I've ever had in the career. And I see the growth that we're doing. And you know at Accenture, we have a lot of investment in research and development, we've got a team of data scientists that's out trying to mine data, figure out, you know, what the insights are that are out there. The liquid studios that we're pulling together. And, you know, as we talk to our clients, it's all about the art of the possible. It's not so much trying to sell a tool or solution. That's obviously important. But, where can we take you? What are the things that the industry hasn't thought of yet that we can take you as a company and help you disrupt into a new business market? >> Re-imagining the future. Thanks for coming, Albrecht. Appreciate it. Albrecht Powell with Accenture Analytics. Exciting this time in the industry-- I would agree data is certainly intoxicating at one level, but really great value opportunity. Thanks for coming on the Cube, and sharing the data with us as we analyze. Here on the Cube, more great coverage after this short break. At Informatica World 2017, I'm John Furrier, Peter Burris. We'll be right back with more. (futuristic electronic music)
SUMMARY :
Brought to you by Informatica. This is the Cube's exclusive coverage Good to be here today. Great to have you on. That would be the accent on the future. Great to have you on. because that's the theme that you guys have. is going to be data, and harnessing the power of that. But that really is an impact to IT, business operations, the dark data wherever it may be, I see the train coming down the tracks is the relationships we have with clients are long term, I get the critical piece of it, and solutions that they have to help them pull over to So, that suggests that one of the challenges So one of the things that we've shifted to But it has to again, be something that people can touch is the dominant feature of how things roll out. I always ask the question, John: Is that liquid studio's and the innovation centers is that instead of doing, you know, John: Very fluid. And you know, they may do it themselves. You're partnering in the real time requirements definition the Informatica Ecosystem Impact Award last evening. Albrecht: Thank you very much, I appreciate that. to you guys? for the last 18 or so years. It's like, you know, why aren't you bigger? What are the things that you're seeing that you know, they're trying to integrate but in the use case basis. We talk about the need and others, that one of the big barriers is, And the way we look at governance, it's expanding now. Yeah, it's not the central group. And even within Accenture, you know, we're tying Does that make the master data management, and the MDM, that the traditional data stuff is a commodity now. And freed up for developers who have a DevOps mindset Because at the end of the day, in the industry working on things. And I see the growth that we're doing. and sharing the data with us as we analyze.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Albrecht | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
Informatica | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Uber | ORGANIZATION | 0.99+ |
John | PERSON | 0.99+ |
Peter | PERSON | 0.99+ |
Paul | PERSON | 0.99+ |
Albrecht Powell | PERSON | 0.99+ |
John Deere | ORGANIZATION | 0.99+ |
San Francisco | LOCATION | 0.99+ |
1.7 megabytes | QUANTITY | 0.99+ |
Eight times | QUANTITY | 0.99+ |
first trillion | QUANTITY | 0.99+ |
John Furrier | PERSON | 0.99+ |
2010 | DATE | 0.99+ |
28 year | QUANTITY | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
PeopleSoft | ORGANIZATION | 0.99+ |
Accenture Analytics | ORGANIZATION | 0.99+ |
2020 | DATE | 0.99+ |
two | QUANTITY | 0.99+ |
five year | QUANTITY | 0.99+ |
Ahmet | PERSON | 0.99+ |
Accenture | ORGANIZATION | 0.99+ |
six years | QUANTITY | 0.99+ |
ten | QUANTITY | 0.99+ |
AXON | ORGANIZATION | 0.99+ |
today | DATE | 0.99+ |
Wikibon.com | ORGANIZATION | 0.99+ |
#INFA17 | EVENT | 0.98+ |
one year | QUANTITY | 0.97+ |
Anel | PERSON | 0.97+ |
SAP | ORGANIZATION | 0.97+ |
SiliconANGLE | ORGANIZATION | 0.97+ |
last evening | DATE | 0.96+ |
one | QUANTITY | 0.96+ |
single instance | QUANTITY | 0.96+ |
Claire | PERSON | 0.96+ |
one level | QUANTITY | 0.95+ |
Informatica Ecosystem Impact Award | TITLE | 0.95+ |
single | QUANTITY | 0.93+ |
this morning | DATE | 0.93+ |
Informatica World 2017 | EVENT | 0.93+ |
first | QUANTITY | 0.93+ |
next | QUANTITY | 0.92+ |
Edwards | PERSON | 0.92+ |
Informatica World | ORGANIZATION | 0.91+ |
Cube | ORGANIZATION | 0.9+ |
three things | QUANTITY | 0.89+ |
big | EVENT | 0.89+ |
ORGANIZATION | 0.86+ | |
one-two years | QUANTITY | 0.84+ |
trillion | DATE | 0.83+ |
ORGANIZATION | 0.82+ |
Amit Walia, Informatica - Informatica World 2017 - #INFA17 - #theCUBE
>> Announcer: Live from San Francisco it's the CUBE. Covering Informatica World 2017. Brought to you by Informatica. >> Welcome back everyone. We are here live in San Francisco for Informatica World 2017 exclusive coverage from the CUBE. Third year covering the transformation of Informatica as a company. I'm John Furrier, Silicon Angle. My co-host this week is Peter Burris, General Manager of Wikibon.com and Head of Research for Silicon Angle Media. Our next guest is eight time CUBE alumni, Amit Walia Executive Vice President of Products at Informatica. Amit, great to see you. >> Good to be here. >> Thanks for spending the time to come on. Saw you had a nice dinner last night with all your top customers. Very happy customers. Welcome to the CUBE. >> Yes, thank you. We keep them happy. Eleventh year in a row we got number one in customer loyalty. We work hard for that. >> There's a lot of exciting things happening. I just want to jump into some of the products though because that's your wheelhouse. You guys have been an amazing product company. I've always been kind of bullish on you guys, very complimentary. The one thing that, when we've talked on FaceBook and also on the CUBE is that not everyone knows about Informatica. They know about the old Informatica. We had Jerry Held on yesterday talking about the transformation, how hybrid cloud's here to stay. You guys have made great strides on the product front, the platform front, decentralizing data with control. Now you got the new brand. What's going on, give us the update. You got to be pretty pumped now, you got a megaphone out there with the new CMO. >> Yeah, lots happening at that end. I'll go back and paint a picture about how we see where the industry is and then how we are basically transforming that. My fundamental belief is that we're going through this massive transformation. Pick any word, but underlying at the technology level, the systems of records, all the databases and all the apps are massively fragmenting. Cloud, on-premise, hundreds and hundreds of choices. Systems of engagement for customers are fragmenting, right. When I talk to customers, they're struggling to figure out what is the system of intelligence. What's the organizing principle? Take a great example, my customer data and what I know about you John, is available inside the system, within multiple databases, multiple apps, outside the systems, what you do on LinkedIn, FaceBook, Twitter, how do I get a handle of you to be able to effectively engage with you? That is a fundamental change that is happening in the industry is what is my organizing principle to have the system of intelligence? We've honed in at the metadata layer for that. We believe leave the data wherever it is because it's going to be in different places. Use your best of breed apps. Organize the metadata because the scale and scope of that, while small, power of that is very high. Yesterday in my keynote, I announced the launch of Claire, our AIML offering. The idea is that we are going to be the Google of the enterprise to bring the entire metadata together. When we apply machine learning to it, it's the same algorithms that LinkedIn applies, or FaceBook applies for photo tagging or relationships, or Amazon applies for recommendations. We're going to apply it for data and make that then be what I call organizing principle, the system of intelligence for an enterprise. That's the nutshell of what we're trying to do. >> Also this Jada 3.0 thing, I want to press you on this because this is really cool. You guys have increased the surface area of addressibility of data and we talked about that last year, making it horizontally scalable yet with all the goodness of the controls as we talked about in the past. Now you're bringing in access methods via machine learning and AI techniques to make it accessible. Think Alexa, right. People at home, "Hey give me a song." How are you guys using the algorithms because now algorithms have become a super important part of what to look at. FaceBook, you mentioned FaceBook and Google, they've been criticized for their algorithms suppressing quality data. News cycle, things pop up once they see some traction. How do you guys tweak and enable algorithms to surface the best data possible? >> The best way to describe is that our philosophy is different. Claire, our AI engine, our goal is to make sure we can surface all of the data to the customer, but in an organized fashion. We're not looking to say, filter something. The best example is that predictive maintenance. If I am BMW and I'm running a robotic driven shop floor, how do I know when something's about to go down? I have a lot of old, historical data on my shop floor, but real time streaming data's coming from the sensor of the robot. I want to marry the two together and then let the system tell me, boy I feel like in the next 30 minutes, something is about to happen. We are doing those kind of things, solving those problems so we're not looking to filter or suppress anything. Our goal is to make sure we can bring more and more and more data together and with the help of machine learning, Claire, make it easy for customers to make decisions. Intelligent decisions, smart decisions, easier versus hundreds of people having to guess or predict which ends up not being very smart. >> On the road map side, I want you to take a minute to explain it. It's a good laying out the value proposition there, but I want to tie the cloud together with this because Jerry Held said yesterday, hybrid cloud's going to be a very long journey because legacy doesn't go away. You guys have a great business on-prem that's been historical for you guys. As you guys have modernized, what is the connection on a product basis that's available today and that's being worked on on a road map basis that says, you can do all this stuff with the data, but it's going to be cloud enabled. How do you get that cloud, hybrid cloud connection so the customer doesn't feel pain in moving to the cloud? >> First of all, I can boldly say that we were probably the only software company in the industry that disrupted our own industry to go to the cloud. By the way, data integration which is our core model, 11 years ago we invested in the cloud. We didn't know where it will go and we announced that Informatica 11 years ago and today 11 years later, we are the number one market share leader in cloud integration, number one in Gartner Magic Quadrant, and our cloud platform today is transacting a trillion transactions a month. In some ways, we were disrupting ourselves as you speak. >> Yeah, I mean the Gartner thing, I always say this cause those are old metrics, but the new metric is customer traction. You guys were in the announcement with Google Spanner as they globally GA their spanner distributed database which is a horizontally scalable database. You have a relationship with Amazon, you're in Microsoft. What is the customer uptake and what are some use cases? Give us some specifics. >> Three specific use cases. The customer started a journey in cloud more connecting cloud applications. On SalesForce, connected with World Gate, connected to SAP, so on and so forth. Simple application integration, all API management. Where data gravity is moving to cloud, where fundamental workloads are going and we see more and more traction is taking analytics to the cloud. I'm moving my workload to Redshift or I'm moving to an Azure data warehouse. That's where, by the way between January and May, we have moved half a trillion data objects to cloud data warehouses. Half a trillion. Clearly in that context we work with AWS. Three years ago, we started with them. Azure -- >> Just to put an exclamation point on that, in January it was a billion so between January and now, it's up to a trillion. Huge. That's a hockey stick. >> Kale is a hockey stick over there because so much more is being created outside the enterprise and customers don't want to bring it on-premise. They say look I just want to put it in Redshift or Azure database and I want to process there and over time, what they want, more to your point is connect me to my on-premise data warehouse too. Let's say I've done some analytics here, connect the relevant analytics and move it to, let's say my on-premise data warehouse and over a period of time as I get comfortable with this hybrid, I may take this workload and 100% flip over to the cloud too. They want this bi-directional journey. That's what's really enabling customers. >> It's always kind of hard to cobble together things that customers language that they're used to speaking in, to new concepts. It seems to me that data integration is your business of business. >> That's the foundation. We discovered data integration is the foundational layer and everything else we do is what I call more value added data management capabilities. Like MDM. Data integration allows you to connect, bring data together, MDM is a value added data management solution to say now I can get a 360 degree view of my customer like Nordstrom is using us for. Or a 360 degree view of my products, or a 360 degree view of my suppliers to make more business decisions. >> John: So integration is table stakes from your standpoint? Foundational. >> It's foundational. >> John: Foundational. Okay, better word. >> In that context, we operate like the Switzerland in the world of data. Whether it's Amazon, Google, Azure, tomorrow Oracle, SAP, we connect to the whole world. >> Amit, you have a vision of where this is all going to go. It's one thing to say, we've got our product set and we're moving it to a new technology base, which is good. That'll improve productivity. This whole concept of data management is bigger than just moving existing tooling, existing practices to a new set of platforms, no matter how much more productivity you might get out of those new platforms. It means something more. It means the way your business operates differently, business thinks differently, it means different ways of institutionalizing work. Give us the vision that you're laying out to your product team about how, yes we're re-platforming, we're introducing these new development technologies and all these other things, but here's where we're going. Here's the role we want to have in business. What is the role that Informatica wants to have in business? >> Our vision is to be what I call the system of intelligence for our customers because the organizing layer for that is data. When we say data management, data management's a very broad word you could argue. Our goal is that we want to organize the enterprises data. The vision that Google has for the internet, organize the customer's data whether it's inside their four walls or outside, in the context of the business processes. I'll translate that for you in two ways. We used to optimize for the IT technical user. A couple of years ago we made a big pivot to put an AND to it. We are also optimizing it for the business user because data now is such a powerful asset that business users want direct access to it. One of the things you would see from us in the last three or four years is we have been putting out a lot of out of the box data solutions. Intelligent Data Lake is a great example of that. We are giving IT full control of it, but we have a bi-modal experience where a business user can self service analytics. I just want to walk in as a marketing analyst and understand what was my lead to revenue conversion. I don't care about all the underlying infrastructure. I don't (mumble), but I just want to do my job. IT also wants to make sure as business users are accessing it, there's governance, security, compliance issues. We're marrying the two together. That's a very high bar for ourselves. >> Let me see if I can follow up on that because I want to make sure that at least I understand it. When you say you want to be the Google for enterprises data, there's actually a couple subtle things in there. First off, number one is that Google is looking at mainly public data and you want to look at public and an enterprises private data. As you said, that requires a whole level of functionality >> Amit: Totally right. >> That Google doesn't worry about like privacy, like ownership, like management and control. Secondly, increasingly the enterprise concept, especially when it comes to data is being able to get access to any data, anywhere. It's not organize the internet. It's not organize the enterprises data, it's organize all data for that enterprise. >> For the enterprise. >> Is that right? >> Exactly. We don't own the data. The enterprise owns the data. Big difference for us. >> The enterprise is also going to go out to all those sources that Google's looking at - >> Two big differences, the data within the enterprise and outside the enterprise for the enterprise, and we don't own the data, we want to bring it together for the enterprise to consume and operate and execute a lot more easy and efficiently. >> We're not talking about just small corners of data. >> No, not at all. >> We're talking about the enterprise, all data that's possible -- >> We are going outside the world, we're looking at unstructured data because, for example when you are, let's say on Twitter. Today we're going to be Tweeting, that's unstructured data, but it's about you and me. Today if Nordstrom wants to figure out something, what John likes, what John thinks, they want that, they want to. We are bringing that together within the MDM to say, oh you know what John bought for you, here's what John is saying on FaceBook or here's what John's saying on Twitter. Marry the two together and you understand John a whole lot better. That's what we want to do. >> And make it addressable and make it available to not only databases and systems, but developers. >> Amit: Oh absolutely! >> When I asked the question about data management, kind of the vision of data management, in many respects, it's the enterprises access to data that's relevant to it, number one. The ability from a metadata standpoint to know where it is and have the properties of ownership and privacy and rights and privileges and identities, and number two, the ability to move it around according to, as you noted, the integration laws that the -- >> That's exactly right. Because we've been operating for the enterprise for the last 25 years, we understand what they need. What regulations, what security concerns, what governance and compliance issues. If I had to summarize that context, look, we want to organize the enterprises data whether it's inside the four walls or outside for them, at their level of scale and security and governance and then with the help of Claire, democratize that for any user to truly use it. >> Democratization's a big angle and I want to ask you that because as much as you see the future, and I think you do, we've been talking to you many times here in they keynotes, customers aren't in the future. You've got to kind of come to earth and get to reality so I've got to ask you the question for customers, because they're trying to just deal, I'm trying to move to the cloud, I've got some VM Ware, I've got Amazon over here, I've got Azure, I haven't really baked out my full how I'm going to integrate cloud in my business model, what are some of the use cases that you guys are engaging customers with? You have good vision, products are solid. When you go out to the field, talk to customers, what are the use cases? What are you engaging them on? >> The journey to cloud is a big use case. In the journey to cloud, as I said there are two specific journeys customers are on. One is I'm deploying these thousands and thousands or hundreds and hundreds of enterprise SaaS apps. Help me weave them together in the context of data integration or MDM. Second is, the whole data gravity going to cloud. We talked about data warehousing analytics. Second is all of that. Move my data warehousing, but give me the flexibility in the hybrid. As I said, right, I want to bring outside data within Redshift, but connect it to my. Those are our two biggest use cases we see. Third we see that rides on both of them is self-service analytics. If I'm able to do both of these, then I'm much more easily able to do self-service analytics. Those three are the ones -- >> John: Are primary use cases right now? >> Those are the three prime use cases. Second one, on the other hand we see governance and compliance come up very big. Clearly customers are realizing that all of this re-architecture that's happening, you still need the same governance and compliance. If I am a large bank, if I'm a large insurance company, the laws didn't change for me. Cloud may have come, Hadoop may have come, the laws still stay the same so governance and compliance is a huge one for us. Look at GDPR. There is a deadline in May 2018 and customers are unprepared for that. That's the number two, I see governance a lot I see. >> In Europe it's even worse. You could get a top line, is that the top line, four percent of you -- >> Amit: Customers don't realize if you're a US company, even if you transact with one, single European entity, you are now -- >> The liability's there so let's just go to the root cause of what causes that liability potential, that's security. Quickly, security obviously's on the mind of you guys. You have an interesting security product. You guys are digging in the product, what's the product vision on security? >> That's the last one I was going to say. Four years ago, we saw that coming that security is an unsolved problem at the data layer and that's where the world is going to organize itself. We invested, and we have to invest ahead of the curve. We launched the product Secure@Source. Today, it's basically the industry's number one product. 11 awards at Odyssey. Raymond James is a customer, deployed within their four walls. Seven thousand databases go through Secure@Source to give them a full view of my sensitive data, who's accessing it, all of those risks that are now coming to the data layer. As data gets democratized, the security issues become bigger and broader. >> Final question for you. I want you to take a minute to end the segment because I want to give you the chance to say that because you know I'm a big fan of product work. Watching you guys go private and seeing the transition with the new management team, the product guys came in. I've said this on the CUBE many times, you've got the brand marketing going on now, new CMO, things going to be pumping out there. What is special about Informatica right now from a product standpoint? What makes you guys unique? You guys have done some good things, products coming down the pike. What are the guiding principles for you as the leader of the product team to continue to stay on that wave and innovate and make these products valuable to customers? >> I think the biggest change I would say is that we are innovating at the space of a start up. But we have the skill and breadth in the world of data management that is unparalleled to anyone. In this space, whether it's the traditional architecture, big data architecture, real-time streaming architecture or a cloud architecture or it's MDM and security and governance, nobody can do it at scale as us. By the way, we firmly believe in the best of breed concept. All of those capabilities are best of breed within their own market. Our belief is that look, we can solve a customers transition a lot more seamlessly and a lot more risk-free, and a lot more in the future proof way. Of course, we are modeling ourselves to move at the pace of a startup. I call ourselves the hottest pre-IPO -- >> John: I was just going to ask the revenue question. >> A billion dollar in revenue company, not billion dollar market cap company. >> John: You're doing over a billion in revenue? >> Doing over a billion in revenue. >> I'm going to add one more thing to that Amit. I'm not even going to test it. We are especially impressed that you have made very, very bold promises the past few years and you've executed on them. You're one of the few companies in this space in the whole data management, this emerging data management next generation world that has executed on the promises that it's made. Your promises make sense and all the things that you said are excellent. The promises make sense, but your execution makes is safe for customers. >> Well we had some critical analysis yesterday so we're not going to just all fawn all over you guys, there's some things to work on. The big bets are paying out. You guys made some great bets. The cloud bet was key. Congratulations. Amit, great to see you. Coming on the CUBE, thanks for spending the time. You got a keynote coming up this afternoon. Real quick, what's going to be the topic? >> Well I'm going to talk about how Claire will be able to solve a lot of future-looking problems. Today's keynote is all about the futures and what the vision of the future is. I'm going to showcase a few examples of what machine learning and AI can do to increase productivity and help ease the pain of our users and customers. >> Get that data integrated, democratize it and create freedom for data to fly around and get those apps addressing it. This is the CUBE, bringing you all the data here inside the CUBE, but soon we'll have an AI bot doing all the interviews in the future sometime. I'm John Furrier with Peter Burris. We'll do them today. Informatica day two exclusive coverage from the CUBE. We'll be back with more coverage after this short break. Stay with us.
SUMMARY :
Brought to you by Informatica. exclusive coverage from the CUBE. Thanks for spending the time to come on. We work hard for that. and also on the CUBE of the enterprise to bring the entire metadata together. You guys have increased the surface area Our goal is to make sure we can bring more and more and more so the customer doesn't feel pain in moving to the cloud? in the industry that disrupted our own industry What is the customer uptake Where data gravity is moving to cloud, Just to put an exclamation point on that, is connect me to my on-premise data warehouse too. It's always kind of hard to cobble together is the foundational layer John: So integration is John: Foundational. in the world of data. What is the role that Informatica wants to have in business? One of the things you would see from us and you want to look at public It's not organize the enterprises data, We don't own the data. for the enterprise to consume and operate and execute Marry the two together and you understand John to not only databases and systems, but developers. that the -- for the last 25 years, so I've got to ask you the question for customers, In the journey to cloud, as I said Second one, on the other hand we see is that the top line, four percent of you -- Quickly, security obviously's on the mind of you guys. We launched the product Secure@Source. What are the guiding principles for you By the way, we firmly believe in the best of breed concept. A billion dollar in revenue company, Your promises make sense and all the things that you said Coming on the CUBE, thanks for spending the time. Today's keynote is all about the futures This is the CUBE, bringing you all the data
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
John | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Claire | PERSON | 0.99+ |
BMW | ORGANIZATION | 0.99+ |
Amit Walia | PERSON | 0.99+ |
May 2018 | DATE | 0.99+ |
Jerry Held | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
ORGANIZATION | 0.99+ | |
AWS | ORGANIZATION | 0.99+ |
Informatica | ORGANIZATION | 0.99+ |
John Furrier | PERSON | 0.99+ |
360 degree | QUANTITY | 0.99+ |
January | DATE | 0.99+ |
yesterday | DATE | 0.99+ |
100% | QUANTITY | 0.99+ |
Gartner | ORGANIZATION | 0.99+ |
SAP | ORGANIZATION | 0.99+ |
both | QUANTITY | 0.99+ |
Amit | PERSON | 0.99+ |
two | QUANTITY | 0.99+ |
three | QUANTITY | 0.99+ |
Nordstrom | ORGANIZATION | 0.99+ |
Half a trillion | QUANTITY | 0.99+ |
hundreds | QUANTITY | 0.99+ |
Today | DATE | 0.99+ |
Second | QUANTITY | 0.99+ |
last year | DATE | 0.99+ |
Wikibon.com | ORGANIZATION | 0.99+ |
San Francisco | LOCATION | 0.99+ |
TITLE | 0.99+ | |
two ways | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
Third | QUANTITY | 0.99+ |
May | DATE | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
Silicon Angle Media | ORGANIZATION | 0.99+ |
Silicon Angle | ORGANIZATION | 0.99+ |
Eleventh year | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
Seven thousand databases | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
First | QUANTITY | 0.99+ |
thousands | QUANTITY | 0.99+ |
Four years ago | DATE | 0.99+ |
CUBE | ORGANIZATION | 0.99+ |
11 years ago | DATE | 0.99+ |
four percent | QUANTITY | 0.98+ |
Steve Jones & Srikant Kanthadai, Capgemini - #infa16 - #theCUBE
>>live from San Francisco. It's the Cube covering Informatica World 2016. Brought to you by Informatica. Now here are your hosts John Furrier and Peter Burress. Okay. Welcome back, everyone. We are here live in San Francisco for Informatica World 2016. Exclusive coverage from Silicon Angle Media is the Cube. This is our flagship programme. We go out to the events and extract the signal to noise. I'm John from my co host, Peter Burst. We have tree conflict comedy Global Head of Data Management and Steve Jones, global vice president. Big data from Capt. Jeff and I insights and data. You. Good to see you again. You sure you're welcome back. Welcome to the Cube. Thank you. And you've got my name right? It was a tongue twister, but, uh, we were talking about big data before we started rolling and kind of like where we've come to talk about over the really big data. You look back only a few years ago. Go back five years, Duke movement to where it is now. The modernisation is certainly loud and clear, but it's just not about Hadoop anymore. There's a lot of operational challenges and also the total cost of owners who want to get your thoughts. What's the trends? What do you guys see as the big trends now relative to this modernisation of taking open source the next big day to the next level? >>I think part of the pieces were actually about to publish a report we've done within the massacre on exactly that question, Uh, particular and governance and how people are making it operational. We did a report recently with our captain consulting division around Operation Analytics. Really fascinating thing that found out was the two real interesting in governance, right? The age old thing on governance has been the business doesn't engage. Well, guess what we found when you look at big data programmes is when the big data programmes start to deliver value. Guess who wants to take them over business? Guess who then actually starts leading the governance efforts, the business. So suddenly, this piece where the history of sort of data management has been, you know, going you really care about quality and the business, to be honest, going? Yeah, we don't care that much. We're still using excel, um, to the stage of which you're delivering real analytical value those pieces are going through. It's something we've been on a long journey for. I mean, we talked the other day. 2011 was the first time at camp we published a white paper on on our learnings around Big Data and governance. Um, it's amazing. Five years ago, we were talking about actually how you do governance and big data because of some of our more, uh, sort of forward looking clients. But that shift and what we're finding in that the report is the fact that people are really looking to replace this substrate. It's absolutely not about just about Hadoop, but that's the foundation, right? And unlike sort of historical pieces where there hasn't really been a data foundation, there's been lots of data silos but not a data foundation. Companies are looking to move towards actual firm data foundations across their entire business. That's a huge leap for it organisations to make and in terms of its impact on, you know, MDM and data quality and pace of delivery. Um, and those are the pieces. >>So also talk about the trends outside the US, for instance, because now you have in the UK uh, talk about that because your clients have a global footprint. The governance then crosses over the boundaries, blurring if you will virtual. But you still have physical, uh, locations. Well, I am sort of the UK and based out of London, And, uh so I see that side of the pond more often than, uh, this side. But the trends are pretty similar. And what Steve said, in fact, we were joking about it yesterday and we said, It's not for the tweet, but maybe, you know, was a little bit more big data doesn't need data quality. And my other favorite statement is MDM is dead. Long live India. Both of them are relevant. Big data doesn't need data quality in the sense that you cleanse all your data and put it into a TD WR uh, or a data lake because you can't only part of it is data owned by you. The rest comes from external sources where it needs quality is building the context on top when the end user of the analysts have a view, and there, if you build the context, then even good data could turn too bad, because in a particular context. That data is no more relevant. But bad data can turn to good because you're bringing in the context. And there was this eggs example we were talking about. You know, you you run a marketing campaign and you have all these likes and tweets and everybody loved it. Somebody then said, Okay, how about how good is this campaign? That's great. We need more. How good is it in the context of sales? Guess what? When the campaign ran, there was no difference to your sales. So then this good data that you had on the marketing campaign has turned back just to the company. That was a wasted effort that marketing. So you need contextual quality, not pure data quality. You know, if you look at e t l. You transform you do data quality before you, Lord. Now you're talking of E l t. And that's where you need quality. You need the linkages, the references, this data changes the data, and real time has been the conversation earlier so far today, the context defines the quality quality. A data swamp could be a data, you know, clean and environment. I mean, one >>of the reasons why we should presented that we present my presentation That I did on Monday was on avoiding a data swamp. So we actually think. But what we say is you've already got it. The myth is that you don't have data swamp right today, which is Oh, we've got my perfect data warehouse and it's got a perfect schemer. Really? And what does your business use Excel spreadsheets? Where do they get the data from? Well, they get from S a p. They download this and we got a macro. Somebody wrote in 1998 which means we can't upgrade that despot desktop from office 97. Right? So that desktop is office 97 because it's the only one that has a supply chain spreadsheet on. So the reality is you have the spread. Have it today. I think to the point you said about the country difference. One of the things we've seen, I think from a sort of a culture difference between Europe and here in the U. S. Is the U. S. Has been very much the technology pioneer, right is well, you know, the Hadoop stuff. The sparks of all that technology push European companies are seeing a lot of have taken quite a while to get into the, uh the Hadoop marketplace, but particularly the larger manufacturers, Um and sort of I'd say the more robust, like pharmaceuticals and these large scale organisations are now going all in. But after thinking about it. So what I mean is is that we've seen sort of lots of POC is used to be, like, four or five years ago. People doing PhDs here in North America. They're very technically centric. And then people like Okay, >>Exactly. Whereas >>over in now, in Europe, we're seeing more people going. Okay, We know where we want to get, too, because we've seen all the technology. Now it works. We're gonna start with thinking about the governance and thinking about that. What's the right way to go about this? So I think from a timing perspective, the thing that was interesting we felt beginning of last year that we begin to see some earlier states. Larger programmes in Europe, Maybe towards the end of the reality was by the middle of the year we were seeing very, very large pieces. There was almost a switch that happened, but we've our return, this notion of governance because it's really important. And you've said it here today about 20 times the rules of data Governments have been written piecemeal over the past few decades. Uh, started off by saying, uh is which application owns what data? And is the data quality enough so that the application runs or not? Uh, then compliance kind of kicked in, and we utilised compliance related rules to write the new rules of data governance. What is data governance in the context of big data? And the reason I ask questions specifically and maybe put some bounds on it is we're trying to get to a point where the business puts a value on data trade data as an asset that has a value. And the only way we're gonna be able to do that is through governance rules to support it. So what does data governance mean in a big data context, I >>think, Yeah. So the value is really the impact, and I go back to a very simple analogy people, When you didn't have computers, you had your ledges. You locked it up in a safe and took the key home. So you protected who had access to your data? You then put it on PCs. But then you give them access with Loggins. Then you said, Well, I'll tell you what you can do with my data. That was the era of B I. Because you had reports all they could do was print a report. Now you've given them access to do whatever they want with data. Now, how do you know? First thing on the governance aspect is what are they doing with the data? Where did they get the data for which they used to come up with that? What is the exposure to your organisation if somebody has, you know, uh, traded around, they traded around with labour rates or, uh, you know, fix them or done something you're talking about. And then you work backwards, Arlene. Age. So now I need to know first thing what? Not just who accesses my data. And I need to know. What are they doing that I need to know where they got the data with it. >>Well, I think this is >>You don't know what they're when they're going to access it and what they're going to do with at any given time. But I >>think that's the thing is where we have the This is where the sort of contention comes in. Right. To be honest between the areas back to the value is from a data management data governance that those things are all true, right? We need to know those pieces. The other reality is that today how do you show the business, Actually that they value the pieces, which is ultimately the outcome. So the piece we're finding on the research and the research we're about to publish soon with Informatica is one of things it's really finding. Is that where when do you get the business to care about governance? And the answer is when you demonstrate an outcome which relies on having good governance. So if you do a set of analytics and you prove that this is going to improve the effectiveness, the bottom line, the top line or whatever, the firm and particularly Operational analytics customer analytics, where they're real measurable numbers, we can save you 6% on your global supply chain costs. But in order to do that, you need a single view of product and parts, which means you need to do a product. MDM Well, that's a very easy way to get the business engaging government, as opposed to we need to do product MDM What? >>We're going to 3 60 view of the customer. >>So you So we're still pricing the value of data based on the outcome? Absolutely. And then presumably at some point, there is some across all those different utilisation and that will become the true value of the data. Is that I think the piece, I'd say in terms of that, if we sum it up, it's sort of it becomes a challenge because ultimately the business pays. Right? So one of the things I like about the big data stuff and the programmes are doing these large scale companies is the ability to deliver value to an area. So what we call insight at the point of action, and that's the bit where I pay. So, yes, I could sum it up in Theoretically and the C I can say, Well, I'm delivering this much value, but it's at those points of action. And if you say to something right, I deliver you $2 million. It costs you $100,000. That's much better than we have to say in totality. This delivers you, you know, $2 billion and it costs you $20 million or $200 million. That's an abstract piece, whereas except when I'm thinking about investment BAC, because I need to be able to appropriate the right set of resources, financial and otherwise, to the data based not just on individual exploitations but across an entire range of applications. Tyre range of utilisation, right? I think I think so. But again, in terms of the ability to bill and charges that if I can, my total is the summation of the individuals. So that's why I worked with the CFO once you have the CIA was in the room, said the business case for their for one of their programmes, and CFO said, Well, if I had, it took all your business cases and adding together this company twice the size and cost nothing to run. So there's been a history of theoretical use cases. So what we're seeing, I think on the data and the outcome side is the fact that particular Operation Analytics they're absolutely quantifiable outcomes. So while then you can say? Well, yes, If you then add this up. We need to make an investment on based platform. The two things we're finding are because you can use these much more agile technologies. These projects don't take 12 months to deliver first value, so you can. And because the incremental cost of working in a lake environment is so much less, you know, I don't have a 12 month schema change problem. So that's one of the things we're seeing is the ability to say yes as a strategy. We're going to spend 20 million or whatever over the next five years on this. But every three months, I'm going to prove to you that I've delivered value back because one thing I've seen on data governance, sort of strategic programmes historically is 18 months in. What have you delivered? What have you done for me? Proves that it has value right that >>you've forgotten. And I think also what we're seeing with big data initiatives is the failed fast methodology like the drug trials and farmers. So what's your project? It's actually the sum of all the all the programmes you've run. And we were talking about apportioning uh the budget, whose budget? Because it's now being done by the individual businesses in their own areas. So there's no CF or sitting there and saying, Well, this is the budget I give I t. And this is how you apportion it. It's all at the point of the business and they find we'll do all these fail fast programmes and I've then hit one, which makes me big bucks. And I love this concept because essentially talking about the horizontal disruption, which is what cloud and data does just fantastic. And I'm sure this is driving a lot of client engagements for you guys. So I got to ask a question on that thread Jerry Held talked about earlier today. I want to ask the question. He made a comment, but alternative questions. You guys, he said. Most CFOs know where their assets are. When you ask him to go down, the legend they go, Oh, yeah, they asked. What's about data? Where the data assets. The question is, when you go talk to your clients, uh, what do they look at when they say data assets? Because you're bringing up in the notion of not inventory of data I'm sitting around whether it's dirty, clean, you can argue and things will happen. But when it gets put to use for a purpose, Peter says, data with a purpose that's this would keep on narrative. What is there a chief data officer like a CFO role that actually knows what's going on? And probably no. But how do you have the clients? They're just share some colour because this is now a new concept of who's tracking the asset value. >>And I think there's two bits and I'll start without it. And then if you talk specifically post an L, which I think is a great example of what happens with data when it becomes an asset, is the ability to understand the totality of data within any nontrivial organisation is basically zero because it's not just inside your firewalls. I'd also question the idea that CFOs know where all the assets are. I'm working with a very large manufacturer, and after they've sold it, they need to service it, and they can't tell you where every asset is because that information now lives within a client. So actually knowing where all of the assets they need to service are, they might know their physical plants and factories are. But some of these assets a pretty big things they don't know necessarily where they are on planet Earth. So the piece on data is really to the stage of because it's also external data, right? So really the piece for me about government and other ones Do I understand the relationships of these pieces in terms of the do I value data as an individual pieces because of what I can do with it? Sometimes the data itself is the value, But most of the time we're finding in terms of when people describe value, it's to the outcome that it's based upon. And that's something that's much easier to define than how much is my, uh, product master worth. Well, I can't really say that, but you know what? I can absolutely say that 6% reduction in my supply chain costs because I have a product master. But I think post and l is a great example of what happens when you go the next step on data >>because you're looking at addressed it. And actually, it's not just posting now. We were talking to another uh, male company. A postal company. Where? Data asset. Okay, my address is our data assets, but I have multiple addresses for one person, and what they wanted to offer was based on the value of the packages that you get delivered. They wanted to give you a priority or a qualification of the addresses. They said this is a more trustworthy address because anything about £50 this person gets it delivered there. This is a lot of mail. So do you consider the insurance or the value of the packages that you get delivered to be a data asset? Most people wouldn't. They would say, Yeah, the addresses a asset. That's the data asset. But there's a second part to it, which you don't even know. So the answer really is yes and no. And it all is contextual because in a particular context, you can see if I know where everybody lives. I know where everybody is and I have all the address. You almost got to look back after the outcome and kind of reverse track the data and say, OK, that stream. I >>would say that people who start with we've had 30 years of trying to say it's the data object that has the value, and it's never ever happened. As soon as we're starting talking about the outcome and then backtracking and going in order to this outcome, we needed addresses which historically issues that would have been the value. But actually it was It was that plus the analytics of prioritising them for risk that suddenly that's a lot more valuable. That outcome of you know, what this person tends to be here, this area people seem to see as lower risk. This is where I can therefore look at the work office for those people. It gives you more information about the >>notion of the data swamp turning into data quality because the context, Sri says, is really key. Because now, if you can move data to context in real time data in motion where people call these days the buzzword. But that's the value. When you when you when you stumble upon that, that's where you say, Well, I thought I had bad data. No, Actually, it's hanging around waiting to be used as potential energy. As you know, it's the same thing with questionable. They're moving from being a postal supplier to delivering packages. Now, you know they have a very short window to deliver packages. So just how do you get to a building? Do you have to go through the backyard? Do you have to call somebody to get it? Now that data becomes valuable because otherwise you know all their deliveries go off the radar screen, right? Because they just shot to schedule >>was going to say about the quality. Want a great example of qualities that we spend a lot of times say process data and manufacturing will clean it up before it goes in the reporting structure, which is great, and that gives you a really great operational reports. There's now an entire business of people doing the digital discovery of processes so they can use the bad data to discover what your processes are and where your operational processes are currently breaking down process. If I cleaned up the data, they wouldn't be able to do their jobs. And it's this fascinating stuff we're finding a lot with. The data science piece is its ability to get different value out of data, >>chemical reactions, alchemy. It's all the interactions of the data. This is interesting. And I want to ask you guys, I know we have a minute left, and I want to have you guys take a minute to explain to the audience Cap Gemini and how people how you engage with the customer, uh, and context to their progress. Where are your customers? On the progress bar of these kinds of Congress? Because we have a nice conversation. I'd love to do an hour for this. Go up. We can geek out. But reality is day to run a business, right? So and in the tier one system integrators like captain and I all have kind of different differentiation. What do you guys do differently with this area of your practise? How are you engaging with your customers? And where are they on the progress bar of Are they like while you're talking gibberish to me, are they on board? Where are they? >>I think I think we've got a bit of a man. We've been on this journey a lot longer than most. Like I say, 2011. We're talking actual data governance and big data. You don't talk about that if you haven't been doing it for a while. we were the first systems integrated and as we Cloudera pivotal with massive partner with homework. So most of what's interesting is when people talk about data lakes and some people are thinking that stuff new. We're talking about the problem of most of our clients are now looking at the problem of having We will have multiple data lakes for P. I reasons for operational efficiency reasons from budget reasons. Whatever it may be we're looking at, how do you collaborate beyond the firewall? So I'd say, Obviously, we've got a continuity of customers. But a lot of our customers are going beyond the stage at which they're worrying about big data within their four walls to the stage of how do I collaborate beyond my four walls? And this, for us, is the switch on governance and data, and what we do is is the difference between sort of capture announcement other ones is. So when's recess is the global MGM guy and Gold Data Management guy? He actually his team is in all of the countries, so he has P and l responsibility for that. When I have it for big data in the >>country, you're out implementing the value extraction >>were in multi. I mean, it's really at the stage of kicking tyres. We're at the stage >>behind the kicking tyres a long way back in 2000, 11 >>1,002,011. By now, sort >>of driving the Ferrari on the autobahn. You know, 90 miles an hour straight, narrow. It's a lot more work to do, right. There's always a lot more things keep changing and that's that's the best part >>of what we do next. And that's the point for us is the reason we're in this is that it's what's next and I think that people, the reason governments are changing fundamentally is this move towards global collaboration. So the more you look at health exchanges and all of these things, the more people collaborate outside the four walls. That for us, is the problem we want to solve next, which is why we're working on industrialising what we now consider the boring stuff which is building a data lake and doing the internals and ingestion in those pieces that were not interested in putting bodies on that. It's about how you solve the next problem. >>Stephen Pre, thank you so much for joining the Cuba because you're good to see you again. And welcome to the Cuban love nightclub. You made it, um, great to have you love to do it. Do this again and again. I love the context. I love that you guys are on this, you know, data quality at the right time. Really? Right message? Certainly we think certainly relevant. So thanks for sharing your insights on here. And And the data on the Cube live streaming from San Francisco. You're watching the Cuba right back. It's always fun to come back to the cube because
SUMMARY :
There's a lot of operational challenges and also the total cost of owners who want to get your thoughts. is the fact that people are really looking to replace this substrate. So also talk about the trends outside the US, for instance, because now you have in the UK So the reality is you have the spread. And is the data quality enough so that the application runs or not? What is the exposure to your organisation You don't know what they're when they're going to access it and what they're going to do with at any given time. And the answer is when you demonstrate an outcome which relies on having good governance. But again, in terms of the ability to bill and charges And I'm sure this is driving a lot of client engagements for you guys. So the piece on data is really to the stage of because it's also external But there's a second part to it, which you don't even know. That outcome of you know, what this person tends to be here, this area people seem to see So just how do you get to a There's now an entire business of people doing the digital discovery of processes And I want to ask you guys, I know we have a minute left, and I want to have you guys take a minute to explain to the audience You don't talk about that if you haven't I mean, it's really at the stage of kicking tyres. By now, sort of driving the Ferrari on the autobahn. So the more you look at health exchanges and all of these things, the more people collaborate outside the four I love that you guys are on this, you know, data quality at the right time.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Steve | PERSON | 0.99+ |
Peter | PERSON | 0.99+ |
London | LOCATION | 0.99+ |
Peter Burress | PERSON | 0.99+ |
1998 | DATE | 0.99+ |
John Furrier | PERSON | 0.99+ |
Stephen Pre | PERSON | 0.99+ |
20 million | QUANTITY | 0.99+ |
CIA | ORGANIZATION | 0.99+ |
Peter Burst | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
Jerry Held | PERSON | 0.99+ |
12 month | QUANTITY | 0.99+ |
San Francisco | LOCATION | 0.99+ |
Monday | DATE | 0.99+ |
12 months | QUANTITY | 0.99+ |
$100,000 | QUANTITY | 0.99+ |
UK | LOCATION | 0.99+ |
$2 million | QUANTITY | 0.99+ |
2011 | DATE | 0.99+ |
John | PERSON | 0.99+ |
last year | DATE | 0.99+ |
$2 billion | QUANTITY | 0.99+ |
First | QUANTITY | 0.99+ |
Excel | TITLE | 0.99+ |
30 years | QUANTITY | 0.99+ |
Informatica | ORGANIZATION | 0.99+ |
6% | QUANTITY | 0.99+ |
North America | LOCATION | 0.99+ |
Srikant Kanthadai | PERSON | 0.99+ |
Steve Jones | PERSON | 0.99+ |
two bits | QUANTITY | 0.99+ |
US | LOCATION | 0.99+ |
$20 million | QUANTITY | 0.99+ |
$200 million | QUANTITY | 0.99+ |
Silicon Angle Media | ORGANIZATION | 0.99+ |
18 months | QUANTITY | 0.99+ |
Global Head of Data Management | TITLE | 0.99+ |
MGM | ORGANIZATION | 0.99+ |
yesterday | DATE | 0.99+ |
2000 | DATE | 0.99+ |
U. S. | LOCATION | 0.99+ |
second part | QUANTITY | 0.99+ |
four | DATE | 0.99+ |
today | DATE | 0.99+ |
Ferrari | ORGANIZATION | 0.99+ |
two | QUANTITY | 0.99+ |
Both | QUANTITY | 0.99+ |
office 97 | TITLE | 0.99+ |
one person | QUANTITY | 0.99+ |
twice | QUANTITY | 0.99+ |
office 97 | TITLE | 0.99+ |
first systems | QUANTITY | 0.98+ |
one | QUANTITY | 0.98+ |
India | LOCATION | 0.98+ |
Five years ago | DATE | 0.98+ |
U. S. | LOCATION | 0.98+ |
Congress | ORGANIZATION | 0.98+ |
excel | TITLE | 0.98+ |
Informatica World 2016 | EVENT | 0.97+ |
One | QUANTITY | 0.97+ |
Capgemini | PERSON | 0.97+ |
five years ago | DATE | 0.97+ |
first time | QUANTITY | 0.97+ |
Capt. | PERSON | 0.97+ |
Arlene | PERSON | 0.97+ |
zero | QUANTITY | 0.96+ |
1,002,011 | QUANTITY | 0.96+ |
Gold Data Management | ORGANIZATION | 0.96+ |
few years ago | DATE | 0.95+ |
Cuba | LOCATION | 0.95+ |
about £50 | QUANTITY | 0.95+ |
European | OTHER | 0.95+ |
about 20 times | QUANTITY | 0.93+ |
P. | PERSON | 0.92+ |
past few decades | DATE | 0.91+ |
Jeff | PERSON | 0.9+ |
an hour | QUANTITY | 0.88+ |
earlier today | DATE | 0.88+ |
first value | QUANTITY | 0.87+ |
first thing | QUANTITY | 0.87+ |
3 60 view | QUANTITY | 0.85+ |
two things | QUANTITY | 0.83+ |