Image Title

Search Results for Mew Era Consulting:

Saket Saurabh, Nexla - Data Platforms 2017 - #DataPlatforms2017


 

(upbeat music) [Announcer] Live from the Wigwam in Pheonix, Arizona, it's the Cube. Covering Data Platforms 2017. Brought to you by Cue Ball. >> Hey welcome back everybody, Jeff Frick here with the Cube. We are coming down to the end of a great day here at the historic Wigwam at the Data Platforms 2017, lot of great big data practitioners talking about the new way to do things, really coining the term data ops, or maybe not coining it but really leveraging it, as a new way to think about data and using data in your business, to be data-driven, software-defined, automated solution and company. So we're excited to have Saket Saurabh, he is the, and I'm sorry I butchered that, Saurabh. >> Saurabh, yeah. >> Saurabh, thank you, sorry. He is the co-founder and CEO of Nexla, and welcome. >> Thank you. >> So what is Nexla, tell us about Nexla for those that aren't familiar with the company. Thank you so much. Yeah so Nexla is a data operations platform. And the way we look at data is that data is increasingly moving between companies and one of the things that is driving that is the growth in machine learning. So imagine you are an e-commerce company, or a healthcare provider. You need to get data from your different partners. You know, suppliers and point-of-sale systems, and brands and all that. And the companies, when they are getting this data, from all these different places, it's so hard to manage. So we think of, you know just like cloud computing, made it easy to manage thousands of servers, we think of data ops as something that makes it easy to manage those thousands of data sources coming from so many partners. So you've jumped straight past the it's a cool buzz term in way to think about things, into the actual platform. So how does that platform fit within the cloud, and on Prim, is it part of the infrastructure, sits next to the infrastructure, is it a conduit? How does that work? >> Yeah, we think of it as, if you think of maybe machine learning or advanced analytics as the application, then data operations is sort of an underlying infrastructure for it. It's not really the hardware, the storage, but it's a layer on top. The job of data operations is to get the data from where it is to where you need it to be, and in the right form and shape. So now you can act on it. >> And do you find yourself replacing legacy stuff, or is this a brand new demand because of all the variant and so many types of datasets that are coming in that people want to leverage. >> Yeah, I mean to be honest, some of this has always been there in the sense that the day you connected a database to a network data started to move around. But if you think of the scale that has happened in the last six or seven years, none of those existing systems were ever designed for that. So when we talk about data growing at at a Moore's Law rate, when we talk about everybody getting into machine learning, when we talk about thousands of data sets across so many different partners that you work with, and when we think that reports that you get from your partners is no more sufficient, you need that underlying data, you can not basically feed that report into an algo. So when you look at all of these things we feel like it is a new thing in some ways. >> Right. Well, I want to unpack that a little bit because you made an interesting comment, before you turned on the cameras you just repeated, that you can't run an algorithm on a report. And in a world where we've got all the shared data sets, and it's funny too right, because you used to run a sample, now you want, you said, the raw. Not only all, but the raw data, so that you can do with it what you wish. Very different paradigm. >> Yeah. >> It sounds like there's a lot more, and you're not just parsing what's in the report, but you have to give it structure that can be combined with other data sources. And that sounds like a rather challenging task. Because the structure, all the metadata, the context that gives the data meaning that is relevant to other data sets, where does that come from? >> Yeah, so what happens, and this has been how technology companies have started to evolve. You want to focus on your core business. And therefore you will use a provider that processes your payments, you will use a provider that gives you search. You will use a provider that provides you the data for example for your e-commerce system. So there are different types of vendors you're working with. Which means that there's different types of data being involved. So when I look at for example a brand today, you could be say, a Nike, and your products are being sold on so many websites. If you want to really analyze your business well, you want data from every single one of those places, where your data team can now access it. So yes, it is that raw data, it is that metadata, and it is the data coming from all the systems that you can look at together and say when I ran this ad this is how people reacted to it, this was the marketing lift from that, this is the purchase that happened across these different channels, this is how my top line or bottom line was affected. And to analyze everything together you need all the data in a place. >> I'm curious on what do you find on the change in the business relationship. Because I'm sure there were agreements structured in another time which weren't quite as detailed, where the expectations in terms of what was exchanged wasn't quite this deep. Are you seeing people have to change their relationships to get this data? Is it out there that they're getting it, or is this really changing the way that people partner in data exchange, on like the example that you just used between say Nike and Foot Locker, to pick a name. >> Yeah, so I think companies that have worked together have always had reports come in, so you would get a daily report of how much you sold. Now just a high-level report of how much you sold is not sufficient anymore. You want to understand where was it bought, in which city, under what weather conditions, by what kind of user and all that stuff. So I think what companies are looking at, again, they have built their data systems, they have the data teams, unless they give the data their teams cannot be effective and you cannot really take a daily sales report and feed that into your algorithm, right? So you need very fine-grained data for that. So I think companies are doing this where, hey you were giving me a report before, I also need some underlying data. Report is for a business executive to look at and see how business is doing, and the underlying data is really for that algorithm to understand and maybe identify things that a report might not. >> Wouldn't there have been already, at least in the example of sell-through, structured data that's been exchanged between partners already like vendor-managed inventory, or you know where like a downstream retailer might make their sell-through data accessible to suppliers who actually take ownership of the inventory and are responsible for stocking it at optimal levels. >> Yeah, I think Walmart was the innovator in that, with the POS link system, back in the day, for retail. But the point is that this need for data to go from one company to their partners and back and forth is across every sector. So you need that in e-commerce, you need that in fintech, we see companies who have to manage your portfolio needs to connect with different banks and brokerages you work with to get the data. We see that in healthcare across different providers and pharmaceutical companies, you need that. We see that in automotive. If every care generates data, an insurance company needs to be able to understand that and look at it. >> This, it's a huge problem you're addressing, because this is the friction between inter-company applications. And we went through this with the B2B marketplaces, 15 plus years ago. But the reason we did these marketplace hubs was so that we could standardize the information exchange. If it's just Walgreens talking to Pfizer, and then doing another one-off deal with, I don't know, Lily, I don't know if they both still exist, it won't work for connecting all of pharmacy with all of pharma. How do you ensure standards between downstream and upstream? >> Yeah. So you're right, this has happened. When we do a wire transfer from one person to another, some data goes from a bank to another bank, still takes hours to get that, it's very tiny amount of data. That has all exploded, we are talking about zetabytes of data now every year. So the challenge is significantly bigger. Now coming to standards, what we have found, that two companies sitting together and defining a standard almost never works. It never works because applications change, systems change, the change is the only constant. So the way we've approached it at our company is, we monitor the data, we sit on top of the data and just learn the structure as we observe data flowing through. So we have tons of data flowing through and we're constantly learning the structure, and are identifying how the structure will map to the destination. So again, applying machine learning to see how the structure is changing, how the data volume is changing. So you are getting data from somewhere say every hour, and then it doesn't show up for two hours. Traditionally systems will go down, you may not even find for five days that the data wasn't there for that. So we look at the data structure, the amount of data, the time when it comes, and everything to instantly learn and be able to inform the downstream systems of what they should be expecting, if there is a change that somebody needs to be alerted about. So a lot of innovation is going in to doing this at scale without necessarily having to predefine something in a tight box that cannot be changed. Because it's extremely hard to control. >> All right, Saket, that's a great explanation. We're going to have to leave it there, we're out of time. And thank you for taking a few minutes out of your day to stop by. >> Thank you. >> All right. Jeff Frick with George Gilbert, we are at Data Platforms 2017, Pheonix Arizona, thanks for watching. (electronic music)

Published Date : May 25 2017

SUMMARY :

Brought to you by Cue Ball. at the historic Wigwam at the Data Platforms 2017, He is the co-founder and CEO of Nexla, So we think of, you know just like cloud computing, So now you can act on it. And do you find yourself replacing legacy stuff, the day you connected a database to a network Not only all, but the raw data, so that you can do with it but you have to give it structure that can be combined And to analyze everything together you need all the data I'm curious on what do you find on the change So you need very fine-grained data for that. or you know where like a downstream retailer But the point is that this need for data to go But the reason we did these marketplace hubs and just learn the structure as we observe data And thank you for taking a few minutes out of your day we are at Data Platforms 2017, Pheonix Arizona,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
WalmartORGANIZATION

0.99+

WalgreensORGANIZATION

0.99+

SaurabhPERSON

0.99+

Jeff FrickPERSON

0.99+

NikeORGANIZATION

0.99+

George GilbertPERSON

0.99+

PfizerORGANIZATION

0.99+

two hoursQUANTITY

0.99+

five daysQUANTITY

0.99+

LilyPERSON

0.99+

two companiesQUANTITY

0.99+

NexlaORGANIZATION

0.99+

SaketPERSON

0.99+

Foot LockerORGANIZATION

0.99+

Saket SaurabhPERSON

0.99+

one personQUANTITY

0.98+

bothQUANTITY

0.98+

oneQUANTITY

0.98+

Pheonix, ArizonaLOCATION

0.97+

CubeORGANIZATION

0.97+

15 plus years agoDATE

0.97+

todayDATE

0.97+

thousands of data sourcesQUANTITY

0.97+

WigwamLOCATION

0.96+

Data Platforms 2017EVENT

0.96+

thousands of serversQUANTITY

0.95+

one companyQUANTITY

0.95+

#DataPlatforms2017EVENT

0.92+

Cue BallPERSON

0.9+

thousands of data setsQUANTITY

0.9+

ArizonaLOCATION

0.75+

last sixDATE

0.73+

hourQUANTITY

0.69+

singleQUANTITY

0.67+

seven yearsQUANTITY

0.67+

MooreORGANIZATION

0.66+

everyQUANTITY

0.64+

PheonixLOCATION

0.54+

CoveringEVENT

0.51+