Image Title

Search Results for LiveRamp:

Lisa Cramer, LiveRamp & Chris Child, Snowflake | Snowflake Summit 2022


 

(upbeat music) >> Good afternoon, everyone. Welcome back to theCUBE's live coverage of Snowflake Summit 22, the fourth annual Snowflake Summit. Lisa Martin here with Dave Vellante, We're live in Vegas, as I mentioned. We've got a couple of guests here with us. We're going to be unpacking some more great information that has come out of the show news today. Please welcome Chris Child back to theCUBE, Senior Director of Product Management at Snowflake, and Lisa Cramer is here, Head of Embedded Products at LiveRamp, guys welcome. >> Thank you. >> Hi. >> Tell us a little bit about LiveRamp, what you guys do, what your differentiators are and a little bit about the Snowflake partnership? >> Sure, well, LiveRamp makes it safe and easy to connect data. And we're powered by core identity resolution capabilities, which enable our clients to resolve their data, and connect it with other data sets. And so we've brought these identity infrastructure capabilities to Snowflake, and built into the Native Application Framework. We focused on two initial products around device resolution, which enables our clients to connect customer data from the digital ecosystem. This powers things like, measurement use cases, and understanding campaign effectiveness and ROI. And the second capability we built into the Native Application Framework is called transcoding. And this enables a translation layer between identifiers, so that parties can safely and effectively share data at a person-based view. >> Chris, talk to us about, Snowflake just announced a lot of news this morning, just announced, the new Snowflake Native Application Framework. You alluded to this, Lisa, talk to us about that. What does it mean for customers, what does it do? Give us all the backstory. >> Yeah, so we had seen a bunch of cases for our customers where they wanted to be able to take application logic, and have other people use it. So LiveRamp, as an example of that, they've built a bunch of complicated logic to help you figure out who is the same person in different systems. But the problem was always that, that application had to run outside of the Data Cloud. And that required you to take your data outside of Snowflake, entrust your data to a third party. And so every time that companies have to go, become a vendor, they have to go through a security review, and go through a long onerous process, to be able to be allowed to process the really sensitive data that these customers have. So with the Native Applications Framework, you can take your application code, all of the logic, and the data that's needed to build it together, and actually push that through secure data sharing into a customer's account, where it runs, and is able to access their data, join it with data from the provider, all without actually having to give that provider access to your core data assets themselves. >> Is it proper to think of the Native Application Framework as a PaaS layer within the Data Cloud? >> That's a great way to think about it. And so, this is where we've integrated with the marketplace as well. So providers like LiveRamp will be able to publish these applications. They'll run entirely on effectively a PaaS layer that's powered by Snowflake, and be able to deliver those to any region, any cloud, any place that Snowflake runs. >> So, we get a lot of grief for this term, but we've coined a term called "supercloud". Okay, and the supercloud is an abstraction layer that hovers above the hyperscale infrastructure. Companies like yours, build on top of that. So you don't have to worry about the underlying complexities. And we've said that, in order to make that a reality, you have to have a super PaaS. So is that essentially what you're doing? You're building your product on top of that? You're not worrying about, okay, now I'm going to go to Azure, I'm going to go to AWS, or I'm going to go to, wherever, is that a right way to think about it? >> That's exactly right. And I think, Snowflake has really helped us, kind of shift the paradigm in how we work with our customers, and enabled us to bring our capabilities to where their data lives, right? And enabled them to, kind of run the analytics, and run the identity resolution where their data sits. And so that's really exciting. And I think, specifically with the Native Application Framework, Snowflake delivered on the promise of minimizing data movement, right? The application is installed. You don't have to move your data at all. And so for us, that was a really compelling reason to build into it. And we love when our customers can maintain control of their data. >> So the difference between what you are doing as partners, and a SaaS, is that, you're not worrying about all the capabilities, there in the data, all the governance, and the security components. You're relying on the Data Cloud for that, is that right? Or is it a SaaS? >> Yeah, I think there's components, like certainly parts of our business still run in the SaaS model. But I think the ability to rely on some of the infrastructure that Snowflake provides, and honestly kind of the connectivity, and the verticalized solutions that Snowflake brings to bear with data providers, and technology providers, that matter most to that vertical, really enable us to kind of rely on some of that to ensure that we can serve our customers as they want us to. >> So you're extending your SaaS platform and bringing new capabilities, as opposed to building, or are you building new apps in the Data Cloud? This is, I'm sorry to be so pedantic, but I'm trying to understand from your perspective. >> Oh yeah, so we built new capabilities within the Data Cloud. It's based on our core identity infrastructure capabilities, but we wanted to build into the Native Application Framework, so that data doesn't have to move and we can serve our customers, and they can maintain control over their data in their environment. So we built new capabilities, but it's all based on our core identity infrastructure. >> So safe sharing reminds me of like when procurement says, do we have an MSA? Yes, okay, go. You know, it's just frictionless. Versus no, okay, send some paper, go back and forth and it just takes forever. >> That's one of the big goals that we see. And to your point on, is it a PaaS, is it a SaaS? We honestly think of it as something a little bit different, in a similar way to where, at Snowflake we saw a whole generation of SaaS business models, and as a utility, and a consumption-based model, we think of ourselves as different from a SaaS business model. We're now trying to enable application providers, like LiveRamp, to take the core technology in IP that they've built over many, many years, but deliver it in a completely new different way that wasn't possible. And so part of this is extending what they're doing, and making it a little easier to deploy, and not having to go through the MSA process in the same way. But also we do think that this will allow entirely new capabilities to be brought that wouldn't be possible, unless they could be deployed and run inside the Data Cloud. >> Is LiveRamp a consumption pricing model, or is it a subscription, or a combo? >> We are actually a subscription, but with some usage capabilities. >> It's an hybrid. >> Chris, talk a little bit about the framework that you guys have both discussed. How is it part of the overall Snowflake vision of delivering secure and governed, powerful analytics, and data sharing to customers, and ecosystem partners? >> So this, for us we view this as kind of the next evolution of Snowflake. So Snowflake was all built on helping people consolidate their data, bring all your data into one place and then run all of your different workloads on it. And what we've seen over the years is, there are still a lot of different use cases, where you need to take your data out of the Data Cloud, in order to do certain different things. So we made a bunch of announcements today around machine learning, so that you don't have to take your data out to train models. And native applications is built on the idea of don't bring your data to the applications you need. Whether they're machine learning models, whether they're identity resolution, whether they're really even just analytics. Instead, take the application logic and bring that into the Data Cloud, and run it right on your data where it is. And so the big benefit of that is, I don't need copies of my data that are getting out of sync, and getting out of date. I don't need to give a copy of my data to anyone else. I get to keep it, I get to govern it. I get to secure it. I know exactly what's going on. But now, we can open this up to workloads, not just ones that Snowflake's building, but workloads that partners like LiveRamp, or anyone else is building. All those workloads can then run in a single copy of your data, in a single secure environment. >> And when you say in one place, Chris, people can get confused by that, 'cause it's really not in one place. it's the global thing that Benoit stressed this morning >> And that right, and so these, once you write a native app once, so the native app that they've written is one piece of code, one application, that now can be deployed by customers in any region, or on any cloud that they're running on without any changes at all. So to your point on the PaaS, that's where it gets very PaaS-like, because they write once to the Snowflake APIs, and now it can run literally anywhere the Snowflake runs. >> But the premise that we've put forth in supercloud is that, this is a new era. It's not multicloud. And it's consistent with a digital business, right? You're building, you've got a digital business, and this is a new value layer of a digital business. If I've got capabilities, I want to bring them to the cloud. I want to bring them to, every company's a software company, software's eating the world, data's eating software. I mean, I could go on and on and on, but it's not like 10 years ago. This is a whole new life cycle that we're just starting. Is that valid? I mean do you feel that way about LiveRamp? >> Definitely, I mean, I think it's really exciting to see all of the data connectivity that is happening. At the same time, I think the challenges still remain, right? So there are still challenges around being able to resolve your data, and being able to connect your data to a person-based view in a privacy safe way, to be able to partner with others in a data collaboration model, right? And to be able to do all of that without sharing anything from a sensitive identifier standpoint, or not having a resolved data set. And so I think you're absolutely right. There's a lot of really cool, awesome innovation happening, but the customer challenges, kind of still exist. And so that's why it's exciting to build these applications that can now solve those problems, where that data is. >> It's the cloud benefit, the heavy lifting thing, for data? 'Cause you don't have to worry about all that. You can focus on campaign ROI, or whatever new innovation that you want to bring out. >> And think about it from the end customer's perspective. They now, can come into their single environment where they have all their data, they can say, I need to match the identity, and they can pull in LiveRamp with a few clicks, and then they can say, I'm ready to take some actions on this. And they can pull in action tools with just a few more clicks. And they haven't made current marketing stack that you see. There's 20 different tools and you're schlepping data back and forth between each of them, and LiveRamp's just one stop on your journey to get this data out to where I'm actually sending emails or targeting ads. Our vision is that, all that happens on one copy of the data, each of these different tools are grabbing the parts they need, again in a secure well-governed, well-controlled way, enriching in ways that they need, taking actions that they need, pulling in other data sets that they need. But the end consumer maintains control over the data, and over the process, the entire way through. >> So one copy data. So you sometimes might make a copy, right? But you'd make as many copies as you need to, but no more, kind of thing, to paraphrase Einstein, or is that right? >> There's literally one copy of the data. So one of the nice things with Snowflake, with data sharing, and with native applications, the data is stored once in one file on disc and S3, which eventually is a disc somewhere. >> Yeah, yeah, right. >> But what can happen is, I'm really just granting permission to these different applications, to read and write from that single copy of the data. So as soon as a new customer touches my website, that immediately shows up in my data. LiveRamp gets access to that instantly. They enrich it. Before I've even noticed that that new customer signed up, the data's already been enriched, the identity's been matched, and they're already put into a bucket about what campaign I should run against them. >> So the data stays where it is. You bring the ISO compute, but the application. And then you take the results, right? And then I can read them back? >> You bring the next application, right to that same copy of the data. So what'll happen is you'll have a view that LiveRamp is accessing and reading and making changes on, LiveRamp is exposing its own view, I have another application reading from the LiveRamp view, exposing its own view. And ultimately someone's taking an action based on that. But there's one copy of the data all the way through. That's the really powerful thing. >> Okay, so yeah, so you're not moving the data. So you're not dealing with latency problems, but I can, if I'm in Australia and I'm running on US West, it's not a problem? >> Yes, so there, if you do want to run across different clouds, we will copy the data in that case, we've found it's much faster. >> Okay, great, I thought I was losing my mind. >> No, but as long as you're staying within a single region, there will be no copies of the data. >> Yeah, okay, totally makes sense, great. >> One of the efficiency there in speed to be able to get the insights. That's what it's all about, being able to turn the volume up on the data from a value perspective. Thanks so much guys for joining us on the program today talking about what LiveRamp and Snowflake are doing together and breaking down the Snowflake Native Application Framework. We appreciate your insights and your time, And thanks for joining us. >> Thank you both. >> Thank you guys. >> Thank you. >> For our guests, and Dave Vellante, I'm Lisa Martin. You're watching theCUBE Live from Snowflake Summit 22 from Las Vegas. We'll be right back with our next guest. (upbeat music)

Published Date : Jun 14 2022

SUMMARY :

that has come out of the show news today. and built into the Native Chris, talk to us about, and is able to access their data, and be able to deliver those Okay, and the supercloud and run the identity resolution and the security components. and honestly kind of the connectivity, apps in the Data Cloud? so that data doesn't have to move and it just takes forever. and run inside the Data Cloud. but with some usage capabilities. and data sharing to customers, and bring that into the Data Cloud, it's the global thing that So to your point on the PaaS, But the premise that we've put forth And to be able to do all of It's the cloud benefit, and over the process, to paraphrase Einstein, So one of the nice things with Snowflake, from that single copy of the data. So the data stays where it is. right to that same copy of the data. and I'm running on US West, Yes, so there, if you do want to run I was losing my mind. No, but as long as you're One of the efficiency there in speed We'll be right back with our next guest.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

Dave VellantePERSON

0.99+

Lisa MartinPERSON

0.99+

ChrisPERSON

0.99+

AustraliaLOCATION

0.99+

Chris ChildPERSON

0.99+

Lisa CramerPERSON

0.99+

VegasLOCATION

0.99+

Las VegasLOCATION

0.99+

LisaPERSON

0.99+

LiveRampORGANIZATION

0.99+

EinsteinPERSON

0.99+

Chris ChildPERSON

0.99+

BenoitPERSON

0.99+

one copyQUANTITY

0.99+

SnowflakeORGANIZATION

0.99+

AWSORGANIZATION

0.99+

20 different toolsQUANTITY

0.99+

Snowflake Summit 22EVENT

0.99+

eachQUANTITY

0.99+

one applicationQUANTITY

0.99+

one fileQUANTITY

0.99+

bothQUANTITY

0.98+

US WestLOCATION

0.98+

todayDATE

0.98+

Data CloudTITLE

0.98+

theCUBEORGANIZATION

0.97+

Snowflake SummitEVENT

0.97+

single copyQUANTITY

0.97+

second capabilityQUANTITY

0.97+

LiveRampTITLE

0.96+

Native Application FrameworkTITLE

0.96+

SnowflakeTITLE

0.95+

Snowflake Summit 2022EVENT

0.95+

PaaSTITLE

0.95+

one stopQUANTITY

0.95+

oneQUANTITY

0.94+

10 years agoDATE

0.93+

Native Application FrameworkTITLE

0.93+

this morningDATE

0.91+

one placeQUANTITY

0.9+

two initial productsQUANTITY

0.88+

onceQUANTITY

0.87+

single regionQUANTITY

0.86+

single environmentQUANTITY

0.82+

single secure environmentQUANTITY

0.79+

couple of guestsQUANTITY

0.73+

one piece of codeQUANTITY

0.73+

OneQUANTITY

0.71+

S3TITLE

0.7+

fourth annualQUANTITY

0.68+

Sasha Kipervarg, LiveRamp | Cloud Native Insights


 

>> Narrator: From theCUBE studios in Palo Alto in Boston, connecting with thought leaders around the globe, these are Cloud Native Insights. >> Hi, and welcome to another episode of Cloud Native Insights. I'm your host, Stu Miniman. And when we talk about Cloud Native of course, it's not just moving to the cloud as a location, but how do we take advantage of what's happened in the cloud of the changes that need to happen. And this is not only from a technology standpoint, it's an organizational standpoint. And we're also going to touch on the financial implications and something you've probably heard about FinOps, relatively new last couple of years as a term. Of course, the financial engineering cloud has been around for many years and how that ties into DevOps and to help us understand this movement, what's going on really thrilled that we have a practitioner in this space. I want to welcome Sasha Kipervarg. He's a head, the head of Global Cloud Operations in special projects with LiveRamp. Sasha, thanks so much for joining us. >> Thanks very much too, happy to be here. >> All right, so why don't we start off first for those that don't know LiveRamp, I'm sorry, you're in the ad tech space. Maybe just give us a little bit about, you know, the organization and what your team does there? >> Sure, so LiveRamp is in the advertising technology space, and we help connect companies to their customers and send targeted advertising to them. We're based in San Francisco and have engineering teams across the globe, primarily New York, London, China, all over the map, really. And we're a fast growing company, we've gone from perhaps 400 to maybe 12, 1300 employees over the last year and a half. >> Well, you know that whole space is a whole separate discussion. I like when I looked up a little bit about LiveRamp the discussion point is, you know, cookies for eating not for following you, in looking where are you going all over the company. So your role inside LiveRamp, though. Tell us a little bit... You know, we're cloud bits in New York? >> Sure, so I'm responsible for the engineering teams that help other development teams operate in the cloud. So whereas on premise, it would have been a traditional operations team in the cloud. It's basically an engineering team that are experts in all the different areas that other engineering teams need us to be in so that we can express good practices and help them deliver products. >> Great, you actually had a real forcing function for cloud. You know, right now during the global pandemic we've seen lots of acceleration of people looking at cloud, if you could briefly just bring us back as to one of the things that helped push LiveRamp, you know, to go much heavier into cloud. >> Yeah, so we had some initial plans and we were exploring. But what really pushed us over the edge was we had a three to four day outage at our data center here in San Francisco during a heatwave. And during that time, the data center couldn't control their temperature. We had unusually warm temperatures in San Francisco, they weren't that warm. It was like maybe in the, you know, mid 90s. But for the Bay Area in the summertime, you know, where it's usually 70, it was a big deal. And so we had racks of servers going down because it was too hot. And so if we weren't quite convinced before that we certainly were after that, and that made us realize that there were lots of good reasons to be in the cloud. And so we did it. We put together a migration and over the course of a year, we not only containerized but we migrated our environment into GCP. >> I wonder if you could just bring us inside a little bit that move to the cloud, you talk about adopting containerization. You know, your applications, you know, how much of it did you just kind of move there? How much did you build new? Where there some things that you just said, hey, I can kind of, you know, adopt a SAS equivalent, you know, how did your application portfolio look? >> Yeah, so it's probably good to think of them in terms of the infrastructure services that we use in the cloud, and then the customer facing applications themselves. And what we try to do is essentially containerize all of our infrastructure applications. Actually, let me rephrase that. We took the customer facing applications, and we containerize those. Now the applications themselves, did not change but they swapped out their underlying infrastructure for containers, running on the GCP native container service. On the back end of things we use the native services in GCP up as much as possible. So if we were using a database on premise, we tried to use the native database service in the Cloud with Google. I think the one interesting exception to that which we're changing now, in fact, was we decided to run our hundred petabyte Hadoop cluster in the Cloud using our own native service because of some price concerns. Those price concerns have gotten better since time and we're now migrating to Dataproc, which is Google's native Hadoop service. >> Yeah, it's fascinating when you think about just how fast things change in the cloud, new services can become available and as you're alluding to the finances can change significantly over you know, a couple of months or a quarter. Overall, how's the experience been? You know, moving to cloud, though? >> Well, it's been fantastic in some ways, painful in others because, you know, you discover and maybe this is begin to touch on the FinOp stuff like, you discover that you've gone from quarterly planning cycles where you opt to purchase a whole rack of servers, and you implement them over the next quarter or something like that, to making by the second decisions, to spin up resources via command line by developer and spend unlimitless operating expenses. So, it's quite a big shift. And I think a lot of companies are caught, you know, flat footed by it. We certainly work for a little bit. And there's some financial pain that gets expressed. And you know, the question that I would pose to the audience when they think about the cloud is, you know, we think of the migrations and we only think about their technical success, but if you migrate to the cloud and you do it technically and you containerize and it's on schedule, but then you blow your budget, was it really a success? Because ultimately, you know the business needs to be profitable in order for things to work. >> Yeah, absolutely Sasha. So what I've heard you talk about this before is in the pre-cloud model, you met with the budget team quarterly, and it was mostly a look back function. And of course, when you think about leveraging the cloud, things are changing on a fairly regular basis. And are you able to understand what decisions you're making and what the impact will be on you know, next month and next quarters, billing? So bring us inside a little bit as to, you know, that interaction and what that meant to your teams and how they had to think about you know, engineering and finance together? >> Yeah, it's a fantastic question. So, I guess the first thing is, let me let me zoom out for a moment and just make sure that the audience understands that you know, typically it's just engineering leadership, and a fairly small number of maybe high level developers, maybe an architect that get together with finance once a quarter and have a conversation about what they want to spend and how much they want to spend, and where it should be implemented. And that is a fairly regular thing that's been going on for many years. When you move to the cloud, all of a sudden that decision needs to happen on a real time basis. And typically, companies are not set up for that kind of a conversation. There's usually like a large wall between finance and engineering. And it's because you want the engineering teams to be engineers and the finance folks to be doing finance related things. And the two don't really mix all that often. But when you give a developer an API to spend money essentially right, that's what you've done. They don't just spend up resources, they spend money by API. You need to have a real time conversation where they can make trade offs, where you can track the budget, and those expenses shift from something called CapEx to OpEx. And that's treated in a very different way, on the books. Where we are today is we've created what a team, we call it a FinOps practice. But it's a team that's cross functional by nature that sits within engineering that's made up of a FinOps practitioner, person dedicated to the role. And then members of the finance team. And then many other members of engineer and they work together to first, express the cost by helping developers understand what they're actually spending and where they're spending it. And then the system also makes, recommendations about how to optimize and then the developers absorb that information and figure out what they should optimize, do that work. And then the system re-represents the information for them, and lets them know that their optimizations make sense or not from a financial perspective. The way that we've talked to developers, we've discovered that they care about efficiency. They care about efficiency in different ways. They care about CPU efficiency, they care about RAM efficiency. And it turns out, they care about how efficient their application is from a cost perspective to, right? And you can either tell them directly to care about it, or help them become aware. Or you can use proxies, like what I just mentioned about CPU, RAM, disk, network. If they understand how efficient their application is. They have a natural instinct to want to make it better on a daily and weekly basis. It's just sort of baked into their deep engineering persona. And we try to harness that. We try to position things in such a way that they can do the right thing, because most developers want to do the right. >> Yeah, it's really interesting to me Sasha I remember back, you know you go back seven, eight years ago and I looked at cloud models, and how cloud providers were trying to give more visibility and even give guidance to customers as to how they could adjust things to make them more financially reasonable. I've come from the infrastructure side, when I think about you know, deployments in a data center. It was very well understood you had systems engineer work with a customer, they deploy something, they understand what the growth of is expected to be, and if you needed more, more computer, more storage, what the cost of that would be, you understand the you know, how many years you will be writing that off for, but everything's well understood, and as you said, like developers often they've got, n minus one technology, okay, here's some gear you could work on. But finances were clearly written, they were put into some spreadsheet or understood as opposed to the cloud. There is much more burden on the user to understand what they're doing. Because you have that limitless capability as opposed to some fixed asset that you're writing it off. We're huge proponents of ledger than the cloud. And often there are, cost savings by going to the cloud. But it feels like they're also some of this overhead of having to do the financial engineering is an overhead cost that might not be considered in the overall movement to the cloud. >> Yeah, and maybe now is a good time to swing back to the concept of DevOps, right? Because I want to frame FinOps in this concept of having the budget overhead and I want to link it to the Agile, okay. So, part of the reason we moved to DevOps which is an Agile movement that essentially, puts the responsibility of owning infrastructure and deploying it into the hands of the engineers themselves. The reason that it existed was because we had a problem deploying, we had two different teams typically operations and engineering. And one of them would write the code, and they would throw it over the wall to the operations team that will deploy the code. And because they were two different teams, and they didn't necessarily sit together or sometimes even report into the same leadership, they had different goals, right. And when there was a problem, the problem had to cross both of the team boundaries. And so it was slower to resolve issues. And so, people had the bright idea to essentially put the teams together, right. And allow the developers themselves to deploy the code. And of course, depending on the size of the company was structured--or it is structured slightly differently this idea of DevOps. And, essentially what you had was a situation that worked beautifully because if you had two separate teams that all of a sudden became one team that was fully responsible for writing the code, writing the tests and deploying the code, they saw each other's pain, they understood the problem really well. And it was an opportunity for them to go faster, and they could see the powerful thing. And I think that's essentially what made the DevOps movement incredibly successful. It was the opportunity to be able to control their own destiny, and move faster that made it successful. I view FinOps in a similar fashion. It is an opportunity for developers to understand their cost efficiency and deploy in the cloud by API, and do it in a fully responsible way. Everything that we've been talking about related to DevOps, there is a higher goal here. And that is the goal of unit economics, which is figuring out precisely what your application actually costs being deployed and used by the consumer on a unit basis, right. And that is the thing we're all trying to get to. And this FinOps gets us one step closer to that sort of financial nirvana. Now if you can achieve it, or even if you can achieve the basics of it. You can structure your contracts in a different way, you can create products that take better advantage of your financial model. You can destroy certain products that you have, that don't really make sense to operate in the cloud. You can fire customers. You can do a whole variety of things, if you know what your full costs are, and FinOps allows us to do that. And FinOps allows developers to think of their applications in a way that perhaps they never have in a fully transparent, holistic way. Like there's no sense to build a Ferrari, if it costs too much to operate, right. And FinOps helps you get there. >> It's such an important point Sasha. I'm so glad you brought that up, back in the traditional infrastructure data center world, we spent decades talking about Showback and Chargeback and what visibility you had? And of course for the most part, it was, oh well you know, that sunk costs or something that facilities takes care of. I'm not going to work at it and therefore, we did not have a clear picture of IT and how it really impacted the bottom line of business. So FinOps as you said, help move us towards that ultimate goal that we know we've had for years. I want to tease on that thing that you mentioned there, speed. We understand that, absolutely speed is one of the most important things, how do we react to the business? How to react to the customer, as close to real time as possible? How do you make sure that FinOps doesn't slow things down? If I'm an engineer, and I need to think about oh, wait. I've been told that, the best code to write is no code. But, I have to constantly think about, am I being financially sound? Am I doing that? How do we make sure that this movement doesn't slow me down, but actually enables me to move forward faster? >> Yeah I mean, let me mention a couple of things there. The first is that, what I alluded to before, which is that if you don't think about this as a developer, it's possible that the finance folks in the company could decide well hey, operating the cloud doesn't make financial sense for us. And so we're not going to do it and we're going to go back to data center and you maybe that's the right business move for some businesses who aren't growing rapidly, for whom speed and flexibility isn't as important. Maybe they stay in the data center or they go back to a data center. And so like, I would think a developer has stakes in the game, if they want to be flexible, if they want to continue to be flexible. And from a company perspective, like we... You know, this idea still being sort of fleshed out and even within the FinOps movement, like there is a question of how much time should a developer spend thinking about costs stuff? I'll tell you what my answer is, and perhaps I can touch on what other people think about it as well. My answer is that it's best to be transparent with developers as much as possible and share with them as much data as we possibly can, the right kind of data, right? Not overwhelm them with statistics, that help them understand their applications and applications efficiency. And if when you are implementing a FinOps practice within your org, if you get the sense that people are very touchy, and they're not used to this idea of talking about cost directly, you can talk about it in terms of proxies, right. And as I mentioned before, CPU, RAM, disk, network. Those are all good proxies for cost. So if you tell them hey, your application is efficient or inefficient on these different dimensions, go do something about it, right. Like, when you build your next architecture for your application, incorporate efficiencies across these particular dimensions. That will resonate and that will ensure that developers don't feel like it's hampering their speed. I think the cultural shift that FinOps emphasizes is key. This, helping developers get the high level understanding of why we're doing what we're doing and why it's important and embedding it into their not only their architectural design, but their daily operations. That is the key, like FinOps has multiple pieces to it. I think it's successful because it emphasizes a system that's made up of governance practices, rules that tell you how you should behave within the system. Tools like a CMP, and we can talk about that in a bit. But essentially, it's a cost management platform which is a tool that is designed to figure out what you're spending and express it back to you. It's designed to create anomalies and there's a whole segment in the marketplace of these different kinds of tools. And then of course, the cultural shift. If you can do all three at your organization whether you want to call it a FinOps or not, you're going to be set up for success and it will solve that problem for you. >> So Sasha, one of the things I've really enjoyed the last decade or so is it used to be that IT organizations thought what they were doing was, the differentiator and therefore, they were a bit guarded about what they would share. And of course, these days leveraging cloud leveraging open source, there is much more collaboration out there. And LiveRamp, not only is using FinOps, but you're a member of the FinOps Foundation, which has over 1500, individual members participating in that oversaw by the Linux Foundation, maybe bring us in a little bit as to, why LiveRamp decided to join this group. And, for final word on really kind of the mission of the FinOps Foundation. >> Yeah, I mean as members of the audience might know, the FinOps Foundation recently moved to the Linux Foundation, and I think part of that move was to express the independence of the FinOps Foundation, it was connected to a company in a CMP space before and I think J.R and the team made a wonderful decision in doing so. And I wanted to give a shout out to them. I'm very excited about the shift, and we look forward to contributing to the codebase and all the conversations. In terms of how we discovered it. I was feeling the pain of all these different problems of being, over my budget in the cloud. And, I had arrived at like this idea of like, I needed a dedicated person, a dedicated team that was cross-functional in order to solve the problem. But, on a whim, I attended a FinOps course at a conference and Mike Fuller, who was the author or one of the authors of the FinOps book, along with J.R. was teaching it and I spent eight hours just in like, in literal wonder thinking holy crap this guy and whoever came up with this concept put together and synthesized all of the pain that I had felt and all the different things I thought about in order to solve the problem in a beautiful, holistic manner. And they were just presenting it back to me on a platter, back to everyone on a platter and I thought that was beautiful. And the week that I got back to work from the conference, I put together a presentation for the executives to position a FinOps practice as the solution for LiveRamps budgetary cloud pain. We went for it, and we... It's helped us, it's helped lots of other companies. And, I'm here today partly because I want to give back because there's so much that I learned from being in the Slack channel. There's so much that I learned by reading the book, things that I hadn't thought of that I hadn't experienced yet. So I didn't have the pain. But you know, J.R and Mike, they had all interviewed, hundreds of different folks for the book, got lots of input, and they were talking about things that I hadn't experienced yet, that I was going to. And so I want to give back, they clearly want to give back. And I think it's, a wonderful, a wonderful practice, a wonderful book, a wonderful Slack channel. I would recommend that anyone facing the budgetary challenge in the cloud, join the organization There is a monthly conversation, where someone presents and you learn a lot from doing it. You learn problems and solutions that you perhaps wouldn't have thought of, so I would highly recommend it. >> All right, well Sasha thank you so much for sharing your story with our community and everything that you've learned and best of luck going forward. >> Thanks very much Stu. It's great to talk. >> Alright, and if you want to learn more about what Sasha was talking about, Linux Foundation it is this finops.org is their website. Linux Foundation, of course theCUBE. Cloud Native, big piece of what happens and what we're doing will be at theCUBEcon, CloudNativeCon shows this year. Look for more interviews in this space. I'm Stu Miniman. And look forward to hearing more about your Cloud Native Insights. (upbeat music)

Published Date : Jul 9 2020

SUMMARY :

leaders around the globe, of the changes that need to happen. and what your team does there? and send targeted advertising to them. you know, cookies for eating in all the different areas that you know, to go much heavier into cloud. and over the course of a year, bit that move to the cloud, and we containerize those. you know, a couple of months or a quarter. and maybe this is begin to and how they had to think about and just make sure that the in the overall movement to the cloud. And that is the goal of unit economics, and what visibility you had? and express it back to you. of the FinOps Foundation. and solutions that you perhaps and everything that you've learned It's great to talk. Alright, and if you

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
SashaPERSON

0.99+

Sasha KipervargPERSON

0.99+

Mike FullerPERSON

0.99+

J.R.PERSON

0.99+

Stu MinimanPERSON

0.99+

San FranciscoLOCATION

0.99+

FinOps FoundationORGANIZATION

0.99+

Palo AltoLOCATION

0.99+

New YorkLOCATION

0.99+

LondonLOCATION

0.99+

MikePERSON

0.99+

GoogleORGANIZATION

0.99+

Linux FoundationORGANIZATION

0.99+

J.RPERSON

0.99+

oneQUANTITY

0.99+

one teamQUANTITY

0.99+

LiveRampORGANIZATION

0.99+

FerrariORGANIZATION

0.99+

eight hoursQUANTITY

0.99+

San FranciscoLOCATION

0.99+

ChinaLOCATION

0.99+

threeQUANTITY

0.99+

two separate teamsQUANTITY

0.99+

Cloud Native InsightsTITLE

0.99+

400QUANTITY

0.99+

Bay AreaLOCATION

0.99+

firstQUANTITY

0.99+

bothQUANTITY

0.99+

four dayQUANTITY

0.99+

two different teamsQUANTITY

0.99+

BostonLOCATION

0.99+

twoQUANTITY

0.98+

todayDATE

0.98+

70QUANTITY

0.98+

Cloud NativeTITLE

0.98+

sevenDATE

0.98+

hundredsQUANTITY

0.98+

DevOpsTITLE

0.98+

mid 90sDATE

0.98+

hundred petabyteQUANTITY

0.98+

over 1500QUANTITY

0.98+

CloudNativeConEVENT

0.98+

second decisionsQUANTITY

0.97+

next monthDATE

0.97+

FinOpsTITLE

0.97+

DataprocORGANIZATION

0.96+

eight years agoDATE

0.96+

first thingQUANTITY

0.95+

Global Cloud OperationsORGANIZATION

0.95+

once a quarterQUANTITY

0.94+

theCUBEORGANIZATION

0.94+

LiveRampTITLE

0.93+

last year and a halfDATE

0.93+

SlackORGANIZATION

0.92+

StuPERSON

0.92+

CapExTITLE

0.91+

next quartersDATE

0.9+

one stepQUANTITY

0.9+

this yearDATE

0.89+

theCUBEconEVENT

0.89+

FinOpsORGANIZATION

0.89+

Hoshang Chenoy, Meraki & Matthew Scullion, Matillion | AWS re:Invent 2022


 

(upbeat music) >> Welcome back to Vegas. It's theCUBE live at AWS re:Invent 2022. We're hearing up to 50,000 people here. It feels like if the energy at this show is palpable. I love that. Lisa Martin here with Dave Vellante. Dave, we had the keynote this morning that Adam Selipsky delivered lots of momentum in his first year. One of the things that you said that you were looking in your breaking analysis that was released a few days ago, four trends and one of them, he said under Selipsky's rule in the 2020s, there's going to be a rush of data that will dwarf anything we have ever seen. >> Yeah, it was at least a quarter, maybe a third of his keynote this morning was all about data and the theme is simplifying data and doing better data integration, integrating across different data platforms. And we're excited to talk about that. Always want to simplify data. It's like the rush of data is so fast. It's hard for us to keep up. >> It is hard to keep that up. We're going to be talking with an alumni next about how his company is helping organizations like Cisco Meraki keep up with that data explosion. Please welcome back to the program, Matthew Scullion, the CEO of Matillion and how Hoshang Chenoy joins us, data scientist at Cisco Meraki. Guys, great to have you on the program. >> Thank you. >> Thank you for having us. >> So Matthew, we last saw you just a few months ago in Vegas at Snowflake Summits. >> Matthew: We only meet in Vegas. >> I guess we do, that's okay. Talk to us about some of the things, I know that Matillion is a data transformation solution that was originally introduced for AWS for Redshift. But talk to us about Matillion. What's gone on since we've seen you last? >> Well, I mean it's not that long ago but actually quite a lot. And it's all to do with exactly what you guys were just talking about there. This almost hard to comprehend way the world is changing with the amounts of data that we now can and need to put to work. And our worldview is there's no shortage of data but the choke points certainly one of the choke points. Maybe the choke point is our ability to make that data useful, to make it business ready. And we always talk about the end use cases. We talk about the dashboard or the AI model or the data science algorithm. But until before we can do any of that fun stuff, we have to refine raw data into business ready, usable data. And that's what Matillion is all about. And so since we last met, we've made a couple of really important announcements and possibly at the top of the list is what we call the data productivity cloud. And it's really squarely addressed this problem. It's the results of many years of work, really the apex of many years of the outsize engineering investment, Matillion loves to make. And the Data Productivity Cloud is all about helping organizations like Cisco Meraki and hundreds of others enterprise organizations around the world, get their data business ready, faster. >> Hoshang talk to us a little bit about what's going on at Cisco Meraki, how you're leveraging Matillion from a productivity standpoint. >> I've really been a Matillion fan for a while, actually even before Cisco Meraki at my previous company, LiveRamp. And you know, we brought Matillion to LiveRamp because you know, to Matthew's point, there is a stage in every data growth as I want to call it, where you have different companies at different stages. But to get data, data ready, you really need a platform like Matillion because it makes it really easy. So you have to understand Matillion, I think it's designed for someone that uses a lot of code but also someone that uses no code because the UI is so good. Someone like a marketer who doesn't really understand what's going on with that data but wants to be a data driven marketer when they look at the UI they immediately get it. They're just like, oh, I get what's happening with my data. And so that's the brilliance of Matillion and to get data to that data ready part, Matillion does a really, really good job because what we've been able to do is blend so many different data sources. So there is an abundance of data. Data is siloed though. And the connectivity between different data is getting harder and harder. And so here comes the Matillion with it's really simple solution, easy to use platform, powerful and we get to use all of that. So to really change the way we've thought about our analytics, the way we've progressed our division, yeah. >> You're always asking about superpowers and that is a superpower of Matillion 'cause you know, low-code, no-code sounds great but it only gets you a quarter of the way there, maybe 50% of the way there. You're kind of an "and" not an "or." >> That's a hundred percent right. And so I mentioned the Data Productivity Cloud earlier which is the name of this platform of technology we provide. That's all to do with making data business ready. And so I think one of the things we've seen in this industry over the past few years is a kind of extreme decomposition in terms of vendors of making data business ready. You've got vendors that just do loading, you've got vendors that just do a bit of data transformation, you've got vendors that do data ops and orchestration, you've got vendors that do reverse ETL. And so with the data productivity platform, you've got all of that. And particularly in this kind of, macroeconomic heavy weather that we're now starting to face, I think companies are looking for that. It's like, I don't want to buy five things, five sets of skills, five expensive licenses. I want one platform that can do it. But to your point David, it's the and not the or. We talk about the Data Productivity Cloud, the DPC, as being everyone ready. And what we mean by that is if you are the tech savvy marketer who wants to get a particular insight and you understand what a Rowan economy is, but you're not necessarily a hardcore super geeky data engineer then you can visual low-code, no-code, your data to a point where it's business ready. You can do that really quick. It's easy to understand, it's faster to ramp people onto those projects cause it like explains itself, faster to hand it over cause it's self-documenting. But, they'll always be individuals, teams, "and", "or" use cases that want to high-code as well. Maybe you want to code in SQL or Python, increasingly of course in DBT and you can do that on top of the Data Productivity Cloud as well. So you're not having to make a choice, but is that right? >> So one of the things that Matillion really delivers is speed to insight. I've always said that, you know, when you want to be business ready you want to make fast decisions, you want to act on data quickly, Matillion allows you to, this feed to insight is just unbelievably fast because you blend all of these different data sources, you can find the deficiencies in your process, you fix that and you can quickly turn things around and I don't think there's any other platform that I've ever used that has that ability. So the speed to insight is so tremendous with Matillion. >> The thing I always assume going on in our customers teams, like you run Hoshang is that the visual metaphor, be it around the orchestration and data ops jobs, be it around the transformation. I hope it makes it easier for teams not only to build it in the first place, but to live with it, right? To hand it over to other people and all that good stuff. Is that true? >> Let me highlight that a little bit more and better for you. So, say for example, if you don't have a platform like Matillion, you don't really have a central repository. >> Yeah. >> Where all of your codes meet, you could have a get repository, you could do all of those things. But, for example, for definitions, business definitions, any of those kind of things, you don't want it to live in just a spreadsheet. You want it to have a central platform where everybody can go in, there's detailed notes, copious notes that you can make on Matillion and people know exactly which flow to go to and be part of, and so I kind of think that that's really, really important because that's really helped us in a big, big way. 'Cause when I first got there, you know, you were pulling code from different scripts and things and you were trying to piece everything together. But when you have a platform like Matillion and you actually see it seamlessly across, it's just so phenomenal. >> So, I want to pick up on something Matthew said about, consolidating platforms and vendors because we have some data from PTR, one of our survey partners and they went out, every quarter they do surveys and they asked the customers that were going to decrease their spending in the quarter, "How are you going to do it?" And number one, by far, like, over a third said, "We're going to consolidate redundant vendors." Way ahead of cloud, we going to optimize cloud resource that was next at like 15%. So, confirms what you were saying and you're hearing that a lot. Will you wait? And I think we never get rid of stuff, we talk about it all the time. We call it GRS, get rid of stuff. Were you able to consolidate or at least minimize your expense around? >> Hoshang: Yeah, absolutely. >> What we were able to do is identify different parts of our tech stack that were just either deficient or duplicate, you know, so they're just like, we don't want any duplicate efforts, we just want to be able to have like, a single platform that does things, does things well and Matillion helped us identify all of those different and how do we choose the right tech stack. It's also about like Matillion is so easy to integrate with any tech stack, you know, it's just they have a generic API tool that you can log into anything besides all of the components that are already there. So it's a great platform to help you do that. >> And the three things we always say about the Data Productivity Cloud, everyone ready, we spoke about this is whether low-code, no-code, quasi-technical, quasi-business person using it, through to a high-end data engineer. You're going to feel at home on the DPC. The second one, which Hoshang was just alluding to there is stack ready, right? So it is built for AWS, built for Snowflake, built for Redshift, pure tight integration, push down ELT better than you could write yourself by hand. And then the final one is future ready, which is this idea that you can start now super easy. And we buy software quickly nowadays, right? We spin it up, we try it out and before we know it, the whole organization is using it. And so the future ready talks about that continuum of being able to launch in five minutes, learn it in five hours, deliver your first project in five days and yet still be happy that it's an enterprise scalable platform, five years down track including integrating with all the different things. So Matillion's job holding up the end of the bargain that Hoshang was just talking about there is to ensure we keep putting the features integrations and support into the Data Productivity Cloud to make sure that Hoshang's team can continue to live inside it and do all the things they need to do. >> Hoshang, you talked about the speed to insight being tremendously fast, but if I'm looking at Cisco Meraki from a high level business outcome perspective, what are some of those outcomes that a Matillion is helping Cisco Meraki to achieve. >> So I can just talk in general, not giving you like any specific numbers or anything, but for example, we were trying to understand how well our small and medium business campaigns were doing and we had to actually pull in data from multiple different sources. So not just, our instances of Marketo and Salesforce, we had to look at our internal databases. So Matillion helped us blend all of that together. Once I had all of that data blended, it was then ready to be analyzed. And once we had that analysis done, we were able to confirm that our SMB campaigns were doing well but these the things that we need to do to improve them. When we did that and all of that happened so quickly because they were like, well you need to get data from here, you need to get data from there. And we're like, great, we'll just plug, plug, plug. We put it all together, build transformations and you know we produced this insight and then we were able to reform, refine, and keep getting better and better at it. And you know, we had a 40X return on SMB campaigns. It's unbelievable. >> And there's the revenue tie in right there. >> Hoshang: Yeah. >> Matthew, I know you've been super busy, tons of meetings, you didn't get to see the whole keynote, but one of the themes of Adam Selipsky's keynote was, you know, the three letter word of ETL, they laid out a vision of zero ETL and then they announced zero ETL for Aurora and Redshift. And you think about ETL, I remember the days they said, "Okay, we're going to do ELT." Which is like, raising the debt ceiling, we're just going to kick the can down the road. So, what do you think about that vision? You know, how does it relate to what you guys are doing? >> So there was a, I don't know if this only works in the UK or it works globally. It was a good line many years ago. Rumors of my death are premature or so I think it was an obituary had gone out in the times by accident and that's how the guy responded to it. Something like that. It's a little bit like that. The announcement earlier within the AWS space of zero ETL between platforms like Aurora and Redshift and perhaps more over time is really about data movement, right? So it's about do I need to do a load of high cost in terms of coding and compute, movement of data between one platform, another. At Matillion, we've always seen data movement as an enabling technology, which gets you to the value add of transformation. My favorite metaphor to bring this to life is one of iron. So the world's made of iron, right? The world is literally made of iron ore but iron ore isn't useful until you turn it to steel. Loading data is digging out iron ore from the ground and moving it to the refinery. Transformation of data is turning iron ore into steel and what the announcements you saw earlier from AWS are more about the quarry to the factory bit than they are about the iron ore to the steel bit. And so, I think it's great that platforms are making it easier to move data between them, but it doesn't change the need for Hoshang's business professionals to refine that data into something useful to drive their marketing campaigns. >> Exactly, it's quarry to the factory and a very Snowflake like in a way, right? You make it easy to get in. >> It's like, don't get me wrong, I'm great to see investment going into the Redshift business and the AWS data analytics stack. We do a lot of business there. But yes, this stuff is also there on Snowflake, already. >> I mean come on, we've seen this for years. You know, I know there's a big love fest between Snowflake and AWS 'cause they're selling so much business in the field. But look that we saw it separating computing from storage, then AWS does it and now, you know, why not? It's good sense. That's what customers want. The customer obsessed data sharing is another thing. >> And if you take data sharing as an example from our friends at Snowflake, when that was announced a few people possibly, yourselves, said, "Oh, Matthew what do you think about this? You're in the data movement business." And I was like, "Ah, I'm not really actually, some of my competitors are in the data movement business. I have data movement as part of my platform. We don't charge directly for it. It's just part of the platform." And really what it's to do is to get the data into a place where you can do the fun stuff with it of refining into steel. And so if Snowflake or now AWS and the Redshift group are making that easier that's just faster to fun for me really. >> Yeah, sure. >> Last question, a question for both of you. If you had, you have a brand new shiny car, you got a bumper sticker that you want to put on that car to tell everyone about Matillion, everyone about Cisco Meraki, what does that bumper sticker say? >> So for Matillion, it says Matillion is the Data Productivity Cloud. We help you make your data business ready, faster. And then for a joke I'd write, "Which you are going to need in the face of this tsunami of data." So that's what mine would say. >> Love it. Hoshang, what would you say? >> I would say that Cisco makes some of the best products for IT professionals. And I don't think you can, really do the things you do in IT without any Cisco product. Really phenomenal products. And, we've gone so much beyond just the IT realm. So you know, it's been phenomenal. >> Awesome. Guys, it's been a pleasure having you back on the program. Congrats to you now Hoshang, an alumni of theCUBE. >> Thank you. >> But thank you for talking to us, Matthew, about what's going on with Matillion so much since we've seen you last. I can imagine how much worse going to go on until we see you again. But we appreciate, especially having the Cisco Meraki customer example that really articulates the value of data for everyone. We appreciate your insights and we appreciate your time. >> Thank you. >> Privilege to be here. Thanks for having us. >> Thank you. >> Pleasure. For our guests and Dave Vellante, I'm Lisa Martin. You're watching theCUBE, the leader in live enterprise and emerging tech coverage.

Published Date : Nov 29 2022

SUMMARY :

One of the things that you and the theme is simplifying data Guys, great to have you on the program. you just a few months ago What's gone on since we've seen you last? And the Data Productivity Cloud Hoshang talk to us a little And so that's the brilliance of Matillion but it only gets you a And so I mentioned the Data So the speed to insight is is that the visual metaphor, if you don't have a and things and you were trying So, confirms what you were saying to help you do that. and do all the things they need to do. Hoshang, you talked about the speed And you know, we had a 40X And there's the revenue to what you guys are doing? the guy responded to it. Exactly, it's quarry to the factory and the AWS data analytics stack. now, you know, why not? And if you take data you want to put on that car We help you make your data Hoshang, what would you say? really do the things you do in Congrats to you now Hoshang, until we see you again. Privilege to be here. the leader in live enterprise

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

MatthewPERSON

0.99+

Lisa MartinPERSON

0.99+

DavidPERSON

0.99+

Matthew ScullionPERSON

0.99+

Dave VellantePERSON

0.99+

DavePERSON

0.99+

AWSORGANIZATION

0.99+

Adam SelipskyPERSON

0.99+

VegasLOCATION

0.99+

CiscoORGANIZATION

0.99+

HoshangPERSON

0.99+

50%QUANTITY

0.99+

five daysQUANTITY

0.99+

UKLOCATION

0.99+

five hoursQUANTITY

0.99+

five minutesQUANTITY

0.99+

SelipskyPERSON

0.99+

MatillionORGANIZATION

0.99+

2020sDATE

0.99+

Hoshang ChenoyPERSON

0.99+

40XQUANTITY

0.99+

15%QUANTITY

0.99+

first projectQUANTITY

0.99+

Cisco MerakiORGANIZATION

0.99+

AuroraORGANIZATION

0.99+

five setsQUANTITY

0.99+

PythonTITLE

0.99+

oneQUANTITY

0.99+

MerakiPERSON

0.99+

one platformQUANTITY

0.99+

bothQUANTITY

0.99+

OneQUANTITY

0.99+

SQLTITLE

0.99+

second oneQUANTITY

0.98+

five yearsQUANTITY

0.98+

five expensive licensesQUANTITY

0.98+

first yearQUANTITY

0.98+

PTRORGANIZATION

0.98+

LiveRampORGANIZATION

0.97+

SnowflakeTITLE

0.97+

three thingsQUANTITY

0.97+

hundred percentQUANTITY

0.96+

MatillionPERSON

0.96+

zeroQUANTITY

0.95+

RedshiftTITLE

0.95+

over a thirdQUANTITY

0.94+

Tim Barnes, AWS | AWS Startup Showcase S2 E3


 

(upbeat music) >> Hello, everyone, welcome to theCUBE's presentation of the AWS Startup Showcase. We're in Season two, Episode three, and this is the topic of MarTech and the Emerging Cloud-Scale Customer Experiences, the ongoing coverage of AWS's ecosystem of large scale growth and new companies and growing companies. I'm your host, John Furrier. We're excited to have Tim Barnes, Global Director, General Manager of Advertiser and Marketing at AWS here doing the keynote cloud-scale customer experience. Tim, thanks for coming on. >> Oh, great to be here and thank you for having me. >> You've seen many cycles of innovation, certainly in the ad tech platform space around data, serving consumers and a lot of big, big scale advertisers over the years as the Web 1.0, 2.0, now 3.0 coming, cloud-scale, roll of data, all big conversations changing the game. We see things like cookies going away. What does this all mean? Silos, walled gardens, a lot of new things are impacting the applications and expectations of consumers, which is also impacting the folks trying to reach the consumers. And this is kind of creating a kind of a current situation, which is challenging, but also an opportunity. Can you share your perspective of what this current situation is, as the emerging MarTech landscape emerges? >> Yeah, sure, John, it's funny in this industry, the only constant has changed and it's an ever-changing industry and never more so than right now. I mean, we're seeing with whether it's the rise of privacy legislation or just breach of security of data or changes in how the top tech providers and browser controllers are changing their process for reaching customers. This is an inflection point in the history of both ad tech and MarTech. You hit the nail on the head with cookie deprecation, with Apple removing IDFA, changes to browsers, et cetera, we're at an interesting point. And by the way, we're also seeing an explosion of content sources and ability to reach customers that's unmatched in the history of advertising. So those two things are somewhat at odds. So whether we see the rise of connected television or digital out of home, you mentioned Web 3.0 and the opportunities that may present in metaverse, et cetera, it's an explosion of opportunity, but how do we continue to connect brands with customers and do so in a privacy compliant way? And that's really the big challenge we're facing. One of the things that I see is the rise of modeling or machine learning as a mechanism to help remove some of these barriers. If you think about the idea of one-to-one targeting, well, that's going to be less and less possible as we progress. So how am I still as a brand advertiser or as a targeted advertiser, how am I going to still reach the right audience with the right message in a world where I don't necessarily know who they are. And modeling is a really key way of achieving that goal and we're seeing that across a number of different angles. >> We've always talked about on the ad tech business for years, it's the behemoth of contextual and behavioral, those dynamics. And if you look at the content side of the business, you have now this new, massive source of new sources, blogging has been around for a long time, you got video, you got newsletters, you got all kinds of people, self-publishing, that's been around for a while, right? So you're seeing all these new sources. Trust is a big factor, but everyone wants to control their data. So this walled garden perpetuation of value, I got to control my data, but machine learning works best when you expose data, so this is kind of a paradox. Can you talk about the current challenge here and how to overcome it because you can't fight fashion, as they say, and we see people kind of going down this road as saying, data's a competitive advantage, but I got to figure out a way to keep it, own it, but also share it for the machine learning. What's your take on that? >> Yeah, I think first and foremost, if I may, I would just start with, it's super important to make that connection with the consumer in the first place. So you hit the nail on the head for advertisers and marketers today, the importance of gaining first party access to your customer and with permission and consent is paramount. And so just how you establish that connection point with trust and with very clear directive on how you're going to use the data has never been more important. So I would start there if I was a brand advertiser or a marketer, trying to figure out how I'm going to better connect with my consumers and get more first party data that I could leverage. So that's just building the scale of first party data to enable you to actually perform some of the types of approaches we'll discuss. The second thing I would say is that increasingly, the challenge exists with the exchange of the data itself. So if I'm a data control, if I own a set of first party data that I have consent with consumers to use, and I'm passing that data over to a third party, and that data is leaked, I'm still responsible for that data. Or if somebody wants to opt out of a communication and that opt out signal doesn't flow to the third party, I'm still liable, or at least from the consumer's perspective, I've provided a poor customer experience. And that's where we see the rise of the next generation, I call it of data clean rooms, the approaches that you're seeing, a number of customers take in terms of how they connect data without actually moving the data between two sources. And we're seeing that as certainly a mechanism by which you can preserve accessibility data, we call that federated data exchange or federated data clean rooms and I think you're seeing that from a number of different parties in the industry. >> That's awesome, I want to get into the data interoperability because we have a lot of startups presenting in this episode around that area, but why I got you here, you mentioned data clean room. Could you define for us, what is a federated data clean room, what is that about? >> Yeah, I would simply describe it as zero data movement in a privacy and secure environment. To be a little bit more explicit and detailed, it really is the idea that if I'm a party A and I want to exchange data with party B, how can I run a query for analytics or other purposes without actually moving data anywhere? Can I run a query that has accessibility to both parties, that has the security and the levels of aggregation that both parties agree to and then run the query and get those results sets back in a way that it actually facilitates business between the two parties. And we're seeing that expand with partners like Snowflake and InfoSum, even within Amazon itself, AWS, we have data sharing capabilities within Redshift and some of our other data-led capabilities. And we're just seeing explosion of demand and need for customers to be able to share data, but do it in a way where they still control the data and don't ever hand it over to a third party for execution. >> So if I understand this correctly, this is kind of an evolution to kind of take away the middleman, if you will, between parties that used to be historically the case, is that right? >> Yeah, I'd say this, the middleman still exists in many cases. If you think about joining two parties' data together, you still have the problem of the match key. How do I make sure that I get the broadest set of data to match up with the broadest set of data on the other side? So we have a number of partners that provide these types of services from LiveRamp, TransUnion, Experian, et cetera. So there's still a place for that so-called middleman in terms of helping to facilitate the transaction, but as a clean room itself, I think that term is becoming outdated in terms of a physical third party location, where you push data for analysis, that's controlled by a third party. >> Yeah, great clarification there. I want to get into this data interoperability because the benefits of AWS and cloud scales we've seen over the past decade and looking forward is, it's an API based economy. So APIs and microservices, cloud native stuff is going to be the key to integration. And so connecting people together is kind of what we're seeing as the trend. People are connecting their data, they're sharing code in open source. So there's an opportunity to connect the ecosystem of companies out there with their data. Can you share your view on this interoperability trend, why it's important and what's the impact to customers who want to go down this either automated or programmatic connection oriented way of connecting data. >> Never more important than it has been right now. I mean, if you think about the way we transact it and still too today do to a certain extent through cookie swaps and all sorts of crazy exchanges of data, those are going away at some point in the future; it could be a year from now, it could be later, but they're going away. And I think that that puts a great amount of pressure on the broad ecosystem of customers who transact for marketers, on behalf of marketers, both for advertising and marketing. And so data interoperability to me is how we think about providing that transactional layer between multiple parties so that they can continue to transact in a way that's meaningful and seamless, and frankly at lower cost and at greater scale than we've done in the past with less complexity. And so, we're seeing a number of changes in that regard, whether that's data sharing and data clean rooms or federated clean rooms, as we described earlier, whether that's the rise of next generation identity solutions, for example, the UID 2.0 Consortium, which is an effort to use hashed email addresses and other forms of identifiers to facilitate data exchange for the programmatic ecosystem. These are sort of evolutions based on this notion that the old world is going away, the new world is coming, and part of that is how do we connect data sources in a more seamless and frankly, efficient manner. >> It's almost interesting, it's almost flipped upside down, you had this walled garden mentality, I got to control my data, but now I have data interoperability. So you got to own and collect the data, but also share it. This is going to kind of change the paradigm around my identity platforms, attributions, audience, as audiences move around, and with cookies going away, this is going to require a new abstraction, a new way to do it. So you mentioned some of those standards. Is there a path in this evolution that changes it for the better? What's your view on this? What do you see happening? What's going to come out of this new wave? >> Yeah, my father was always fond of telling me, "The customer, my customers is my customer." And I like to put myself in the shoes of the Marc Pritchards of the world at Procter & Gamble and think, what do they want? And frankly, their requirements for data and for marketing have not changed over the last 20 years. It's, I want to reach the right customer at the right time, with the right message and I want to be able to measure it. In other words, summarizing, I want omnichannel execution with omnichannel measurement, and that's become increasingly difficult as you highlighted with the rise of the walled gardens and increasingly data living in silos. And so I think it's important that we, as an industry start to think about what's in the best interest of the one customer who brings virtually 100% of the dollars to this marketplace, which is the CMO and the CMO office. And how do we think about returning value to them in a way that is meaningful and actually drives its industry forward. And I think that's where the data operability piece becomes really important. How do we think about connecting the omnichannel channels of execution? How do we connect that with partners who run attribution offerings with machine learning or partners who provide augmentation or enrichment data such as third party data providers, or even connecting the buy side with the sell side in a more efficient manner? How do I make that connection between the CMO and the publisher in a more efficient and effective way? And these are all challenges facing us today. And I think at the foundational layer of that is how do we think about first of all, what data does the marketer have, what is the first party data? How do we help them ethically source and collect more of that data with proper consent? And then how do we help them join that data into a variety of data sources in a way that they can gain value from it. And that's where machine learning really comes into play. So whether that's the notion of audience expansion, whether that's looking for some sort of cohort analysis that helps with contextual advertising, whether that's the notion of a more of a modeled approach to attribution versus a one-to-one approach, all of those things I think are in play, as we think about returning value back to that customer of our customer. >> That's interesting, you broke down the customer needs in three areas; CMO office and staff, partners ISV software developers, and then third party services. Kind of all different needs, if you will, kind of tiered, kind of at the center of that's the user, the consumer who have the expectations. So it's interesting, you have the stakeholders, you laid out kind of those three areas as to customers, but the end user, the consumer, they have a preference, they kind of don't want to be locked into one thing. They want to move around, they want to download apps, they want to play on Reddit, they want to be on LinkedIn, they want to be all over the place, they don't want to get locked in. So you have now kind of this high velocity user behavior. How do you see that factoring in, because with cookies going away and kind of the convergence of offline-online, really becoming predominant, how do you know someone's paying attention to what and when attention and reputation. All these things seem complex. How do you make sense of it? >> Yeah, it's a great question. I think that the consumer as you said, finds a creepiness factor with a message that follows them around their various sources of engagement with content. So I think at first and foremost, there's the recognition by the brand that we need to be a little bit more thoughtful about how we interact with our customer and how we build that trust and that relationship with the customer. And that all starts with of course, opt-in process consent management center but it also includes how we communicate with them. What message are we actually putting in front of them? Is it meaningful, is it impactful? Does it drive value for the customer? I think we've seen a lot of studies, I won't recite them that state that most consumers do find value in targeted messaging, but I think they want it done correctly and there in lies the problem. So what does that mean by channel, especially when we lose the ability to look at that consumer interaction across those channels. And I think that's where we have to be a little bit more thoughtful with frankly, kind of going back to the beginning with contextual advertising, with advertising that perhaps has meaning, or has empathy with the consumer, perhaps resonates with the consumer in a different way than just a targeted message. And we're seeing that trend, we're seeing that trend both in television, connected television as those converge, but also as we see about connectivity with gaming and other sort of more nuanced channels. The other thing I would say is, I think there's a movement towards less interruptive advertising as well, which kind of removes a little bit of those barriers for the consumer and the brand to interact. And whether that be dynamic product placement, content optimization, or whether that be sponsorship type opportunities within digital. I think we're seeing an increased movement towards those types of executions, which I think will also provide value to both parties. >> Yeah, I think you nailed it there. I totally agree with you on the contextual targeting, I think that's a huge deal and that's proven over the years of providing benefit. People, they're trying to find what they're looking for, whether it's data to consume or a solution they want to buy. So I think that all kind of ties together. The question is these three stakeholders, the CMO office and staff you mentioned, and the software developers, apps, or walled gardens, and then like ad servers as they come together, have to have standards. And so, I think to me, I'm trying to squint through all the movement and the shifting plates that are going on in the industry and trying to figure out where are the dots connecting? And you've seen many cycles of innovation at the end of the day, it comes down to who can perform best for the end user, as well as the marketers and advertisers, so that balance. What's your view on this shift? It's going to land somewhere, it has to land in the right area, and the market's very efficient. I mean, this ad market's very efficient. >> Yeah, I mean, in some way, so from a standards perspective, I support and we interact extensively with the IB and other industry associations on privacy enhancing technologies and how we think about these next generations of connection points or identifiers to connect with consumers. But I'd say this, with respect to the CMO, and I mentioned the publisher earlier, I think over the last 10 years with the rise of programmatic, certainly we saw the power reside mostly with the CMO who was able to amass a large pool of cookies or purchase a large sort of cohort of customers with cookie based attributes and then execute against that. And so almost a blind fashion to the publisher, the publisher was sort of left to say, "Hey, here's an opportunity, do you want to buy it or not?" With no real reason why the marketer might be buying that customer? And I think that we're seeing a shift backwards towards the publisher and perhaps a healthy balance between the two. And so, I do believe that over time, that we're going to see publishers provide a lot more, what I might almost describe as mini walled gardens. So the ability, great publisher or a set of publishers to create a cohort of customers that can be targeted through programmatic or perhaps through programmatic guaranteed in a way that it's a balance between the two. And frankly thinking about that notion of federated data clean rooms, you can see an approach where publishers are able to share their first party data with a marketer's first party data, without either party feeling like they're giving up something or passing all their value over to the other. And I do believe we're going to see some significant technology changes over the next three to four years. That really rely on that interplay between the marketer and the publisher in a way that it helps both sides achieve their goals, and that is, increasing value back to the publisher in terms of higher CPMs, and of course, better reach and frequency controls for the marketer. >> I think you really brought up a big point there we can maybe follow up on, but I think this idea of publishers getting more control and power and value is an example of the market filling a void and the power log at the long tail, it's kind of a straight line. Then it's got the niche kind of communities, it's growing in the middle there, and I think the middle of the torso of that power law is the publishers because they have all the technology to measure the journeys and the click throughs and all this traffic going on their platform, but they just need to connect to someone else. >> Correct. >> That brings in the interoperability. So, as a publisher ourselves, we see that long tail getting really kind of fat in the middle where new brands are going to emerge, if they have audience. I mean, some podcasts have millions of users and some blogs are attracting massive audience, niche audiences that are growing. >> I would say, just look at the rise of what we might not have considered publishers in the past, but are certainly growing as publishers today. Customers like Instacart or Uber who are creating ad platforms or gaming, which of course has been an ad supported platform for some time, but is growing immensely. Retail as a platform, of course, amazon.com being one of the biggest retail platforms with advertising supported models, but we're seeing that growth across the board for retail customers. And I think that again, there's never been more opportunities to reach customers. We just have to do it the right way, in the way that it's not offensive to customers, not creepy, if you want to call it that, and also maximizes value for both parties and that be both the buy and the sell side. >> Yeah, everyone's a publisher and everyone's a media company. Everyone has their own news network, everyone has their own retail, it's a completely new world. Tim, thanks for coming on and sharing your perspective and insights on this key note, Tim Barnes, Global Director, General Manager of Advertiser and Market at AWS here with the Episode three of Season two of the AWS Startup Showcase. I'm John Furrier, thanks for watching. (upbeat music)

Published Date : Jun 29 2022

SUMMARY :

of the AWS Startup Showcase. Oh, great to be here and certainly in the ad tech and the opportunities that may present and how to overcome it because exchange of the data itself. into the data interoperability that has the security and to match up with the broadest the impact to customers that the old world is going of change the paradigm of the one customer who brings and kind of the convergence the ability to look and the market's very efficient. and the publisher in a way that it helps is an example of the market filling a void getting really kind of fat in the middle in the way that it's not offensive of the AWS Startup Showcase.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
John FurrierPERSON

0.99+

Tim BarnesPERSON

0.99+

Tim BarnesPERSON

0.99+

Procter & GambleORGANIZATION

0.99+

JohnPERSON

0.99+

AWSORGANIZATION

0.99+

TimPERSON

0.99+

AmazonORGANIZATION

0.99+

TransUnionORGANIZATION

0.99+

ExperianORGANIZATION

0.99+

two sourcesQUANTITY

0.99+

twoQUANTITY

0.99+

UberORGANIZATION

0.99+

LiveRampORGANIZATION

0.99+

both partiesQUANTITY

0.99+

AppleORGANIZATION

0.99+

two partiesQUANTITY

0.99+

MarTechORGANIZATION

0.99+

both sidesQUANTITY

0.99+

InfoSumORGANIZATION

0.99+

bothQUANTITY

0.99+

todayDATE

0.98+

two thingsQUANTITY

0.98+

four yearsQUANTITY

0.98+

two parties'QUANTITY

0.98+

first partyQUANTITY

0.98+

second thingQUANTITY

0.98+

firstQUANTITY

0.98+

LinkedInORGANIZATION

0.98+

InstacartORGANIZATION

0.98+

OneQUANTITY

0.98+

threeQUANTITY

0.97+

oneQUANTITY

0.97+

UID 2.0 ConsortiumORGANIZATION

0.97+

one customerQUANTITY

0.97+

three stakeholdersQUANTITY

0.96+

SnowflakeORGANIZATION

0.96+

theCUBEORGANIZATION

0.95+

Marc PritchardsPERSON

0.95+

amazon.comORGANIZATION

0.94+

100%QUANTITY

0.91+

three areasQUANTITY

0.9+

first placeQUANTITY

0.87+

RedditORGANIZATION

0.83+

millions of usersQUANTITY

0.83+

Startup ShowcaseEVENT

0.82+

IDFATITLE

0.78+

SeasonQUANTITY

0.76+

past decadeDATE

0.75+

EpisodeQUANTITY

0.75+

a year fromDATE

0.74+

last 20 yearsDATE

0.74+

one thingQUANTITY

0.72+

last 10 yearsDATE

0.71+

Web 3.0OTHER

0.7+

RedshiftTITLE

0.65+

Episode threeOTHER

0.64+

zeroQUANTITY

0.64+

Season twoQUANTITY

0.63+

waveEVENT

0.61+

MarTechTITLE

0.58+

twoOTHER

0.55+

S2 E3EVENT

0.53+

threeOTHER

0.5+

Shruti Koparker & Dr. Peter Day, Quantcast | Quantcast The Cookie Conundrum: A Recipe for Success


 

(upbeat music) >> Welcome back to the Quantcast Industry Summit on the demise of third-party cookies, The Cookie Conundrum, A Recipe for Success. We're here with Peter Day, the CTO, Quantcast and Shruti Koparkar, Head of Product Marketing Quancast. Thanks for coming on. Talk about the changing advertising landscape. >> Thanks for having us. >> Thank you for having us. >> So we've been hearing the story out to the big players, want to keep the data, make that centralized, control all the leverage, and then you've got the other end. You've got the open internet that still wants to be free and valuable for everyone. What's what are you guys doing to solve this problem? Because if cookies go away, what's going to happen there? How do people track things? You guys are in this business? First question, what is Quancast strategy to adapt to third-party cookies going away? What's going to be the answer? >> Yeah, so very rightly said, John. The mission, the Quancast mission is to champion of free and open internet. And with that in mind, our approach to this a world without third party cookies is really grounded in three fundamental things. First is industry standards. We think it's really important to participate and to work with organizations who are defining the standards that will guide the future of advertising. So with that in mind we've been participating with IAB Tech Lab, We've been part of their project, we are same thing with Prebid, who's kind of trying to figure out the pipes of identity the ID pipes of the future. And then also is W3C which is the World Wide Web Consortium. And our engineers and our engineering team are participating in their weekly meetings, trying to figure out what's happening with the browsers and keeping up with the progress there on things such as Google's FLoC. The second sort of thing is interoperability. As you've mentioned that a lots of different ID solutions that are emerging. You have UID 2.0, you have LiveRamp, you have Google's FLoC, and there will be more, there are more, and they will continue to be more. We really think it is important to build a platform that can ingest all of these signals. And so that's what we've done. The reason really is to meet our customers where they are at. Today our customers use multiple Data Management Platforms, DMPs. And that's why we support multiple of those. This is not going to be much different than that. We have to meet our customers where we are, or where they are at. And then finally, of course, which is at the very heart of who Quancast is, is innovation. As you can imagine being able to take all of these multiple signals in, including the IDs and the cohorts, but also others like contextual first party consent is becoming more and more important. And then there are many other signals like time, language, geolocation. So all of these signals can help us understand user behavior, intent and interests. In absence of third party cookies. However there's something to note about these. They're very raw, they're complex, they're messy, all of these different signals. They are changing all the time, their real time. And those incomplete in information isolation, just one of these signals can not help you build up true and complete picture. So what you really need is a technology like AI and Machine Learning, to really bring all of these signals together, combine them statistically, and get an understanding of user behavior intent and interest, and then act on it. Be it in terms of providing audience insights, or responding to bid requests and so on and so forth. So those are sort of the three fundamentals that our approach is grounded in which is industry standards, interoperability, and innovation. And you know, you have Peter here >> Yeah. who is the expert so you can dive much deeper into it. >> So Peter is CTO. You've got to tell us, how is this going to actually work? What are you guys doing from a technology standpoint to help with data-driven advertising and a third-party cookieless world? >> Well, we've been this is not a shock. You know, I think anyone who's been close to this space has known that the third party cookie has been reducing in quality in terms of its pervasiveness and its longevity for many years now. And the kind of death knell is really Google Chrome, making the changes that, they're going to be making. So we've been embarrassing in this space for many years and we've had to make a number of hugely diverse investments. So one of them is in how to, as a marketer how do I tell it my marketing still working in a world without (indistinct). The majority of marketers, completely relying on third party cookies today. It's tell them if their marketing is working or not. And so we've had to invest heavily and statistical techniques, which are closer to kind of echo metric models that marketers are used to have things like out of home advertising. It's going to be establishing whether their advertising is working or not in a digital environment. And actually this as with often the case in these kind of times of massive disruption, there's always opportunity to make things better. And we really think that's true. And you know, digital measurement is often mistaken precision for accuracy and there's a real opportunity to kind of see the wood for the trees if you'd like. And start to come up with better methods of measuring the effectiveness of advertising without third party cookies. And we've had to make countless other investments in areas like contextual modeling, and targeting that third-party cookies and connecting directly to publishers rather than going through this kind of loom escape that's going to tied together third party cookies. So I could, if I was to enumerate all the investments we've made I think it would be here till midnight, but we've had to make a number of investments over a number years. And that level investments only increasing at the moment. >> Peter, on that contextual, can you just double click on that and tell us more? >> Yeah, I mean, contextual it is, unfortunately when I think this is really poorly defined. It can mean everything from a publisher saying, Hey trust us this page is about SUV's, it's a what's possible now. And it's only really been possible the last couple of years which is to build statistical models of the entire internet based on the content that people are actually consuming. And this type of technology requires massive data processing capabilities, it's able to take advantage of the latest innovations in areas like natural language processing. And really gives computers, that kind of much deeper and richer understanding of the internet, which ultimately makes it possible to kind of organize the internet, in terms of the types of content of pages. So this type of technology has only been possible for the last few years. And, but we've been using contextual signals since our inception. Had always been massively predictive in terms of audience behaviors, in terms of where advertising is likely to work. And so we've been very fortunate to keep that investment going and take advantage of many of these innovations that are happening in academia and in kind of an adjacent areas >> On the AI and Machine Learning aspect. That seems to be a great differentiator in this day and age for getting the most out of the data. How is machine learning and AI factoring into your platform? >> I think it's how we've always operated, right from our inception. When we started as a measurement company. The way that we were giving our customers at the time we were just publishers, just the publisher side of our business. Insights into who their audience was, which was using Machine Learning techniques. And that's never really changed. The foundation of our platform has always been Machine Learning from before it was cool. A lot of our, kind of a lot of our co-teams have backgrounds in Machine Learning, and the PhDs in statistics and Machine Learning. And that really drives our decision-making. I mean, data is only useful if you can make sense of it and if you can organize it, and if you can take action on it, and to do that at this kind of scale it's absolutely necessary to use Machine Learning technology. >> So you mentioned contextual also, you know, in advertising we have everyone knows and that world that you got the contextual and behavioral dynamics. The behavior that's kind of generally can everyone's believing is happening. The consensus is undeniable is that, people are wanting to expect an environment where there's trust, there's truth, but also they want to be locked in. They don't want to get walled into a walled garden. Nobody wants to be in a wall garden. They want to be free to pop around and visit sites. It's more horizontal scalability than ever before yet. The bigger players are becoming walled garden vertical platforms. So with future of AI, the experience is going to come from this data. So the behaviors out there. How do you get >> Yeah. that contextual relevance and provide the horizontal scale that users expect? >> Yeah, I think it's a really good point and we're definitely at this kind of tipping point, we think in the broader industry. I think, you know, every publisher, right? We're really blessed to work with the biggest publishers in the world. All the way through to my mom's blog, right? So we get to hear the perspectives of the publishers at every scale. And they consistently tell us the same thing. Which is they want some more directly connect to consumers. They don't want to be tied into these walled gardens, which dictate how they must present their content. And in some cases what content they're allowed to present. And so, you know, our job as a company is to really provide level the playing field a little bit. Provide them the same capabilities they're only used to in the walled gardens, but let, give them more choice. In terms of how they structure their content, how they organize their content, how they organize their audiences, but make sure that they can fund that effectively. By making their audiences and their environments discoverable by marketers, measurable by marketers, and connect them as directly as possible to make that kind of ad funded economic model, as effective in the open internet as it is in social. And so a lot of the investments we've made over recent years have been really to kind of realize that vision, which is, it should be as easy for a marketer to be able to understand people on the open internet, as it is in social media. It should be as effective for them to reach people in that environment, is really high quality content as it is on Facebook. And so we've invested a lot of our R&D dollars in making that true. And we're now live with the Quantcast Platform which does exactly that. And as third party cookies go away, it only kind of exaggerate all kind of further emphasizes the need for direct connections between brands and publishers. And so we just want to build a technology that helps make that true, and gives the kind of technology to these marketers and publishers to connect, and to deliver great experiences without relying on these kind of walled gardens. >> Yeah. The direct to consumer, direct to audience is a new trend. You're seeing it everywhere. How do you guys support this new kind of signaling from for that's happening in these new world? How do you ingest the content, ingest this consent signaling? >> We were really fortunate to have an amazing an amazing R&D team. And, you know, we've had to do all sorts to make this, you know, to realize our vision. This has meant things like we, you know we have crawlers which stand the entire internet at this point, extract the content of the pages, and kind of make sense of it, and organize it. And organize it for publishers so that they can understand how their audiences overlap with potentially their competitors or collaborators, but more importantly, organize it for marketers. So they can understand what kind of high-impact opportunities are there for them there. So, you know, we've had to build a lot of technology. We've had to build analytics engines which can get answers back in seconds, so that, you know marketers and publishers can kind of interact with it with their own data and make sense of it and present it in a way that is compelling and then help them drive their strategy as well as their execution. We've had to invest in areas like consent management. Because we believe that a free and open internet is absolutely reliant on trust. And therefore we spend a lot of our time thinking about how do we make it easy for end-users to understand who has access to that data and easy friendly and users to be able to opt out. And as a result of that, we've now got the world's most widely adopted consent management platform. So it's hard to tackle one of these problems without tackling all of them. And we're fortunate enough to have had a large enough R&D budget over the last four or five years, make a number of investments, everything from consent and identity, through to contextual signals, through to measurement technologies, which really bring advertisers and publishers closer together. >> Great insight there. Shruti last word for you. What's the customer view here as you bring these new capabilities of the platform. What's what are you guys seeing as the highlight from a platform perspective? >> So the initial response that we've seen from our customers has been very encouraging. Both on the publisher side, as well as the marketer side. I think, you know, one of the things we hear quite a lot is you guys are at least putting forth a solution and action solution for us to test. Peter mentioned measurement. That really is where we started because you cannot optimize what you cannot measure. So that is where his team has started. And we have some measurement, very very initial capabilities still in alpha, but they are available in the platform for marketers to test out today. So the initial response has been very encouraging. People want to engage with us. Of course, our, you know, our fundamental value proposition which is that the Quantcast platform was never built to be reliant on third party data, these stale segments. Like we operate we've always operated on real time live data. The second thing is our premium publisher relationships. We have had the privilege of working like Peter served with some of the biggest publishers but we also have a very wide footprint. We have first party tags across over a hundred million plus web and mobile destinations. And, you know, as you must've heard like that sort of first party footprint, is going to come in really handy in a world without third party cookies. We are encouraging all of our customers, publishers and marketers to grow their first party data. And so that's something that's a strong point that customers love about us and lean into it quite a bit. So, yeah, the initial response has been great. Of course it doesn't hurt that we've made all these R&D investments. We can talk about consent, and, you know, I often say that consent it sounds simple, but it isn't, there's a lot of technology involved. But there's lots of legal work involved as it as well. We have a very strong legal team who has expertise built in. So yeah, a very good response initially. >> Democratization, everyone's a publisher, everyone's a media company. They have to think about being a platform. You guys provide that. So congratulations Peter, thanks for dropping the gems there. Shruti thanks for sharing the product highlights. Thanks for your time. >> Thank you. >> Okay, this is the Quancast Industry Summit on the demise of third-party cookies and what's next The Cookie Conundrum, the Recipe for success with Quancast I'm John Berger with theCUBE. Thanks for watching. (upbeat music)

Published Date : May 19 2021

SUMMARY :

and Shruti Koparkar, Head of What's going to be the answer? and to work with organizations who is the expert so you can to help with data-driven advertising And start to come up with better methods academia and in kind of That seems to be a great differentiator and to do that at this kind of scale and that world that you got and provide the horizontal and publishers to connect, direct to audience is a new trend. to make this, you know, capabilities of the platform. So the initial response that we've seen They have to think about being a platform. the Recipe for success with Quancast

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Peter DayPERSON

0.99+

PeterPERSON

0.99+

ShrutiPERSON

0.99+

Shruti KoparkarPERSON

0.99+

QuantcastORGANIZATION

0.99+

JohnPERSON

0.99+

Shruti KoparkerPERSON

0.99+

QuancastORGANIZATION

0.99+

IAB Tech LabORGANIZATION

0.99+

FirstQUANTITY

0.99+

oneQUANTITY

0.99+

TodayDATE

0.99+

World Wide Web ConsortiumORGANIZATION

0.99+

W3CORGANIZATION

0.99+

BothQUANTITY

0.99+

First questionQUANTITY

0.98+

second thingQUANTITY

0.98+

Peter DayPERSON

0.98+

over a hundred millionQUANTITY

0.98+

secondQUANTITY

0.98+

todayDATE

0.98+

FacebookORGANIZATION

0.98+

John BergerPERSON

0.98+

The Cookie Conundrum: A Recipe for SuccessTITLE

0.97+

The Cookie ConundrumTITLE

0.96+

Quancast Industry SummitEVENT

0.96+

GoogleORGANIZATION

0.95+

LiveRampTITLE

0.95+

first partyQUANTITY

0.95+

firstQUANTITY

0.94+

Dr.PERSON

0.93+

PrebidORGANIZATION

0.93+

Quantcast Industry SummitEVENT

0.92+

FLoCTITLE

0.92+

five yearsQUANTITY

0.92+

A Recipe for SuccessTITLE

0.91+

last couple of yearsDATE

0.87+

three fundamentalsQUANTITY

0.87+

three fundamental thingsQUANTITY

0.85+

CTOPERSON

0.75+

UID 2.0TITLE

0.74+

last few yearsDATE

0.73+

midnightDATE

0.72+

doubleQUANTITY

0.69+

themQUANTITY

0.68+

theCUBEORGANIZATION

0.66+

fourQUANTITY

0.66+

ChromeTITLE

0.57+

QuantcastTITLE

0.54+

these signalsQUANTITY

0.48+

RecipeTITLE

0.43+

lastDATE

0.37+

GoogleCOMMERCIAL_ITEM

0.35+