Image Title

Search Results for TransUnion:

Stephanie Chiras, Red Hat & Manasi Jagannatha, AWS | AnsibleFest 2022


 

(upbeat music) >> Hey everyone, welcome back to Chicago theCUBE is live on the floor at AnsibleFest 2022, the first in-person Ansible event that we've covered since 2019. Lisa Martin here with John Furrier. John, great to be here. There's about 1400 to 1500 people here in person, the partner ecosystem is growing and evolving, and that's going to be one of the themes of our next conversation. >> CloudScale is continuing to change the ecosystem, and this segment with AWS is going to be awesome. >> Exactly, we've got one of our alumni back with us, Stefanie Chiras joins us again, senior vice president, partner ecosystem success at Red Hat. and Manasi Jagannatha is also here Global Alliance Manager at AWS. Ladies, welcome to the program. >> Both: Thank you. >> Manasi: Nice to be here. >> Stefanie: Yeah. >> So some exciting news that came out. First of all was great to see you on stage. >> Thank you. >> In front of a live audience. The community is, you talked about this before we went live. The Ansible is nothing, if not the community. So I can only imagine how great that felt to be on stage in front of live bodies announcing the next step with Ansible and AWS. Tell us about that. >> I mean, you can't compete with the energy that comes from a live event. And I remember the first AnsibleFest I came to, it's just this electric feeling born out of the community, born out of collaboration and getting together feeds that collaboration in a way that like nothing else. >> Lisa: Can't do it by video alone. >> You cannot. And so it was so fun cuz today was big news. We announced that Ansible will be available through the AWS marketplace, the next step in our partnership journey. And we've been hearing like most of our announcements, we do these because customers ask for them. And that's really what is key. And the combination of what Red Hat brings to the table and what AWS brings to the table. That's what underpins this announcement this morning. >> Talk about it from a customer demand perspective and how you are not only meeting customers where they are, but you're speaking their language. >> Manasi: Yeah. >> Yeah, there's a couple of aspects and then I want to pass it to Manasi because nothing speaks better than a customer experience. But the specifics I think of what come together is this is where technology, procurement, experience, accessibility all come together. And it took both of us in order to do that. But we actually talked about a great example today, the TransUnion. >> So we have TransUnion, they are a credit reporting company and they're a giant customer. They use RHEL, they use AWS services. So while they were transitioning to the cloud, the first thing they wanted to know was compliance, right? Like, how do we have guardrails around compliance? That was a key feature for them. And then the other piece was how do we scale without increasing the complexity? And then the critical piece was being able to integrate with the depth of AWS services without having to do it over and over again. So what TransUnion did was they basically integrated Ansible automation platform with the AWS Cloud Control API that gave them the flexibility To basically integrate with what, 200 plus services? And it's amazing to see them grow over time. >> What's interesting is that Amazon, obviously cloud has been awesome. We've been covering it since the beginning. DevOps infrastructures code was the dream. Now it's app says code, you have configuration code before that. As cloud goes next level here, we're starting to see a lot more higher level services on AWS being adopted by customers. And so I want to get into how the marketplace deal works. So what's in it for the customer? Because as they bring Ansible across the enterprise and edge, now we're seeing that develop. If I'm the customer, am I buying it through the marketplace? What's the mechanics of the deal? Can I just tap into the bill, explain the marketplace workflow or how it works? >> Yeah, I'd love to do that. So customers come to the marketplace for three key benefits, right? Like one is the consumption based model, pay as you go, you can get hourly, annual, and spot instances. For some services you even get per second billing, right? Like, that's amazing, that's one. And then the other piece is John and Stefanie, as you know, customers would love to draw down on their EDPs, right? Like they want a single- >> EDPs, explain that with acronym. >> It's enterprise discount program. So they want a single bill where they can use third party services and AWS services and they don't have to go through the hustle of saying, "Hey, let me combine all these different pieces." So combining that, and of course the power of Ansible, right? Like customers love Ansible, they've built playbooks. The beauty of it is whatever you want to build on AWS, there is most likely a playbook or a module that already exists. So they can just tap into that and build into- >> Operationally it's a purchasing through marketplace. >> And you know, I mean, being an engineer myself, we always often get caught up in the technology aspect. Like what's the greatest technology? And everyone, as Manasi said, everyone loves the technology of Ansible, but the procurement aspect is also so important. And this is where I think this partnership really comes together. It is natively, Ansible is now, natively integrated into AWS billing. So one bill, you go and you log in. Now you have a Red Hat subscription, you get all the benefits from Red Hat that comes along with that subscription. But the like Ansible is all about simplicity. This brings simplicity to that procurement model and it allows you to scale within your AWS cloud environment that you have set up. And as Manasi mentioned, pull in those other native services from AWS. It's Great. >> It's interesting one of the things that buzzword Lisa and I were just talking as in the industry is the word multiplayer. I've heard people say that's multiplayer software, kind of a gaming analogy. But what you guys are doing is setting up, once they go with Ansible in the marketplace, they're just buying as things get more collaborative off the marketplace. So it kind of streamlines, if I get this right. >> Stefanie: Yep. >> The purchasing process. So they're already in, they just use it's on the bill. Is that kind of how it works? >> Yep. >> Absolutely done, yeah. >> So it the customer has a partnership with us more on the technology side and this particular case and with AWS and the procurement side, it brings that together. >> So multiplayer software, is it multiplayer software? >> We like to talk about multi-partner solutions and I think this provides a new grounding for other partners to come in and build upon that with their services capabilities, with their other technology capabilities. So well clearly in my world, we talk about multi-partner. (both laughs) >> Well, what you're doing is empowering the developers. I know that Red Hat is one of its goals is let's make things much more seamless, much smoother for the developers as the buyer's journey has changed. And John, you've talked about that quite a bit. You're empowering those buyers to actually have a much simpler, streamlined process and to be able to start seeing automation become democratized across organizations. >> Yeah, and one of the things I love about the announcement as well is it pulls in the other values of Ansible automation platform in that simplicity model that you mentioned with like things like certified collections, certified collections that have been built by partners. We have built certified collections, to go along with this offering as well as part of the AWS offering that pulls in these other partner engagements together. And as you said, democratizes not only what we've done together, but what we've done with other partners together. >> Lisa: Right. >> Yeah. >> Can you kind of talk kind of about the depths of the partnership, the co-engineering, and sort of the evolution and the customer involvement in the expansion of the partnership? >> Yeah, I'd love to walk you through that. So we've had a longstanding partnership coming up on 15 years now Stefanie, can you believe it? >> Stefanie: Yeah. (laughs) >> 15 years we've been building, to give you some historical context, right? In back in 2008 we launched RHEL and in 2015 we supported SAP workloads on RHEL. And then the list goes on, right? Like we've been launching Graviton instances, Arm instances, Nitro. The key to be noted here is that every new instance Launch, RHEL has always been supported on day one, right? Like that's been our motto. So that's one. And then in 2021, as you know, we launched Rosa Red Hat OpenShift service on AWS. And that's helped customers with their modernization journey to AWS. So that's been context historically around where we were and where we are today. And now with Ansible, it just gives customer another tool in their arsenal, right? And then the goal is to make sure we meet customers where they are, give them all the Red Hat products that they love using on their hybrid workloads. >> Sounds like a lot is coming maybe at re:Invent too, coming up. >> Yeah. >> What's next? >> This is the beginning, right? We'll continue to grow and based upon not only laying the building blocks for what customers can build with, and you mentioned Lisa, right? We follow this journey that Manasi talked about because of what customers ask for. So it's always a new adventure to determine what'll come next based upon what we hear from our joint customers. >> On that front though, Stefanie, talk about the impact of the broader ecosystem that this is just scratching the surface. >> One of the things, and we've been going through a whole transformation at Red Hat about how we engage with the ecosystem. We've done organizational shifts, we've done a complete revamp of how we engage with the ecosystem. One of our biggest focus is to make sure that the partnerships that we have with one partner bring value to the rest of our partners. No better example than something like this when we work with AWS to create accessibility and capability through a procurement model that we know is important to customers. But that then serves as a launch point for other partners to build certified collections around or now around validated content, which we talked about today at AnsibleFest, that allows other partners to engage. And we're seeing a huge amount in services partners, right? Automation is so pervasive now as customers want to go out and scale. We're seeing services partners really come in and help customers go from, it's always challenging when you have a broad set of IT. You have cloud native over here, you have bare metal over here, you have virtual, it's complex. >> John: Yeah. >> There's sometimes an energy activation barrier to get over that initial automation. We're seeing partners come in with really skilled services capabilities to help customers get over that hump to consolidate with an automation plan. It gets them better equipped to do day one automation and day two automation. And that's where Ansible automation platform is going. It's not just about configuration management, it's about day two management as well. >> Talk about those barriers a little bit more and how Ansible and AWS together are helping customers really knock those out of the park. Another baseball reference for you. We see that a lot of organizations, the skills gap, which we've talked about already on the conversation today, but Ansible as being a facilitator of helping organizations to attract talent, to retain talent, but also customers that maybe don't know where to start or don't know how to determine the ROI that automating processes will bring. How can this partnership help customers nock those out of the park? >> So I'll start and then I'll pass it to Manasi here. But I think one of the key things in this particular partnership is just plain old accessibility. Accessibility, which public cloud has taught the world a new way to get fast access that consumption based pricing. Right you can get your hands on it, you can test it out, you can have a team go in and test it out, and then you can see it's built for scale. So then you can scale it as far as you want to go forward. We clearly have an ecosystem of services partners, so does AWS to help people then sort of take it to the next level as they want to build upon it. But to me the first step is about accessibility, getting your hands dirty. You can build it into those committed spend programs that you may have with AWS as well to try new things. But it's a great test bed. >> Absolutely. And then to add to what Stefanie said, together Red Hat and AWS, we have about a hundred thousand partners combined, right? Like resellers, sis, GSI, distributors. So the reach the combined partnership has just amplifies. >> Yeah, it's huge news. I think it's a big deal because you operationalize the heavy lifting of procurement for all your joint customers and the scale piece is huge. So congratulations. I think it's going to make a lot of money for Ansible. So good call there. My question is, as we hear here, the next level's edge. So AWS has been doing a ton of hybrids since outpost announcement years ago. Now you got all kinds of regional expansions, you've got local zones, you've got all kinds of new edge activity. So are there dots connecting here with the edge with Red Hat Ansible? >> Do you want- >> Yeah, so I think we see two trends with our customers, right? Like mainly I'm specifically talking about our RHEL customer base on AWS. We have almost hundreds to thousands of customers using RHEL on AWS. These are 90% of fortune 500 companies use RHEL, right? So with that customer base, they are looking to expand your point into the edge. There's outposts, there are so many hybrid environments that they're trying to expand in. So just adding Ansible, RHEL, Rosa, OpenShift, that entire makes, just gives customers that the plethora of products they need to run their workloads everywhere, right? Like we have certifications outpost, we have certifications with OpenShift, right? So it just completes the puzzle, if you- >> So it's a nice fit. >> Yeah. >> It is a really nice fit. And I love Edge and Edge once you start going distributed, this automation aspect is key for all the reasons, for security reasons to make sure you do it the same way every single time. It's just pervasive in it. But things like the Cloud Control API allow it to bridge into things like Outpost. It allows a simple way, one clean way to do API and then you can expand it out and get the value. >> So this is why you are on stage and you said that Ansible's going to expand the scope to be more enterprise architecture. >> Stefanie: That's right. >> That's essentially what you're getting at. This is now a distributed computing fabric at cloud scale on AWS. >> Stefanie: That's right. >> Did I get that right? >> Yep, and it touches all the different deployments you may have, on-prem, virtual, cloud native, you name it. >> So how do the people turn into architects? Cuz this is, again, we had this earlier conversation with Tom, multi-tool players, a baseball analogy I used. It's like signifies the best player, your customers are becoming multiple tool players or operators. The new operator is now the top talent. They got to run Ansible, they got to automate, they got to provide services to the cloud native developers. So this new role is emerging, it's not a cloud architect but it's, if it's going to be system architecture wide, what's this new person look like that's going to run all this? >> I think it's an interesting question. We were talking yesterday, actually, Tom and I were talking with the partners. We had Partner Day, the first ever at AnsibleFest yesterday, which was great. We got a lot of insight. They talked a lot about this platform focus, right? Customers are looking to create that platform so that the developers can come in and build upon it without compromising what they want to do. So I do think there's a move in that direction to say how do you create these platforms at a company that no compromises, but it provides that consistency. I would say one thing in partnerships like this, I think customer expectations on the partner ecosystem to have it be trusted is increasing. They expect us as we've done to have our engineers roll up their sleeves together to come to the table together. That's going to show up in our curated content. It's going to show up in our validated content. Those are the places I think where we come up from the bottom through our partnership and we help bridge that gap. >> John: Awesome. >> And trust was brought up a number of times this morning during the keynote. We're almost out of time here, but I think it's one of those words that a lot of companies use. But I think what you're showing is really the value in it from Ansible's perspective from AWS's perspective and ultimately the value in it for the customer. >> Stefanie: Yes. >> So I got to ask you one final question. >> Stefanie: Absolutely. >> And maybe as as reinvent is around the corner, what's next for the partnership? Obviously big news today, Manasi, looking down down the pipe- >> Stefanie: Big news today. >> What are some of the things that you think are going to become next that you can share? >> I mean at this point, and I'll pass it to Manasi to close us out, but we are continuing to follow, to meet our customers where they want to be. We are looking across our portfolio for different ways that customers want to consume within AWS. We'll continue to look at the procurement models through the partner programs that Manasi and the team have had. And to me the next step is really bringing in the rest of the ecosystem. How do we use this as a grounding step? >> Yeah, absolutely. So we are always listening to customer feedback and they want more Red Hat products in the marketplace. So that's where we'll be. >> In the marketplace. >> Congratulations great deal. >> Yes great work there guys. And customers always want more. That's the thing. But that's what keeps us going. So we love it. >> Absolutely. >> Thank you so much for joining John and me on the program today. It's been great to have you. And congratulations again. >> It's a pleasure. >> Thank you. >> For our guests and for John Furrier, I'm Lisa Martin. You're watching theCUBE Live from Chicago at AnsibleFest 2022. This is only day one of our coverage. We'll be back after a short break for more. (upbeat music)

Published Date : Oct 18 2022

SUMMARY :

and that's going to be one of the themes is going to be awesome. of our alumni back with us, to see you on stage. So I can only imagine how great that felt And I remember the first And the combination of what and how you are not only meeting But the specifics I think And it's amazing to see Can I just tap into the bill, So customers come to the marketplace and of course the power of Ansible, right? Operationally it's a and it allows you to scale is the word multiplayer. Is that kind of how it works? So it the customer We like to talk about and to be able to start seeing automation Yeah, and one of the things Yeah, I'd love to And then the goal is to make sure Sounds like a lot is coming maybe This is the beginning, right? of the broader ecosystem that the partnerships that to consolidate with an automation plan. on the conversation today, So then you can scale it as And then to add to what Stefanie said, and the scale piece is huge. So it just completes the puzzle, if you- and then you can expand So this is why you are on stage This is now a distributed computing fabric the different deployments So how do the people so that the developers can is really the value in it and the team have had. products in the marketplace. That's the thing. on the program today. This is only day one of our coverage.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
TomPERSON

0.99+

StefaniePERSON

0.99+

JohnPERSON

0.99+

Lisa MartinPERSON

0.99+

MichaelPERSON

0.99+

NVIDIAORGANIZATION

0.99+

AWSORGANIZATION

0.99+

ManasiPERSON

0.99+

LisaPERSON

0.99+

PluribusORGANIZATION

0.99+

John FurrierPERSON

0.99+

Stephanie ChirasPERSON

0.99+

2015DATE

0.99+

Ami BadaniPERSON

0.99+

Stefanie ChirasPERSON

0.99+

AmazonORGANIZATION

0.99+

2008DATE

0.99+

Mike CapuanoPERSON

0.99+

two companiesQUANTITY

0.99+

two yearsQUANTITY

0.99+

Red HatORGANIZATION

0.99+

90%QUANTITY

0.99+

yesterdayDATE

0.99+

MikePERSON

0.99+

RHELTITLE

0.99+

ChicagoLOCATION

0.99+

2021DATE

0.99+

Pluribus NetworksORGANIZATION

0.99+

second versionQUANTITY

0.99+

last yearDATE

0.99+

next yearDATE

0.99+

AnsibleORGANIZATION

0.99+

Tim Barnes, AWS | AWS Startup Showcase S2 E3


 

(upbeat music) >> Hello, everyone, welcome to theCUBE's presentation of the AWS Startup Showcase. We're in Season two, Episode three, and this is the topic of MarTech and the Emerging Cloud-Scale Customer Experiences, the ongoing coverage of AWS's ecosystem of large scale growth and new companies and growing companies. I'm your host, John Furrier. We're excited to have Tim Barnes, Global Director, General Manager of Advertiser and Marketing at AWS here doing the keynote cloud-scale customer experience. Tim, thanks for coming on. >> Oh, great to be here and thank you for having me. >> You've seen many cycles of innovation, certainly in the ad tech platform space around data, serving consumers and a lot of big, big scale advertisers over the years as the Web 1.0, 2.0, now 3.0 coming, cloud-scale, roll of data, all big conversations changing the game. We see things like cookies going away. What does this all mean? Silos, walled gardens, a lot of new things are impacting the applications and expectations of consumers, which is also impacting the folks trying to reach the consumers. And this is kind of creating a kind of a current situation, which is challenging, but also an opportunity. Can you share your perspective of what this current situation is, as the emerging MarTech landscape emerges? >> Yeah, sure, John, it's funny in this industry, the only constant has changed and it's an ever-changing industry and never more so than right now. I mean, we're seeing with whether it's the rise of privacy legislation or just breach of security of data or changes in how the top tech providers and browser controllers are changing their process for reaching customers. This is an inflection point in the history of both ad tech and MarTech. You hit the nail on the head with cookie deprecation, with Apple removing IDFA, changes to browsers, et cetera, we're at an interesting point. And by the way, we're also seeing an explosion of content sources and ability to reach customers that's unmatched in the history of advertising. So those two things are somewhat at odds. So whether we see the rise of connected television or digital out of home, you mentioned Web 3.0 and the opportunities that may present in metaverse, et cetera, it's an explosion of opportunity, but how do we continue to connect brands with customers and do so in a privacy compliant way? And that's really the big challenge we're facing. One of the things that I see is the rise of modeling or machine learning as a mechanism to help remove some of these barriers. If you think about the idea of one-to-one targeting, well, that's going to be less and less possible as we progress. So how am I still as a brand advertiser or as a targeted advertiser, how am I going to still reach the right audience with the right message in a world where I don't necessarily know who they are. And modeling is a really key way of achieving that goal and we're seeing that across a number of different angles. >> We've always talked about on the ad tech business for years, it's the behemoth of contextual and behavioral, those dynamics. And if you look at the content side of the business, you have now this new, massive source of new sources, blogging has been around for a long time, you got video, you got newsletters, you got all kinds of people, self-publishing, that's been around for a while, right? So you're seeing all these new sources. Trust is a big factor, but everyone wants to control their data. So this walled garden perpetuation of value, I got to control my data, but machine learning works best when you expose data, so this is kind of a paradox. Can you talk about the current challenge here and how to overcome it because you can't fight fashion, as they say, and we see people kind of going down this road as saying, data's a competitive advantage, but I got to figure out a way to keep it, own it, but also share it for the machine learning. What's your take on that? >> Yeah, I think first and foremost, if I may, I would just start with, it's super important to make that connection with the consumer in the first place. So you hit the nail on the head for advertisers and marketers today, the importance of gaining first party access to your customer and with permission and consent is paramount. And so just how you establish that connection point with trust and with very clear directive on how you're going to use the data has never been more important. So I would start there if I was a brand advertiser or a marketer, trying to figure out how I'm going to better connect with my consumers and get more first party data that I could leverage. So that's just building the scale of first party data to enable you to actually perform some of the types of approaches we'll discuss. The second thing I would say is that increasingly, the challenge exists with the exchange of the data itself. So if I'm a data control, if I own a set of first party data that I have consent with consumers to use, and I'm passing that data over to a third party, and that data is leaked, I'm still responsible for that data. Or if somebody wants to opt out of a communication and that opt out signal doesn't flow to the third party, I'm still liable, or at least from the consumer's perspective, I've provided a poor customer experience. And that's where we see the rise of the next generation, I call it of data clean rooms, the approaches that you're seeing, a number of customers take in terms of how they connect data without actually moving the data between two sources. And we're seeing that as certainly a mechanism by which you can preserve accessibility data, we call that federated data exchange or federated data clean rooms and I think you're seeing that from a number of different parties in the industry. >> That's awesome, I want to get into the data interoperability because we have a lot of startups presenting in this episode around that area, but why I got you here, you mentioned data clean room. Could you define for us, what is a federated data clean room, what is that about? >> Yeah, I would simply describe it as zero data movement in a privacy and secure environment. To be a little bit more explicit and detailed, it really is the idea that if I'm a party A and I want to exchange data with party B, how can I run a query for analytics or other purposes without actually moving data anywhere? Can I run a query that has accessibility to both parties, that has the security and the levels of aggregation that both parties agree to and then run the query and get those results sets back in a way that it actually facilitates business between the two parties. And we're seeing that expand with partners like Snowflake and InfoSum, even within Amazon itself, AWS, we have data sharing capabilities within Redshift and some of our other data-led capabilities. And we're just seeing explosion of demand and need for customers to be able to share data, but do it in a way where they still control the data and don't ever hand it over to a third party for execution. >> So if I understand this correctly, this is kind of an evolution to kind of take away the middleman, if you will, between parties that used to be historically the case, is that right? >> Yeah, I'd say this, the middleman still exists in many cases. If you think about joining two parties' data together, you still have the problem of the match key. How do I make sure that I get the broadest set of data to match up with the broadest set of data on the other side? So we have a number of partners that provide these types of services from LiveRamp, TransUnion, Experian, et cetera. So there's still a place for that so-called middleman in terms of helping to facilitate the transaction, but as a clean room itself, I think that term is becoming outdated in terms of a physical third party location, where you push data for analysis, that's controlled by a third party. >> Yeah, great clarification there. I want to get into this data interoperability because the benefits of AWS and cloud scales we've seen over the past decade and looking forward is, it's an API based economy. So APIs and microservices, cloud native stuff is going to be the key to integration. And so connecting people together is kind of what we're seeing as the trend. People are connecting their data, they're sharing code in open source. So there's an opportunity to connect the ecosystem of companies out there with their data. Can you share your view on this interoperability trend, why it's important and what's the impact to customers who want to go down this either automated or programmatic connection oriented way of connecting data. >> Never more important than it has been right now. I mean, if you think about the way we transact it and still too today do to a certain extent through cookie swaps and all sorts of crazy exchanges of data, those are going away at some point in the future; it could be a year from now, it could be later, but they're going away. And I think that that puts a great amount of pressure on the broad ecosystem of customers who transact for marketers, on behalf of marketers, both for advertising and marketing. And so data interoperability to me is how we think about providing that transactional layer between multiple parties so that they can continue to transact in a way that's meaningful and seamless, and frankly at lower cost and at greater scale than we've done in the past with less complexity. And so, we're seeing a number of changes in that regard, whether that's data sharing and data clean rooms or federated clean rooms, as we described earlier, whether that's the rise of next generation identity solutions, for example, the UID 2.0 Consortium, which is an effort to use hashed email addresses and other forms of identifiers to facilitate data exchange for the programmatic ecosystem. These are sort of evolutions based on this notion that the old world is going away, the new world is coming, and part of that is how do we connect data sources in a more seamless and frankly, efficient manner. >> It's almost interesting, it's almost flipped upside down, you had this walled garden mentality, I got to control my data, but now I have data interoperability. So you got to own and collect the data, but also share it. This is going to kind of change the paradigm around my identity platforms, attributions, audience, as audiences move around, and with cookies going away, this is going to require a new abstraction, a new way to do it. So you mentioned some of those standards. Is there a path in this evolution that changes it for the better? What's your view on this? What do you see happening? What's going to come out of this new wave? >> Yeah, my father was always fond of telling me, "The customer, my customers is my customer." And I like to put myself in the shoes of the Marc Pritchards of the world at Procter & Gamble and think, what do they want? And frankly, their requirements for data and for marketing have not changed over the last 20 years. It's, I want to reach the right customer at the right time, with the right message and I want to be able to measure it. In other words, summarizing, I want omnichannel execution with omnichannel measurement, and that's become increasingly difficult as you highlighted with the rise of the walled gardens and increasingly data living in silos. And so I think it's important that we, as an industry start to think about what's in the best interest of the one customer who brings virtually 100% of the dollars to this marketplace, which is the CMO and the CMO office. And how do we think about returning value to them in a way that is meaningful and actually drives its industry forward. And I think that's where the data operability piece becomes really important. How do we think about connecting the omnichannel channels of execution? How do we connect that with partners who run attribution offerings with machine learning or partners who provide augmentation or enrichment data such as third party data providers, or even connecting the buy side with the sell side in a more efficient manner? How do I make that connection between the CMO and the publisher in a more efficient and effective way? And these are all challenges facing us today. And I think at the foundational layer of that is how do we think about first of all, what data does the marketer have, what is the first party data? How do we help them ethically source and collect more of that data with proper consent? And then how do we help them join that data into a variety of data sources in a way that they can gain value from it. And that's where machine learning really comes into play. So whether that's the notion of audience expansion, whether that's looking for some sort of cohort analysis that helps with contextual advertising, whether that's the notion of a more of a modeled approach to attribution versus a one-to-one approach, all of those things I think are in play, as we think about returning value back to that customer of our customer. >> That's interesting, you broke down the customer needs in three areas; CMO office and staff, partners ISV software developers, and then third party services. Kind of all different needs, if you will, kind of tiered, kind of at the center of that's the user, the consumer who have the expectations. So it's interesting, you have the stakeholders, you laid out kind of those three areas as to customers, but the end user, the consumer, they have a preference, they kind of don't want to be locked into one thing. They want to move around, they want to download apps, they want to play on Reddit, they want to be on LinkedIn, they want to be all over the place, they don't want to get locked in. So you have now kind of this high velocity user behavior. How do you see that factoring in, because with cookies going away and kind of the convergence of offline-online, really becoming predominant, how do you know someone's paying attention to what and when attention and reputation. All these things seem complex. How do you make sense of it? >> Yeah, it's a great question. I think that the consumer as you said, finds a creepiness factor with a message that follows them around their various sources of engagement with content. So I think at first and foremost, there's the recognition by the brand that we need to be a little bit more thoughtful about how we interact with our customer and how we build that trust and that relationship with the customer. And that all starts with of course, opt-in process consent management center but it also includes how we communicate with them. What message are we actually putting in front of them? Is it meaningful, is it impactful? Does it drive value for the customer? I think we've seen a lot of studies, I won't recite them that state that most consumers do find value in targeted messaging, but I think they want it done correctly and there in lies the problem. So what does that mean by channel, especially when we lose the ability to look at that consumer interaction across those channels. And I think that's where we have to be a little bit more thoughtful with frankly, kind of going back to the beginning with contextual advertising, with advertising that perhaps has meaning, or has empathy with the consumer, perhaps resonates with the consumer in a different way than just a targeted message. And we're seeing that trend, we're seeing that trend both in television, connected television as those converge, but also as we see about connectivity with gaming and other sort of more nuanced channels. The other thing I would say is, I think there's a movement towards less interruptive advertising as well, which kind of removes a little bit of those barriers for the consumer and the brand to interact. And whether that be dynamic product placement, content optimization, or whether that be sponsorship type opportunities within digital. I think we're seeing an increased movement towards those types of executions, which I think will also provide value to both parties. >> Yeah, I think you nailed it there. I totally agree with you on the contextual targeting, I think that's a huge deal and that's proven over the years of providing benefit. People, they're trying to find what they're looking for, whether it's data to consume or a solution they want to buy. So I think that all kind of ties together. The question is these three stakeholders, the CMO office and staff you mentioned, and the software developers, apps, or walled gardens, and then like ad servers as they come together, have to have standards. And so, I think to me, I'm trying to squint through all the movement and the shifting plates that are going on in the industry and trying to figure out where are the dots connecting? And you've seen many cycles of innovation at the end of the day, it comes down to who can perform best for the end user, as well as the marketers and advertisers, so that balance. What's your view on this shift? It's going to land somewhere, it has to land in the right area, and the market's very efficient. I mean, this ad market's very efficient. >> Yeah, I mean, in some way, so from a standards perspective, I support and we interact extensively with the IB and other industry associations on privacy enhancing technologies and how we think about these next generations of connection points or identifiers to connect with consumers. But I'd say this, with respect to the CMO, and I mentioned the publisher earlier, I think over the last 10 years with the rise of programmatic, certainly we saw the power reside mostly with the CMO who was able to amass a large pool of cookies or purchase a large sort of cohort of customers with cookie based attributes and then execute against that. And so almost a blind fashion to the publisher, the publisher was sort of left to say, "Hey, here's an opportunity, do you want to buy it or not?" With no real reason why the marketer might be buying that customer? And I think that we're seeing a shift backwards towards the publisher and perhaps a healthy balance between the two. And so, I do believe that over time, that we're going to see publishers provide a lot more, what I might almost describe as mini walled gardens. So the ability, great publisher or a set of publishers to create a cohort of customers that can be targeted through programmatic or perhaps through programmatic guaranteed in a way that it's a balance between the two. And frankly thinking about that notion of federated data clean rooms, you can see an approach where publishers are able to share their first party data with a marketer's first party data, without either party feeling like they're giving up something or passing all their value over to the other. And I do believe we're going to see some significant technology changes over the next three to four years. That really rely on that interplay between the marketer and the publisher in a way that it helps both sides achieve their goals, and that is, increasing value back to the publisher in terms of higher CPMs, and of course, better reach and frequency controls for the marketer. >> I think you really brought up a big point there we can maybe follow up on, but I think this idea of publishers getting more control and power and value is an example of the market filling a void and the power log at the long tail, it's kind of a straight line. Then it's got the niche kind of communities, it's growing in the middle there, and I think the middle of the torso of that power law is the publishers because they have all the technology to measure the journeys and the click throughs and all this traffic going on their platform, but they just need to connect to someone else. >> Correct. >> That brings in the interoperability. So, as a publisher ourselves, we see that long tail getting really kind of fat in the middle where new brands are going to emerge, if they have audience. I mean, some podcasts have millions of users and some blogs are attracting massive audience, niche audiences that are growing. >> I would say, just look at the rise of what we might not have considered publishers in the past, but are certainly growing as publishers today. Customers like Instacart or Uber who are creating ad platforms or gaming, which of course has been an ad supported platform for some time, but is growing immensely. Retail as a platform, of course, amazon.com being one of the biggest retail platforms with advertising supported models, but we're seeing that growth across the board for retail customers. And I think that again, there's never been more opportunities to reach customers. We just have to do it the right way, in the way that it's not offensive to customers, not creepy, if you want to call it that, and also maximizes value for both parties and that be both the buy and the sell side. >> Yeah, everyone's a publisher and everyone's a media company. Everyone has their own news network, everyone has their own retail, it's a completely new world. Tim, thanks for coming on and sharing your perspective and insights on this key note, Tim Barnes, Global Director, General Manager of Advertiser and Market at AWS here with the Episode three of Season two of the AWS Startup Showcase. I'm John Furrier, thanks for watching. (upbeat music)

Published Date : Jun 29 2022

SUMMARY :

of the AWS Startup Showcase. Oh, great to be here and certainly in the ad tech and the opportunities that may present and how to overcome it because exchange of the data itself. into the data interoperability that has the security and to match up with the broadest the impact to customers that the old world is going of change the paradigm of the one customer who brings and kind of the convergence the ability to look and the market's very efficient. and the publisher in a way that it helps is an example of the market filling a void getting really kind of fat in the middle in the way that it's not offensive of the AWS Startup Showcase.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
John FurrierPERSON

0.99+

Tim BarnesPERSON

0.99+

Tim BarnesPERSON

0.99+

Procter & GambleORGANIZATION

0.99+

JohnPERSON

0.99+

AWSORGANIZATION

0.99+

TimPERSON

0.99+

AmazonORGANIZATION

0.99+

TransUnionORGANIZATION

0.99+

ExperianORGANIZATION

0.99+

two sourcesQUANTITY

0.99+

twoQUANTITY

0.99+

UberORGANIZATION

0.99+

LiveRampORGANIZATION

0.99+

both partiesQUANTITY

0.99+

AppleORGANIZATION

0.99+

two partiesQUANTITY

0.99+

MarTechORGANIZATION

0.99+

both sidesQUANTITY

0.99+

InfoSumORGANIZATION

0.99+

bothQUANTITY

0.99+

todayDATE

0.98+

two thingsQUANTITY

0.98+

four yearsQUANTITY

0.98+

two parties'QUANTITY

0.98+

first partyQUANTITY

0.98+

second thingQUANTITY

0.98+

firstQUANTITY

0.98+

LinkedInORGANIZATION

0.98+

InstacartORGANIZATION

0.98+

OneQUANTITY

0.98+

threeQUANTITY

0.97+

oneQUANTITY

0.97+

UID 2.0 ConsortiumORGANIZATION

0.97+

one customerQUANTITY

0.97+

three stakeholdersQUANTITY

0.96+

SnowflakeORGANIZATION

0.96+

theCUBEORGANIZATION

0.95+

Marc PritchardsPERSON

0.95+

amazon.comORGANIZATION

0.94+

100%QUANTITY

0.91+

three areasQUANTITY

0.9+

first placeQUANTITY

0.87+

RedditORGANIZATION

0.83+

millions of usersQUANTITY

0.83+

Startup ShowcaseEVENT

0.82+

IDFATITLE

0.78+

SeasonQUANTITY

0.76+

past decadeDATE

0.75+

EpisodeQUANTITY

0.75+

a year fromDATE

0.74+

last 20 yearsDATE

0.74+

one thingQUANTITY

0.72+

last 10 yearsDATE

0.71+

Web 3.0OTHER

0.7+

RedshiftTITLE

0.65+

Episode threeOTHER

0.64+

zeroQUANTITY

0.64+

Season twoQUANTITY

0.63+

waveEVENT

0.61+

MarTechTITLE

0.58+

twoOTHER

0.55+

S2 E3EVENT

0.53+

threeOTHER

0.5+

Ed Bailey, Cribl | AWS Startup Showcase S2 E2


 

(upbeat music) >> Welcome everyone to theCUBE presentation of the AWS Startup Showcase, the theme here is Data as Code. This is season two, episode two of our ongoing series covering the exciting startups from the AWS ecosystem. And talk about the future of data, future of analytics, the future of development and all kind of cool stuff in Multicloud. I'm your host, John Furrier. Today we're joined by Ed Bailey, Senior Technology, Technical Evangelist at Cribl. Thanks for coming on the queue here. >> I thank you for the invitation, thrilled to be here. >> The theme of this session is the observability lake, which I love by the way I'm getting into that in a second. A breach investigation's best friend, which is a great topic. Couple of things, one, I like the breach investigation angle, but I also like this observability lake positioning, because I think this is a teaser of what's coming, more and more data usage where it's actually being applied specifically for things here, it's observability lake. So first, what is an observability lake? Why is it important? >> Why it's important is technology professionals, especially security professionals need data to make decisions. They need data to drive better decisions. They need data to understand, just to achieve understanding. And that means they need everything. They don't need what they can afford to store. They don't need not what vendor is going to let them store. They need everything. And I think as a point of the observability lake, because you couple an observability pipeline with the lake to bring your enterprise of data, to make it accessible for analytics, to be able to use it, to be able to get value from it. And I think that's one of the things that's missing right now in the enterprises. Admins are being forced to make decisions about, okay, we can't afford to keep this, we can afford to keep this, they're missing things. They're missing parts of the picture. And by bringing, able to bring it together, to be able to have your cake and eat it too, where I can get what I need and I can do it affordably is just, I think that's the future, and it just drives value for everyone. >> And it just makes a lot of sense data lake or the earlier concert, throw everything into the lake, and you can figure it out, you can query it, you can take action on it real time, you can stream it. You can do all kinds of things with it. Verb observability is important because it's the most critical thing people are doing right now for all kinds of things from QA, administration, security. So this is where the breach piece comes in. I like that's part of the talk because the breached investigation's best friend, it implies that you got the secret sourced to behind it, right? So, what is the state of the breach investigation today? What's going on with that? Because we know breaches, we see 'em out there, but like, why is this the best friend of a breach investigator? >> Well, and this is unfortunate, but typically there's an enormous delay between breach and detection. And right now, there's an IBM study, I think it's 287 days, but from the actual breach to detection and containment. It's an enormous amount of time. And the key is so when you do detect a breach, you're bringing in your instant, your response team, and typically without an observability lake, without Cribl solutions around observability pipeline, you're going to have an incomplete picture. The incident response team has to first to understand what's the scope of the breach. Is it one server? Is it three servers? Is it all the servers? You got to understand what's been compromised, what's been the end, what's the impact? How did the breach occur in the first place? And they need all the data to stitch that together, and they need it quickly. The more time it takes to get that data, the more time it takes for them to finish their analysis and contain the breach. I mean, hence the, I think about an 87, 90 days to contain a breach. And so by being able to remove the friction, by able to make it easier to achieve these goals, what shouldn't be hard, but making, by removing that friction, you speed up the containment and resolution time. Not to mention for many system administrators, they don't simply have the data because they can afford to store the data in their SIEM. Or they have to go to their backup team to get a restore which can take days. And so that's-- It's just so many obstacles to getting resolution right now. >> I mean, it's just, you're crawling through glass there, right? Because you think about it like just the timing aspect. Where is the data? Where is it stored and relevant and-- >> And do you have it at all? >> And you have it at all, and then, you know, that person doesn't work anywhere, they change jobs. I mean, who is keeping track of all this? You guys have now, this capability where you can come in and do the instrumentation with the observability lake without a lot of change to the environment, which is not the way it used to be. Used to be, buy a tool, build a platform. Cribl has a solution that eases the struggles with the enterprise. What specifically is that pain point? And what do you guys do specifically? >> Well, I'll start out with kind of example, what drew me to Cribl, so back in 2018. I'm running the Splunk team for a very large multinational. The complexity of that, we were dealing with the complexity of the data, the demands we were getting from security and operations were just an enormous issue to overcome. I had vendors come to me all the time that will solve your problems, but that means you got to move to our platform where you have to get rid of Splunk or you have to do this, and I'm losing something. And what Cribl stream brought into, was I could put it between my sources and my destinations and manage my data. And I would have flow control over the data. I don't have to lose anything. I could keep continuing use our existing analytics tools, and that sense of power and control, and I don't have to lose anything. I was like, there's something wrong here. This is too good to be true. And so what we're talking about now in terms of breach investigation, is that with Cribl stream, I can create a clone of my data to an object store. So this is in, this is almost any object store. So it can be AWS, it could be the other vendor object stores. It could be on-prem object stores. And then I can house my data, I can house all my data at the cheapest possible price. So instead of eating up my most expensive storage, I put all my data in my object store. And I only put the data I need for the detections in my SIEM. So if, and hopefully never, but if you do have a breach, lock stream has a wonderful UI that makes a trivial to then pick my data out of my object store and restore it back into my SIEM so that my IR team has to develop a complete picture of how the breach happen. What's the scope? What is their lateral movement and answer those questions. And it just, it takes the friction away. Just like you said, just no more crawling over glass. You're running to your solution. >> You mentioned object store, and you're streaming that in. You talk about the Cribble stream tool. I'm assuming there when you're streaming the pipeline stuff, but is there a schema involved? Is there database challenges? What, how do you guys look at that? I know you're vendor agnostic. I like that piece, you plug in and you leverage all the tools that are out there, Splunk, Datadog, whatever. But how about on the database side, what's the impact there? >> Well, so I'm assuming you're talking about the object store itself, so we don't have to apply the schema. We can fit the data to whichever the object store is. We structure the data so it makes it easier to understand. For example, if I want to see communications from one IP to another IP, we structure it to make it easier to see that and query that, but it is just, we're-- Yeah, it's completely vendor neutral and this makes it so simple, so simple to enable, I think-- >> So no pre-defined schema needed. >> No, not at all. And this, it made it so much easier. I think we enabled this for the enterprise. I think it took us three hours to do, and we were able to then start, I mean, start cutting our retention costs dramatically. >> Yeah, it's great when you get that kind of value, time to value critical and all the skeptics fall to the sides pretty quickly. (chuckles) I got to ask you, well, go ahead. >> So I say, I mean, previously, I would have to go to our backup team. We'd have to open up a ticket, we'd have to have a bridge, then we'd have to go through the process of pulling tape and being, it could take, you know, hours, hours if not days to restore the amount of data we needed. And just it, you know, we were able to run to our goals, and solve business problems instead of focusing on the process steps of getting things done. >> Right, so take me through the architecture here and some customer examples, 'cause you have the Cribble streaming there, observability pipeline. That's key, you mentioned that. >> Yes. >> And then they build out these observability lakes from that. So what is the impact of that? Can you share the customers that are using that solution? What are they seeing for benefits? What are some of the impact? Can you give us some specifics? >> I mean, I can't share with all the exact customer names. I can definitely give you some examples. Like referenceable conference would be TransUnion, so that I came from TransUnion. I was one of the first customers and it solved enormous number of problems for us. Autodesk is another great example. The idea that we're able to automate and data practices. I mean, just for example, what we were talking about with backups. We'd have to, you have to put a lot of time into managing your backups in your inner analytics platforms, you have to. And then you're locked into custom database schemas, you're locked into vendors. And it's also, it's still, it's expensive. So being able to spend a few hours, dramatically cut your costs, but still have the data available, and that's the key. I didn't have to make compromises, 'cause before I was having to say, okay, we're going to keep this, we're going to just drop this and hope for the best. And we just don't, we just didn't have to do that anymore. I think for the same thing for TransUnion and Autodesk, the idea that we're going to lower our cost, we're going to make it easier for our administrators to do their job and so they can spend more time on business value fundamentals, like responding to a breach. You're going to spend time working with your teams, getting value observability solutions and stop spending time on writing custom solutions using to open source tools. 'Cause your engineering time is the most precious asset for any enterprise and you got to focus your engineering time on where it's needed the most. >> Yeah, and they can't underestimate the hassle and cost of ownership, of swapping out pre-existing stuff, just for the sake of having a functionality. I mean that's a big-- >> It's pain and that's a big thing about lock stream is that being vendor neutral is so important. If you want to use the Splunk universal forwarder, that's great. If you want to use Beats, that's awesome. If you want to use Fluentd, even better. If you want to use all three, you can do that too. It's the customer choice and we're saying to people, use what suits your needs. And if you want to write some of your data to elastic, that's great. Some of your data to Splunk, that's even better. Some of it to, pick your pick, fine as well or Exabeam. You have the choices to put together, put your own solutions together and put your data where you need it to be. We're not asking you only in our ecosystem to work with only our partners. We're letting you pick and choose what suits your business. >> Yeah, you know, that's the direction I was just talking about the Amazon folks around their serverless. You know, you can use any tool, you know, you can, they have that core architecture for everything, the S3 and then pick whatever you want to use. SageMaker, just that other thing. This is the new way. That's the way it has to be to be effective. How do you guys handle that? What's been the reaction from customers? Do they like, roll their eyes and doubt you guys, or can you do it? Are they skeptical? How fast can you convert 'em over? (chuckles) >> Right, and that's always the challenge. And that's, I mean, the best part of my day is talking to customers. I love hearing and feedback, what they like, what they don't and what they need. And of course I was skeptical. I didn't believe it when I first saw it because I was like this, you know, because I'm, I was used to being locked in. I was used to having to put a lot of effort, a lot of custom code, like, what do you mean? It's this easy? I believe I did the first, this is 2018, and I did our first demos, like 30 minutes in, and I cut about 1/2 million dollars out of our license in the first 30 minutes in our first demo. And I was stunned because I mean, it's like, this is easy. >> Yeah, I mean-- >> Yeah, exactly. I mean, this is, and then this is the future. And then for example, we needed to bring in so like the security team wanted to bring in a UBA solution that wasn't part of the vendor ecosystem that we were in. And I was like, not a problem. We're going to use log stream. We're going to clone a copy of our data to the UBA solution. We were able to get value from this UBA solution in weeks. What typically is a six month cycle to start getting value. And it just, it was just too easy and the best part of it. And the thing is, it just struck me was my engineers can now spend their time on delivering value instead of integrations and moving data around. >> Yeah, and also we can spend more time preventing breaches. But what's interesting is counterintuitive here is that, if you, as you add more flexibility and choice, you'd think it'd be harder to handle a breach, right? So, now let's go back to the scenario. Now you guys, say an organization has a breach, and they have the observability pipeline, They got the lake in place, your observability lake, take me through the investigation. How easy is it, what happens? How they start it, what goes on? >> So, once your SOC detects a breach, then they bring in the idea. Typically you're going to bring in your incident response team. So what we did, and this is one more way that we removed that friction, we cleaned up the glass, is we delegate to the instant response team, the ability to restore, we call it-- So if Cribl calls it replay, we play data at our object store back into your SIEM. There's a very nice UI that gives you the ability to say, "I want data from this time period, at this time period, I want it to be all the data." Or the ability to filter and say, "I want this, just this IP." For example, if I detected, okay, this IP has been breached then I'm going to pull all the data that mentions this IP and this timeframe, hit a button and it just starts. And then it's going to restore how as fast your IOPS are for your solution. And then it's back in your tool, it's back in your tool. One of the things I also want to mention is we have an amazing enrichment capability. So one of the things that we would do is we would've pipelines so as the data comes out of the object store, it hits the pipeline, and then we enrich it. We hit use GoIP information, perverse and NAS. It gets processed through threat Intel feed. So the data's already enriched and ready for the incident response people to do their job. And so it just, it bamboozle the friction of getting to the point where I can start doing my job. >> You know, at this theme, this episode for this showcase is about Data as Code. And which is, you know, we've been, I've been saying this on theCUBES for since it was being around 13 years ago, that developers are going to be dealing with data like they deal with software code, and you're starting to see, you mentioned enrichment. Where do you see Data as Code going? How relevant in it now, because we really talking about when you add machine learning in here, that has to be enriched, and iterated on too. We're talking about taking things off a branch and putting it back into the core. This is a data discussion, this isn't software, but it sounds the same. >> Right, and this is something that the irony is that, I remember first time saying it to an auditor. I was constantly going with auditors, and that's what I described is I'm going to show you the code that manages the data. This is the data's code that's going to show you how we transform it, how we secure it, where the data goes, how it's enriched. So you can see the whole story, the data life cycle in one place. And that's how we handled our orders. And I think that is enormously, you know, positive because it's so easy to be confused. It's so easy to have complexity to get in the way of progress. And by being able to represent your Data as Code, it's a step forward 'cause the amount of data and the complexity of data, it's not getting simpler, it's getting more complex. So we need to come up with better ways to handle it. >> Now you've been on both sides of the fence. You've been in the trenches as customer, now you're a supplier with Great Solution. What are people doing with this data engineering roles? Because it's not enough data engineering. I mean, 'cause if you say Data as Code, if you believe that to be true and many people do, we do. And you looked at the history of infrastructure risk code that enabled DevOps, AIOps, MLOps, DataOps, it's happening, right? So data stack ops is coming. Obviously security is huge in this. How does that data engineering role evolve? Because it just seems more and more that there's going to be a big push towards an SRE version of data, right? >> I completely agree. I was working with a customer yesterday, and I spent a large part of our conversation talking about implementing development practices for administrators. It's a new role. It's a new way to think of things 'cause traditionally your Splunk or elastic administrators is talking about operating systems and memory and talking about how to use proprietary tools in the vendor, that's just not quite the same. And so we started talking about, you need to have, you need to start getting used to code reviews. Yeah, the idea of getting used to making sure everything has a comment, was one thing I told him was like, you know, if you have a function has to have a comment, just by default, just it has to. Yeah, the standards of how you write things, how you name things all really start to matter. And also you got to start adding, considering your skillset. And this is some mean probably one of the best hire I ever made was I hired a guy with a math degree, because I needed his help to understand how do machine learning works, how to pick the best type of algorithm. And I think this is going to evolve, that you're going to be just away from the gray bearded administrator to some other gray bearded administrator with a math degree. >> It's interesting, it's a step function. You have a data engineer who's got that kind of capabilities, like what the SRA did with infrastructure. The step function of enablement, the value creation from really good data engineering, puts the democratization playback on the table, and changes, >> Thank you very much John. >> And changes that entire landscape. How do you, what's your reaction to that? >> I completely agree 'cause so operational data. So operational security data is the most volatile data in the enterprise. It changes on a whim, you have developers who change things. They don't tell you what happens, vendor doesn't tell you what happened, and so that idea, that life cycle of managing data. So the same types of standards of disciplines that database administrators have done for years is going to have, it has to filter down into the operational areas, and you need tooling that's going to give you the ability to manage that data, manage it in flight in real time, in order to drive detections, in order to drive response. All those business value things we've been talking about. >> So I got to ask you the larger role that you see with observability lakes we were talking before we came on camera live here about how exciting this kind of concept is, and you were attracted to the company because of it. I love the observability lake concept because it puts all that data in one spot, you can manage it. But you got machine learning in AI around the corner that also can help. How has all this changed in the landscape of data security and things because it makes a lot of sense, and I can only see it getting better with machine learning. >> Yeah, definitely does. >> Totally, and so the core issue, and I don't want to say, so when you talk about observability, most people have assumptions around observability is only an operational or an application support process. It's also security process. The idea that you're looking for your unknown, unknowns. This is what keeps security administrators up at night is I'm being attacked by something I don't know about. How do you find those unknown? And that's where your machine learning comes in. And that's where that you have to understand there's so many different types of machine learning algorithms, where the guy that I hired, I mean, had started educating me about the umpteen number of algorithms and how it applies to different data and how you get different value, how you have to test your data constantly. There's no such thing as the magical black box of machine learning that gives you value. You have to implement, but just like the developer practices to keep testing and over and over again, data scientists, for example. >> The best friend of a machine learning algorithm is data, right? You got to keep feeding that data, and when the data sets are baked and secure and vetted, even better, all cool. Had great stuff, great insight. Congratulations Cribl, Great Solution. Love the architecture, love the pipelining of the observability data and streaming that in to a lake. Great stuff. Give a plug for the company where you guys are at, where people can get information. I know you guys got a bunch of live feeds on YouTube, Twitch, here in theCUBE. Where else can people find you? Give the plug. >> Oh, please, please join our slack community, go to cribl.io/community. We have an amazing community. This was another thing that drew me to the company is have a large group of people who are genuinely excited about data, about managing data. If you want to try Cribl out, we have some great tool. Try Cribl tools out. We have a cloud platform, one terabyte up free data. So go to cribl.io/cloud or cribl.cloud, sign up for, you know, just never times out. You're not 30 day, it's forever up to one terabyte. Try out our new products as well, Cribl Edge. And then finally come watch Nick Decker and I, every Thursday, 2:00 PM Eastern. We have live streams on Twitter, LinkedIn and YouTube live. And so just my Twitter handle is EBA 1367. Love to have, love to chat, love to have these conversations. And also, we are hiring. >> All right, good stuff. Great team, great concepts, right? Of course, we're theCUBE here. We got our video lake coming on soon. I think I love this idea of having these video. Hey, videos data too, right? I mean, we've got to keep coming to you. >> I love it, I love videos, it's awesome. It's a great way to communicate, it's a great way to have a conversation. That's the best thing about us, having conversations. I appreciate your time. >> Thank you so much, Ed, for representing Cribl here on the Data as Code. This is season two episode two of the ongoing series covering the hottest, most exciting startups from the AWS ecosystem. Talking about the future data, I'm John Furrier, your host. Thanks for watching. >> Ed: All right, thank you. (slow upbeat music)

Published Date : Apr 26 2022

SUMMARY :

And talk about the future of I thank you for the I like the breach investigation angle, to be able to have your I like that's part of the talk And the key is so when Where is the data? and do the instrumentation And I only put the data I need I like that piece, you We can fit the data to for the enterprise. I got to ask you, well, go ahead. and being, it could take, you know, hours, the Cribble streaming there, What are some of the impact? and that's the key. just for the sake of You have the choices to put together, This is the new way. I believe I did the first, this is 2018, And the thing is, it just They got the lake in place, the ability to restore, we call it-- and putting it back into the core. is I'm going to show you more that there's going to be And I think this is going to evolve, the value creation from And changes that entire landscape. that's going to give you the So I got to ask you the Totally, and so the core of the observability data and that drew me to the company I think I love this idea That's the best thing about Cribl here on the Data as Code. Ed: All right, thank you.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JohnPERSON

0.99+

John FurrierPERSON

0.99+

EdPERSON

0.99+

Ed BaileyPERSON

0.99+

TransUnionORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

2018DATE

0.99+

AutodeskORGANIZATION

0.99+

AWSORGANIZATION

0.99+

three hoursQUANTITY

0.99+

287 daysQUANTITY

0.99+

IBMORGANIZATION

0.99+

30 dayQUANTITY

0.99+

six monthQUANTITY

0.99+

first demoQUANTITY

0.99+

yesterdayDATE

0.99+

CriblORGANIZATION

0.99+

first demosQUANTITY

0.99+

YouTubeORGANIZATION

0.99+

TwitchORGANIZATION

0.99+

firstQUANTITY

0.99+

both sidesQUANTITY

0.99+

three serversQUANTITY

0.99+

SplunkORGANIZATION

0.99+

one spotQUANTITY

0.99+

oneQUANTITY

0.99+

OneQUANTITY

0.98+

30 minutesQUANTITY

0.98+

CriblPERSON

0.98+

UBAORGANIZATION

0.98+

one placeQUANTITY

0.98+

one terabyteQUANTITY

0.98+

first 30 minutesQUANTITY

0.98+

LinkedInORGANIZATION

0.98+

SRAORGANIZATION

0.97+

TodayDATE

0.97+

one more wayQUANTITY

0.97+

about 1/2 million dollarsQUANTITY

0.96+

one serverQUANTITY

0.96+

TwitterORGANIZATION

0.96+

BeatsORGANIZATION

0.96+

Nick DeckerPERSON

0.96+

CriblTITLE

0.95+

todayDATE

0.94+

Cribl EdgeTITLE

0.94+

first customersQUANTITY

0.94+

87, 90 daysQUANTITY

0.93+

Thursday, 2:00 PM EasternDATE

0.92+

around 13 years agoDATE

0.91+

first timeQUANTITY

0.89+

threeQUANTITY

0.87+

cribl.io/communityOTHER

0.87+

IntelORGANIZATION

0.87+

cribl.cloudTITLE

0.86+

DatadogORGANIZATION

0.85+

S3TITLE

0.84+

Cribl streamTITLE

0.82+

cribl.io/cloudTITLE

0.81+

Couple of thingsQUANTITY

0.78+

twoOTHER

0.78+

episodeQUANTITY

0.74+

AWS Startup ShowcaseEVENT

0.72+

lockTITLE

0.72+

ExabeamORGANIZATION

0.71+

Startup Showcase S2 E2EVENT

0.69+

season twoQUANTITY

0.67+

MulticloudTITLE

0.67+

up to one terabyteQUANTITY

0.67+

Gokula Mishra | MIT CDOIQ 2019


 

>> From Cambridge, Massachusetts, it's theCUBE covering MIT Chief Data Officer and Information Quality Symposium 2019 brought to you by SiliconANGLE Media. (upbeat techno music) >> Hi everybody, welcome back to Cambridge, Massachusetts. You're watching theCUBE, the leader in tech coverage. We go out to the events. We extract the signal from the noise, and we're here at the MIT CDOIQ Conference, Chief Data Officer Information Quality Conference. It is the 13th year here at the Tang building. We've outgrown this building and have to move next year. It's fire marshal full. Gokula Mishra is here. He is the Senior Director of Global Data and Analytics and Supply Chain-- >> Formerly. Former, former Senior Director. >> Former! I'm sorry. It's former Senior Director of Global Data Analytics and Supply Chain at McDonald's. Oh, I didn't know that. I apologize my friend. Well, welcome back to theCUBE. We met when you were at Oracle doing data. So you've left that, you're on to your next big thing. >> Yes, thinking through it. >> Fantastic, now let's start with your career. You've had, so you just recently left McDonald's. I met you when you were at Oracle, so you cut over to the dark side for a while, and then before that, I mean, you've been a practitioner all your life, so take us through sort of your background. >> Yeah, I mean my beginning was really with a company called Tata Burroughs. Those days we did not have a lot of work getting done in India. We used to send people to U.S. so I was one of the pioneers of the whole industry, coming here and working on very interesting projects. But I was lucky to be working on mostly data analytics related work, joined a great company called CS Associates. I did my Master's at Northwestern. In fact, my thesis was intelligent databases. So, building AI into the databases and from there on I have been with Booz Allen, Oracle, HP, TransUnion, I also run my own company, and Sierra Atlantic, which is part of Hitachi, and McDonald's. >> Awesome, so let's talk about use of data. It's evolved dramatically as we know. One of the themes in this conference over the years has been sort of, I said yesterday, the Chief Data Officer role emerged from the ashes of sort of governance, kind of back office information quality compliance, and then ascended with the tailwind of the Big Data meme, and it's kind of come full circle. People are realizing actually to get value out of data, you have to have information quality. So those two worlds have collided together, and you've also seen the ascendancy of the Chief Digital Officer who has really taken a front and center role in some of the more strategic and revenue generating initiatives, and in some ways the Chief Data Officer has been a supporting role to that, providing the quality, providing the compliance, the governance, and the data modeling and analytics, and a component of it. First of all, is that a fair assessment? How do you see the way in which the use of data has evolved over the last 10 years? >> So to me, primarily, the use of data was, in my mind, mostly around financial reporting. So, anything that companies needed to run their company, any metrics they needed, any data they needed. So, if you look at all the reporting that used to happen it's primarily around metrics that are financials, whether it's around finances around operations, finances around marketing effort, finances around reporting if it's a public company reporting to the market. That's where the focus was, and so therefore a lot of the data that was not needed for financial reporting was what we call nowadays dark data. This is data we collect but don't do anything with it. Then, as the capability of the computing, and the storage, and new technologies, and new techniques evolve, and are able to handle more variety and more volume of data, then people quickly realize how much potential they have in the other data outside of the financial reporting data that they can utilize too. So, some of the pioneers leverage that and actually improved a lot in their efficiency of operations, came out with innovation. You know, GE comes to mind as one of the companies that actually leverage data early on, and number of other companies. Obviously, you look at today data has been, it's defining some of the multi-billion dollar company and all they have is data. >> Well, Facebook, Google, Amazon, Microsoft. >> Exactly. >> Apple, I mean Apple obviously makes stuff, but those other companies, they're data companies. I mean largely, and those five companies have the highest market value on the U.S. stock exchange. They've surpassed all the other big leaders, even Berkshire Hathaway. >> So now, what is happening is because the market changes, the forces that are changing the behavior of our consumers and customers, which I talked about which is everyone now is digitally engaging with each other. What that does is all the experiences now are being captured digitally, all the services are being captured digitally, all the products are creating a lot of digital exhaust of data and so now companies have to pay attention to engage with their customers and partners digitally. Therefore, they have to make sure that they're leveraging data and analytics in doing so. The other thing that has changed is the time to decision to the time to act on the data inside that you get is shrinking, and shrinking, and shrinking, so a lot more decision-making is now going real time. Therefore, you have a situation now, you have the capability, you have the technology, you have the data now, you have to make sure that you convert that in what I call programmatic kind of data decision-making. Obviously, there are people involved in more strategic decision-making. So, that's more manual, but at the operational level, it's going more programmatic decision-making. >> Okay, I want to talk, By the way, I've seen a stat, I don't know if you can confirm this, that 80% of the data that's out there today is dark data or it's data that's behind a firewall or not searchable, not open to Google's crawlers. So, there's a lot of value there-- >> So, I would say that percent is declining over time as companies have realized the value of data. So, more and more companies are removing the silos, bringing those dark data out. I think the key to that is companies being able to value their data, and as soon as they are able to value their data, they are able to leverage a lot of the data. I still believe there's a large percent still not used or accessed in companies. >> Well, and of course you talked a lot about data monetization. Doug Laney, who's an expert in that topic, we had Doug on a couple years ago when he, just after, he wrote Infonomics. He was on yesterday. He's got a very detailed prescription as to, he makes strong cases as to why data should be valued like an asset. I don't think anybody really disagrees with that, but then he gave kind of a how-to-do-it, which will, somewhat, make your eyes bleed, but it was really well thought out, as you know. But you talked a lot about data monetization, you talked about a number of ways in which data can contribute to monetization. Revenue, cost reduction, efficiency, risk, and innovation. Revenue and cost is obvious. I mean, that's where the starting point is. Efficiency is interesting. I look at efficiency as kind of a doing more with less but it's sort of a cost reduction, but explain why it's not in the cost bucket, it's different. >> So, it is first starts with doing what we do today cheaper, better, faster, and doing more comes after that because if you don't understand, and data is the way to understand how your current processes work, you will not take the first step. So, to take the first step is to understand how can I do this process faster, and then you focus on cheaper, and then you focus on better. Of course, faster is because of some of the market forces and customer behavior that's driving you to do that process faster. >> Okay, and then the other one was risk reduction. I think that makes a lot of sense here. Actually, let me go back. So, one of the key pieces of it, of efficiency is time to value. So, if you can compress the time, or accelerate the time and you get the value that means more cash in house faster, whether it's cost reduction or-- >> And the other aspect you look at is, can you automate more of the processes, and in that way it can be faster. >> And that hits the income statement as well because you're reducing headcount cost of your, maybe not reducing headcount cost, but you're getting more out of different, out ahead you're reallocating them to more strategic initiatives. Everybody says that but the reality is you hire less people because you just automated. And then, risk reduction, so the degree to which you can lower your expected loss. That's just instead thinking in insurance terms, that's tangible value so certainly to large corporations, but even midsize and small corporations. Innovation, I thought was a good one, but maybe you could use an example of, give us an example of how in your career you've seen data contribute to innovation. >> So, I'll give an example of oil and gas industry. If you look at speed of innovation in the oil and gas industry, they were all paper-based. I don't know how much you know about drilling. A lot of the assets that goes into figuring out where to drill, how to drill, and actually drilling and then taking the oil or gas out, and of course selling it to make money. All of those processes were paper based. So, if you can imagine trying to optimize a paper-based innovation, it's very hard. Not only that, it's very, very by itself because it's on paper, it's in someone's drawer or file. So, it's siloed by design and so one thing that the industry has gone through, they recognize that they have to optimize the processes to be better, to innovate, to find, for example, shale gas was a result output of digitizing the processes because otherwise you can't drill faster, cheaper, better to leverage the shale gas drilling that they did. So, the industry went through actually digitizing a lot of the paper assets. So, they went from not having data to knowingly creating the data that they can use to optimize the process and then in the process they're innovating new ways to drill the oil well cheaper, better, faster. >> In the early days of oil exploration in the U.S. go back to the Osage Indian tribe in northern Oklahoma, and they brilliantly, when they got shuttled around, they pushed him out of Kansas and they negotiated with the U.S. government that they maintain the mineral rights and so they became very, very wealthy. In fact, at one point they were the wealthiest per capita individuals in the entire world, and they used to hold auctions for various drilling rights. So, it was all gut feel, all the oil barons would train in, and they would have an auction, and it was, again, it was gut feel as to which areas were the best, and then of course they evolved, you remember it used to be you drill a little hole, no oil, drill a hole, no oil, drill a hole. >> You know how much that cost? >> Yeah, the expense is enormous right? >> It can vary from 10 to 20 million dollars. >> Just a giant expense. So, now today fast-forward to this century, and you're seeing much more sophisticated-- >> Yeah, I can give you another example in pharmaceutical. They develop new drugs, it's a long process. So, one of the initial process is to figure out what molecules this would be exploring in the next step, and you could have thousand different combination of molecules that could treat a particular condition, and now they with digitization and data analytics, they're able to do this in a virtual world, kind of creating a virtual lab where they can test out thousands of molecules. And then, once they can bring it down to a fewer, then the physical aspect of that starts. Think about innovation really shrinking their processes. >> All right, well I want to say this about clouds. You made the statement in your keynote that how many people out there think cloud is cheaper, or maybe you even said cheap, but cheaper I inferred cheaper than an on-prem, and so it was a loaded question so nobody put their hand up they're afraid, but I put my hand up because we don't have any IT. We used to have IT. It was a nightmare. So, for us it's better but in your experience, I think I'm inferring correctly that you had meant cheaper than on-prem, and certainly we talked to many practitioners who have large systems that when they lift and shift to the cloud, they don't change their operating model, they don't really change anything, they get a bill at the end of the month, and they go "What did this really do for us?" And I think that's what you mean-- >> So what I mean, let me make it clear, is that there are certain use cases that cloud is and, as you saw, that people did raise their hand saying "Yeah, I have use cases where cloud is cheaper." I think you need to look at the whole thing. Cost is one aspect. The flexibility and agility of being able to do things is another aspect. For example, if you have a situation where your stakeholder want to do something for three weeks, and they need five times the computing power, and the data that they are buying from outside to do that experiment. Now, imagine doing that in a physical war. It's going to take a long time just to procure and get the physical boxes, and then you'll be able to do it. In cloud, you can enable that, you can get GPUs depending on what problem we are trying to solve. That's another benefit. You can get the fit for purpose computing environment to that and so there are a lot of flexibility, agility all of that. It's a new way of managing it so people need to pay attention to the cost because it will add to the cost. The other thing I will point out is that if you go to the public cloud, because they make it cheaper, because they have hundreds and thousands of this canned CPU. This much computing power, this much memory, this much disk, this much connectivity, and they build thousands of them, and that's why it's cheaper. Well, if your need is something that's very unique and they don't have it, that's when it becomes a problem. Either you need more of those and the cost will be higher. So, now we are getting to the IOT war. The volume of data is growing so much, and the type of processing that you need to do is becoming more real-time, and you can't just move all this bulk of data, and then bring it back, and move the data back and forth. You need a special type of computing, which is at the, what Amazon calls it, adds computing. And the industry is kind of trying to design it. So, that is an example of hybrid computing evolving out of a cloud or out of the necessity that you need special purpose computing environment to deal with new situations, and all of it can't be in the cloud. >> I mean, I would argue, well I guess Microsoft with Azure Stack was kind of the first, although not really. Now, they're there but I would say Oracle, your former company, was the first one to say "Okay, we're going to put the exact same infrastructure on prem as we have in the public cloud." Oracle, I would say, was the first to truly do that-- >> They were doing hybrid computing. >> You now see Amazon with outposts has done the same, Google kind of has similar approach as Azure, and so it's clear that hybrid is here to stay, at least for some period of time. I think the cloud guys probably believe that ultimately it's all going to go to the cloud. We'll see it's going to be a long, long time before that happens. Okay! I'll give you last thoughts on this conference. You've been here before? Or is this your first one? >> This is my first one. >> Okay, so your takeaways, your thoughts, things you might-- >> I am very impressed. I'm a practitioner and finding so many practitioners coming from so many different backgrounds and industries. It's very, very enlightening to listen to their journey, their story, their learnings in terms of what works and what doesn't work. It is really invaluable. >> Yeah, I tell you this, it's always a highlight of our season and Gokula, thank you very much for coming on theCUBE. It was great to see you. >> Thank you. >> You're welcome. All right, keep it right there everybody. We'll be back with our next guest, Dave Vellante. Paul Gillin is in the house. You're watching theCUBE from MIT. Be right back! (upbeat techno music)

Published Date : Aug 1 2019

SUMMARY :

brought to you by SiliconANGLE Media. He is the Senior Director of Global Data and Analytics Former, former Senior Director. We met when you were at Oracle doing data. I met you when you were at Oracle, of the pioneers of the whole industry, and the data modeling and analytics, So, if you look at all the reporting that used to happen the highest market value on the U.S. stock exchange. So, that's more manual, but at the operational level, that 80% of the data that's out there today and as soon as they are able to value their data, Well, and of course you talked a lot and data is the way to understand or accelerate the time and you get the value And the other aspect you look at is, Everybody says that but the reality is you hire and of course selling it to make money. the mineral rights and so they became very, very wealthy. and you're seeing much more sophisticated-- So, one of the initial process is to figure out And I think that's what you mean-- and the type of processing that you need to do I mean, I would argue, and so it's clear that hybrid is here to stay, and what doesn't work. Yeah, I tell you this, Paul Gillin is in the house.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

AmazonORGANIZATION

0.99+

HitachiORGANIZATION

0.99+

AppleORGANIZATION

0.99+

FacebookORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

Doug LaneyPERSON

0.99+

five timesQUANTITY

0.99+

OracleORGANIZATION

0.99+

KansasLOCATION

0.99+

TransUnionORGANIZATION

0.99+

Paul GillinPERSON

0.99+

HPORGANIZATION

0.99+

three weeksQUANTITY

0.99+

IndiaLOCATION

0.99+

10QUANTITY

0.99+

Sierra AtlanticORGANIZATION

0.99+

Gokula MishraPERSON

0.99+

DougPERSON

0.99+

hundredsQUANTITY

0.99+

Berkshire HathawayORGANIZATION

0.99+

five companiesQUANTITY

0.99+

80%QUANTITY

0.99+

U.S.LOCATION

0.99+

Booz AllenORGANIZATION

0.99+

Tata BurroughsORGANIZATION

0.99+

first stepQUANTITY

0.99+

GokulaPERSON

0.99+

next yearDATE

0.99+

thousandsQUANTITY

0.99+

McDonald'sORGANIZATION

0.99+

one aspectQUANTITY

0.99+

Cambridge, MassachusettsLOCATION

0.99+

SiliconANGLE MediaORGANIZATION

0.99+

firstQUANTITY

0.99+

yesterdayDATE

0.99+

thousands of moleculesQUANTITY

0.99+

first oneQUANTITY

0.99+

OneQUANTITY

0.98+

GEORGANIZATION

0.98+

northern OklahomaLOCATION

0.98+

todayDATE

0.97+

CS AssociatesORGANIZATION

0.97+

20 million dollarsQUANTITY

0.97+

oneQUANTITY

0.96+

FirstQUANTITY

0.96+

Global Data and Analytics and Supply ChainORGANIZATION

0.95+

MIT CDOIQ ConferenceEVENT

0.95+

13th yearQUANTITY

0.94+

U.S. governmentORGANIZATION

0.93+

two worldsQUANTITY

0.92+

Azure StackTITLE

0.91+

one thingQUANTITY

0.9+

one pointQUANTITY

0.9+

NorthwesternORGANIZATION

0.9+

couple years agoDATE

0.89+

MIT Chief Data Officer and Information Quality Symposium 2019EVENT

0.87+

this centuryDATE

0.85+

Tang buildingLOCATION

0.85+

Global Data Analytics andORGANIZATION

0.83+

Chief Data Officer Information Quality ConferenceEVENT

0.81+

MITORGANIZATION

0.78+

theCUBEORGANIZATION

0.77+

thousand different combination of moleculesQUANTITY

0.74+

lastDATE

0.67+

yearsDATE

0.66+

U.S.ORGANIZATION

0.66+

billion dollarQUANTITY

0.65+

themesQUANTITY

0.65+

Osage IndianOTHER

0.64+

Jacques Nadeau, Dremio | Big Data SV 2018


 

>> Announcer: Live from San Jose, it's theCUBE, presenting Big Data Silicon Valley. Brought to you by SiliconANGLE Media and it's ecosystem partners. >> Welcome back to Big Data SV in San Jose. This theCUBE, the leader in live tech coverage. My name is Dave Vellante and this is day two of our wall-to-wall coverage. We've been here most of the week, had a great event last night, about 50 or 60 of our CUBE community members were here. We had a breakfast this morning where the Wikibon research team laid out it's big data forecast, the eighth big data forecast and report that we've put out, so check out that online. Jacques Nadeau is here. He is the CTO and co-founder of Dremio. Jacque, welcome to theCUBE, thanks for coming on. >> Thanks for having me here. >> So we were talking a little bit about what you guys do. Three year old company. Well, let me start. Why did you co-found Dremio? >> So, it was a very simple thing I saw, so, over the last ten years or so, we saw a regression in the ability for people to get at data, so you see all these really cool technologies that came out to store data. Data lakes, you know, SQL systems, all these different things that make developers very agile with data. But what we were also seeing was a regression in the ability for analysts and data consumers to get at that data because the systems weren't designed for analysts, they were designed for data producers and developers. And we said, you know what, there needs to be a way to solve this. We need to be able to empower people to be self-sufficient again at the data consumption layer. >> Okay, so you solved that problem how, you said, called it a self-service of a data platform. >> Yeah, yeah, so self-service data platform and the idea is pretty simple. It's that, no matter where the data is physically, people should be able to interact with a logical view of it. And so, we talk a little bit like it's Google Docs for your data. So people can go into the system, they can see the different data sets that are available to them, collaborate around those, create changes to those that they can then share with other people in the organization, always dealing with the logical layer and then, behind the scenes, we have physical capabilities to interact with all the different system we interact with. But that's something that business users shouldn't have to think as much about and so, if you think about how people interact with data today, it's very much about copies. So every time you want to do something, typically you're going to make a copy. I want to reshape the data, I make a copy. I want to make it go faster, I make a copy. And those copies are very, very difficult for people to manage and they could have mixed the business meaning of data with the physical, I'm making copies to make them faster or whatever. And so our perspective is that, if you can separate away the physical concerns from the logical, then business users have a much more, much more likelihood to be able to do something self-service. >> So you're essentially virtualizing my corpus of data, independent of location, is that right, I mean-- >> It's part of what we do, yeah. No, it's part of what we do. So, the way we look at it is, is kind of several different components to try to make something self-service. It starts with, yeah, virtualize or abstract away the details of the physical, right? But then, on top of that, expose a very, sort of a very user-friendly interface that allows people to sort of catalog and understand the different things, you know, search for things that they want to interact with, and then curate things, even if they're non-technical users, right? So the goal is that, if you talk to sort of even large internet companies in the Valley, it's very hard to even hire the amount of data engineering that you need to satisfy all the requests of your end-users of data. And so the, and so the goal of Dremio is basically to figure out different tools that can provide a non-technical experience for getting at the data. So that's sort of the start of it but then the second step is, once you've got access to this thing and people can collaborate and sort of deal with the data, then you've got these huge volumes of data, right? It's big data and so how do you make that go faster? And then we have some components that we deal with, sort of, speed and acceleration. >> So maybe talk about how people are leveraging this capability, this platform, what the business impact is, what have you seen there? >> So a lot of people have this problem, which is, they have data all over the place and they're trying to figure out "How do I expose this "to my end-users?" And those end-users might be analysts, they might be data scientists, they might be product managers that are trying to figure out how their product is working. And so, what they're doing today is they're typically trying to build systems internally that, to provide these capabilities. And so, for example, working with a large auto manufacturer. And they've got a big initiative where they're trying to make the data that they have, they have huge amounts of data across all sort of different parts of the organization and they're trying to make that available to different data consumers. Now, of course, there's a bunch of security concerns that you need to have around that, but they just want to make the data more accessible. And so, what they're doing is they're using Dremio to figure out ways to, basically, catalog all the data below, expose that to the different users, applying lots of different security rules around that, and then create a bunch of reflections, which make the things go faster as people are interacting with the things. >> Well, what about the governance factor? I mean, you heard this in the hadoop world years ago. "Ah, we're going to make, we're going to harden hadoop, "we're going to" and really, there was no governance and it became more and more important. How do you guys handle that? Do you partner with people? Is it up to the customer to figure that out? Do you provide that? >> It's several different things, right? It's a complex ecosystem, right? So it's a combination of things. You start with partnering with different systems to make sure that you integrate well with those things. So the different things that control some parts of credentials inside the systems all the way down to "What's the file system permissions?", right? "What are the permissions inside of something like Hive and the metastore there?" And then other systems on top of that, like Sentry or Ranger are also exposing different credentialing, right? And so we work hard to sort of integrate with those things. On top of that, Dremio also provides a full security model inside of the sort of virtual space that we work. And so people can control the permissions, the ability to access or edit any object inside of Dremio based on user roles and LDAP and those kinds of things. So it's, it's kind of multiple layers that have to be working together. >> And tell me more about the company. So founded three years ago, I think a couple of raises, >> Yep >> who's backing you? >> Yeah, yeah, yeah, so we founded just under three years ago. We had great initial investors, in Red Point and Lightspeed, so two great initial investors and we raised about 15 million on that round. And then we actually just closed a B round in January of this year and we added Norwest to the portfolio there. >> Awesome, so you're now in the mode of, I mean, they always say, you know, software is such a capital-efficient business but you see software companies raising, you know, 900 million dollars and so, presumably, that's to compete, to go to market and, you know, differentiate with your messaging and branding. Is that sort of what the, the phase that you're in now? You kind of developed a product, it's technically sound, it's proven in the marketspace and now you're scaling the, the go-to-market, is that right? >> That's exactly right. So, so we've had a lot of early successes, a lot of Fortune 100 companies using Dremio today. For example, we're working with TransUnion. We're working with Intel. We actually have a great relationship with OVH, which is the third-largest hosting company in the world, so a lot of great, Daimler is another one. So working with a lot of great companies, seeing sort of great early success with the product with those companies, and really looking to say "Hey, we're out here." We've got a booth for the first time at Strata here and we're sort of letting people know about, sort of, a better way, or easier way, for people to deal with data >> Yeah. >> A happier way. >> I mean, it's a crowded space, right? There's a lot of tools out there, a lot of companies. I'm interested in how you sort of differentiate. Obviously simplification is a part of that, the breadth of your capabilities. But maybe, in your words, you could share with me how you differentiate from the competition and how you break out from the noise. >> Yeah, yeah, yeah, so it's, you're absolutely right, it's a very crowded space. Everybody's using the same words and that makes it very hard for people to understand what's going on. And so, what we've found is very simple is that typically we will actually, the first meeting we deal with a customer, within the first 10 minutes we'll demo the product. Because so many technologies are technologies, not, they're not products and so you have to figure out how to use the product. You've got to figure out how you would customize it for your certain use-case. And what we've found with our product is, by making it very, very simple, people start, the light goes on in a very short amount of time and so, we also do things on our website so that you can see, in a couple of minutes, or even less than that, little animations that sort of give you a sense of what it's about. But really, it's just "Hey, this is a product "which is about", there's this light bulb that goes on, it's great. And you figure this out over the course of working with different customers, right? But there's this light bulb that goes on for people that are so confused by all the things that are going on and if we can just sit down with them, show them the product for a few minutes, all of a sudden they're like "Wait a minute, "I can use this", right? So you're frequently talking to buyers that are not the most technical parts of the organization initially, and so most of the technologies they look at are technologies that are very difficult to understand and they have to look to others to try to even understand how it would fit into their architecture. With Dremio, we have customers that can, that have installed it and gotten up, and within an hour or two, started to see real value. And that sort of excitement happens even in the demo, with most people. >> So you kind of have this bifurcated market. Since the big data meme, everybody says they're data-driven and you've got a bifurcated market in that, you've got the companies that are data-driven and you've got companies who say they're data-driven but really aren't. Who are your customers? Are they in both? Are they predominantly in the data-driven side? Are they predominantly in the trying to be data-driven? >> Well, I would say that they all would say that they're data-driven. >> Yeah, everyone, who's going to say "Well, we're not data-driven." >> Yeah, yeah, yeah. So I would say >> We're dead. >> I would say that everybody has data and they've got some ways that they're using it well and other places where they feel like they're not using it as well as they should. And so, I mean, the reason that we exist is to make it so it's easier for people to get value out of data, and so, if they were getting all the value they think they could get out of data, then we probably wouldn't exist and they would be fully data-driven. So I think that everybody, it's a journey and people are responding well to us, in part, because we're helping them down that journey. >> Well, the reason I asked that question is that we go to a lot of shows and everybody likes to throw out the digital transformation buzzword and then use Uber and Airbnb as an example, but if you dig deeper, you see that data is at the core of those companies and they're now beginning to apply machine intelligence and they're leveraging all this data that they've built up, this data architecture that they built up over the last five or 10 years. And then you've got this set of companies where all the data lives in silos and I can see you guys being able to help them. At the same time, I can see you helping the disruptors, so how do you see that? I mean, in terms of your role, in terms of affecting either digital transformations or digital disruptions. >> Well, I'd say that in either case, so we believe in a very sort of simple thing, which is that, so going back to what I said at the beginning, which is just that I see this regression in terms of data access, right? And so what happens is that, if you have a tightly-coupled system between two layers, then it becomes very difficult for people to sort of accommodate two different sets of needs. And so, the change over the last 10 years was the rise of the developer as the primary person for controlling data and that brought a huge amount of great things to it but analysis was not one of them. And there's tools that try to make that better but that's really the problem. And so our belief is very simple, which is that a new tier needs to be introduced between the consumers and the, and the producers of data. And that, and so that tier may interact with different systems, it may be more complex or whatever, for certain organizations, but the tier is necessary in all organizations because the analysts shouldn't be shaken around every time the developers change how they're doing data. >> Great. John Furrier has a saying that "Data is the new development kit", you know. He said that, I don't know, eight years ago and it's really kind of turned out to be the case. Jacques Nadeau, thanks very much for coming on theCUBE. Really appreciate your time. >> Yeah. >> Great to meet you. Good luck and keep us informed, please. >> Yes, thanks so much for your time, I've enjoyed it. >> You're welcome. Alright, thanks for watching everybody. This is theCUBE. We're live from Big Data SV. We'll be right back. (bright music)

Published Date : Mar 9 2018

SUMMARY :

Brought to you by SiliconANGLE Media We've been here most of the week, So we were talking a little bit about what you guys do. And we said, you know what, there needs to be a way Okay, so you solved that problem how, and the idea is pretty simple. So the goal is that, if you talk to sort of expose that to the different users, I mean, you heard this in the hadoop world years ago. And so people can control the permissions, And tell me more about the company. And then we actually just closed a B round that's to compete, to go to market and, you know, for people to deal with data and how you break out from the noise. and so most of the technologies they look at So you kind of have this bifurcated market. that they're data-driven. Yeah, everyone, who's going to say So I would say And so, I mean, the reason that we exist is At the same time, I can see you helping the disruptors, And so, the change over the last 10 years "Data is the new development kit", you know. Great to meet you. This is theCUBE.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

Jacques NadeauPERSON

0.99+

DaimlerORGANIZATION

0.99+

John FurrierPERSON

0.99+

NorwestORGANIZATION

0.99+

IntelORGANIZATION

0.99+

WikibonORGANIZATION

0.99+

TransUnionORGANIZATION

0.99+

JacquePERSON

0.99+

San JoseLOCATION

0.99+

OVHORGANIZATION

0.99+

LightspeedORGANIZATION

0.99+

second stepQUANTITY

0.99+

UberORGANIZATION

0.99+

two layersQUANTITY

0.99+

AirbnbORGANIZATION

0.99+

bothQUANTITY

0.99+

SiliconANGLE MediaORGANIZATION

0.99+

Google DocsTITLE

0.99+

Red PointORGANIZATION

0.99+

StrataORGANIZATION

0.99+

60QUANTITY

0.98+

900 million dollarsQUANTITY

0.98+

three years agoDATE

0.98+

eight years agoDATE

0.98+

twoQUANTITY

0.98+

DremioPERSON

0.98+

first 10 minutesQUANTITY

0.98+

last nightDATE

0.98+

about 15 millionQUANTITY

0.97+

theCUBEORGANIZATION

0.97+

first timeQUANTITY

0.97+

DremioORGANIZATION

0.97+

Big Data SVORGANIZATION

0.96+

an hourQUANTITY

0.96+

two great initial investorsQUANTITY

0.95+

todayDATE

0.93+

first meetingQUANTITY

0.93+

this morningDATE

0.92+

two different setsQUANTITY

0.9+

thirdQUANTITY

0.88+

Big DataORGANIZATION

0.87+

SQLTITLE

0.87+

10 yearsQUANTITY

0.87+

CUBEORGANIZATION

0.87+

years agoDATE

0.86+

Silicon ValleyLOCATION

0.86+

January of this yearDATE

0.84+

DremioTITLE

0.84+

Three year oldQUANTITY

0.81+

last 10 yearsDATE

0.8+

SentryORGANIZATION

0.77+

one of themQUANTITY

0.75+

about 50QUANTITY

0.75+

day twoQUANTITY

0.74+

RangerORGANIZATION

0.74+

SVEVENT

0.7+

last ten yearsDATE

0.68+

eighth bigQUANTITY

0.68+

DataORGANIZATION

0.66+

BigEVENT

0.65+

couple of minutesQUANTITY

0.61+

CTOPERSON

0.56+

oneQUANTITY

0.55+

lastDATE

0.52+

100 companiesQUANTITY

0.52+

underDATE

0.51+

fiveQUANTITY

0.5+

2018DATE

0.5+

HiveTITLE

0.42+

Marie Wieck & Greg Wolfond | IBM Interconnect 2017


 

>> Announcer: Live from Las Vegas, it's theCUBE. Covering InterConnect 2017. Brought to you by, IBM. >> Hey, welcome back everyone. Live in Las Vegas, the Mandalay Bay. We're at the IBM Interconnect 2017. This is CUBE's exclusive coverage of three days of wall to wall. Day three winding down here at the event. Great show about cloud, data, and blockchain. Our next guest is Marie Wieck, who's the general manager of the Blockchain group within IBM, and Greg Wolfond, who's the chairman and CEO of SecureKey Technologies, announced a partnership with IBM. A lot of great success of blockchain. It's now a business unit in IBM. Marie, great to see you. Congratulations on the new role. >> Absolutely, we're really excited. We've seen so much momentum in blockchain that we really are investing heavily, created a new division, part of our Industry Platforms team, and we're off to the races. >> Exciting. >> So six weeks in the role now. >> Six weeks, I guess the business model is keeping running hard, (laughter) 'cause you guys have made great success. We had Ramesh, one of your workers in your division, on early, he came from the labs, or the research team, >> Marie: Right, research. >> and now he's in Solutions. The traction has been pretty amazing, so take us through, from a business standpoint, obviously you're now got the P&L applet running, you're going, engaging customers on use cases. Where'd this progress come from? Was it just, the internal coalesce of IBMers and customers coming together, give us why this is at its point today. >> I mean I think the most important point about blockchain, is that it really is a network effect. The whole idea of a shared distributed ledger, where everybody has visibility to the appropriate parts of data that they want, gives you some really interesting new business models, but you can't have a network effect, you can't have a community and an ecosystem if you don't have a common set of standards, and a way to drive interoperability. So just 15 months ago, we launched with 30 other people, the Hyperledger Project, in the Linux Foundation. It's been the fastest growing open source project since the Linux Foundation started, so really impressive momentum, and, you know if you think back just a year, at InterConnect last year in February, we had this little demo of trading marbles. This year, fast forward a year, we have a new division, we have 400 clients that we're working with on real production level use cases. We have eight networks in production. We've got now version one of the standard, which really brings a lot of the enterprise requirements, and we're seeing all kinds of new use cases. Supply chain, health care, government, financial services, all where we're really talking about being real now and trusted for business. >> And I would add that Ginni Rometty on stage, hammering home the focus, >> Exactly. >> like big time, at a Watsonesque level, >> Marie: Absolutely. >> so that must to mobilizing the IBMers new division. What's the buzz internally? (laughter) People want to come work for your division now, I mean what's happening? >> I do get an awful lot of emails from a lot of people who are very interested, but I always know when there's real momentum, when there are people who are doing it that we didn't tell to do it, you know, so we're starting with a pretty small team internally, my group itself Direct Line, is about 200 people. There's about 600 people in the extended team across the different functions across IBM, but when I do a search on our internal directory and search for blockchain, there's over a thousand people who have that name already in their title or in their description because they're working on it, and they see the power of it. >> Innovative people get intoxicated by blockchain, because they can just see the disruption elements. Greg, I want to ask you, because you're actually doing it, not only is it intoxicating to kind of grok what blockchain can be, this some real use cases right now, really jamming hard on blockchain with the ledger, can you just share quickly how that's playing out in context IBM and in the marketplace. >> Yeah, so SecureKey's a digital ID company. So we started in Canada years ago doing this login service for government. You show up, you want to see your taxes, your unemployment, your pension, any of 80 different departments, we made it easy for citizens to go there. You can redirect into a federated login with your TD login, or your Royal Bank login. We have millions of Canadians who use that, and we had hundreds of thousands a month, but it's really a login service, and it saved the federal government I think eight hundred million dollars to get that done, but we wanted to move to the next step, which is sharing identity, so digital identity and how do I share my attributes from TD Bank or Royal Bank or my data from Equifax or TransUnion in a trusted way with parties I want to, and not share it in other ways. And we couldn't do that without Hyperledger. So we can talk a little bit about why we went to it, but we have a network in Canada, we tested already phase one, we're launching later this year with Royal Bank, TD Bank, Bank of Montreal, Scotia. Where a citizen can show up at a Telco to create a new account. Is it okay to share my name and address and my credit score? Yes, done, account's open in seconds. FINTRAC changed the rules in Canada so you can open a bank account. Can I show up at a bank and share my attributes from the province and from a credit agency, and create my bank account in seconds. And we've all had this problem, right? I talk to my wife... >> I mean we live it everyday, I mean identity theft is I think front and center in mainstream life. Everyone has either someone close to them or themselves get the phone call, the credit score's dropping, or hey, someone's had my identity for a couple weeks, this is brutal, even the credit cards are gettin'... >> It's funny, when I started this business two of my friends had their identity taken over and someone put mortgages on their homes, and I said there had to be a better way to do it. With blockchain if we can take data from different sources, that the bank knows it's me and I can log in right now, that I possess this phone, that the province knows it's me and I can turn on the camera and check it's me, we can raise the ID validation score for everyone in the whole industry. For healthcare, to government, to banking, and we not only raise the ID validation we also raise the AuthScore, because I'm not just logging in with my bank, I must have this phone, with this SIM in it, and if it's canceled it's not me. And normally people would put that through brokers in the middle, but NIST in the U.S. said, we don't want brokers in the middle. They could peek, they could see your data. I have single points of failure. If this is identity for health here's how it goes down. I have honeypots of data. People are collecting all of my stuff in one place, it's encrypted, but the bad guy's going to get that, right? They could go after the person, and say I need the keys, I have a member of your family... >> I mean we're living in a world, in cloud, Marie knows, there's no perimeter anymore. >> Marie: Right right. The security experts that are state of the art right now, are saying, even saying theCUBE in day one here, data is the new perimeter. So there it is, right, this is fundamental, what you're saying, this is the new perimeter, the data, and you distribute it. >> So no broker right, means less of a threat matrix for people to hack. You don't need a trusted third party to arbitrate. >> You shouldn't have to get other credentials and things to go right, if I can login at my bank right now` and prove I've got the mobile device, can I release data from different sources? Ten percent of Americans move every year, if I show up at an apartment, can I share that I'm Greg and my bank says I'm me, that I have this device from my mobile company, can I share a background check to say that it's me? We're going to do that in about eight seconds, compared to the landlord having to go and pay a real estate agent one month's rent to vet you. And then when you do that, imagine the power now right? Would you like to sign up for internet? Share your data, yes, click. Would you like contents insurance, click. Totally taking friction out for consumers, but making sure that the parties who provide that data, whether it's my bank, whether it's my government, they can't track me. I don't want my government or my bank knowing if I go to mental health, or if I go to a cancer clinic. Really important that they don't know, right? >> Yeah, healthcare here, I don't know what it's like in Canada, but certainly in the United States you can't get information about yourself (laughter). >> And it's a perfect connection to blockchain, 'cause the whole notion of blockchain in our mind is about a trusted network, and how do you get trust if you don't know who the people are who are participating. So, we signed an agreement with the Food and Drug Administration in the United States, to focus on leveraging blockchain for exchange of information around patients, privately and securely for clinical drug trials. You know, it's just one example of now, you bring that trust element, that's built on a blockchain already, as a new interoperable component of these new supply chain networks. Whether it's around supply chain in global sourcing, whether it's the providence of food or diamonds, there's some really interesting aspects that you can now add on top. And we're now even connecting, you mentioned Cognitive, you know, now apply Watson on top of that. How do you increase the trust level in our new version one delivery of Hyperledger on the IBM cloud, we actually provide a trust score for the network, scale, a one to a hundred. What if Watson could actually look at your use case and hear the recommendations and suggestions for how to improve the trust level? Improving it means getting more members, so it's more distributed, and there's more sharing of information. But they're not going to want to share that information if there isn't a trust model. >> So give us a glimpse as to, sort of, your business. You got 200 people, but you've got thousands of people within IBM that you can tap, in addition to the huge portfolio of things like Cognitive. So you've got this startup (laughs), >> Marie: Right. >> inside of IBM. >> Marie: A startup in IBM. >> And you said it's inside the Industry Platform's team, so what is that, and what are you actually building out? >> So, we are building, we're taking, and contributing, we're investing really, in the Hyperledger project ourselves. We are one of now 122 members of the open source and open community project, and we're actually developing and contributing content there. >> Dave: Big committer there. >> Big committer, and we provide a support model for anybody who wants to use just Hyperledger, but we take Hyperledger back, and now we're delivering it as a secure platform on our high security network, that is production grade, you know enterprise strong, would be Ginni's word for that, right, and delivering that on our cloud, or letting you take a container and put it on your own enterprise if you really want your own private cloud. But we're also building industry solutions on top of that. So we announced a partnership with Maersk, for global shipping on global trade digitization to provide greater visibility. >> But on that deal, just to interrupt, that Ramesh was put in, that wasn't a solution specifically for them, that was an industry scope solution. >> Correct. So it's really a partnership. So in this case again, it's that network effect, it's that ecosystem, it's not Maersk, the customer, it's Maersk and IBM the partners, who are now bringing forward as the anchor tenants in this new network, the rest of our ecosystems, and we were interested 'cause we have a big supply chain business for all our hardware as well. >> And you're selling a SAS product, is that right? >> Correct. >> So it's a subscription based model? >> Correct. >> And then services on top of that? >> And services both to develop new blockchain applications, we've had a number of our clients here from the 20 thousand at InterConnect, that've come up with new ideas. We're going to help them build that, in a services kind of model, but many of these are going to be essentially SAS networks where either they're going to pay a membership fee or they're going to pay per transaction, a percentage of the price, or they're going to participate in the savings, because this is actually going to streamline the opportunity. In the case of SecureKey, the model we see customers willing to buy, the validation of an identity for an individual if they're participating in a critical transaction. A bank would certainly be willing to pay to increase the confidence that Greg is Greg, if he was applying for a mortgage online. >> And the consumption is through the IBM cloud, correct? >> Yeah, so there's a toolkit, we're big believers in open source. It's open at the ends, really easy using things like Bluemix to connect to the endpoints. And for us, it's just a magnificent coming together, because things like the high security network to turn banks on quickly, where they trust it, and they can put their data in a secure and trusted way, make this all go faster. >> Dave: But that's the only place in the world I can get this, correct? >> Marie: It is certainly the only place that you can get that level of security in a blockchain network. >> But from a competitive standpoint, somebody else has to build this out, and create as a competitive product as IBM has, and run it on somebody else's cloud, for them to compete, correct? >> That is correct. >> The strategy is not to spam the world's clouds, it's to say hey, we've got this solution, here's how you get it, here's how you consume it. >> And we really firmly believe that if this is an interoperable set of standards, there will be other networks, there will be other participants. We want them all to be interoperable. We want a global identity standard for interlocking networks, because that is actually the tide that raises all boats. So if they want to take Hyperledger and put it on their own private cloud or somebody else's cloud, we support that thoroughly. We think that the most enterprise grade cloud though, is with IBM. >> You just got thousands of people doing it, and you say, go for it. >> Exactly. >> Dave: Bring it on baby! >> First of all, you had me at blockchain beginning the interview. I love blockchain, and I think it's very intoxicating from a disruption standpoint. Any entrepreneur, any innovator... This is a bulldozer on existing business models, and of how people do things. So, I'm sure the organic growth that you guys see is proving it, internal IBM and external. How do people get involved? What's your plans on building the ecosystem now, because you got a tiger by the tail here as the GM of this division now. You got to run hard, you got to embrace people, you got to have events, what's your plan, and how do I get involved if I'm someone watching and we want to get involved? >> So, great simple ways to get involved, the developers, we want 'em to be involved directly through the cloud and through developerWorks. You get free access, you can get started quickly. In three clicks you can have a four peer Hyperledger network up and running on Bluemix, and you can start your own services and create. If you are a customer, what we're really suggesting is come and bring us your use case. Bring the participants in your network as well. Come into one of our IBM garages, and we'll work that out for you. And I think it's important that, we think blockchain has a huge potential or I wouldn't be in this new role, but we also think it's not for everything. It's not the panacea for every business problem. We want to make sure the people are using it in the right way for areas where it really makes the most impact, and then we'll help you implement that and develop it. And then we really see the whole ecosystem around our partners, you're going to onboard people into a blockchain network. You're going to have to integrate with your back ends. You're going to extend your mobile devices to provide these new services through apps. So our GSI community is really helping with the integration and the onboarding. Our ISVs are developing new services that run on those blockchain networks, and we just launched our new IBM Cloud for Financial Services, has a blockchain zone, for all those fintech startups to get access and reuse components, so that we can accelerate the effect. >> Alright, well, congratulations Marie, great to see you in the new role, congratulations, >> Marie: Thank you. >> We're super excited for you, and looking forward to getting the update soon at our new studio. We'll try to rope you into our new Palo Alto studio. Greg, great to hear your success. This is the nirvana, I mean, secure ID is like, the big, I mean easily, not like with some either token or engineered identity system, and this is a home run. >> It's privacy, and it's as we talked about before the broadcast. Facebook, would you trust Facebook to go see your medical records? Would you unlock your title using Facebook? You want things that are private, where people aren't tracking you and are more secure than that, so this is really... >> Don Tapscott called Facebook data fracker. (laughing) We provide all our data for Facebook, they've got billionaires on it. Thanks so much for spending the time. >> Thank you. >> Blockchain revolution here inside theCUBE, bringing you really trusted content here on theCUBE. Distribute it out around the world, I'm John Furrier, Dave Vellante, thanks for watching. More great coverage coming up here, stay with us.

Published Date : Mar 22 2017

SUMMARY :

Brought to you by, IBM. of the Blockchain group within IBM, that we really are investing heavily, in the role now. or the research team, Was it just, the internal coalesce of in the Linux Foundation. so that must to mobilizing the IBMers new division. that we didn't tell to do it, you know, and in the marketplace. and it saved the federal government I think get the phone call, the credit score's dropping, and say I need the keys, I have a member of your family... I mean we're living in a world, in cloud, Marie knows, and you distribute it. for people to hack. and prove I've got the mobile device, but certainly in the United States and hear the recommendations and suggestions in addition to the huge portfolio of things like Cognitive. members of the open source and open community project, if you really want your own private cloud. But on that deal, just to interrupt, the rest of our ecosystems, and we were interested In the case of SecureKey, the model we see It's open at the ends, that you can get that level of security it's to say hey, we've got this solution, because that is actually the tide that raises all boats. and you say, go for it. So, I'm sure the organic growth that you guys see and reuse components, so that we can accelerate the effect. and looking forward to getting the update soon to go see your medical records? Thanks so much for spending the time. Distribute it out around the world,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

MariePERSON

0.99+

Greg WolfondPERSON

0.99+

IBMORGANIZATION

0.99+

TD BankORGANIZATION

0.99+

TransUnionORGANIZATION

0.99+

DavePERSON

0.99+

EquifaxORGANIZATION

0.99+

Royal BankORGANIZATION

0.99+

GregPERSON

0.99+

CanadaLOCATION

0.99+

Marie WieckPERSON

0.99+

RameshPERSON

0.99+

Don TapscottPERSON

0.99+

John FurrierPERSON

0.99+

MaerskORGANIZATION

0.99+

Bank of MontrealORGANIZATION

0.99+

Food and Drug AdministrationORGANIZATION

0.99+

FacebookORGANIZATION

0.99+

one monthQUANTITY

0.99+

400 clientsQUANTITY

0.99+

TelcoORGANIZATION

0.99+

InterConnectORGANIZATION

0.99+

SecureKey TechnologiesORGANIZATION

0.99+

thousandsQUANTITY

0.99+

last yearDATE

0.99+

NISTORGANIZATION

0.99+

six weeksQUANTITY

0.99+

SecureKeyORGANIZATION

0.99+

eight hundred million dollarsQUANTITY

0.99+

Six weeksQUANTITY

0.99+

Las VegasLOCATION

0.99+

This yearDATE

0.99+

20 thousandQUANTITY

0.99+

millionsQUANTITY

0.99+

80 different departmentsQUANTITY

0.99+

Mandalay BayLOCATION

0.99+

Ginni RomettyPERSON

0.99+

Ten percentQUANTITY

0.99+

FINTRACORGANIZATION

0.99+

122 membersQUANTITY

0.99+

200 peopleQUANTITY

0.99+

15 months agoDATE

0.99+

Linux FoundationORGANIZATION

0.99+

United StatesLOCATION

0.99+

oneQUANTITY

0.99+

three daysQUANTITY

0.99+

FebruaryDATE

0.99+

ScotiaORGANIZATION

0.99+

HyperledgerORGANIZATION

0.98+

SASORGANIZATION

0.98+

one exampleQUANTITY

0.98+

Palo AltoLOCATION

0.98+

about 600 peopleQUANTITY

0.97+

hundreds of thousands a monthQUANTITY

0.97+

over a thousand peopleQUANTITY

0.97+

CUBEORGANIZATION

0.97+