Image Title

Search Results for Qlik:

Itamar Ankorion, Qlik & Peter MacDonald, Snowflake | AWS re:Invent 2022


 

(upbeat music) >> Hello, welcome back to theCUBE's AWS RE:Invent 2022 Coverage. I'm John Furrier, host of theCUBE. Got a great lineup here, Itamar Ankorion SVP Technology Alliance at Qlik and Peter McDonald, vice President, cloud partnerships and business development Snowflake. We're going to talk about bringing SAP data to life, for joint Snowflake, Qlik and AWS Solution. Gentlemen, thanks for coming on theCUBE Really appreciate it. >> Thank you. >> Thank you, great meeting you John. >> Just to get started, introduce yourselves to the audience, then going to jump into what you guys are doing together, unique relationship here, really compelling solution in cloud. Big story about applications and scale this year. Let's introduce yourselves. Peter, we'll start with you. >> Great. I'm Peter MacDonald. I am vice president of Cloud Partners and business development here at Snowflake. On the Cloud Partner side, that means I manage AWS relationship along with Microsoft and Google Cloud. What we do together in terms of complimentary products, GTM, co-selling, things like that. Importantly, working with other third parties like Qlik for joint solutions. On business development, it's negotiating custom commercial partnerships, large companies like Salesforce and Dell, smaller companies at most for our venture portfolio. >> Thanks Peter and hi John. It's great to be back here. So I'm Itamar Ankorion and I'm the senior vice president responsible for technology alliances here at Qlik. With that, own strategic alliances, including our key partners in the cloud, including Snowflake and AWS. I've been in the data and analytics enterprise software market for 20 plus years, and my main focus is product management, marketing, alliances, and business development. I joined Qlik about three and a half years ago through the acquisition of Attunity, which is now the foundation for Qlik data integration. So again, we focus in my team on creating joint solution alignment with our key partners to provide more value to our customers. >> Great to have both you guys, senior executives in the industry on theCUBE here, talking about data, obviously bringing SAP data to life is the theme of this segment, but this reinvent, it's all about the data, big data end-to-end story, a lot about data being intrinsic as the CEO says on stage around in the organizations in all aspects. Take a minute to explain what you guys are doing as from a company standpoint. Snowflake and Qlik and the solutions, why here at AWS? Peter, we'll start with you at Snowflake, what you guys do as a company, your mission, your focus. >> That was great, John. Yeah, so here at Snowflake, we focus on the data platform and until recently, data platforms required expensive on-prem hardware appliances. And despite all that expense, customers had capacity constraints, inexpensive maintenance, and had limited functionality that all impeded these organizations from reaching their goals. Snowflake is a cloud native SaaS platform, and we've become so successful because we've addressed these pain points and have other new special features. For example, securely sharing data across both the organization and the value chain without copying the data, support for new data types such as JSON and structured data, and also advance in database data governance. Snowflake integrates with complimentary AWS services and other partner products. So we can enable holistic solutions that include, for example, here, both Qlik and AWS SageMaker, and comprehend and bring those to joint customers. Our customers want to convert data into insights along with advanced analytics platforms in AI. That is how they make holistic data-driven solutions that will give them competitive advantage. With Snowflake, our approach is to focus on customer solutions that leverage data from existing systems such as SAP, wherever they are in the cloud or on-premise. And to do this, we leverage partners like Qlik native US to help customers transform their businesses. We provide customers with a premier data analytics platform as a result. Itamar, why don't you talk about Qlik a little bit and then we can dive into the specific SAP solution here and some trends >> Sounds great, Peter. So Qlik provides modern data integration and analytics software used by over 38,000 customers worldwide. Our focus is to help our customers turn data into value and help them close the gap between data all the way through insight and action. We offer click data integration and click data analytics. Click data integration helps to automate the data pipelines to deliver data to where they want to use them in real-time and make the data ready for analytics and then Qlik data analytics is a robust platform for analytics and business intelligence has been a leader in the Gartner Magic Quadrant for over 11 years now in the market. And both of these come together into what we call Qlik Cloud, which is our SaaS based platform. So providing a more seamless way to consume all these services and accelerate time to value with customer solutions. In terms of partnerships, both Snowflake and AWS are very strategic to us here at Qlik, so we have very comprehensive investment to ensure strong joint value proposition to we can bring to our mutual customers, everything from aligning our roadmaps through optimizing and validating integrations, collaborating on best practices, packaging joint solutions like the one we'll talk about today. And with that investment, we are an elite level, top level partner with Snowflake. We fly that our technology is Snowflake-ready across the entire product set and we have hundreds of joint customers together and with AWS we've also partnered for a long time. We're here to reinvent. We've been here with the first reinvent since the inaugural one, so it kind of gives you an idea for how long we've been working with AWS. We provide very comprehensive integration with AWS data analytics services, and we have several competencies ranging from data analytics to migration and modernization. So that's our focus and again, we're excited about working with Snowflake and AWS to bring solutions together to market. >> Well, I'm looking forward to unpacking the solutions specifically, and congratulations on the continued success of both your companies. We've been following them obviously for a very long time and seeing the platform evolve beyond just SaaS and a lot more going on in cloud these days, kind of next generation emerging. You know, we're seeing a lot of macro trends that are going to be powering some of the things we're going to get into real quickly. But before we get into the solution, what are some of those power dynamics in the industry that you're seeing in trends specifically that are impacting your customers that are taking us down this road of getting more out of the data and specifically the SAP, but in general trends and dynamics. What are you hearing from your customers? Why do they care? Why are they going down this road? Peter, we'll start with you. >> Yeah, I'll go ahead and start. Thanks. Yeah, I'd say we continue to see customers being, being very eager to transform their businesses and they know they need to leverage technology and data to do so. They're also increasingly depending upon the cloud to bring that agility, that elasticity, new functionality necessary to react in real-time to every evolving customer needs. You look at what's happened over the last three years, and boy, the macro environment customers, it's all changing so fast. With our partnerships with AWS and Qlik, we've been able to bring to market innovative solutions like the one we're announcing today that spans all three companies. It provides a holistic solution and an integrated solution for our customer. >> Itamar let's get into it, you've been with theCUBE, you've seen the journey, you have your own journey, many, many years, you've seen the waves. What's going on now? I mean, what's the big wave? What's the dynamic powering this trend? >> Yeah, in a nutshell I'll call it, it's all about time. You know, it's time to value and it's about real-time data. I'll kind of talk about that a bit. So, I mean, you hear a lot about the data being the new oil, but it's definitely, we see more and more customers seeing data as their critical enabler for innovation and digital transformation. They look for ways to monetize data. They look as the data as the way in which they can innovate and bring different value to the customers. So we see customers want to use more data so to get more value from data. We definitely see them wanting to do it faster, right, than before. And we definitely see them looking for agility and automation as ways to accelerate time to value, and also reduce overall costs. I did mention real-time data, so we definitely see more and more customers, they want to be able to act and make decisions based on fresh data. So yesterday's data is just not good enough. >> John: Yeah. >> It's got to be down to the hour, down to the minutes and sometimes even lower than that. And then I think we're also seeing customers look to their core business systems where they have a lot of value, like the SAP, like mainframe and thinking, okay, our core data is there, how can we get more value from this data? So that's key things we see all the time with customers. >> Yeah, we did a big editorial segment this year on, we called data as code. Data as code is kind of a riff on infrastructure as code and you start to see data becoming proliferating into all aspects, fresh data. It's not just where you store it, it's how you share it, it's how you turn it into an application intrinsically involved in all aspects. This is the big theme this year and that's driving all the conversations here at RE:Invent. And I'm guaranteeing you, it's going to happen for another five and 10 years. It's not stopping. So I got to get into the solution, you guys mentioned SAP and you've announced the solution by Qlik, Snowflake and AWS for your customers using SAP. Can you share more about this solution? What's unique about it? Why is it important and why now? Peter, Itamar, we'll start with you first. >> Let me jump in, this is really, I'll jump because I'm excited. We're very excited about this solution and it's also a solution by the way and again, we've seen proven customer success with it. So to your point, it's ready to scale, it's starting, I think we're going to see a lot of companies doing this over the next few years. But before we jump to the solution, let me maybe take a few minutes just to clarify the need, why we're seeing, why we're seeing customers jump to do this. So customers that use SAP, they use it to manage the core of their business. So think order processing, management, finance, inventory, supply chain, and so much more. So if you're running SAP in your company, that data creates a great opportunity for you to drive innovation and modernization. So what we see customers want to do, they want to do more with their data and more means they want to take SAP with non-SAP data and use it together to drive new insights. They want to use real-time data to drive real-time analytics, which they couldn't do to date. They want to bring together descriptive with predictive analytics. So adding machine learning in AI to drive more value from the data. And naturally they want to do it faster. So find ways to iterate faster on their solutions, have freedom with the data and agility. And I think this is really where cloud data platforms like Snowflake and AWS, you know, bring that value to be able to drive that. Now to do that you need to unlock the SAP data, which is a lot of also where Qlik comes in because typical challenges these customers run into is the complexity, inherent in SAP data. Tens of thousands of tables, proprietary formats, complex data models, licensing restrictions, and more than, you have performance issues, they usually run into how do we handle the throughput, the volumes while maintaining lower latency and impact. Where do we find knowledge to really understand how to get all this done? So these are the things we've looked at when we came together to create a solution and make it unique. So when you think about its uniqueness, because we put together a lot, and I'll go through three, four key things that come together to make this unique. First is about data delivery. How do you have the SAP data delivery? So how do you get it from ECC, from HANA from S/4HANA, how do you deliver the data and the metadata and how that integration well into Snowflake. And what we've done is we've focused a lot on optimizing that process and the continuous ingestion, so the real-time ingestion of the data in a way that works really well with the Snowflake system, data cloud. Second thing is we looked at SAP data transformation, so once the data arrives at Snowflake, how do we turn it into being analytics ready? So that's where data transformation and data worth automation come in. And these are all elements of this solution. So creating derivative datasets, creating data marts, and all of that is done by again, creating an optimized integration that pushes down SQL based transformations, so they can be processed inside Snowflake, leveraging its powerful engine. And then the third element is bringing together data visualization analytics that can also take all the data now that in organizing inside Snowflake, bring other data in, bring machine learning from SageMaker, and then you go to create a seamless integration to bring analytic applications to life. So these are all things we put together in the solution. And maybe the last point is we actually took the next step with this and we created something we refer to as solution accelerators, which we're really, really keen about. Think about this as prepackaged templates for common business analytic needs like order to cash, finance, inventory. And we can either dig into that a little more later, but this gets the next level of value to the customers all built into this joint solution. >> Yeah, I want to get to the accelerators, but real quick, Peter, your reaction to the solution, what's unique about it? And obviously Snowflake, we've been seeing the progression data applications, more developers developing on top of Snowflake, data as code kind of implies developer ecosystem. This is kind of interesting. I mean, you got partnering with Qlik and AWS, it's kind of a developer-like thinking real solution. What's unique about this SAP solution that's, that's different than what customers can get anywhere else or not? >> Yeah, well listen, I think first of all, you have to start with the idea of the solution. This are three companies coming together to build a holistic solution that is all about, you know, creating a great opportunity to turn SAP data into value this is Itamar was talking about, that's really what we're talking about here and there's a lot of technology underneath it. I'll talk more about the Snowflake technology, what's involved here, and then cover some of the AWS pieces as well. But you know, we're focusing on getting that value out and accelerating time to value for our joint customers. As Itamar was saying, you know, there's a lot of complexity with the SAP data and a lot of value there. How can we manage that in a prepackaged way, bringing together best of breed solutions with proven capabilities and bringing this to market quickly for our joint customers. You know, Snowflake and AWS have been strong partners for a number of years now, and that's not only on how Snowflake runs on top of AWS, but also how we integrate with their complementary analytics and then all products. And so, you know, we want to be able to leverage those in addition to what Qlik is bringing in terms of the data transformations, bringing data out of SAP in the visualization as well. All very critical. And then we want to bring in the predictive analytics, AWS brings and what Sage brings. We'll talk about that a little bit later on. Some of the technologies that we're leveraging are some of our latest cutting edge technologies that really make things easier for both our partners and our customers. For example, Qlik leverages Snowflakes recently released Snowpark for Python functionality to push down those data transformations from clicking the Snowflake that Itamar's mentioning. And while we also leverage Snowpark for integrations with Amazon SageMaker, but there's a lot of great new technology that just makes this easy and compelling for customers. >> I think that's the big word, easy button here for what may look like a complex kind of integration, kind of turnkey, really, really compelling example of the modern era we're living in, as we always say in theCUBE. You mentioned accelerators, SAP accelerators. Can you give an example of how that works with the technology from the third party providers to deliver this business value Itamar, 'cause that was an interesting comment. What's the example? Give an example of this acceleration. >> Yes, certainly. I think this is something that really makes this truly, truly unique in the industry and again, a great opportunity for customers. So we kind talked earlier about there's a lot of things that need to be done with SP data to turn it to value. And these accelerator, as the name suggests, are designed to do just that, to kind of jumpstart the process and reduce the time and the risk involved in such project. So again, these are pre-packaged templates. We basically took a lot of knowledge, and a lot of configurations, best practices about to get things done and we put 'em together. So think about all the steps, it includes things like data extraction, so already knowing which tables, all the relevant tables that you need to get data from in the contexts of the solution you're looking for, say like order to cash, we'll get back to that one. How do you continuously deliver that data into Snowflake in an in efficient manner, handling things like data type mappings, metadata naming conventions and transformations. The data models you build all the way to data mart definitions and all the transformations that the data needs to go through moving through steps until it's fully analytics ready. And then on top of that, even adding a library of comprehensive analytic dashboards and integrations through machine learning and AI and put all of that in a way that's in pre-integrated and tested to work with Snowflake and AWS. So this is where again, you get this entire recipe that's ready. So take for example, I think I mentioned order to cash. So again, all these things I just talked about, I mean, for those who are not familiar, I mean order to cash is a critical business process for every organization. So especially if you're in retail, manufacturing, enterprise, it's a big... This is where, you know, starting with booking a sales order, following by fulfilling the order, billing the customer, then managing the accounts receivable when the customer actually pays, right? So this all process, you got sales order fulfillment and the billing impacts customer satisfaction, you got receivable payments, you know, the impact's working capital, cash liquidity. So again, as a result this order to cash process is a lifeblood for many businesses and it's critical to optimize and understand. So the solution accelerator we created specifically for order to cash takes care of understanding all these aspects and the data that needs to come with it. So everything we outline before to make the data available in Snowflake in a way that's really useful for downstream analytics, along with dashboards that are already common for that, for that use case. So again, this enables customers to gain real-time visibility into their sales orders, fulfillment, accounts receivable performance. That's what the Excel's are all about. And very similarly, we have another one for example, for finance analytics, right? So this will optimize financial data reporting, helps customers get insights into P&L, financial risk of stability or inventory analytics that helps with, you know, improve planning and inventory management, utilization, increased efficiencies, you know, so in supply chain. So again, these accelerators really help customers get a jumpstart and move faster with their solutions. >> Peter, this is the easy button we just talked about, getting things going, you know, get the ball rolling, get some acceleration. Big part of this are the three companies coming together doing this. >> Yeah, and to build on what Itamar just said that the SAP data obviously has tremendous value. Those sales orders, distribution data, financial data, bringing that into Snowflake makes it easily accessible, but also it enables it to be combined with other data too, is one of the things that Snowflake does so well. So you can get a full view of the end-to-end process and the business overall. You know, for example, I'll just take one, you know, one example that, that may not come to mind right away, but you know, looking at the impact of weather conditions on supply chain logistics is relevant and material and have interest to our customers. How do you bring those different data sets together in an easy way, bringing the data out of SAP, bringing maybe other data out of other systems through Qlik or through Snowflake, directly bringing data in from our data marketplace and bring that all together to make it work. You know, fundamentally organizational silos and the data fragmentation exist otherwise make it really difficult to drive modern analytics projects. And that in turn limits the value that our customers are getting from SAP data and these other data sets. We want to enable that and unleash. >> Yeah, time for value. This is great stuff. Itamar final question, you know, what are customers using this? What do you have? I'm sure you have customers examples already using the solution. Can you share kind of what these examples look like in the use cases and the value? >> Oh yeah, absolutely. Thank you. Happy to. We have customers across different, different sectors. You see manufacturing, retail, energy, oil and gas, CPG. So again, customers in those segments, typically sectors typically have SAP. So we have customers in all of them. A great example is like Siemens Energy. Siemens Energy is a global provider of gas par services. You know, over what, 28 billion, 30 billion in revenue. 90,000 employees. They operate globally in over 90 countries. So they've used SAP HANA as a core system, so it's running on premises, multiple locations around the world. And what they were looking for is a way to bring all these data together so they can innovate with it. And the thing is, Peter mentioned earlier, not just the SAP data, but also bring other data from other systems to bring it together for more value. That includes finance data, these logistics data, these customer CRM data. So they bring data from over 20 different SAP systems. Okay, with Qlik data integration, feeding that into Snowflake in under 20 minutes, 24/7, 365, you know, days a year. Okay, they get data from over 20,000 tables, you know, over million, hundreds of millions of records daily going in. So it is a great example of the type of scale, scalability, agility and speed that they can get to drive these kind of innovation. So that's a great example with Siemens. You know, another one comes to mind is a global manufacturer. Very similar scenario, but you know, they're using it for real-time executive reporting. So it's more like feasibility to the production data as well as for financial analytics. So think, think, think about everything from audit to texts to innovate financial intelligence because all the data's coming from SAP. >> It's a great time to be in the data business again. It keeps getting better and better. There's more data coming. It's not stopping, you know, it's growing so fast, it keeps coming. Every year, it's the same story, Peter. It's like, doesn't stop coming. As we wrap up here, let's just get customers some information on how to get started. I mean, obviously you're starting to see the accelerators, it's a great program there. What a great partnership between the two companies and AWS. How can customers get started to learn about the solution and take advantage of it, getting more out of their SAP data, Peter? >> Yeah, I think the first place to go to is talk to Snowflake, talk to AWS, talk to our account executives that are assigned to your account. Reach out to them and they will be able to educate you on the solution. We have packages up very nicely and can be deployed very, very quickly. >> Well gentlemen, thank you so much for coming on. Appreciate the conversation. Great overview of the partnership between, you know, Snowflake and Qlik and AWS on a joint solution. You know, getting more out of the SAP data. It's really kind of a key, key solution, bringing SAP data to life. Thanks for coming on theCUBE. Appreciate it. >> Thank you. >> Thank you John. >> Okay, this is theCUBE coverage here at RE:Invent 2022. I'm John Furrier, your host of theCUBE. Thanks for watching. (upbeat music)

Published Date : Dec 1 2022

SUMMARY :

bringing SAP data to life, great meeting you John. then going to jump into what On the Cloud Partner side, and I'm the senior vice and the solutions, and the value chain and accelerate time to value that are going to be powering and data to do so. What's the dynamic powering this trend? You know, it's time to value all the time with customers. and that's driving all the and it's also a solution by the way I mean, you got partnering and bringing this to market of the modern era we're living in, that the data needs to go through getting things going, you know, Yeah, and to build in the use cases and the value? agility and speed that they can get It's a great time to be to educate you on the solution. key solution, bringing SAP data to life. Okay, this is theCUBE

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JohnPERSON

0.99+

AWSORGANIZATION

0.99+

PeterPERSON

0.99+

DellORGANIZATION

0.99+

John FurrierPERSON

0.99+

SiemensORGANIZATION

0.99+

Peter MacDonaldPERSON

0.99+

MicrosoftORGANIZATION

0.99+

Peter McDonaldPERSON

0.99+

QlikORGANIZATION

0.99+

28 billionQUANTITY

0.99+

two companiesQUANTITY

0.99+

TensQUANTITY

0.99+

three companiesQUANTITY

0.99+

Siemens EnergyORGANIZATION

0.99+

20 plus yearsQUANTITY

0.99+

yesterdayDATE

0.99+

SnowflakeORGANIZATION

0.99+

Itamar AnkorionPERSON

0.99+

third elementQUANTITY

0.99+

FirstQUANTITY

0.99+

threeQUANTITY

0.99+

ItamarPERSON

0.99+

over 20,000 tablesQUANTITY

0.99+

bothQUANTITY

0.99+

90,000 employeesQUANTITY

0.99+

firstQUANTITY

0.99+

SalesforceORGANIZATION

0.99+

Cloud PartnersORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

over 38,000 customersQUANTITY

0.99+

under 20 minutesQUANTITY

0.99+

10 yearsQUANTITY

0.99+

fiveQUANTITY

0.99+

ExcelTITLE

0.99+

oneQUANTITY

0.99+

over 11 yearsQUANTITY

0.98+

SnowparkTITLE

0.98+

Second thingQUANTITY

0.98+

Molly Burns Qlik & Samir Shah, AARP | AWS re:Invent 2022


 

(slow upbeat music) >> Good afternoon and welcome back to Sin City. We're here at AWS reInvent with wall-to-wall coverage on theCUBE. My name is Savannah Peterson, joined with Dave Vellante, and very excited to have two exciting guests from Qlik and AARP with us. Molly and Samir, thank you so much for being here. Welcome to the show. >> Thank you for having us. >> Thank you for having us. >> How's it been so far for you, Molly? >> It's been a great show so far. We've got a big booth presence out here. We've had a lot of people coming by, doing demo stations and just really, really coming to the voice of the customer, so we've really enjoyed the event. >> Ah, love a good VOC conversation myself. How about for you, Samir? >> Oh, it's been great meeting a lot of product folks, meeting a lot of other people, trying to do similar things that we're doing, getting confirmation we're doing the right thing, and learning new things. And obviously, you know, here with Molly, it's been a highlight of my experience. >> What's the best thing you learned from your peers, this week? >> You know, some of the things, that we're all talking about, is how do we get data in the right place at the right time? And, you know, that's something that people are now starting to think about. >> Very hot topic. >> You know, doing it, and then not only getting it to the right place, but taking insights and taking action on it as it's getting there. So those are the conversations that are getting around, in the circle I've been hanging around with. >> You hearing the same thing at the booth or? >> Yeah, absolutely. >> And how are you guys responding? >> Well, I think, as a company, and the shifts in the market, people are really trying to determine what workloads belong in which Cloud, what belongs on-prem? And so talking about those realtime transformations, the integration points, the core systems they're coming from, and really how to unlock that data, is just really powerful and meaningful. So that's been a pretty consistent theme throughout the conference, and a lot of conversations that we have on a regular basis. >> I believe that, Molly, let's stick with you for a second. Just in case the audience isn't familiar, tell us a little more about Qlik. >> Yeah, so Qlik is a robust, end-to-end data pipeline. Starting with really looking at all of your source systems whether it's mainframe, SAP, relational database, kind of name your flavor as it's related to sources. Getting those sources over into the target landing spot whether it be Amazon, or other cloud players, or even if you're, if you're managing hybrid workloads. So that's kind of one piece of the end-to-end platform. And then the second piece is really having all that data, analytics ready, coming right through that real-time data pipeline, and really being able to use the data, to monetize the data, to make sense of the data. And then Qlik really does all that data preparation work underneath the visualization layer, which is where all the work happens. And then you get to see the output of that through the visualization of Qlik, which is, you know, the dashboards, the things that our people, people are used to seeing. >> I love that! So at AARP, what are you using Qlik for? What sort of dashboards are you pulling together? >> So when we started our journey to AWS, we knew that, you know, we're going to have our applications, they're distributed in the Cloud, but again, how do we get the data there, in the right place at the right time? So, as members are, taking action, they're calling into the call center, using our website, using our mobile apps. We want to want it to be able to take that information stream it, so we use Qlik, to take those changes when they happen as they happen, be able to stream it to Kafka and then push that data out to the applications that need it in the time that they needed it. So, instead of waiting for a batch job to happen overnight, we're able to now push this data in real time. And by doing that, we're able to personalize the engagement for our members. So if you come in, we know what you're doing, we can personalize the value that we put in front of you, and just make that engagement a lot more engaging for you. >> Yeah. >> And in the channel that you choose to want to come in with, right? Rather than a channel that we are trying to push to you. >> Everyone wants that personalized experience as we discussed, I love AARP, I've done a lot of work with AARP, I look forward to being a member, but in case the audience isn't familiar, you have the largest membership database of any company on Earth that I'm aware of. How many members does AARP have? >> We have nearly 38 million members, and 66,000 volunteers, and 2300 employees across every state in the United States. >> It's a perfect use case for Qlik, right? 'Cause you've been around for a while. You've got data in the million different places. You're trying to get, you've got a mainframe, right? You know, I hear Amazon's trying to put all the mainframes in the Cloud, but I'm guessing the business case isn't there for you. But you want the data that's coming out of that mainframe to be part of that data pipeline, right? So can you paint a picture, of how, what Molly was describing about the data pipeline, how that fits with AARP? >> Yeah, it's actually, it was a perfect use case. And you know, when we engaged with Qlik, what we wanted to be able to do is take that data in the mainframe, and get it distributed into the Cloud, accurately, securely, and make sure that we can track the lineage, and be able to say, hey, application A only needs name and address, application B needs, name, address, and payment. So we were able to do all of that within a couple of weeks, right? And getting that data out there, knowing that it's going to the right place, knowing it's secure, and knowing it's accurate, regardless of the application it goes to, we don't have to worry about seeking data across different applications. Now we know that there's a source of truth, and everything is done through the pipeline, and it's controlled in a way that, we can measure everything that's going through, how it's going through, and how it's being used by the applications, that are consuming it? >> So you've got the providence and the lineage of that data and that's what Qlik ensures, is that right? Is that your role or is that a partner role, combined? >> No, yes, that's absolutely Qlik's role. So for our new offering, Qlik Cloud data integration, it's a comprehensive solution, delivered as a service, delivers real time, automates, transformations, catalog and lineage, all extremely important. And in the case of Samir and AARP, they're trying to unlock the most valuable assets of their data in SAP and mainframe. And surprisingly, sometimes most valuable data in an organization is the hardest to actually get access to. >> Sure. >> So be, you know, just statistically, 70% of Fortune 500 companies still rely on mainframe. So when you think about that, and even when Samir and I are talking about it. >> That's a lot. >> Yeah. >> And that's a lot of scale, that's a lot of data. >> It's a lot of data. >> Yeah. >> So, you know, mainframe isn't a thing of the past. Companies are still relying on it. People have been saying that for years but when we're talking about getting the complex data out of there to really make something meaningful for AARP, we're really proud of the results, and the opportunity that we've been able to provide to really improve the member experience. And how people are able to consume AARP, and all the different offerings that they have? Kind of like you mentioned Savannah, and the way that you go about it. >> Well, it's also the high risk data. High value data, high risk data. You don't want to mess with it. You want to make sure that you've got that catalog to be able to say, okay, this is what we did with that data, this is where it came from. And then you essentially publish to other tools, analytic tools in the Cloud. Can you paint a picture of how that extends to the Cloud? >> Sure, so there's a couple of different things that we do with it. So once we get the data, into our streaming apps, we can publish it over to like our website. We can publish it to the call center, to mobile apps, to our data warehouse, where we can run analytics and AI on it. And then obviously a lot of our journeys, we use a journey orchestration tool, and we've built a CDP, a customer data platform, to get those insights in there, to drive, you know, personalization and experience. >> I'm smiling as you're talking, Samir, because I'm thinking of all the personalized experiences that my mother has with AARP, and it is so fun to learn about the technology that's serving that to her. >> Exactly. >> This segment actually becoming a bit more personal for me than I expected for a couple of reasons. So this is great. Molly, Qlik has been a part of the AWS ecosystem since the get go. How have things changed over the years? >> Yeah, so Qlik still remains the enterprise integration tool of choice for AWS especially- >> Let's call that a casual and just brag. >> Yeah. >> Because that's awesome. That's great, congratulations on that. >> Thank you for SAP and mainframe. So the relationship continues to evolve but we've been part of the ecosystem from since inception. So we look at, how we continue to evolve the partnership? And honestly, a lot of our customers landing spot is AWS. So the partnership evolves really on two fronts. One with Amazon itself, in a partnership lane, and two, with our customers, and what we're doing with them, and how we're able to really optimize what that looks like? And then secondly, earlier this year we announced an offering Amazon and Qlik, called Qlik Ramp, where we can come in and do, a half day architecture deep dive, look at SAP mainframe, and how they get to the Amazon landing spots, whether it's S3, Redshift, or EMR? So we got a lot of different things kind of going on in the Amazon ecosystem, whether it's customer forward and first, and how can we maximize the relationship spend et cetera, with Amazon. And then also how can we deliver, you know, kind of a shorter time to value throughout that process with something like a Qlik ramp, because we want to qualify, and solve customers needs, as equally as we want to you know, say when we're not the right fit. >> So data is a complicated- >> Love that honesty and transparency. >> Data is a complicated situation for most companies, right? And there's a lack of resource, lack of talent. There's hyper specialization. And you were just talking about the evolution of the Cloud and the relationship. How does automation fit into the equation? Are you able to automate a lot of that data integration through the pipeline? >> Yeah. >> Is it was a, what's your journey look like there? Were you resistant to that at first? 'Cause you got to trust the data. Take us through that. >> Yeah, so the first thing, we wanted to make sure is security right? We've got a lot of data, we're going to make sure privacy- >> Very personal data too. >> Exactly. And privacy and security is number one. So we want to make sure anything that we're doing with the data is secure, and it's not given out anywhere. In terms of automation, so what we've been able to do is being able to take these changes, and you know, in technology, the one thing you can guarantee is it's going to break. Network's going to go down, or a server goes down, a database goes down, and that's the only guarantee we have. And by using the product that we have today, we're able to take those outages, and minimize them because there's retry processes, there's ways of going back and saying, hey, I've missed this much data. How do we bring it back in? You don't want data to get out of sync because that causes downstream problems. >> Yeah. >> So all of that is done through the product, right? We don't have to worry about it. You know, we get notifications, but it's not like, oh, I've got to pay someone at two o'clock in the morning because the network's gone down and how's the data sync going to come back up, when it comes back up? All of that's done for us. >> Yeah, and just to add to that, automation, is a key component. I mean, the data engineering teams definitely see the value of automation and how we're able to deliver that. So, improving the experience but also the overall landscape of the environment is critical. >> Yeah, we've seen the stats, data scientists, data pro spend, you know, 80% of their time wrangling data, 20% of their time. >> Data preparation. >> You know extracting value from it. So. >> Yeah, it's so sad. It's such a waste of human capital, and you're obviously relieving that, and letting folks do their job more efficiently. >> The thing is too, you know, as I'm somebody who's love data you dive into the data, you get really excited then after a while you're like, Ugh! >> I'm still here. >> I'm slogging through this data. Taking a bath in it. >> But I think. >> I want to get to the insights. >> I think that world's changing a little bit. >> Yes, definitely. >> So as we're starting to get data that's coming through it's got high fidelity, and richness, right? So in the old days we'd put in a database, normalize it, and then, you know we'd go and do our magic, and hopefully, you know something comes out, and the least of frustration, you just spoke about. Well now, because it's moving in real time, and we can send the data to areas in the way we want it, and add automation, and machine learning on top of that, so that, now it becomes a commodity to massage that data into the in the format that you want it. Then you can concentrate on the value work, right? Which is really where people should be spending the time, rather than, oh, I've got to manipulate the data, make sure it's done in a consistent way, and then make sure it's compliant and done, the same way every single time. >> It may be too early to, you know quantify the business impact, but have you seen, for example, you know, what I was describing creates data silos. 'Cause nobody's going to use the data if it's not trusted. So what happens is it goes to a silo, they put a brick wall around it, and then, you know, they do their thing with it. They trust it for that one use case and then they don't share it. Has that begun to change as you've seen more integration that's automated and augmented? >> Absolutely. I mean, you know, if you're bringing in data and you're showing that it's consistent, and this is where governance and compliance comes in, right? So as long as you have a data catalog, you can make sure that this data's coming through with the lineage that you said is going to, here's the source, here's the target, here's who gets what they only need rather than giving them everything. And by being able to document that, in a way, that's automated rather than somebody going in, and running a report, it's key. Because that's where the trust comes in, rather than, oh, Samir has to go in and manipulate this stream so that, you know, Molly can get the reports she wants. Instead, hey, it's all going in there, the reports are coming out, they're audited, and that's where the trust factor comes. >> And that enables scale. >> Yeah. >> Cloud confidence and scale. Big topics of the show this week. >> Yep. >> It's been the whole thing. Molly, what's next for Qlik? >> Yeah, Qliks on a big journey. So we've released a lot of things most recently, Qlik Cloud data integration as a service, but we're just continuing to grow from a customer base, from a capabilities perspective. We also recently just became HIPAA compliant and went through some other services. >> Congratulations, that is not an easy process. >> Thank you, thank you. >> Yeah. >> And so for us it's really just about expanding and having, that same level of fidelity of the data, and really just getting all of that pushed out to the market so everybody really sees the full value of Qlik, and that we can make your data Qlik. And just for a minute, back to your earlier point. >> Beautiful pun drop there, Molly. Just going to see that. >> Thank you Savannah. >> Yeah. >> But back to your earlier point, just about the time that people are spending, when you're able to automate, and you're getting data delivered in real time, and operational systems are able to see that. 'Cause you're trying to create the least amount of disruption you can, right? 'Cause that's a critical part of the business. When you start to automate and relieve that burden then people have time to spend time on the real things. >> Right. >> Future forward, prescriptive analytics, machine learning, not data preparation, solving problems, fixing soft gaps. >> Staring a spreadsheet, yeah. >> Right? It's actually the full end-to-end pipeline. And so that's really where I feel like the power is unleashed. And as more sources and targets come to light, right? They're all over the showroom floor, so we don't have to mention any of 'em by name, but it's just continuing, to move into that world to have more SaaS integrations. And to be able to serve the customer, and meet them exactly where they're at, at the place that they want to be. And for Samir, and what we did in the transformation there, unlocking that data for mainframe and SAP, getting it into Qlik Cloud, has been a huge business driver for them. And so, because of partners like AWS and Samir and AARP, we're constantly evolving. And really trying to listen to the voice of the customer, to become better for all of you. >> Excellent. >> Love that community first attitude. Very clear that you both have it, both AARP and Qlik with that attitude. We have a new challenge this year to reInvent on theCUBE, little prompt here. >> Okay. >> We're going to put 30 seconds on the clock, although I'm not super crazy about watching the clock. So, feel comfortable with whatever however much time you need. >> Whatever works. >> Yeah, yeah, yeah, yeah, whatever works. But we're looking for equivocally, your Instagram reel, your hot take, your thought leadership, sizzle, with the key theme from this year's show. Molly, your smile is platinum and perfect. So I'm going to start with you. I feel like you've got this. >> Okay, great. >> Yeah. >> Just the closing statement is what you're looking for. >> Sure, yeah, sexy little sound bite. What do you, what's going to be your big takeaway from your experience here in Vegas this week? >> Yeah, so the experience at Vegas this week has been great but I think it's more than just the experience at Vegas, it's really the experience of the year, where we're at with the technology shift. And we're continuing to see, the need for Cloud, the move to Cloud, mixed workloads, hybrid workloads, unlocking core data, making sure that we're getting insights analytics, and value out of that. And really just working through that, kind of consistent evolution, which is exactly what it is. It's never, you never get to a point where, that's it, there's a bow on it, and it's perfect. It's continuously involving, evolving. >> Yeah. >> And I think that's the most important part that you have to take away. Samir's got his environment in a great place today but in six months, there may be some new things or transformations that he wants to look at, and we want to be there at the ready to work with him, roll up our sleeves, and kind of get into that. So the shift of the Cloud is here to stay. Qlik is a hundred percent here to stay. Here ready to serve our customers in any capacity that we can. And I think that's really my big takeaway from this week. And I've loved it, like this has been a great, this has been great with both of you. You both are super high energy. >> Aw, thank you. >> And Samir and I have had a great time over the event as well. >> Well, nailed it. You absolutely nailed it. All right, Samir, shoot your shot. >> So. >> Savannah. >> What I would say, I'm pretty, so. (laughing) >> I like to keep the smiles organic on stage, my perverse sense of humor, everyone just tolerates. >> Yeah, the one thing I think, I'm hearing a lot is, we have to look at data in motion. Streaming data is the way it's going to go. Whether it's customer data, operational data, it doesn't matter, right? We can't have these silos that you spoke about. Those days are gone, right? And if we really want to make a difference, and utilize all of the technology that's being built out there, all of the new features that were, you know, just in the keynotes. We can't have these separate silos, and the data has to go across, trusted data, it has to go across. The second thing I think we're all talking about is, we have to look at things differently. Unlearning the old is harder than learning the new. So we were just talking about event driven architecture. >> Understatement of the century. Sidebar, that was, yeah. >> So, you know, a lot of us techies are used to calling APIs. Well, now we have to push the data out, instead of pulling it. That just means retraining our brains, retraining our architects, retraining our developers, to think in a different way. And then the last thing I think I've learned is, us technology folks have put the customer first right? >> Yes, absolutely. >> What does a customer want? How do they want to feel when they engage with you? Because if we don't do that, none of this technology matters. And you know, we have to get away from the day where the IT guys go in the back black room, (laughing) coat up and then, you know, push something out, and don't think about what am I doing, and how am I impacting your mother? >> Yes, the end customer. It's no longer the person at the end of a terminal. Look at the green screen. >> And just one last thing. I think also it's fit for purpose transformations. And that's how we have to start thinking about how we're doing business. 'Cause there's a paradigm shift, right? From ETL to ELT, right? Extract, Load, Transform your data. And so as we're seeing that, I think it's really just about that fit for purpose, and looking at the transformations, the right transformations. And what's going to move the needle for the business. >> What a great closing note! Molly, Samir, thank you both for being here. >> Both: Thank you! >> This was a really fantastic chat, love where we took it. And thank all of you for tuning in to our live coverage from AWS reInvent here in fabulous Las Vegas, Nevada. I just want to give my mom a quick shout out, since she got a holler throughout this segment, as well as Stacy and all of my friends at AARP, I missed you all. My name's Savannah Peterson, joined with Dave Vellante. You're watching theCUBE. We are the technology leader in coverage for events like this. (slow upbeat music)

Published Date : Nov 30 2022

SUMMARY :

Molly and Samir, thank you really coming to the How about for you, Samir? And obviously, you know, in the right place at the right time? in the circle I've been and the shifts in the market, Just in case the audience isn't familiar, and really being able to use the data, that need it in the time And in the channel that you choose but in case the audience isn't familiar, state in the United States. of that mainframe to be part and get it distributed into the Cloud, is the hardest to actually get access to. So be, you know, just statistically, And that's a lot of and the way that you go about it. how that extends to the Cloud? to drive, you know, and it is so fun to learn part of the AWS ecosystem Because that's awesome. So the relationship continues to evolve and the relationship. 'Cause you got to trust the data. and that's the only guarantee we have. and how's the data sync Yeah, and just to you know, 80% of their You know extracting value from it. and you're obviously relieving that, Taking a bath in it. I think that world's into the in the format that you want it. and then, you know, they And by being able to Big topics of the show this week. It's been the whole thing. and went through some other services. Congratulations, that and that we can make your data Qlik. Just going to see that. just about the time that not data preparation, at the place that they want to be. Very clear that you both have it, 30 seconds on the clock, So I'm going to start with you. Just the closing statement to be your big takeaway the need for Cloud, the move to Cloud, So the shift of the Cloud is here to stay. And Samir and I have had a great time All right, Samir, shoot your shot. What I would say, I like to keep the and the data has to go across, Understatement of the century. put the customer first And you know, we have at the end of a terminal. and looking at the transformations, Molly, Samir, thank you And thank all of you for tuning in

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

SamirPERSON

0.99+

Savannah PetersonPERSON

0.99+

MollyPERSON

0.99+

StacyPERSON

0.99+

VegasLOCATION

0.99+

20%QUANTITY

0.99+

AmazonORGANIZATION

0.99+

AARPORGANIZATION

0.99+

Sin CityLOCATION

0.99+

AWSORGANIZATION

0.99+

80%QUANTITY

0.99+

SavannahPERSON

0.99+

30 secondsQUANTITY

0.99+

70%QUANTITY

0.99+

2300 employeesQUANTITY

0.99+

EarthLOCATION

0.99+

second pieceQUANTITY

0.99+

BothQUANTITY

0.99+

bothQUANTITY

0.99+

66,000 volunteersQUANTITY

0.99+

twoQUANTITY

0.99+

Las Vegas, NevadaLOCATION

0.99+

OneQUANTITY

0.99+

United StatesLOCATION

0.99+

QlikORGANIZATION

0.99+

AARORGANIZATION

0.99+

Samir ShahPERSON

0.99+

this weekDATE

0.99+

todayDATE

0.99+

two frontsQUANTITY

0.98+

QlikPERSON

0.98+

first thingQUANTITY

0.98+

six monthsQUANTITY

0.98+

HIPAATITLE

0.98+

second thingQUANTITY

0.97+

QliksORGANIZATION

0.97+

two exciting guestsQUANTITY

0.97+

one pieceQUANTITY

0.97+

earlier this yearDATE

0.96+

nearly 38 million membersQUANTITY

0.96+

QlikTITLE

0.95+

hundred percentQUANTITY

0.95+

firstQUANTITY

0.95+

Colleen Kapase, Snowflake & Poornima Ramaswamy, Qlik | Snowflake Summit 2022


 

(bright music) >> Hey everyone, welcome back to theCUBE's continuing coverage of Snowflake Summit 22, live from Caesar's Forum in Las Vegas. I'm Lisa Martin here with about 7,000 plus folks, and this next Cube segment, two words, girl power. Please welcome one of our alumni back to the program, Colleen Kapase, SVP, Worldwide Partners and Alliances at Snowflake and Poornima Ramaswamy, EVP of Global Partnerships and Chief of Staff to the CEO. Ladies, welcome to the program! >> Thank you, very happy to be here, amazing event! >> Isn't it? It's so great to see this many people. Yesterday, the keynote, we got in barely, standing room only. I know there was at least one overflow room, maybe two. People are chomping at the bit to hear what Snowflake and its ecosystem has been up to the last three years, since 2019. >> It's been phenomenal! Since the last time we met together, as humans coming together, and then seeing the step function growth three years later, I don't think, we didn't grow gradually. We just jumped three years ahead, and people have just been hungry for the information and the sharing and the joint education, so it's been a phenomenal show. >> It has been, Poornima, talk to us about the Qlik partnership with Snowflake. What's it all about? What's your joint vision, your joint strategy? Give us all that good stuff. >> Sure, so speaking of three years, this relationship has been in existence for the last three years. We were at the last Snowflake Conference in 2019, and I liked what Frank said, even though we were not in-person in life the innovation has continued and our relationship has strengthened over the last three years as well. So it's interesting that everything that Frank and everything that was mentioned at the keynote yesterday is completely in alignment with Qlik's vision and strategy as well. We are focused on making data available for quick decision making, in a timely manner, for in the moment business decisions as such. The world has gone topsy-turvy in the last two years, so you want to know things that are changing as they happen and not one day late, one month late or one quarter late, because then the world's already passed you, that business moment has passed you. That's been our focus. We've got a dual product strategy and portfolio. We collaborate really strongly with Snowflake on both of those to make the most amount of data, made available on the Snowflake platform in the shortest amount of time, so that it's fresh, and it's timely for business decision makers to get access to it, to make decisions as they are dealing with supply chain challenges and people challenges and so on and can make those moments count as such. >> They have to, one of the things that we've learned in the pandemic is access to real-time data is no longer a, oh, that's great, nice to have. It's table stakes for businesses in every industry. Consumer expectations have risen to a level we've probably never seen, and let's face it, they're not going to go down. Nobody's going to want less data, slower. (laughs) Colleen, talk about the Qlik partnership from your Snowflake's perspective. >> Yeah, it's been fabulous, and we started on the BI side and keep evolving it, frankly with more technology, more solutions, making that real-time access, not just the the BI side of having the business intelligence and seeing the data but moving beyond that to the governance side, and that's such a huge piece of the relationship as well, and the trustworthy that executives have with the data, who's seeing it and how are we leveraging it, and we keep expanding that too and having some fun too. I know you guys have been making some acquisitions. >> Talk to us about what's going on at Qlik and some news today as well, acquisitions news, what's the deal? >> Yeah, so like I mentioned, we have a dual product strategy, a Qlik data integration platform and a Qlik analytics platform. And we are strengthening, making sure that we align with Snowflake's vision of all workloads, SaaS only and governed. So the announcement today was we do provide real-time data using our Qlik data integration platform into Snowflake, but that real-time data has to make its way into the hands of the business decision makers as well. So we launched what we call as direct query into Snowflake, so as and when data gets into the Snowflake platform, now customers for specific use cases can choose to access that data as it comes in by accessing it directly on Snowflake. And there are other use cases where the data's already been prepared and so on, and they'll continue using the Qlik analytics platform, but this direct query access will make a world of difference in terms of that active intelligence, in-the-moment decision making. The second announcement that we did was the SaaS first and going all into SaaS, so we are doing our data movement investments in our SaaS platform, and one of our first investments is on the Snowflake platform, going direct into Snowflake, and our data ingestion now, our data replication real-time is going to be available natively into the Snowflake platform through our SaaS data transformation investment that we've made. So those are the two big announcements, and governance has been the cornerstone for our platform end-to-end, right from the beginning, and that strength continues, and that's, again, completely in alignment with the vision that Snowflake has as well. >> I couldn't agree more, that native integration, we used to think about bringing the data to the work, and now it's bring the work to the data, because that's the secure environment, the governed environment, and that's what we're seeing with our product roadmaps together and where we're going, and it gives customers just peace of mind. When you're bringing the work to the data, it's more secure, it's more governed, and that real-time access, it's speed, because boy, so many executives have to make real-time decisions quickly. The world is moving faster than it ever has before, and I've never had an executive say, "Oh yeah, I'll just wait and get the data later." That's not a conversation they have. I need it, and I need it now, and I need it at my fingertips, and I need more of my entire organization to have access to that data, what I feel secure and safe to share with them. And so, having Qlik make that possible is just fantastic. >> The security piece is absolutely critical. We've seen such changes to the threat landscape in the last couple of years. It's no longer now a, if we get hit by a cyber attack, it's a matter of when. And the volume of data just keeps proliferating, proliferating, proliferating, which obviously is not going to slow down either. So having the governance factor, the ability to share data securely, leveraging powerful analytics across to customers and partners and ecosystem, it sounds like to me a pretty big differentiator of what Snowflake is delivering to its customers and the ecosystem. >> It is, and I would say one of the things that has held folks back from moving to the cloud before, was governance. Is this just going to be a free for all, Lisa? I'm not feeling secure with that. And so, having the ability to extend our ecosystem and work on that governance together gives executives peace of mind, that they can easily determine who's going to have access to what, which makes a transition to the cloud faster. And that's what we're looking for, because to have our customers experience the benefits of cloud and the moving up and moving down from a data perspective and really getting access to the data cloud, that's where the nirvana is, and so you guys are helping make that possible and provide that peace of mind, so it's amazing. >> You talk about peace of mind, and it's one of those things we think, oh, it's a marketing term or it's a soft term. It's actually not, it's completely measurable, and it's something that I talk to a lot of C-suite, and the statement of "I sleep better at night," is real. There's gravity with it, knowing that they can trust where the data is. The access is governed. It just keeps getting more and more critical every day. >> Colleen: Well, it's a newsworthy event, frankly- >> Absolutely, nobody wants don't to be a headline. >> If things don't go right, that's people's jobs on the line that's reputations, and that's careers, so that is so important, and I think with a lot of our customers that's our conversations directly of how can you ensure that this is going to be a secure experience? And it's Snowflake and some of our superpowers, and frankly, some of our partners superpowers too, together it's better. >> I can bring this home with a customer example, a couple of customer examples. So Urban Outfitters, I think they're a well-known brand. They've got about 650 stores, to your point on governed autonomy is what I call it. But then it's not just about helping with decision making at the top. You want to be able to make decision making at all levels, so we speak about data democratization. It's about not just strategic decisions that you make for a two-year timeframe or a five-year timeframe. It's about decisions that you want to make today in the first half of the day versus the second half of the day. So Urban Outfitters is a common customer, and during the pandemic they had to change their in-stores into distribution centers. They had to look at their supply chain landscape, because there were supply chain bottlenecks that are still happening today. So, with the power of both Qlik data integration and Qlik analytics, but then the combined power of Qlik and Snowflake, the customer actually was able to make insights available to their in-store managers, to their distribution centers, and from a time perspective, what used to take them days, or, in fact, sometimes even weeks, they're now able to get data in 15 minutes refresh time for their operational decision makers, their distribution centers and their order taking systems, so they're able to make decisions on which brands are moving, not moving. Do they need to change the product position in their stores? Do they need to change their suppliers today? Because, for what's going to be in their inventory one month later, because they are foreseeing, they're able to predict the supply chain bottlenecks that are coming in. They're able to do all of that today because that power of a governed autonomous environment that we've built but real-time data making fresh data available through Snowflake and easy-to-use dashboards and visualization through the analytics platform that we've got. And another customer ABB, 37 different SAP source systems being refreshed every two minutes, worldwide for B2B transactions to be able to make all of those decisions. >> And what you're talking about there, especially with their Urban Outfitters example, I think that's one that everybody as a consumer of clothing and apparel, what you just described, what Qlik and Snowflake enabled there, that could have very well saved that organization. We saw a lot of retailers that were not able to make that pivot. >> Poornima: Yep, no, and it did. >> You are exactly right. I think the differentiation on a lot of our core customers together of combing through, not just surviving but thriving through the pandemic, access to data and supply chain management, and it's these types of solutions that are game changing, and that's why Snowflake's not being sold just to the IT department, it's the business decision makers where they have to make decisions, and one of the things that surprised us the most was we had the star schema COVID data up on our data marketplace and the access to that, that we had our customers to determine supply chain management. What's open? What are the rules per state, per region? Where should we put supply? Where should we not? It was phenomenal. So when you have tools like what Qlik offers together with that data coming through the community, I think that's where a lot of executives experience the power of the data cloud, and that's what we want to see. And we're helping real businesses. We say we want to drive outcomes. Supply chain management was a massive outcome that we helped over the last two years. >> And that was critical, obviously we're still in that from a macro economic perspective. It's still a challenge for a lot of folks, but it was life and death. It was, initially, how do we survive this? And to your point, Colleen, now we've got this foundation, now we can thrive, and we can leave the competition who wasn't able to move this fast in the dust behind us. >> A foreseen function for change, really, and then that change wasn't just different, it was better. >> Yeah, it is better, and it now sets the foundation for the next stage of innovation, which is auto ML and AI ML. You're looking back, you're saying, "Okay this is all the data, "so these are the decisions I had to make in the moment." But then now they can start looking at what are the midterm and the long term strategic decisions I have to make, because I can now predict what are the interconnectedness or the second secondary level and the tertiary level impact for worldwide events. There's a pandemic. We are passed the pandemic. There's flood somewhere. There's fire somewhere. China shuts down every so often. You need new suppliers. How do you get out of your way in terms of making daily decisions, but start planning ahead? I think auto ML, AI ML, and data's going to be the foundation for that and real-time data at that. So what Snowflake's doing in terms of the investment in that space, and Qlik has acquired companies in the auto ML space and driving more automation, that time-to-business value and time-to-predictive insights is going to become very key. >> Absolutely key and also really a lifeline for organizations to be able to do that. >> And I have to say, it's a source of pride for us to see our partners growing and thriving in this environment too. Like some of these acquisitions they're making, Lisa, in the machine learning space, it's awesome. This is where customers want to go. They've got all this fabulous data. They now know how to access it real time. How do I use queries to make me smarter? How do I use this machine learning to look at a vast amount of data in a very real time fashion and make business decisions from? That's the future, that's where we're going. So to see you guys expand from BI, to governance, to machine learning, we're really, Lisa, watching companies in our ecosystem grow as we grow, and that's the piece I take a lot of personal pride in, and it's the fun part of the job, frankly. >> Yeah, as you should take part in that, and that's something too, that's been thematic the last... We were recovering this show yesterday and today that the growth and the substance of the Snowflake ecosystem. You see it, you feel it, and you hear it. >> Yeah, well in Frank Slootman's book, "Amp It Up," there's actually a section that he talks about, because I think he has some amazing lifelong advice on his journey of growth, and he tells us that, "Hey you can attach your company, "your personal career energy to an elevator going up "and a company and a high growth story "or a flat or declining." And it's harder in a flat and declining space, and Snowflake we certainly see as an elevator skyrocketing up and these organizations surrounding us with their technologies and capabilities to have joint outcomes, they're doing fantastic too. I've heard this story over and over again this week. I love seeing this story too with Qlik, and it's just amazing. >> I bet, Ladies, thank you so much for joining me, talking about the Snowflake-Qlik partnership, the better together power, and also, you're just scratching the surface. The future, the momentum, you can feel it. >> Yeah, I love it. >> We appreciate your insights and your time and good luck! >> Thank you, thank you. >> And let's let the girl bosses go! (laughs) >> Exactly! (laughs) For my girl boss guests, I'm Lisa Martin. You're watching theCUBE's coverage of Snowflake Summit 22, live from Caesar's Forum in Las Vegas. I'll be right back with my next guest. (bright music)

Published Date : Jun 15 2022

SUMMARY :

and Chief of Staff to the CEO. People are chomping at the bit to hear and the sharing and the joint education, the Qlik partnership with Snowflake. and everything that was mentioned in the pandemic is and the trustworthy that and governance has been the cornerstone bringing the data to the work, the ability to share data securely, and the moving up and moving and the statement of "I sleep don't to be a headline. that this is going to and during the pandemic they that were not able to make that pivot. and the access to that, and we can leave the competition and then that change wasn't and data's going to be for organizations to be able to do that. and it's the fun part of the job, frankly. that the growth and the substance and Snowflake we certainly see The future, the momentum, you can feel it. I'll be right back with my next guest.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
ColleenPERSON

0.99+

Colleen KapasePERSON

0.99+

Lisa MartinPERSON

0.99+

Frank SlootmanPERSON

0.99+

FrankPERSON

0.99+

Poornima RamaswamyPERSON

0.99+

five-yearQUANTITY

0.99+

15 minutesQUANTITY

0.99+

LisaPERSON

0.99+

SnowflakeORGANIZATION

0.99+

yesterdayDATE

0.99+

two-yearQUANTITY

0.99+

Amp It UpTITLE

0.99+

todayDATE

0.99+

three yearsQUANTITY

0.99+

QlikORGANIZATION

0.99+

PoornimaPERSON

0.99+

Las VegasLOCATION

0.99+

Urban OutfittersORGANIZATION

0.99+

bothQUANTITY

0.99+

oneQUANTITY

0.99+

one month laterDATE

0.99+

three years laterDATE

0.99+

first halfQUANTITY

0.99+

Snowflake Summit 22EVENT

0.99+

SnowflakeTITLE

0.99+

second announcementQUANTITY

0.99+

pandemicEVENT

0.98+

two big announcementsQUANTITY

0.98+

twoQUANTITY

0.98+

ABBORGANIZATION

0.98+

firstQUANTITY

0.98+

two wordsQUANTITY

0.97+

2019DATE

0.97+

this weekDATE

0.97+

YesterdayDATE

0.97+

about 7,000 plus folksQUANTITY

0.96+

first investmentsQUANTITY

0.96+

Snowflake ConferenceEVENT

0.95+

QlikPERSON

0.94+

second half ofQUANTITY

0.94+

second secondary levelQUANTITY

0.94+

about 650 storesQUANTITY

0.93+

one monthQUANTITY

0.92+

theCUBEORGANIZATION

0.92+

SaaSTITLE

0.92+

Global PartnershipsORGANIZATION

0.91+

Snowflake Summit 2022EVENT

0.9+

one dayQUANTITY

0.9+

last couple of yearsDATE

0.9+

EVPPERSON

0.9+

last three yearsDATE

0.89+

Itamar Ankorion, Qlik & Kosti Vasilakakis, AWS | AWS re:Invent 2021


 

>>Hello, and welcome back to the cubes. Continuous coverage of AWS 2021. We're here live real people, and we're pleased to bring you this hybrid event. The most important hybrid event of the year to wrap up really 20, 21 and kick off next year, we're going to dig into the intersection of machine learning and business intelligence, business intelligence, Innomar, and Corian is here as the senior vice president of technology alliances at click and costy Wasilla caucus is the head of product growth for low code, no code machine learning at AWS gentlemen. Welcome to the >>Cube. Thanks for having us. >>I think the first time you were on at reinvent Sev definitely early last decade of >>My life. I >>Had black hair and it was maybe a 2013, I want to say. So it's been quite a run >>And it's definitely been a, been a privilege. I had a, had a chance to attend pretty much all all reinvents from the first one, eh, with a much fewer people and say this growth year over year. And what's amazing about it. This is beyond the scale, how much you grow, the number of people. It's just the face of innovation. Keeps, keeps accelerating as an it's, just this phenomenal. >>We're lucky that we chose data as sort of a, our business passion. But, um, so speaking of data, what are you hearing from customers about what they want to do with their data and bringing together business intelligence and machine learning it's being injected in, but what are they telling you that they, that they want, that they need? What's the opportunity that you're hearing now? >>So, uh, I think first of all, this is a fascinating, fascinating topic because we're talking kind of about the intersection of, uh, what everybody wants to look to do as the next frontier of, uh, of data with predictive data, because descriptive analytics have been around for a long time, but what coconut use predictive analytics, prescriptive analytics to enrich what we've had with descriptive analytics to be the end of the day, improve the business and what, what I love talking to people around here and just listening to customers, express the, you know, their needs is how can they get more value out of data? So they have the data, they don't use. A lot of the data are in Applegate and they want to use it in more ways. And that's what exciting to discuss those new ways. They want to bring it together >>Because anything you'd add to that from AWS perspective, >>I'll tell you what we don't hear from our customers and that we've stopped hearing what is AI and machine learning. And on the contrary we are hearing, how can we make the teams that already AI and ML a lot more productive and make a lot more of it, for example, how can they iterate a lot faster across the ML workflow, how they can train and build really large state of the art, natural language processing models like DDB DBT three, how can we help customers build, train and tune customer specific models for all their, to be able to bring in hyper personalization to their products? And the other thing we're hearing is how can we help the teams that are not tapping into AI and ML get the most power of it in a way, how could you actually potentially either democratize the building and development of machine learning models? Or how can you, in another way, expose machine learning into applications that analytics users are already using? >>Yeah. So in my, when we first met success was measured in, yeah, I got the Hadoop cluster, the work technically, but to your point, they customers want to get more value out of that data now. And so they want to operationalize machine intelligence. Is that what active intelligence is? >>Um, so active intelligence is something that you have here click started to talk about, but we believe it really represents what customers are trying to achieve. And the reason we use the word active intelligence is if you're going to think about active, not being passive. So, uh, traditional BI, uh, kind of relied on pre-configured historical data sets, which were great for what they did, but today they're kind of out of gas in terms of supporting real time decisioning and action. So what active intelligence is all about is really enabling customers to make it take informed, informed action, not just informed decision informed action in the moment. So when that action needs needs to happen. So in order to accommodate that again, this is really the difference between active and passive. Is it active intelligence is all about innovations to bring real-time data. So it's all just historical data. >>I need real time data that's relevant to what's happening. Now. I need a way to get an intelligent data pipeline. And I lead this data pipeline that makes it real-time data available in the forum and the structure that allows me to make a decision or to take action. And finally, it's really to be designed to drive action, right? So whether it's a manual action or whether it's even completely automated, but it's intelligent, it's informed. So that's, that's what active intelligence is all about that by the way, predictive data fits really well into that entire paradigm. Right. >>I mean, we've been talking for years about real-time and it's like, okay, what is real time? Well, it's real time is before you lose the customer before you lose the patient before the machine explodes. Right? So your point about predictive. Yeah. Now you guys made an announcement yesterday, uh, ADA, which stands for AI, for data analytics, what what's that all about? Well, >>Ate them tries to aims to address the very point I mentioned before our customers that are asking us, how can we give access to our business teams? There are a lot more business needs to machine learning. An AI for data analytics is a set of partner solutions that are ML powered. And they're focusing across the spectrum of analytics from data warehousing, business intelligence, business process automation, and other business application. And the idea is to help our partners bring to our customers a lot of those more ways. And for example, we've built integrations with clique Tableau, snowflake, Workato Pegasystems. And through those, those usually take two flavors. Either we help our partners build a mail and embedded into their applications and in a way, make them more intelligent as Mr. Wright mentioned, or we help our partners expose machine learning capability from AWS, right within the UI. >>So for example, yes, they will launch snowflake integration with SageMaker. Now snowflake user can use the same user experience in three-year the same use, the SQL query that they love and trigger an auto ML process insights maker, right from the same UI and get ML into the same UI. And I'm quite excited to also discuss today about the integration we announced today with click SageMaker integration or that was about it. No, no, no other, so I think, um, what a setups, yeah. You mentioned customers want to create more machine learning. They, they want to build faster, new, more machine learning capabilities, which is whereby the way the, the, uh, no code local, you know, comes into mind. How can you use the autopilot, which is a SageMaker product for enabling faster creation of models. So I want to create models faster. They also want to be able to use models in a sense, monetize them, turn them into value to make them available to more users where they're you there's users are. >>Eh, so, you know, BI environments or experiences like as we started to think about him. So I says, well, be provided with Gleevec. And again, with our active intelligence platform is all about weaving the data into the applications, into the environments, either the analytic workflows that, uh, that users have. So we introduced and are super excited. Uh, we've announced, uh, two integrations. So very robust integration between cloud and Amazon SageMaker. And that includes both our new analytic connector for, uh, uh, Amazon SageMaker and our integration with Amazon SageMaker autopilot. So with integration with SageMaker, we now have ClixSense interacting directly and seamlessly with any model deployed within SageMaker. So again, very much like cost dimension in your experience as a user seamlessly, you now also have predictive predictive data. So as you working in application, as you're interacting with your data, dynamically data is interchanged between click and SageMaker in reaching your decision, making your actions with predictive datasets. And that's, what's so cool about it. So again, the clinic environment, we bring real-time data in, prepare it for analytics, and then feed that real-time data to SageMaker to get the real-time prediction back in the same experience for the user. So we're really, really excited about that. So >>Translate what that means for customers is that everything happens faster. Is it unlocked new capabilities? Can we unpack >>A little bit? Absolutely. So aware in a way, bridging the chasm between the data science world and the business teams. So the data science teams are building machine learning models to make predictions. And now with the first integration that Myra mentioned, we actually expose those machine learning models in an application that the business team uses click and with the same dashboards that they are very familiar with can now trigger those machine learning models and get real time predictions in the dashboards themselves powered by machine learning. So in a way, this chasm between the two worlds of data science and business users is completely bruised. And the second integration we built with autopilot, she helps data engineers use completely their own machine learning technology powered by AWS pacemaker. So a data engineers creating different pipelines and through those pipelines, they can now with a building block, add auto ML capabilities in that pipeline without them really knowing machine learning. So we bridge the gap of the business teams, getting access to the data science teams and also bringing the skillset gap for the data engineers to tap into machine learning. You mentioned >>Monitor monetization before. So this to me is key because who's going to do with doing the monetization. It's the business lines that are going to do that, not the data scientists data they're going to enable that, but ultimately it's those data consumers that are building those, I call them data products that they can ultimately monetize. And that's, I'm interested in low-code no-code who sits in your title too, so that all plays in doesn't it? >>Yeah, you guys, and we're heavily invested into that whole space. So for example, today we just launched SageMaker canvas. That is a low-code no-code capability for analysts and business users, but we realized we don't need to only innovate on the technology side. We need to also innovate on the partnerships that we built and those integrations help expose those, our technology to wherever our customers want to be the one to be in clique. So be it, let them use the machine learning technology that we are innovating on exactly where they wanted to be. >>Can you give us some customer examples and use cases, maybe make it real for us, >>Uh, for sure. And I, and I think as you, as you think about these use cases, one of the other things I want to do to kind of envision is the fact that all this predictive data and all this integration that we're talking about is not, can actually express itself in a lot of different experiences for the user. It can be a dashboard. It can also be a conversation analytics, which is part of what we offer in the cloud. So you can actually, he can arrive and interact with the data. You don't have to actually look at it. It can be alerts that actually look automatically and inform you that you need to take action. So you don't actually look at the data. The data will come to you when it, when it needs you including base on, on predictive data. So there's a lot of, uh, a lot of options about how you're going to do it. >>Then give me, let me give you, let me give you an example. I'll let me try and maybe pick one that is intuitive. I think for, for many, for many people sales, right? So you have sales, you have a lot of orders. You want to try to close to closing a quarter, you have a forecast, the deals you expect to close. Uh, and then you can use machine learning for example, to forecast or to try to project which, which deals you're going to lose. So now again, that can look at a lot of different aspects of the deal, the timing, the folder, the volume, the amounts, a lot of other parameters, right. Then predict if you're going to lose a deal. So now, if there's a deal that I, that my sales person is telling me, he's going to win, but the mall is telling me you may lose, well, I probably want to double click on that one. >>Right? So I cannot bring that information right in again, in the moment it is to the seller or to the management, so they can identify it and take action. Now, not only can I bring it to them, but I can also, you know, from the machine learning, you know, what is the likely reason that they lose? And if I know the likely reason, it also become prescriptive, I now can know what to do to try and fix it, right. So I can either do it again manually, or it can also integrate it, uh, again, you know, click cloud. We also also click on application automation, which is again, also kind of a low-code no-code environment to orchestrate processes. I can also take that automatically, also update back Salesforce or the CRM. Okay. So that the metadata management system gets updated. So you got an example, exactly. The example of active intelligence. It allows me to take informed action in the now in the moment about making the best example. >>And if Salesforce salesperson, maybe I prioritize and the machines helping me direct my resources. Is this available today? Is it in general availability >>Available right now? Right? Anyone can go start it right now and click LA >>Congratulations. Um, last question. So what's the future hold for this partnership? Where are you guys headed? Give us a little >>Direction. First of all, would love to scale those integrations. So if you're a customer of Blake, please go ahead and test them and do sir, the feedback. And second for us, we really want to learn from our customers and improve those integrations. We bring to them, we really want to hear what technologies they want to expose to a lot more users. And we are aspiring to build that partnership and get a lot more tight aligned with, uh, with Glick. And, uh, thank you costly. And, uh, we, we see tremendous additional opportunities. I think Amazon tells it where I would say is, well, we're in day one. That that's how we kind of feel about it. There's only so much we put into it, but the market is so dynamic. There's so many new needs that are coming up. So we kind of think about it that way. >>So first of all, we want to journey to expand Lee cloud, adding more services. It's actually a platform where we're bringing both data services. They integration data management, everything related to the analytics pipeline, and of course the analytic services. So it all comes together in one environment that makes it more agile, faster to build these new modern, active intelligence type experiences. So as we do that, we're going to be adding more services, creating more opportunities to integrate with more services from the AWS side. So we're really excited to look at that and just like close to, you mentioned with canvas, you know, Amazon keeps coming up with new new services and new capabilities. So there's gonna be a lot of more opportunity. Eh, we're gonna keep, uh, again, within spirit of our partnership where we want to, you know, jump first innovate quickly and, uh, you know, create is integration, adds value to customer >>Often the flywheel that's. I love it. Great. Great to have you guys awesome to reconnect. All right. Appreciate it. Thank you for watching. This is the queue and we're covering AWS reinvent 2021. We're the leader in high tech coverage, right back

Published Date : Dec 1 2021

SUMMARY :

Innomar, and Corian is here as the senior vice president of technology alliances at click and I So it's been quite a run This is beyond the scale, how much you grow, the number of people. so speaking of data, what are you hearing from customers about what they want to do with their data and bringing to customers, express the, you know, their needs is how can they get more value And on the contrary we are hearing, how can we make the teams I got the Hadoop cluster, the work technically, but to your point, And the reason we use the word active intelligence is if you're going to think about active, available in the forum and the structure that allows me to make a decision or to take action. Well, it's real time is before you lose the customer before you lose the patient before And the idea is to help our partners bring So I want to create models faster. So again, the clinic environment, Can we unpack So the data science teams are building machine learning models to make predictions. So this to me is key because who's going to do with doing the monetization. So for example, today we just launched SageMaker canvas. So you can actually, he can arrive and interact with the data. So now again, that can look at a lot of different aspects of the deal, the timing, So I cannot bring that information right in again, in the moment it is And if Salesforce salesperson, maybe I prioritize and the machines helping me direct my resources. So what's the future hold for this partnership? We bring to them, we really want to hear what technologies So we're really excited to look at that and just like close to, you mentioned with canvas, Great to have you guys awesome to reconnect.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
AmazonORGANIZATION

0.99+

WrightPERSON

0.99+

Itamar AnkorionPERSON

0.99+

AWSORGANIZATION

0.99+

Kosti VasilakakisPERSON

0.99+

second integrationQUANTITY

0.99+

2013DATE

0.99+

first integrationQUANTITY

0.99+

three-yearQUANTITY

0.99+

yesterdayDATE

0.99+

secondQUANTITY

0.99+

todayDATE

0.99+

two flavorsQUANTITY

0.99+

bothQUANTITY

0.98+

next yearDATE

0.98+

SageMakerTITLE

0.98+

FirstQUANTITY

0.98+

two worldsQUANTITY

0.98+

2021DATE

0.98+

first oneQUANTITY

0.97+

GleevecORGANIZATION

0.97+

ADAORGANIZATION

0.97+

ApplegateORGANIZATION

0.97+

ClixSenseTITLE

0.95+

two integrationsQUANTITY

0.94+

oneQUANTITY

0.94+

LeeORGANIZATION

0.93+

one environmentQUANTITY

0.92+

TableauTITLE

0.92+

SQLTITLE

0.92+

GlickORGANIZATION

0.92+

QlikPERSON

0.91+

firstQUANTITY

0.91+

first timeQUANTITY

0.91+

MyraPERSON

0.88+

SageMakerORGANIZATION

0.8+

dayQUANTITY

0.79+

early last decadeDATE

0.77+

doubleQUANTITY

0.77+

21DATE

0.7+

BlakeORGANIZATION

0.69+

Workato PegasystemsTITLE

0.67+

InventEVENT

0.64+

SalesforceORGANIZATION

0.63+

reinvent SevEVENT

0.58+

AWSEVENT

0.56+

yearsQUANTITY

0.56+

WasillaLOCATION

0.51+

20DATE

0.48+

DDB DBT threeTITLE

0.44+

Joe DosSantos, Qlik | CUBE Conversation, April 2019


 

>> From the SiliconANGLE Media office in Boston, Massachusetts, it's theCUBE! Now here's your host, Stu Miniman! >> I'm Stu Miniman and this is a CUBE Conversation from our Boston area studio. Going to dig in to discuss the data catalog and to help me do that, I want to welcome to the program first-time guest Joe DosSantos who is the global Head of Data Management Strategy at Qlik. Joe, thank you so much for joining us. >> Good to be here Stu. >> All right so the data catalog, let's start there. People, in general, know what a catalog is. well maybe some of the millenniums might not know as much as those of us that been in the industry a little bit longer might have. So start there and help level set us. >> So our thinking is that there are lots of data assets around and people can't get at them. And just like you might be able to go to Amazon and shop for something, and you go through a catalog or you go to the library and you can see what's available, we're trying to approximate that same kind of shopping experience for data. You should be able to see what you have, you should be able to look for things that you need, you should be able to find things you didn't even know were available to you. And then you should be able to be able to put them into your cart in a secure way. >> So Joe, the step one is, I've gathered my data lake, or whatever oil or water analogy we want to use for gathering the data on, and then we've usually got analytic tools and lots of things there but this is a piece of that overall puzzle, do I have that right? >> That's exactly right so, if you think about what are the obstacles to analytics, there are studies out there that say less than one percent of analytics data is actually being analyzed. We're having a trouble with the pipelines to get data into the hands of people who can do something meaningful with it. So what is meaningful? Could be data science, could be natural language, which maybe if you have an Alexa at home or you just ask a question and that information is provided right back to you. So somebody wants to do something meaningful with data but they can't get it. Step one is go retrieve it, so our Attunity solution is really about how do we start to effectively build pipelines to go retrieve data from the source? The next step though is how do I understand that data? Cataloging isn't about just having a whole bunch of boxes on a shelf, it's being able to describe the contents of those shelves, it's being able to know that I need that thing. If you were to go into an Amazon.com experience and you say I'm going on a fishing trip and you're looking for a canoe, it'll offer you a paddle, it'll offer you lifejackets. It guides you through that experience. We want data to be the same way, this guided trip through the data that's available to you in that environment. >> Yes, it seems like - metadata is something we often talk about but it seems like even more than that. >> It really is, metadata is a broad term. If you want to know about your data, you want to know where it came from. I often joke that there are three things you want to know about data: what is it, where did it come from and who can have access to it under what circumstances. Now those are really simple concepts but they're really complex under the covers. What is data? Well, is this private information, is this person identifiable information, is a tax ID, is it a credit card? I come from TD Bank and we were very preoccupied with the idea of someone getting data that they shouldn't. You don't want everyone running around with credit cards, how do I recognize a credit card, how do I protect a credit card? So the idea of cataloging is not just available for everything, it's security. I'm going to give you an example of what happens when you walk into a pharmacy. If you walk into a pharmacy and you want a pack of gum or shampoo you walk up to the shelf and you grab it, it's carefully marked in the aisles, it's described but it's public, it's easy to get, there aren't any restrictions. If you wanted chewing tobacco or cigarettes you would need to present somebody with an ID who need to say that you are of age, who would need to validate that you are authorized to see that and if you wanted Oxycontin, you'd best have a prescription. Why isn't data like that, why don't we have rules that stipulate what kind of data belong in what kind of category and who can have access to it? We believe that you can, so a lot of impediments to that are about availability and visibility but also about security and we believe that once you've provisioned that data to a place then the next step is understanding clearly what it is, and who can have access to it so that you can provision it downstream to all of these different analytic consumers that need it. >> Yeah, data security is absolutely front and center, it's the conversation at board levels today, so the catalog, is it a security tool or it works with kind of your overall policies and procedures? >> So you need to have a policy. One of the fascinating things that exists in a lot of companies is you ask people please give me the titles of the columns that constitute personally identifiable information, you'll get blank stares. So if you don't have a policy, you don't have a construct, you're hopelessly lost. But as soon as you write that down now you can start building rules around that. You can know who can have access to what under what circumstances. When I was at TD we took care to try and figure out what the circumstances were that allowed people to do their job. If you're in marketing you need to understand the demographic information, you need to be able to distribute a marketing list that actually has people's names and addresses on it. Do you need their credit card number, probably not. We started to work through these scenarios of understanding what the nature of data was on a must-have basis and then you don't have to ask for approval every single time. If you go to Amazon you don't ask for approval to buy the canoe, you just know whether it's in stock, if it's available and if it's in your area. Same thing with data, we want to remove all of the friction associated with that just because the rules are in place. >> Okay, so now that I have the data what do I do with it? >> Well this is actually really an important part of out Qlik story. So Qlik is not trying to lock people into a Qlik visualization scenario. Once you have data what we're trying to do is to say that discovery might happen across lots of different platforms. Maybe you're a Tableau user, I don't know why, but there are Tableau users - no in fact we did use Tableau at TD - but if you wanted provision data and discover things and comparable BI tools, no problem. Maybe you want to move that into a machine learning type of environment, you have TensorFlow, you have H2O libraries doing predictive modeling, you have R and Python, all of those things are things that you might want to do, in fact these days a lot of times people don't want analytics and visualizations, they want to ask the questions, do you have an Amazon Alexa in your house? >> I have an Alexa and a Google Home. >> That's right so you don't want a fancy visualization, you want the answer to a question so a catalog enables that, a catalog helps you figure out where the data is that asks a question. So when you ask Alexa what's the capital of Kansas it's going through the databases that it has that are neatly tagged and cataloged and organized and it comes back with Topeka. >> Yeah. >> I didn't want to stump you there. >> Thank you Joe, boy, I think back in the world, there are people, ontological studies as to how I put these things together. As a user I'm guessing, using a tool like this, I don't need to have to figure how to set all this up, there's got to be way better tools and things like that just like in the discussion of metadata, most systems today do that for me or at least a lot of it but how much do I as a customer customize stuff and how much does it do it for me? >> So when you and I have a conversation we share a language and if I say where do you live you know that living implies a house, implies an address and you've made that connection. And so effectively all businesses have their own terminology and ontology of how they speak and what we do is, if we have that ontology described to us we will enforce those rules so we are able to then discover the data that fits that categorization of data. So we need the business to define that force and again a lot of this is about processing procedure. Anyone who works in technology knows that very little of the technological problems are actually about technology, they're about process and people and psychology. What we're doing is if someone says I care deeply and passionately about customers and customers have addresses and these are the rules around them, we can then apply those rules. Imagine the governance tools are there to make laws, we're like the police, we enforce those laws at time of shopping in that catalog metaphor. >> Wow Joe, my mind is spinning a little bit because one of the problems you have if you work for a big customer, you'd have different parts of the company that would all want the same answer but they'd ask it in very different ways and they don't speak the same language so does a catalog help with that? >> Well it does and it doesn't. I think that we are moving to a world in which for a lot of questions, truth is in the eye of the beholder. So if you think about a business that wants to close the books, you can't have revenue that was maybe three million, maybe four million. But if you want to say what was the effectiveness of the campaign that we ran last night? Was it more effective with women or men - why? Anytime someone asks a question like why, or I wonder if, these are questions that invite investigation, analysis and we can come to the table with different representations of that data, it's not about truth, it's about how we interpret that. So one of the peculiar and difficult things for people to wrap their arm around is in the modern data world with data democratization, two people can go in search of the same question and get wildly different answers. That's not bad, that's life, right? So what's the best movie that's out right now? There's no truth, it's a question of your tastes and what you need to be able to do is, as we move to a democratized world is, what were the criteria that were used? What was the data that was used? And so we need those things to be cited but the catalog is effectively the thing that puts you in touch with the data that's available. Think about your college research projects. You wrote a thesis or a paper, you were meant to draw a conclusion, you had to go to the library and get the books that you needed. And maybe, hopefully, no one had ever combined all of those ideas from those books to create the conclusion that you did. That's what we're trying to do every single day in the businesses of the world in 2019. >> Yeah it's a little scary in the world of science most things don't come down to a binary answer, there's the data to prove it and what we understand today might not be - if we look and add new data to it it could change. Bring in some customer examples as to what they're doing, how this impacts it and I wish brings more certainty into our world. >> Absolutely, so I come from TD Bank and I was the Vice President of Information Management Technology there, and we used Data Catalyst to catalog a very large data lake so we had a Hadoop data lake that was six petabytes, had about 200 different applications in it. And what we were able to do was to allow self service to those data assets in that lake. So imagine you're just looking for data and instead of having to call somebody or get a pipeline built and spend the next six months getting data, you go to a portal, you grab that data. So what we were able to do was to make it very simple to reduce that. We usually think that it takes about 50% of your time in an analysis context to find the data, to make the data useful, what if that was all done for you? So we created a shopping experience for that at an enterprise level. What was the goal - well at TD, we were all about legendary customer experience so we found very important were customer interactions and their experiences, their transactions, their web Qliks, their behavioral patterns and if you think about it what any company is looking to do is to catch a customer in the act of deciding and what are those critical things that people decide? In a bank it might be when to buy a house, when you need mortgages and you need potentially loans and insurance. For a healthcare company it might be when they change jobs, for a hospital it might be when the weather changes. And everybody's looking for an advantage to do that and you can only get that advantage if you're creative about recognizing those moments through analytics and then acting in real time with streaming to do something about that moment. >> All right so Joe one of the questions I have is is there an aspect of time when you go into this because I understand if I ask questions based on the data that I have available today but if I'd asked that two weeks before that it would be some different data and if I kept watching it, it would do that and so I've got certain apps I use like when's the best time to buy a ticket, when is the best time to do that, how does that play in? >> So there are two different dimensions to this, the first is what we call algorithmic decay. If you're going to try and develop an algorithm you don't want the data shifting under your feet as you do things because all of a sudden your results will change if you're not right and the sad reality is that most humans are not very original so if I look at your behavior for the past ten years and if I look at the past twenty it won't be necessarily different from somebody else, so what we're looking to do is catch mass patterns, that's the power of big data, to look at a lot of patterns to figure out the repeatability in most patterns. At that point you're not really looking for the data to change, then you go to score it and this is where the data changes all the time. So think about big data as looking at a billion rows and figuring out what's going on. The next thing would be traditionally called fast data which is now based on an algorithm - this event just happened, what should I do? That data is changing under your feet regularly, you're looking to stream that data, maybe with a change data capture tool like Attunity, you're looking to get that into the hands of people in applications to make decisions really quickly. Now what happens over time is people's behaviors change - only old people are on Facebook now right, you know this, so demographics change and the things that used to be very predictive fail to be and there has to be capability in an industry, in an enterprise to be able deal with those algorithms as they start to decay and replace them with something fresher. >> All right Joe, how do things like government compliance fit into this? >> So governance is really at the core of the catalog. You really need to understand what the rules are if you want to have an effective catalog. We don't believe that every single person in a data democratized world should have access to every single data element. So you need to understand what is this data, how should I protect it and how should I think about the overall protection of this data and the use of this data. This is a really important governance principle to figure out who can have access to these data sets under what circumstances. Again nothing to do with technology but the catalog should really enforce your policy and a really good catalog should help to enforce the policies that you're coming up with, with who should have access to that data under what circumstances. >> Okay so Joe this is a pretty powerful tool, how do customers measure that they're getting adoption, that they're getting the results that they were hoping to when they roll this out? >> No one ever woke up one day and said boy would it be great if I stockpiled petabytes of data. At the end of the day, >> I know some storage companies that say that. >> They wish the customers would say that but at the end of the day you have data for analytics value and so what is analytics value? Maybe it's about a predictive algorithm. Maybe it's about a vizualisation, maybe its about a KPI for your executive suite. If you don't know, you shouldn't start. What we want to start to do is to think about use cases that make a difference to an enterprise. At TD that was fundamentally about legendary customer experience, offering the next best action to really delight that customer. At SunLife that was about making sure that they had an understand from a customer support perspective about their consumers. At some of our customers, at a healthcare company it was about faster discovery of drugs. So if you understand what those are you then start from the analytical outcome to the data that supports that and that's how you get started. How can I get the datasets that I'm pretty sure are going to drive the needle and then start to build from there to make me able to answer more and more complex questions. >> Well great those are some pretty powerful use cases, I remember back in the early Hadoop days it was like let's not have the best minds of our time figuring out how you can get better ad clicks right? >> That's right it's much easier these days. Effectively Hadoop really allows you to do, what big data really allows you to do is to answer questions more comprehensively. There was a time when cost would prevent you from being able to look at ten years worth of history, those cost impediments are gone. So your analytics are can be much better as a result, you're looking at a much broader section of data and you can do much richer what-if analysis and I think that really the secret of any good analytics is encouraging the what-if kind of questions. So you want in a data democratized world to be able to encourage people to say I wonder if this is true, I wonder if this happened and have the data to support that question. And people talk a lot about failing fast, glibly, what does that mean? Well I wonder if right now women in Montana in summertime buy more sunglasses. Where's the data that can answer that question? I want that quickly to me and I want in five minutes to say boy Joe, that was really stupid. I failed and I failed fast but it wasn't because I spent the next six weeks looking for the data assets, it's because I had the data, got analysis really quickly and then moved on to something else. The people that can churn through those questions fastest will be the ones that win. >> Very cool, I'm one of those people I love swimming into data always seeing what you can learn. Customers that want to get started, what do you recommend, what are the first steps? >> So the first thing is really about critical use case identification. Again no one wants to stockpile data so we need to start to think about how the data is going to affect an outcome and think about that user outcome. Is it someone asking in natural language a question of an application to drive a certain behavior? Is it a real time decision, what is the thing that you want to get good at? I've mentioned that TD wanted to be good about customer experience and offer development. If you think about what Target did there's a notorious story about them being able to predict pregnancy because they recognized that there was an important moment, there was a behavioral change in consumers that would overall change how they buy. What's important to you, what data might be relevant for that, anchor it there, start small, go start to operationalize the pipes that get you the data that you need and encourage a lot of experimentation with these data assets that you've got. You don't need to create petabytes of data. Create the data sets that matter and then grow from use case to use case. One of our customers SunLife did a wonderful job of really trying to articulate seven or eight key use cases that would matter and built their lake accordingly. First it was about customer behavior then it was employee behavior. If you can start to think about your customers and what they care about there's a person out there that cares about customer attrition. There's a person out there that cares about employee attrition, there's a person out there that cuts costs about cost of delivery of goods. Let's figure out what they need and how to use analytics to drive that and then we can start to get smart about the data assets that can really cause that analytics to explode. >> All right well Joe, really appreciate all the updates on the catalogs there, data at the center of digital transformation for so many customers and illuminating some key points there. >> Happy to be here. >> All right thank you so much for watching theCUBE, I'm Stu Miniman. (upbeat music)

Published Date : May 17 2019

SUMMARY :

and to help me do that, I want to welcome All right so the data catalog, let's start there. You should be able to see what you have, that's available to you in that environment. Yes, it seems like - metadata is something we often are authorized to see that and if you wanted the demographic information, you need to be able do you have an Amazon Alexa in your house? That's right so you don't want Thank you Joe, boy, I think back in the world, So when you and I have a conversation and what you need to be able to do is, there's the data to prove it and what we and instead of having to call somebody for the data to change, then you go to score it So you need to understand what is this data, At the end of the day, but at the end of the day you have data and have the data to support that question. what do you recommend, what are the first steps? the pipes that get you the data that you need data at the center of digital All right thank you so much

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
TD BankORGANIZATION

0.99+

Joe DosSantosPERSON

0.99+

JoePERSON

0.99+

AmazonORGANIZATION

0.99+

Stu MinimanPERSON

0.99+

three millionQUANTITY

0.99+

SunLifeORGANIZATION

0.99+

April 2019DATE

0.99+

MontanaLOCATION

0.99+

2019DATE

0.99+

four millionQUANTITY

0.99+

BostonLOCATION

0.99+

sevenQUANTITY

0.99+

five minutesQUANTITY

0.99+

ten yearsQUANTITY

0.99+

two peopleQUANTITY

0.99+

todayDATE

0.99+

less than one percentQUANTITY

0.99+

KansasLOCATION

0.99+

TDORGANIZATION

0.99+

six petabytesQUANTITY

0.99+

OneQUANTITY

0.99+

firstQUANTITY

0.99+

first stepsQUANTITY

0.99+

FirstQUANTITY

0.99+

Amazon.comORGANIZATION

0.99+

three thingsQUANTITY

0.99+

FacebookORGANIZATION

0.98+

Boston, MassachusettsLOCATION

0.98+

TableauTITLE

0.98+

first-timeQUANTITY

0.98+

about 50%QUANTITY

0.97+

TargetORGANIZATION

0.97+

PythonTITLE

0.97+

AlexaTITLE

0.97+

oneQUANTITY

0.97+

about 200 different applicationsQUANTITY

0.97+

StuPERSON

0.97+

last nightDATE

0.96+

eight key use casesQUANTITY

0.94+

TensorFlowTITLE

0.9+

step oneQUANTITY

0.89+

Information Management TechnologyORGANIZATION

0.89+

one dayQUANTITY

0.89+

RTITLE

0.88+

SiliconANGLEORGANIZATION

0.87+

OxycontinCOMMERCIAL_ITEM

0.87+

H2OTITLE

0.86+

Step oneQUANTITY

0.86+

QlikPERSON

0.86+

HadoopTITLE

0.85+

QlikTITLE

0.84+

QlikORGANIZATION

0.83+

single timeQUANTITY

0.81+

billion rowsQUANTITY

0.81+

two weeks beforeDATE

0.81+

next six monthsDATE

0.8+

Vice PresidentPERSON

0.8+

two different dimensionsQUANTITY

0.8+

petabytesQUANTITY

0.75+

GoogleCOMMERCIAL_ITEM

0.75+

first thingQUANTITY

0.75+

single data elementQUANTITY

0.7+

next six weeksDATE

0.7+

past tenDATE

0.66+

single personQUANTITY

0.65+

one of the questionsQUANTITY

0.64+

single dayQUANTITY

0.64+

Itamar Ankorion, Qlik | CUBE Conversation, April 2019


 

>> from the Silicon Angle Media Office in Boston, Massachusetts. It's the queue. Now here's your host. Still minimum. >> I'm stupid, Aman and this is a cube conversation from our Boston area studio. We spent a lot of time talking about digital transformation. Of course, At the center of that digital transformations data this segment We're going to be talking about the data integration platform. Joining me for that segment is Itamar on Cory on Who's the senior vice president of enterprise data Integration with Click. Thanks so much for joining me. >> Thanks to left me here. >> All right, so a zay just said, you know the customers, you know, digital information when you talked to any user, you know, there there's some that might say, Oh, there's a little bit of hyper I don't understand it, but really leveraging that data, you know, there are very few places that that is not core toe what they need to do, and if they're not doing it, they're competition will do it. So can you bring us inside a little bit? That customers you're talking to that, that you know where that fits into their business needs and you know how the data integration platform, you know, helps them solve that issue. >> Absolutely so, As you mentioned, the diesel transformation is driving a lot ofthe innovation, a lot off efforts by corporations and virtually any organization that we're talking. Toa seize data is a core component off, enabling the little transformation. The data creates new analytics, and there was toe power, the digital transformation, whether it's in making better decisions, whether it's embedding the analytics and the intelligence into business processes and custom applications to ever to reach the experience and make it better. So data becomes key, and the more data you can make available through the process, the faster you can make a development in the process. The faster you can adapt your process to accommodate the changes, the better it will be. So we're saying organization, virtually all of them looking to modernize their day, the strategy and the day, the platforms in order to accommodate these needs. >> Yeah, it's such a complex issue. We've we've been at, you know, chief data officer events way, talk about data initiatives. You know, we worry a little bit that the sea seats sometimes here it's like up. They heard data is the new oil and they came and they said, You know, according to the magazine I read, you need we need to have a date, a strategy, and give me the value of data. But, you know, where is the rubber hitting the road? You know what? What are some of those steps that they're taking? You know, how do I help, you know, get my arms around the data and that help make sure it can move along that spectrum from kind of the raw or two, you know, real value. >> I think you made a great point. Talking about the or to value our as we refer to it is a road to ready. And part of the whole innovation that we're seeing is the modernization of the platform where organizations are looking to tap into the tremendous amount of data that is available today. So a couple of things have happened first in the last decade. First of all, we have significantly more data. It is available and and then ever before, because of digitization, off data and new sources become available. But beyond that, we have the technology is the platforms that can both store in process large amounts of data. So we have foundations. But in the end, to make it happen, we need to get all the data to where we want to analyze it and find a way to put it together and turning from more row material into ready, material ready products that can be consumed. And that's really where the challenges and we're seeing. A lot of organizations, especially the CEO Seo the animals, architecture and First data architecture, teams on a journey to understand how to put together these kind of architectures and data systems. And that's where without data integration platform, we focused on accommodating the new challenges they have encountered in trying to make that happen. >> Yeah, help us unpack a little bit, You know, a here today. You know, it's the economy. Everything should work together when I rolled out. You know, in our company, you know, the industries leading serum, it's like, Oh, I've got hundreds of data sources and hundreds of tools I could put together, and it should be really easy for me to just, you know, allow my data to flow and get to the right place. But I always always find a lot a lot of times that that easy. But I've been having a hard time finding that so so >> that that's a good point. And if you cannot takes the bag, understand water, this side of the court challenges or the new needs that we're seeing because we talk about the transformation and more than analytics field by data being part of it. More analytics created a new type of challenges that didn't exist before and therefore kind of traditional data integration tools didn't do the job they didn't meet. Those model needs me very touched on a few of those. So, first of all, and people, when customers are implementing more than analytics many times where they refer to escape well they're trying to do is to do a I machine learning. We'LL use those terms and we talk about him but machine learning and I get smarter, the more data you give them. So it's all about the scale of data, and what we're seeing with customers is where if in the past data warehouse system, but if typically had five ten twenty, they the source is going into it. When I was saying one hundred X uh, times that number of sources. So we have customers that worked with five hundred six hundred, some over two thousand source of data feeding the data analytics system. So scale becomes a critical need and we talk about scale. You need the ability to bring data from hundreds or thousands of sources so systems efficiently with very low impact and ideally, do it also with less resources. Because again, you need to scale the second second chair and you ran in tow s to do with the fact that more than analytics for many organizations means real Time analytics or streaming analytics. So they wantto be ableto process data in real time. In response for that, to do that, you need away toe move data, capture it in real time and be able to make it available and do that in a very economic fashion. And then the third one is in order to deal with the scare in order to deal with the agility that the customers want. The question is, well, are they doing the analytics? And many of them are adopting the cloud, and we've been seeing multicoloured adoption. So in order to get data to the cloud. Now you're dealing with the challenge of efficiency. I have limited network band with. I have a lot of data that I need to move around. How can I move all of that and do that more efficiently? And, uh, the only thing that would add to that is that beyond that, the mechanics of how you move the data with scale, with efficiency even in real time there's also how you approach the process where the whole solution is to beware. What a join those the operations you can implement and accommodate any type of architecture. I need to have a platform that you may choose and we sink us was changed those overtime. So I need a breather to be agile and flexible. >> Yeah, well, ah, Lotto unpack there because, you know, I just made the comment. You know, if you talk about us humans, the more data we give them doesn't mean I'm actually going to get better. It's I need to We need to be able to have those tool ings in there to be able to have that data and help give me the insights, which then I could do on otherwise, you know, we understand most people. It's like if I have to make decisions or choices and I get more thrown at me, there's less and less likelihood that I can do on that on boy the Data Lakes. Yeah, I I remember the first time I heard Data Lakes. It was, you know, we talked about what infrastructure rebuilding, and now the last couple of years, the cloud public cloud tends to be a big piece of it. Even though we know data is goingto live everywhere, you know everything, not just public private ground. But EJ gets into a piece of it so that you know that the data integration platform, you know how easy it for customers get started on that We'LL talk about that diversity of everything else, you know, Where do they start? Give me a little bit of kind of customer journey, if you would. And maybe even if you have a customer example that that would be a great way to go illustrated. >> Absolutely so First of all, it's a journey, and I think that journey started quite a few years ago. I mean, do it is now over ten years old, and they were actually seeing a big change in shifting the market from what was initially the Duke ecosystem into a much brother sort of technology's, especially with the cloud in order to store and process large scales of data. So the journey customs we're going through with a few years, which were very experimental customers were trying trying it on for size. They were trying to understand how Toby the process around it, the solutions of them ivory batch oriented with may produce back in the early days off. But when you look at it today, it's a very it's already evolved significantly, and you're saying this big data systems needing to support different and diverse type off workloads. Some of them are michelle machine learning and sign. Some of them are streaming in the Olympics. Some of them are serving data for micro services toe parad, Egil applications. So there's a lot of need for the data in the journey, and what we're seeing is that customers as they move through this journey, they sometimes need to people and they need if they find you technology that come out and they had the ability to be able to accommodate, to adapt and adopt new technologies as they go through. It s so that's kind of the journey we have worked with our customers through. And as they evolved, once they figured it out, this scale came along. So it's very common to see a customer start with a smaller project and then scale it up. So for many of the cost me worked with, that's how it worked out. And you ask for an example. So one of her customers this month, the world's largest automotive companies, and they decided to have a strategy to turn what they believe is a huge asset they have, which is data. But the data is in a lot of silos across manufacturing facility supply facilities and others inventory and bring it all together into one place. Combined data with data to bring from the car itself and by having all the data in one place, be able to derive new insights into information that they they can use as well as potentially sale or monetizing other other ways. So as they got started, they initially start by running it out to set a number off their data data centers and their source of information manufacturing facilities. So they started small. But then very quickly, once they figured out they can do it fast and figure out the process to scale it. Today, there are over five hundred systems they have. Martha is over two hundred billion changes in data being fed daily. Okay, enter their Data lake. So it's a very, very large scale system. I feel we can talk about what it takes to put together something so big. >> Yeah. Don't pleaded. Please take the next step. That would that would be perfect. >> Okay, so I think whether the key things customers have to understand, uh, you were saying that the enterprise architecture teams is that when you need to scale, you need to change the way you think about things. And in the end of the day, there are two fundamental differences in the approach and the other light technology that enabled that. So we talked earlier about the little things help for the mind to understand. Now I'm going to focus on and hide it. Only two that should be easy to take away. First is that they're the move from bench to real time or from batch tow. The Delta to the changes. Traditionally, data integration was done in the best process. You reload the data today if you want to scale. If you want to work in a real time, you need to work based on the Delta on the change, the fundamental technology behind it. It's called change data capture, and it's like technology and approach. It allows you to find and identify only the changes on the enterprise data systems and imagine all the innovation you can get by capturing, imposing or the change is. First of all, you have a significant impact on the systems. Okay, so we can scale because you were moving less data. It's very efficient as you move the data around because it's only a fraction off the data, and it could be real time because again, you capturing the data as it changes. So they move from bitch to real time or to streaming data based on changes. The capture is fundamental, fundamental in creating a more than their integration environment. >> I'm assuming there's an initial load that has to go in something like that, >> correct. But he did that once and then for the rest of the time you're really moving onto the deltas. The second difference, toe one was get moving from batch toe streaming based on change. The capture and the second eyes how you approach building it, which is moving from a development. Let platform to automation. So through automation, you could take workloads that have traditionally being in the realm ofthe the developer and allow people with out development skills to be able to implement such solutions very quickly. So again, the move from developer toe toe configuration based automation based products or what we've done opportunity is First, we have been one of the pioneers in the innovators in change that I capture technology. So the platform that now it's part of the clique that integration plan from brings with it okay over fifteen years off innovation and optimization change their capture with the broader set of data sources that our support there, with lots of optimization ranging from data sources like sickle server and Oracle, the mainstream toe mainframes and to escape system. And then one of the key focus with the head is how do we take complex processes and ultimatum. So from a user perspective, you can click a few buttons, then few knobs, and you have the optimize solution available for making data moving data across that they're very sets off systems. So through moving on to the Delta and the automation, you allow this cape. >> So a lot of the systems I'm familiar with it's the metadata you know, comes in the system. I don't have to as an admin or somebody's setting that up. I don't have to do all of this or even if you think about you know, the way I think of photos these days. It used to be. I took photos and trying to sort them was, you know, ridiculous. Now, my, you know, my apple or Google, you know, normally facial recognition, but timestamp location, all those things I can sort it and find it. You know, it's built into the system >> absolutely in the metadata is critical to us to the whole process. First of all, because when you bring data from one system to another system, somebody's to understand their data. And the process of getting data into a lake and into a data warehouse is becoming a multi step day the pipeline, and in order to trust the data and understanding that you need to understand all the steps that they went through. And we also see different teams taking part in this process. So for it seemed to be able to pick up the data and work on it, it needs to understand its meta data. By the way, this is also where the click their integration platform bring together the unity software. Together with Click the catalyst, we'LL provide unique value proposition for you that because you have the ability to capture changed data as it changes, deliver that data virtually anywhere. Any data lake, any cloud platform, any analytic platform. And then we find the data to generate analytic ready data sets and together with the click data Catalyst, create derivative data sets and publish all of their for a catalogue that makes it really easy to understand which data exists and how to use it. So we have an end to end solution for streaming data pipelines that generate analytic data that data sets for the end of the day, wrote to ready an accelerated fashion. >> So, Itamar, your customers of the world that out, How did they measures Casesa? Their critical KP eyes is there You know some, you know, journey, you know, math that they help go along. You know what? What? What are some commonalities? >> So it's a great question. And naturally, for many organizations, it's about an arrow. I It's about total cost of ownership. It seeing result, as I mentioned earlier, agility and the timeto value is really changing. Customers are looking to get results within a matter of, if very few month and even sometimes weeks versus what it used to be, which is many months and sometimes even years. So again, the whole point is to do with much, much faster. So from a metric for success, what we're seeing his customers that buy our solution toe enable again large scale strategic initiatives where they have dozens to hundreds of data sources. One of the key metrics is how many data sources heavy onboard that heavy, made available. How many in the end of the data sets that already analytic ready have we published or made available Torrey Tor users and I'LL give you an example. Another example from one of for customers, very large corporation in the United States in the opportunity of after trying to move to the cloud and build a cloud Data Lake and analytic platform. In the two years they're able to move to two three data sets to the cloud after they try, they knew they'd integration platform okay, there. But they moved thirty day The sits within three months, so completely different result. And the other thing that they pointed out and actually talk about their solution is that unlike traditional data integration software, and they took an example of one of those traditional PTL platforms and they pointed out it takes seven months to get a new person skilled on that platform. Okay, with our data integration platform, they could do that in a matter of hours to a few days. So again, the ability to get results much faster is completely different. When you have that kind of software that goes back to a dimension about automation versus development based mouth now, >> it really seems like the industry's going through another step function, just as we saw from traditional data warehouses. Tto win. Who? Duke rolled out that just the order of magnitude, how long it took and the business value return Seems like we're we're going through yet another step function there. So final thing. Yeah, You know what? Some of the first things that people usually get started with any final takeaways you want to share? >> Sure. First, for what people are starting to work with. Is there usually selecting a platform of choice where they're gonna get started in respect of whether Iran analytics and the one take a way I'LL give customers is don't assume that the platform you chose is we're going to end up because new technologies come to market, a new options come. Customers are having mergers, acquisitions, so things change all the time. And as you plan, make sure you have the right infrastructure toe allow you two kind of people support and make changes as you move through the throw. These are innovation. So they may be key key takeaway. And the other one is make sure that you're feeling the right infrastructure that can accommodate speed in terms of real time accomodate scale. Okay, in terms of both enabling data legs, letting cloud data stores having the right efficiency to scale, and then anything agility in respect to being able to deploy solution much, much faster. Yeah, >> well, tomorrow I think that. That's some real important things to say. Well, we know that the only constant Internet industry is change on DH. Therefore, we need to have solutions that can help keep up with that on and be able to manage those environments. And, you know, the the role of is to be able to respond to those needs of the business fast. Because if I don't choose the right thing, the business will go elsewhere. Tara trying to fuck with Angelo. Thank you so much for sharing all the latest on the immigration data platforms. Thank you. Alright, Uh, always lots more on the cube dot Net comes to minimum is always thanks for watching.

Published Date : May 16 2019

SUMMARY :

It's the queue. Itamar on Cory on Who's the senior vice president of enterprise data Integration with Click. and you know how the data integration platform, you know, helps them solve that issue. and the more data you can make available through the process, the faster you can make a development that spectrum from kind of the raw or two, you know, real value. But in the end, to make it happen, we need to get all the data to easy for me to just, you know, allow my data to flow and get to the right place. the mechanics of how you move the data with scale, with efficiency even in real time there's Yeah, well, ah, Lotto unpack there because, you know, I just made the comment. So the journey customs we're going through with a few years, which were very experimental customers Please take the next step. imagine all the innovation you can get by capturing, imposing or the change is. So through moving on to the Delta and the automation, you allow this cape. So a lot of the systems I'm familiar with it's the metadata you know, absolutely in the metadata is critical to us to the whole process. there You know some, you know, journey, you know, math that they help go along. So again, the ability to get results much faster is completely different. it really seems like the industry's going through another step function, just as we saw from traditional data warehouses. assume that the platform you chose is we're going to end up because new technologies come to market, Alright, Uh, always lots more on the cube dot Net comes to minimum is always

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
hundredsQUANTITY

0.99+

BostonLOCATION

0.99+

GoogleORGANIZATION

0.99+

April 2019DATE

0.99+

appleORGANIZATION

0.99+

TodayDATE

0.99+

seven monthsQUANTITY

0.99+

United StatesLOCATION

0.99+

OlympicsEVENT

0.99+

thousandsQUANTITY

0.99+

OracleORGANIZATION

0.99+

Itamar AnkorionPERSON

0.99+

two yearsQUANTITY

0.99+

FirstQUANTITY

0.99+

OneQUANTITY

0.99+

dozensQUANTITY

0.99+

three monthsQUANTITY

0.99+

TaraPERSON

0.99+

thirty dayQUANTITY

0.99+

one hundredQUANTITY

0.99+

Boston, MassachusettsLOCATION

0.99+

over five hundred systemsQUANTITY

0.99+

AmanPERSON

0.99+

tomorrowDATE

0.99+

one placeQUANTITY

0.98+

twoQUANTITY

0.98+

over fifteen yearsQUANTITY

0.98+

third oneQUANTITY

0.98+

Silicon Angle Media OfficeORGANIZATION

0.98+

oneQUANTITY

0.98+

bothQUANTITY

0.98+

todayDATE

0.98+

second eyesQUANTITY

0.98+

DukeORGANIZATION

0.97+

second differenceQUANTITY

0.97+

AngeloPERSON

0.97+

CasesaORGANIZATION

0.97+

over two thousandQUANTITY

0.96+

second second chairQUANTITY

0.96+

firstQUANTITY

0.95+

over two hundred billion changesQUANTITY

0.95+

five ten twentyQUANTITY

0.93+

two kindQUANTITY

0.93+

last couple of yearsDATE

0.92+

last decadeDATE

0.91+

over ten years oldQUANTITY

0.9+

five hundred six hundredQUANTITY

0.88+

two fundamental differencesQUANTITY

0.88+

first timeQUANTITY

0.88+

one systemQUANTITY

0.88+

two three data setsQUANTITY

0.87+

this monthDATE

0.87+

QlikPERSON

0.87+

ItamarPERSON

0.86+

first thingsQUANTITY

0.85+

toolsQUANTITY

0.82+

Data LakesTITLE

0.82+

CatalystORGANIZATION

0.81+

EJORGANIZATION

0.8+

few years agoDATE

0.78+

Torrey TorTITLE

0.77+

ClickORGANIZATION

0.77+

LottoORGANIZATION

0.76+

yearsQUANTITY

0.74+

MarthaORGANIZATION

0.73+

DeltaOTHER

0.72+

hundreds of data sourcesQUANTITY

0.7+

IranLOCATION

0.68+

DeltaORGANIZATION

0.5+

coupleQUANTITY

0.49+

CoryPERSON

0.48+

catalystORGANIZATION

0.45+

sourcesQUANTITY

0.44+

michelleORGANIZATION

0.44+

SeoORGANIZATION

0.43+

Itamar Ankorion & Drew Clarke, Qlik | CUBE Conversation, April 2019


 

>> from the Silicon Angle Media Office in Boston, Massachusetts. It's the queue. Now here's your host. Still minimum. >> Hi, I'm student men and welcome to a special edition of Cube conversations here in our Boston area studio. Habito. Welcome to the program. First of all, to my right, a first time guests on the program Drew Clark, Who's the chief strategy officer? A click and welcome back to the program tomorrow on Carryon. Who's a senior vice president of enterprise data integration now with Click but new title to to the acquisition of Eternity. So thanks so much for joining us, gentlemen. >> Great to be here. >> All right, True, You know, to Nitti we've had on the program anytime we haven't click on the program, but maybe for audience just give us a quick level set on Click. And you know the acquisition, you know, is some exciting news. So let's start there and we'LL get into it. >> Sure, thanks. Teo and Click were a twenty five year old company and the business analytics space. A lot of people know about our products. Clint View, Click Sense. We have fifty thousand customers around the world and from large companies, too kind of small organizations. >> Yeah. Alright. Eso you No way. Talk a lot about data on our program. You know, I looked through some of the clique documentation. It resonated with me a bit because when we talk about digital transformation on our program, the key thing that different to the most between the old way of doing things the modern is I need to be data driven. They need to make my decision the the analytics piece of that s o it. Tomorrow, let's start there and talk about, you know, other than you know, that the logo on your card changes. You know what's the same? What's different going forward for you? >> Well, first, we were excited about that about this merger and the opportunity that we see in the market because there's a huge demand for data, presumably for doing new types of analytics business intelligence. They they's fueling the transformation. And part of the main challenge customers have organizations have is making more data available faster and putting it in the hands of the people who need it. So, on our part of the coming from eternity, we spend the last few years innovating and creating technology that they helped car organizations and modernize how they create new day. The architecture's to support faster data, more agility in terms ofthe enabling data for analytics. And now, together with Click, we can continue to expand that and then the end of the day, provide more data out to more people. >> S o. You know, Drew, it's interesting, you know that there's been no shortage of data out there. You know, we've for decades been talking about the data growth, but actually getting access store data. It's in silos more than ever. It's, you know, spread out all over the day. We say, you know, the challenge of our time is really building distributed architectures and data is really all over the place and, you know, customers. You know, their stats all over the places to how much a searchable how much is available. You know how much is usable? So, you know, explain a little bit, you know, kind of the challenge you're facing. And you know how you're helping move customers along that journey? >> Well, what you bring up stew is thie kind of the idea of kind of data and analytics for decision making and really, it's about that decision making to go faster, and you're going to get into that right kind of language into the right individuals. And we really believe in his concept of data literacy and data literacy was said, I think, well, between two professors who co authored a white paper. One professor was from M I t. The other one's from ever sin college, a communication school. Data literacy is the kind of the ability to read, understand, analyze and argue with data. And the more you can actually get that working inside an organization, the better you have from a decision making and the better competitive advantage you have your evening or wind, you're going to accomplish a mission. And now with what you said, the proliferation of data, it gets harder. And where do you find it? And you need it in real time, and that's where the acquisition of opportunity comes in. >> Okay, I need to ask a follow up on that. So when a favorite events I ever did with two other Emmett professors, yes, where Boston area. We're putting a lot >> of the >> mighty professors here, but any McAfee and Erik Nilsson talked about racing with the machine because, you know, it's so great, you know? You know who's the best chess player out there? Was it you know, the the human grandmaster, or was that the computer? And, you know, the studies were actually is if you put the grandmaster with the computer, they could actually beat either the best computer or the best person. So when you talk about, you know, the data and analytics everybody's looking at, you know, the guy in the ML pieces is like, OK, you know, how do these pieces go together? How does that fit into the data literacy piece? You know, the people and, you know, the machine learning >> well where you bring up is the idea of kind of augmenting the human, and we believe very much around a cognitive kind of interface of kind of the technology, the software with kind of a person and that decision making point. And so what you'LL see around our own kind of perspective is that we were part of a second generation be eye of like self service, and we've moved rapidly into this third generation, which is the cognitive kind of augmentation and the decision maker, right? And so you say this data literacy is arguing with data. Well, how do you argue and actually have the updated machine learning kind of recommendations? But it's still human making that decision. And that's an important kind of component of our kind of, like, our own kind of technology that we bring to the table. But with the two nitti, that's the data side needs to be there faster and more effective. >> Yeah. So, Itamar, please. You know Phyllis in on that. That data is the, you know, we would in big data, we talk about the three V's. So, you know, where are we today? How dowe I be ableto you know, get in leverage all of that data. >> So that's exactly where we've been focused over the last few years and worked with customers that were focused on building new data lakes, new data warehouses, looking at the clouds, building basically more than new foundations for enabling the organization to use way more data than every before. So it goes back to the volume at least one V out of the previous you mentioned. And the other one, of course, is the velocity. And how fast it is, and I've actually come to see that there are, in a sense, two dimensions velocity that come come together. One is how timely is the data you're using. And one of the big changes we're seeing in the market is that the user expectation and the business need for real time data is becoming ever more critical. If we used to talkto customers and talk about real time data because when they asked her data, they get a response very quickly. But it's last week's data. Well, that's not That doesn't cut it. So what we're seeing is that, first of all, the dimension of getting data that Israel Time Day that represents the data is it's currently second one is how quickly you can actually make that happen. So because business dynamics change match much faster now, this speed of change in the industry accelerates. Customers need the ability to put solutions together, make data available to answer business questions really faster. They cannot do it in the order ofthe month and years. They need to do it indoors off days, sometimes even hours. And that's where our solutions coming. >> Yeah, it's interesting. You know, my backgrounds. On the infrastructure side, I spent a lot of time in the cloud world. And, you know, you talk about, you know, health what we need for real time. Well, you know, used to be, you know, rolled out a server. You know, that took me in a week or month and a V m it reduced in time. Now we're, you know, containerized in communities world. And you know what? We're now talking much sort of time frame, and it's like, Oh, if you show me the way something was, you know, an hour ago. Oh, my gosh, That's not the way the world is. And I think, you know, for years we talked to the Duke world. You know what Israel time and how do I really define that? And the answer. We usually came up. It is getting the right information, you know, in the right place, into the right person. Or in the sales standpoint, it's like I need that information to save that client. They get what they need. So we still, you know, some of those terms, you know, scale in real time, short of require context. But you know what? Where does that fit into your customer discussions. >> Well, >> to part says, you bring up. You know, I think what you're saying is absolutely still true. You know, right? Data, right person, right time. It gets harder, though, with just the volumes of data. Where is it? How do you find it? How do you make sure that it's It's the the right pieces to the right place and you brought up the evolution of just the computer infrastructure and analytics likes to be close to the data. But if you have data everywhere, how do you make sure that part works? And we've been investing in a lot of our own Cloud Analytics infrastructure is now done on a micro services basis. So is running on Cuban eighties. Clusters it Khun work in whatever cloud compute infrastructure you want, be it Amazon or zur or Google or kind of your local kind of platform data centers. But you need that kind of small piece tied to the right kind of did on the side. And so that's where you see a great match between the two solutions and when you in the second part is the response from our customer's on DH after the acquisition was announced was tremendous. We II have more customer who works in a manufacturing space was I think this is exactly what I was looking to do from an analytic spaces I needed. Mohr did a real time and I was looking at a variety of solutions. She said, Thank you very much. You made my kind of life a little easier. I can narrow down Teo. One particular platform s so we have manufacturing companies. We have military kind of units and organizations. Teo Healthcare organizations. I've had just countless kind of feedback coming in along that same kind of questions. All >> right, Amaar, you know, for for for the eternity. Customers, What does this mean for them coming into the click family? >> Well, first of all, it means for them that we have a much broader opportunity to serve them. Click is a much, much bigger company. We have more resources. We can put a bear to both continuing enhance The opportunity. Offering is well as creating integrations with other products, such as collecting the click Data catalyst, which are click acquired several months ago. And there's a great synergy between those the products to the product and the collected a catalyst to provide a much more comprehensive, more an enterprise data integration platform, then beyond there to create, also see energies with other, uh, click analytic product. So again, while the click their integration platform consisting Opportunity and Click the catalyst will be independent and provide solutions for any data platform Analytic platform Cloud platform is it already does. Today we'LL continue to investigate. There's also opportunities to create unique see energies with some afar clicks technologies such as the associative Big Data Index and some others to provide more value, especially its scale. >> All right, eso drew, please expand on that a little bit if you can. There's so many pieces I know we're going to spend a little bit. I'm going deeper and some some of the other ones. But when you talk to your customers when you talk to your partners, what do you want to make sure there their key takeaways are >> right. So there is a couple of important points Itamar you made on the data integration platform, and so that's a combination of the eternity products plus the data catalysts, which was, you know, ca wired through podium data. Both of those kind of components are available and will continue to be available for our customers to use on whatever analytics platform. So we have customers who use the data for data science, and they want to work in our python and their own kind of machine learning or working with platforms like data robots. And they'LL be able to continue to do that with that same speed. They also could be using another kind of analytical visualization tool. And you know, we actually have a number of customers to do that, and we'LL continue to support that. So that's the first point, and I think you made up, which is the important one. The second is, while we do think there is some value with using Click Sense with the platform, and we've been investing on a platform called the Associative Big Data Index, and that sounds like a very complicated piece. But it's what we've done is taken are kind of unique kind of value. Proposition is an analytical company which is thehe, bility, toe work with data and ask questions of it and have the answers come to you very quickly is to be able to take that same associative experience, uh, that people use in our product and bring it down to the Data Lake. And that's where you start to see that same kind of what people love about click, view and click sense and brought into the Data Lake. And that's where Tamara was bringing up from a scale kind of perspective. So you have both kind of opportunities, >> Drew, and I really appreciate you sharing the importance of these coming together. We're going to spend some more time digging into the individual pieces there. I might be able to say, OK, are we passed the Data Lakes? Has it got to a data swamp or a data ocean? Because, you know, there are lots of sources of data and you know the like I always say Is that seems a little bit more pristine than the average environment. Eso But thank you so much and look forward to having more conversations with thanks to all right, you. And be sure to, uh, check out the cute dot net for all our videos on stew minimum. Thanks so much for watching

Published Date : May 16 2019

SUMMARY :

It's the queue. First of all, to my right, a first time guests on the program Drew And you know the acquisition, A lot of people know about our products. Tomorrow, let's start there and talk about, you know, other than you know, is making more data available faster and putting it in the hands of the people who need it. really all over the place and, you know, customers. And the more you can actually get that working So when a favorite events I ever did with two other Emmett You know, the people and, you know, the machine learning And so you say this data literacy is arguing with data. That data is the, you know, looking at the clouds, building basically more than new foundations for enabling the organization to use way more It is getting the right information, you know, in the right place, And so that's where you see a great match between the two solutions right, Amaar, you know, for for for the eternity. And there's a great synergy between those the products to the product and the collected a catalyst to provide a But when you talk to your customers when you talk to your partners, what do you want to make sure there their key the answers come to you very quickly is to be able to take that same associative experience, you know, there are lots of sources of data and you know the like I always say Is that seems

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
StevePERSON

0.99+

Dave VellantePERSON

0.99+

Steve ManlyPERSON

0.99+

SanjayPERSON

0.99+

RickPERSON

0.99+

Lisa MartinPERSON

0.99+

VerizonORGANIZATION

0.99+

DavidPERSON

0.99+

AWSORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

Fernando CastilloPERSON

0.99+

JohnPERSON

0.99+

Dave BalantaPERSON

0.99+

ErinPERSON

0.99+

Aaron KellyPERSON

0.99+

JimPERSON

0.99+

FernandoPERSON

0.99+

Phil BollingerPERSON

0.99+

Doug YoungPERSON

0.99+

1983DATE

0.99+

Eric HerzogPERSON

0.99+

LisaPERSON

0.99+

DeloitteORGANIZATION

0.99+

YahooORGANIZATION

0.99+

SpainLOCATION

0.99+

25QUANTITY

0.99+

Pat GelsingPERSON

0.99+

Data TorrentORGANIZATION

0.99+

EMCORGANIZATION

0.99+

AaronPERSON

0.99+

DavePERSON

0.99+

PatPERSON

0.99+

AWS Partner NetworkORGANIZATION

0.99+

Maurizio CarliPERSON

0.99+

IBMORGANIZATION

0.99+

Drew ClarkPERSON

0.99+

MarchDATE

0.99+

John TroyerPERSON

0.99+

Rich SteevesPERSON

0.99+

EuropeLOCATION

0.99+

BMWORGANIZATION

0.99+

VMwareORGANIZATION

0.99+

three yearsQUANTITY

0.99+

85%QUANTITY

0.99+

Phu HoangPERSON

0.99+

VolkswagenORGANIZATION

0.99+

1QUANTITY

0.99+

Cook IndustriesORGANIZATION

0.99+

100%QUANTITY

0.99+

Dave ValataPERSON

0.99+

Red HatORGANIZATION

0.99+

Peter BurrisPERSON

0.99+

BostonLOCATION

0.99+

Stephen JonesPERSON

0.99+

UKLOCATION

0.99+

BarcelonaLOCATION

0.99+

Better Cybercrime Metrics ActTITLE

0.99+

2007DATE

0.99+

John FurrierPERSON

0.99+

Drew Clarke, Qlik | CUBE Conversation, April 2019


 

>> From the SiliconANGLE Media office in Boston, Massachesetts, it's theCUBE. Now here's your host, Stu Miniman. >> Hi I'm Stu Miniman and this is a CUBE conversation from our Boston area studios. The ecosystem around data and analytics definitely isn't becoming any simpler today. Joining me for this segment is Drew Clarke who's the chief strategy officer at Qlik. And Drew let's start there, we talk about the wave of big data, a lot of them have wrapped themselves around the cloke of AI today, you've got machine learning in there. So help kinda give us a little bit about where Qlik fits into that ecosystem and differentiates itself from this very diverse ecosystem. >> Yeah, sure and I get that question a lot Stu is, who is Qlik and what makes us unique. And as a strategy, individual and professional, I spend a lot of time talking, working with customers that are looking at companies and I always come back to it like, what is that core kinda part? Every company comes from something and then how does it fit into the landscape, so I use actually our history to explain a little bit about who we are. So we're 25 years ago, or 25 years old and our very first customer was Tetrapak which make cardboard boxes, of all different sizes, so if you think about Amazon when you order something and you get it showed up at your, it shows up at your desktop or your door, it's in a different size box. Well Tetrapak had a problem of their sales people were selling inventory they didn't have. And they needed to be able to sell what they had, but they also wanted to make sure they showed what they did not have. So they signed on and had a project with Qlik. And this is in Sweden, and they developed a product which is really a product configurator tied with a visualization to it. So what they had the answer on a business question was, tell me what products are and are not available and be able to dynamically make selections as sales reps were answering the questions. So that was the genesis of our own kinda product, so we had a choice back then to say, do we stay in a product configurator space, or do we move into the visualization analytics? And so we took that unique kinda package, what we call the associative engine with the visual kinda piece and we went and started on the business intelligence or the analytics journey. And where we've kinda evolved that as a company is we took that, and another great example is another customer a couple of years ago there was the tsunami in Japan, do you remember that Stu? >> Of course. >> When that happened. So one of our customers was in the consumer products and they had a lot of supply or ingredients that came out of Japan. And they also knew that, okay, the tsunami hit, big impact on there supply chain, and they had to actually make an announcement, they had earnings on Wall Street, and they needed to be able to outline to their investors within the week to say, well is this a big impact, is this not a big impact on our forward looking revenues? And they tried answering the question using traditional analytics, you know, show me what products were impacted by the tsunami, and that's a first order question, as you know it's an easy question to ask. Well now you're going down into the ingredients, you're looking at where the data is in the supply chain and you come back with an answer that says these are the ones that are impacted. The next question that the business asks was okay, tell me what products were not affected. And now think about that is not question going through every single row. Oh and tell me what the inventory is, and can we run campaigns and sales where we know we're either A gonna miss our revenue numbers or we're gonna hit 'em. And they used the Qlik, they tried a different kinda traditional way of answering a question, they couldn't answer it 'cause they get stuck at that first. It was Qlik that actually entered and helped them answer the second question, show what products were not affected and do we have inventory, and they would be able to make that decision. And so that's where we start, what we makes us unique is this combination of analytics and visual kinda interface. And that's been kinda our core differentiator in the market from 25 years ago to where we are today. >> Yeah, and boy that history has changed quite a lot. Think about data visualization, we used to do infographics many years ago, just how do I tell a story with that data? There's the creative things you can do with it but as well as us as humans we look at all of those data points out there and most of the times it's not static, I love people when they're sharing, it's like okay, let me give you charts for something over a 100 year period, and you can watch it ebb and flow and change in the like, so there's so many technology. 25 years ago, cloud had many different terms, I can argue I've worked with plenty of people that we had the XSP back in the 90s and the pre-cloud things. But there's some challenges that we've been trying to solve and then some major breakthroughs we've had with some of these journeys and these technology waves, so bring us up to today as to, we talk about things like speed and scale and agility impacting what we're doing, it's got to be, you've got the why and the core, but the how and the what has changed dramatically. >> Stu you really are kind of a technical kinda guy at heart, right, so one of the things you said at the beginning there where you talked about looking at an infographic and the human kinda component of, how do you look at this information, how do you understand it? It's getting bigger and harder to understand. One of the things that we firmly believe in is the human being is an integral part of the decision making process. And so you think about a scatter plot with 30,000 data points, how do you actually make sense of it? And we spend a lot of time about the human brain and how it looks at information on this kinda big data scale and we're a predator as a human, we're binocular and we look for certain things, and so we spend a lot of time around that kinda visual interface. And I think Steven Few writes about this, Edward Tufte and his documentation around kinda how do you present information in a great way. Well, you take that 30,000 data points on a scatterplot, and well bringing it forward in our technology we show density in heat because that's what we look for. And we look for patterns, and we look for outliers as a predator as kind of an individual. And so we present the information in a way that a human is kinda wired to receive it, but underneath, and this is where I think you're second part was going, underneath is like how do you keep that elegance and responding to a kind of now compute and infrastructure and all the sides. >> Yeah and I guess I always worried is we talk about garbage in, garbage out sometimes. How do I make sure I've got good data, how do I make sure the algorithm is learning? There was a tool that was, oh let me train this AI on Twitter and what they got back they had to turn it off really quick because it became a troll and then much worse and the language was awful, so sometimes if you just let the data run wild the algorithm doesn't understand what's going on. How do you balance that and make sure we're getting good decisions and good information? And we say, if you automate a bad process you haven't done a good thing. >> Right, right. Well that comes through a number of layers from automation there's kinda the data, getting it from the raw source, getting it ready for the analytical consumption, and is it a machine, is it a human, is it a human augmented with kinda the intelligence? And as you progress through this data journey of bringing the data into now the common terms are data lakes and data swamps. How do you find the right information and where do you put the right kinda governance? And governance, not being a bad word, but governance being a, I'm confident that information is correct. And so you see the introduction of data catalogs, so much like a card catalog in a library if you're old enough to actually remember that. >> I know the Dewey Decimal System. >> Okay, there you go so I was a page, that was one of my first paying job was to put books back in the library. And you want to be able to find the right information and know that it's been curated, been set up, but it doesn't have to be written all out. You want to have that progressive kind of bringing of that information for the user to be able to do that. And as you kind of fan out from the central that raw data out to kinda where the analytics users are kinda engaging and working with it. That governance allows for that confidence, but then you need to know that you're scaling and the speed. You don't want to wait if you had a request. The decision just like, even what happened to that customer, tsunami happened, I have earnings on a set day in days from an event. I can't wait a month to come up with the answer. I need that speed, I need that faster. >> Alright, so who's the one inside the customer that work's on this, you know, we've all heard that there's skill gaps out there. Years ago it was like, okay we're going to build this giant army of data scientists. It's not like we're saying we don't need data scientists but we don't have enough time to train enough PhD's to fill the jobs. So where are we today, where do the customers fit organizationally, and if you can get into a little bit of where the product touches them? >> Sure, so what you bring up is the. Great interviewer, broad question, so many different ways we can go with this. And I come back to the idea of what a lot of people come and talk about is this citizen data scientist, but it's really about data literacy. And these are individuals who need to be comfortable working with data, and how do you actually have that confidence level of when I'm looking at it do I know is it real? Am I having the right conversation? Just recently I had the opportunity to see a number of presentations by college seniors who were presenting their senior thesis' on how they're working on a particular theme. And I was in this behavioral sciences and leadership department, it was at the United States Military Academy at Westpoint. And when you think about leadership and you think about behavioral sciences and you think about a lot of the softer side of it, but everyone of these cadets had data and you can see them looking at the empirical data, looking at the R coefficients, is this noise, is this signal, what's causation versus correlation. What you see is this language of data literacy in the curriculum and you flash forward and you look at every department in a company and you see people are coming in who understand there's data that can be used to be informing my decision so I don't need to wait for this white lab coat PhD on data science. It's like well, is there causation is there correlation? So marketing, finance, sales, we're seeing this at that data citizen at the edges in a company and it's coming out of the universities. >> Yeah I was at a conference recently and the analysts up on the keynote stage says, you want to teach your team machine learning? Get a summer intern that's taken the courses and have them spend a week training you up on it. So excellent, so sounds like if someone wants to get started with Qlik, relatively low bar. I don't have to go through some six month training class to be able to start getting some business value and rolling this out. >> Yeah, exactly. Stu, you can go right on our website and you can sign up and start to use our product right in the cloud. If you want to put it on a desktop you can do that. And you just drag in your first data files and I encourage you to actually bring in a complicated dataset. Don't go with a simple excel file, a lot of companies can do bars, charts, and graphs. But what you really want to do is bring in two different datasets and bring it into, and remember the associative engine of bringing different data together? And it's the second and third question that you're really looking for those insights. And so you can very quickly assemble the information. You don't need to go back and learn what a left outer joint is because our engine takes care of that for you. You want to understand what's going on? It's transparent. And then you start finding insights within minutes of being able to use that. >> Yeah well if you go back to the Hitchhikers Guide to the Galaxy, sometimes the answer's easy, I have to know the right questions to be able to ask. Alright, Drew I want to give you the final takeaway for this piece. >> Okay, so if you're thinking about dealing with any data and you want to answer not just the question, but it's usually the second and third, and you want to have a speed of use. You can do that with our platform, but think about it really in that concept of data literacy and you want that right information for the individuals to read and write, that's okay and it's easy. It's analyzing and arguing and that's where the competitive advantage so take a look at that. >> Alright, well Drew Clarke really appreciate the updates on Qlik and be sure to check out theCUBE.net. There's a nice little search bar on top, you can search by company, search by person, actually a lot of the key metadata you can search for in there. Thousands of videos in there. Never a registration to be able to get it. So I'm Stu Miniman and thanks as always for watching theCUBE. (upbeat music)

Published Date : May 13 2019

SUMMARY :

From the SiliconANGLE Media office Hi I'm Stu Miniman and this is a CUBE conversation And they needed to be able to sell what they had, and you come back with an answer that says these are There's the creative things you can do with it but as at heart, right, so one of the things you said at And we say, if you automate a bad process And so you see the introduction of data catalogs, And you want to be able to find the right information that work's on this, you know, in the curriculum and you flash forward and you look at you want to teach your team machine learning? And so you can very quickly assemble the information. the right questions to be able to ask. and you want to have a speed of use. There's a nice little search bar on top, you can search

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Drew ClarkePERSON

0.99+

DrewPERSON

0.99+

Edward TuftePERSON

0.99+

SwedenLOCATION

0.99+

BostonLOCATION

0.99+

TetrapakORGANIZATION

0.99+

Steven FewPERSON

0.99+

JapanLOCATION

0.99+

QlikORGANIZATION

0.99+

Stu MinimanPERSON

0.99+

April 2019DATE

0.99+

secondQUANTITY

0.99+

second questionQUANTITY

0.99+

AmazonORGANIZATION

0.99+

six monthQUANTITY

0.99+

United States Military AcademyORGANIZATION

0.99+

thirdQUANTITY

0.99+

StuPERSON

0.99+

second partQUANTITY

0.99+

oneQUANTITY

0.99+

Hitchhikers Guide to the GalaxyTITLE

0.98+

30,000 data pointsQUANTITY

0.98+

todayDATE

0.98+

Wall StreetLOCATION

0.98+

Thousands of videosQUANTITY

0.98+

third questionQUANTITY

0.98+

30,000 data pointsQUANTITY

0.98+

firstQUANTITY

0.98+

WestpointORGANIZATION

0.97+

first customerQUANTITY

0.97+

25 years agoDATE

0.97+

a weekQUANTITY

0.97+

OneQUANTITY

0.96+

MassachesettsLOCATION

0.96+

TwitterORGANIZATION

0.95+

90sDATE

0.95+

100 yearQUANTITY

0.94+

SiliconANGLEORGANIZATION

0.92+

first data filesQUANTITY

0.91+

a monthQUANTITY

0.91+

couple of years agoDATE

0.88+

first order questionQUANTITY

0.87+

25 years oldQUANTITY

0.85+

two different datasetsQUANTITY

0.85+

many years agoDATE

0.85+

DeweyORGANIZATION

0.82+

Years agoDATE

0.79+

theCUBE.netOTHER

0.77+

ofEVENT

0.76+

QlikPERSON

0.71+

single rowQUANTITY

0.71+

jobQUANTITY

0.6+

theCUBEORGANIZATION

0.6+

overQUANTITY

0.58+

Peter MacDonald & Itamar Ankorion | AWS re:Invent 2022


 

(upbeat music) >> Hello, welcome back to theCUBE's AWS RE:Invent 2022 Coverage. I'm John Furrier, host of theCUBE. Got a great lineup here, Itamar Ankorion SVP Technology Alliance at Qlik and Peter McDonald, vice President, cloud partnerships and business development Snowflake. We're going to talk about bringing SAP data to life, for joint Snowflake, Qlik and AWS Solution. Gentlemen, thanks for coming on theCUBE Really appreciate it. >> Thank you. >> Thank you, great meeting you John. >> Just to get started, introduce yourselves to the audience, then going to jump into what you guys are doing together, unique relationship here, really compelling solution in cloud. Big story about applications and scale this year. Let's introduce yourselves. Peter, we'll start with you. >> Great. I'm Peter MacDonald. I am vice president of Cloud Partners and business development here at Snowflake. On the Cloud Partner side, that means I manage AWS relationship along with Microsoft and Google Cloud. What we do together in terms of complimentary products, GTM, co-selling, things like that. Importantly, working with other third parties like Qlik for joint solutions. On business development, it's negotiating custom commercial partnerships, large companies like Salesforce and Dell, smaller companies at most for our venture portfolio. >> Thanks Peter and hi John. It's great to be back here. So I'm Itamar Ankorion and I'm the senior vice president responsible for technology alliances here at Qlik. With that, own strategic alliances, including our key partners in the cloud, including Snowflake and AWS. I've been in the data and analytics enterprise software market for 20 plus years, and my main focus is product management, marketing, alliances, and business development. I joined Qlik about three and a half years ago through the acquisition of Attunity, which is now the foundation for Qlik data integration. So again, we focus in my team on creating joint solution alignment with our key partners to provide more value to our customers. >> Great to have both you guys, senior executives in the industry on theCUBE here, talking about data, obviously bringing SAP data to life is the theme of this segment, but this reinvent, it's all about the data, big data end-to-end story, a lot about data being intrinsic as the CEO says on stage around in the organizations in all aspects. Take a minute to explain what you guys are doing as from a company standpoint. Snowflake and Qlik and the solutions, why here at AWS? Peter, we'll start with you at Snowflake, what you guys do as a company, your mission, your focus. >> That was great, John. Yeah, so here at Snowflake, we focus on the data platform and until recently, data platforms required expensive on-prem hardware appliances. And despite all that expense, customers had capacity constraints, inexpensive maintenance, and had limited functionality that all impeded these organizations from reaching their goals. Snowflake is a cloud native SaaS platform, and we've become so successful because we've addressed these pain points and have other new special features. For example, securely sharing data across both the organization and the value chain without copying the data, support for new data types such as JSON and structured data, and also advance in database data governance. Snowflake integrates with complimentary AWS services and other partner products. So we can enable holistic solutions that include, for example, here, both Qlik and AWS SageMaker, and comprehend and bring those to joint customers. Our customers want to convert data into insights along with advanced analytics platforms in AI. That is how they make holistic data-driven solutions that will give them competitive advantage. With Snowflake, our approach is to focus on customer solutions that leverage data from existing systems such as SAP, wherever they are in the cloud or on-premise. And to do this, we leverage partners like Qlik native US to help customers transform their businesses. We provide customers with a premier data analytics platform as a result. Itamar, why don't you talk about Qlik a little bit and then we can dive into the specific SAP solution here and some trends >> Sounds great, Peter. So Qlik provides modern data integration and analytics software used by over 38,000 customers worldwide. Our focus is to help our customers turn data into value and help them close the gap between data all the way through insight and action. We offer click data integration and click data analytics. Click data integration helps to automate the data pipelines to deliver data to where they want to use them in real-time and make the data ready for analytics and then Qlik data analytics is a robust platform for analytics and business intelligence has been a leader in the Gartner Magic Quadrant for over 11 years now in the market. And both of these come together into what we call Qlik Cloud, which is our SaaS based platform. So providing a more seamless way to consume all these services and accelerate time to value with customer solutions. In terms of partnerships, both Snowflake and AWS are very strategic to us here at Qlik, so we have very comprehensive investment to ensure strong joint value proposition to we can bring to our mutual customers, everything from aligning our roadmaps through optimizing and validating integrations, collaborating on best practices, packaging joint solutions like the one we'll talk about today. And with that investment, we are an elite level, top level partner with Snowflake. We fly that our technology is Snowflake-ready across the entire product set and we have hundreds of joint customers together and with AWS we've also partnered for a long time. We're here to reinvent. We've been here with the first reinvent since the inaugural one, so it kind of gives you an idea for how long we've been working with AWS. We provide very comprehensive integration with AWS data analytics services, and we have several competencies ranging from data analytics to migration and modernization. So that's our focus and again, we're excited about working with Snowflake and AWS to bring solutions together to market. >> Well, I'm looking forward to unpacking the solutions specifically, and congratulations on the continued success of both your companies. We've been following them obviously for a very long time and seeing the platform evolve beyond just SaaS and a lot more going on in cloud these days, kind of next generation emerging. You know, we're seeing a lot of macro trends that are going to be powering some of the things we're going to get into real quickly. But before we get into the solution, what are some of those power dynamics in the industry that you're seeing in trends specifically that are impacting your customers that are taking us down this road of getting more out of the data and specifically the SAP, but in general trends and dynamics. What are you hearing from your customers? Why do they care? Why are they going down this road? Peter, we'll start with you. >> Yeah, I'll go ahead and start. Thanks. Yeah, I'd say we continue to see customers being, being very eager to transform their businesses and they know they need to leverage technology and data to do so. They're also increasingly depending upon the cloud to bring that agility, that elasticity, new functionality necessary to react in real-time to every evolving customer needs. You look at what's happened over the last three years, and boy, the macro environment customers, it's all changing so fast. With our partnerships with AWS and Qlik, we've been able to bring to market innovative solutions like the one we're announcing today that spans all three companies. It provides a holistic solution and an integrated solution for our customer. >> Itamar let's get into it, you've been with theCUBE, you've seen the journey, you have your own journey, many, many years, you've seen the waves. What's going on now? I mean, what's the big wave? What's the dynamic powering this trend? >> Yeah, in a nutshell I'll call it, it's all about time. You know, it's time to value and it's about real-time data. I'll kind of talk about that a bit. So, I mean, you hear a lot about the data being the new oil, but it's definitely, we see more and more customers seeing data as their critical enabler for innovation and digital transformation. They look for ways to monetize data. They look as the data as the way in which they can innovate and bring different value to the customers. So we see customers want to use more data so to get more value from data. We definitely see them wanting to do it faster, right, than before. And we definitely see them looking for agility and automation as ways to accelerate time to value, and also reduce overall costs. I did mention real-time data, so we definitely see more and more customers, they want to be able to act and make decisions based on fresh data. So yesterday's data is just not good enough. >> John: Yeah. >> It's got to be down to the hour, down to the minutes and sometimes even lower than that. And then I think we're also seeing customers look to their core business systems where they have a lot of value, like the SAP, like mainframe and thinking, okay, our core data is there, how can we get more value from this data? So that's key things we see all the time with customers. >> Yeah, we did a big editorial segment this year on, we called data as code. Data as code is kind of a riff on infrastructure as code and you start to see data becoming proliferating into all aspects, fresh data. It's not just where you store it, it's how you share it, it's how you turn it into an application intrinsically involved in all aspects. This is the big theme this year and that's driving all the conversations here at RE:Invent. And I'm guaranteeing you, it's going to happen for another five and 10 years. It's not stopping. So I got to get into the solution, you guys mentioned SAP and you've announced the solution by Qlik, Snowflake and AWS for your customers using SAP. Can you share more about this solution? What's unique about it? Why is it important and why now? Peter, Itamar, we'll start with you first. >> Let me jump in, this is really, I'll jump because I'm excited. We're very excited about this solution and it's also a solution by the way and again, we've seen proven customer success with it. So to your point, it's ready to scale, it's starting, I think we're going to see a lot of companies doing this over the next few years. But before we jump to the solution, let me maybe take a few minutes just to clarify the need, why we're seeing, why we're seeing customers jump to do this. So customers that use SAP, they use it to manage the core of their business. So think order processing, management, finance, inventory, supply chain, and so much more. So if you're running SAP in your company, that data creates a great opportunity for you to drive innovation and modernization. So what we see customers want to do, they want to do more with their data and more means they want to take SAP with non-SAP data and use it together to drive new insights. They want to use real-time data to drive real-time analytics, which they couldn't do to date. They want to bring together descriptive with predictive analytics. So adding machine learning in AI to drive more value from the data. And naturally they want to do it faster. So find ways to iterate faster on their solutions, have freedom with the data and agility. And I think this is really where cloud data platforms like Snowflake and AWS, you know, bring that value to be able to drive that. Now to do that you need to unlock the SAP data, which is a lot of also where Qlik comes in because typical challenges these customers run into is the complexity, inherent in SAP data. Tens of thousands of tables, proprietary formats, complex data models, licensing restrictions, and more than, you have performance issues, they usually run into how do we handle the throughput, the volumes while maintaining lower latency and impact. Where do we find knowledge to really understand how to get all this done? So these are the things we've looked at when we came together to create a solution and make it unique. So when you think about its uniqueness, because we put together a lot, and I'll go through three, four key things that come together to make this unique. First is about data delivery. How do you have the SAP data delivery? So how do you get it from ECC, from HANA from S/4HANA, how do you deliver the data and the metadata and how that integration well into Snowflake. And what we've done is we've focused a lot on optimizing that process and the continuous ingestion, so the real-time ingestion of the data in a way that works really well with the Snowflake system, data cloud. Second thing is we looked at SAP data transformation, so once the data arrives at Snowflake, how do we turn it into being analytics ready? So that's where data transformation and data worth automation come in. And these are all elements of this solution. So creating derivative datasets, creating data marts, and all of that is done by again, creating an optimized integration that pushes down SQL based transformations, so they can be processed inside Snowflake, leveraging its powerful engine. And then the third element is bringing together data visualization analytics that can also take all the data now that in organizing inside Snowflake, bring other data in, bring machine learning from SageMaker, and then you go to create a seamless integration to bring analytic applications to life. So these are all things we put together in the solution. And maybe the last point is we actually took the next step with this and we created something we refer to as solution accelerators, which we're really, really keen about. Think about this as prepackaged templates for common business analytic needs like order to cash, finance, inventory. And we can either dig into that a little more later, but this gets the next level of value to the customers all built into this joint solution. >> Yeah, I want to get to the accelerators, but real quick, Peter, your reaction to the solution, what's unique about it? And obviously Snowflake, we've been seeing the progression data applications, more developers developing on top of Snowflake, data as code kind of implies developer ecosystem. This is kind of interesting. I mean, you got partnering with Qlik and AWS, it's kind of a developer-like thinking real solution. What's unique about this SAP solution that's, that's different than what customers can get anywhere else or not? >> Yeah, well listen, I think first of all, you have to start with the idea of the solution. This are three companies coming together to build a holistic solution that is all about, you know, creating a great opportunity to turn SAP data into value this is Itamar was talking about, that's really what we're talking about here and there's a lot of technology underneath it. I'll talk more about the Snowflake technology, what's involved here, and then cover some of the AWS pieces as well. But you know, we're focusing on getting that value out and accelerating time to value for our joint customers. As Itamar was saying, you know, there's a lot of complexity with the SAP data and a lot of value there. How can we manage that in a prepackaged way, bringing together best of breed solutions with proven capabilities and bringing this to market quickly for our joint customers. You know, Snowflake and AWS have been strong partners for a number of years now, and that's not only on how Snowflake runs on top of AWS, but also how we integrate with their complementary analytics and then all products. And so, you know, we want to be able to leverage those in addition to what Qlik is bringing in terms of the data transformations, bringing data out of SAP in the visualization as well. All very critical. And then we want to bring in the predictive analytics, AWS brings and what Sage brings. We'll talk about that a little bit later on. Some of the technologies that we're leveraging are some of our latest cutting edge technologies that really make things easier for both our partners and our customers. For example, Qlik leverages Snowflakes recently released Snowpark for Python functionality to push down those data transformations from clicking the Snowflake that Itamar's mentioning. And while we also leverage Snowpark for integrations with Amazon SageMaker, but there's a lot of great new technology that just makes this easy and compelling for customers. >> I think that's the big word, easy button here for what may look like a complex kind of integration, kind of turnkey, really, really compelling example of the modern era we're living in, as we always say in theCUBE. You mentioned accelerators, SAP accelerators. Can you give an example of how that works with the technology from the third party providers to deliver this business value Itamar, 'cause that was an interesting comment. What's the example? Give an example of this acceleration. >> Yes, certainly. I think this is something that really makes this truly, truly unique in the industry and again, a great opportunity for customers. So we kind talked earlier about there's a lot of things that need to be done with SP data to turn it to value. And these accelerator, as the name suggests, are designed to do just that, to kind of jumpstart the process and reduce the time and the risk involved in such project. So again, these are pre-packaged templates. We basically took a lot of knowledge, and a lot of configurations, best practices about to get things done and we put 'em together. So think about all the steps, it includes things like data extraction, so already knowing which tables, all the relevant tables that you need to get data from in the contexts of the solution you're looking for, say like order to cash, we'll get back to that one. How do you continuously deliver that data into Snowflake in an in efficient manner, handling things like data type mappings, metadata naming conventions and transformations. The data models you build all the way to data mart definitions and all the transformations that the data needs to go through moving through steps until it's fully analytics ready. And then on top of that, even adding a library of comprehensive analytic dashboards and integrations through machine learning and AI and put all of that in a way that's in pre-integrated and tested to work with Snowflake and AWS. So this is where again, you get this entire recipe that's ready. So take for example, I think I mentioned order to cash. So again, all these things I just talked about, I mean, for those who are not familiar, I mean order to cash is a critical business process for every organization. So especially if you're in retail, manufacturing, enterprise, it's a big... This is where, you know, starting with booking a sales order, following by fulfilling the order, billing the customer, then managing the accounts receivable when the customer actually pays, right? So this all process, you got sales order fulfillment and the billing impacts customer satisfaction, you got receivable payments, you know, the impact's working capital, cash liquidity. So again, as a result this order to cash process is a lifeblood for many businesses and it's critical to optimize and understand. So the solution accelerator we created specifically for order to cash takes care of understanding all these aspects and the data that needs to come with it. So everything we outline before to make the data available in Snowflake in a way that's really useful for downstream analytics, along with dashboards that are already common for that, for that use case. So again, this enables customers to gain real-time visibility into their sales orders, fulfillment, accounts receivable performance. That's what the Excel's are all about. And very similarly, we have another one for example, for finance analytics, right? So this will optimize financial data reporting, helps customers get insights into P&L, financial risk of stability or inventory analytics that helps with, you know, improve planning and inventory management, utilization, increased efficiencies, you know, so in supply chain. So again, these accelerators really help customers get a jumpstart and move faster with their solutions. >> Peter, this is the easy button we just talked about, getting things going, you know, get the ball rolling, get some acceleration. Big part of this are the three companies coming together doing this. >> Yeah, and to build on what Itamar just said that the SAP data obviously has tremendous value. Those sales orders, distribution data, financial data, bringing that into Snowflake makes it easily accessible, but also it enables it to be combined with other data too, is one of the things that Snowflake does so well. So you can get a full view of the end-to-end process and the business overall. You know, for example, I'll just take one, you know, one example that, that may not come to mind right away, but you know, looking at the impact of weather conditions on supply chain logistics is relevant and material and have interest to our customers. How do you bring those different data sets together in an easy way, bringing the data out of SAP, bringing maybe other data out of other systems through Qlik or through Snowflake, directly bringing data in from our data marketplace and bring that all together to make it work. You know, fundamentally organizational silos and the data fragmentation exist otherwise make it really difficult to drive modern analytics projects. And that in turn limits the value that our customers are getting from SAP data and these other data sets. We want to enable that and unleash. >> Yeah, time for value. This is great stuff. Itamar final question, you know, what are customers using this? What do you have? I'm sure you have customers examples already using the solution. Can you share kind of what these examples look like in the use cases and the value? >> Oh yeah, absolutely. Thank you. Happy to. We have customers across different, different sectors. You see manufacturing, retail, energy, oil and gas, CPG. So again, customers in those segments, typically sectors typically have SAP. So we have customers in all of them. A great example is like Siemens Energy. Siemens Energy is a global provider of gas par services. You know, over what, 28 billion, 30 billion in revenue. 90,000 employees. They operate globally in over 90 countries. So they've used SAP HANA as a core system, so it's running on premises, multiple locations around the world. And what they were looking for is a way to bring all these data together so they can innovate with it. And the thing is, Peter mentioned earlier, not just the SAP data, but also bring other data from other systems to bring it together for more value. That includes finance data, these logistics data, these customer CRM data. So they bring data from over 20 different SAP systems. Okay, with Qlik data integration, feeding that into Snowflake in under 20 minutes, 24/7, 365, you know, days a year. Okay, they get data from over 20,000 tables, you know, over million, hundreds of millions of records daily going in. So it is a great example of the type of scale, scalability, agility and speed that they can get to drive these kind of innovation. So that's a great example with Siemens. You know, another one comes to mind is a global manufacturer. Very similar scenario, but you know, they're using it for real-time executive reporting. So it's more like feasibility to the production data as well as for financial analytics. So think, think, think about everything from audit to texts to innovate financial intelligence because all the data's coming from SAP. >> It's a great time to be in the data business again. It keeps getting better and better. There's more data coming. It's not stopping, you know, it's growing so fast, it keeps coming. Every year, it's the same story, Peter. It's like, doesn't stop coming. As we wrap up here, let's just get customers some information on how to get started. I mean, obviously you're starting to see the accelerators, it's a great program there. What a great partnership between the two companies and AWS. How can customers get started to learn about the solution and take advantage of it, getting more out of their SAP data, Peter? >> Yeah, I think the first place to go to is talk to Snowflake, talk to AWS, talk to our account executives that are assigned to your account. Reach out to them and they will be able to educate you on the solution. We have packages up very nicely and can be deployed very, very quickly. >> Well gentlemen, thank you so much for coming on. Appreciate the conversation. Great overview of the partnership between, you know, Snowflake and Qlik and AWS on a joint solution. You know, getting more out of the SAP data. It's really kind of a key, key solution, bringing SAP data to life. Thanks for coming on theCUBE. Appreciate it. >> Thank you. >> Thank you John. >> Okay, this is theCUBE coverage here at RE:Invent 2022. I'm John Furrier, your host of theCUBE. Thanks for watching. (upbeat music)

Published Date : Nov 23 2022

SUMMARY :

bringing SAP data to life, great meeting you John. then going to jump into what On the Cloud Partner side, and I'm the senior vice and the solutions, and the value chain and accelerate time to value that are going to be powering and data to do so. What's the dynamic powering this trend? You know, it's time to value all the time with customers. and that's driving all the and it's also a solution by the way I mean, you got partnering and bringing this to market of the modern era we're living in, that the data needs to go through getting things going, you know, Yeah, and to build in the use cases and the value? agility and speed that they can get It's a great time to be to educate you on the solution. key solution, bringing SAP data to life. Okay, this is theCUBE

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JohnPERSON

0.99+

AWSORGANIZATION

0.99+

PeterPERSON

0.99+

DellORGANIZATION

0.99+

SiemensORGANIZATION

0.99+

Peter MacDonaldPERSON

0.99+

John FurrierPERSON

0.99+

MicrosoftORGANIZATION

0.99+

Peter McDonaldPERSON

0.99+

Itamar AnkorionPERSON

0.99+

QlikORGANIZATION

0.99+

28 billionQUANTITY

0.99+

two companiesQUANTITY

0.99+

TensQUANTITY

0.99+

three companiesQUANTITY

0.99+

Siemens EnergyORGANIZATION

0.99+

20 plus yearsQUANTITY

0.99+

yesterdayDATE

0.99+

SnowflakeORGANIZATION

0.99+

third elementQUANTITY

0.99+

FirstQUANTITY

0.99+

threeQUANTITY

0.99+

ItamarPERSON

0.99+

over 20,000 tablesQUANTITY

0.99+

bothQUANTITY

0.99+

90,000 employeesQUANTITY

0.99+

firstQUANTITY

0.99+

SalesforceORGANIZATION

0.99+

Cloud PartnersORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

over 38,000 customersQUANTITY

0.99+

under 20 minutesQUANTITY

0.99+

10 yearsQUANTITY

0.99+

fiveQUANTITY

0.99+

ExcelTITLE

0.99+

oneQUANTITY

0.99+

over 11 yearsQUANTITY

0.98+

SnowparkTITLE

0.98+

Second thingQUANTITY

0.98+

Sanjeev Mohan, SanjMo & Nong Li, Okera | AWS Startup Showcase


 

(cheerful music) >> Hello everyone, welcome to today's session of theCUBE's presentation of AWS Startup Showcase, New Breakthroughs in DevOps, Data Analytics, Cloud Management Tools, featuring Okera from the cloud management migration track. I'm John Furrier, your host. We've got two great special guests today, Nong Li, founder and CTO of Okera, and Sanjeev Mohan, principal @SanjMo, and former research vice president of big data and advanced analytics at Gartner. He's a legend, been around the industry for a long time, seen the big data trends from the past, present, and knows the future. Got a great lineup here. Gentlemen, thank you for this, so, life in the trenches, lessons learned across compliance, cloud migration, analytics, and use cases for Fortune 1000s. Thanks for joining us. >> Thanks for having us. >> So Sanjeev, great to see you, I know you've seen this movie, I was saying that in the open, you've at Gartner seen all the visionaries, the leaders, you know everything about this space. It's changing extremely fast, and one of the big topics right out of the gate is not just innovation, we'll get to that, that's the fun part, but it's the regulatory compliance and audit piece of it. It's keeping people up at night, and frankly if not done right, slows things down. This is a big part of the showcase here, is to solve these problems. Share us your thoughts, what's your take on this wide-ranging issue? >> So, thank you, John, for bringing this up, and I'm so happy you mentioned the fact that, there's this notion that it can slow things down. Well I have to say that the old way of doing governance slowed things down, because it was very much about control and command. But the new approach to data governance is actually in my opinion, it's liberating data. If you want to democratize or monetize, whatever you want to call it, you cannot do it 'til you know you can trust said data and it's governed in some ways, so data governance has actually become very interesting, and today if you want to talk about three different areas within compliance regulatory, for example, we all know about the EU GDPR, we know California has CCPA, and in fact California is now getting even a more stringent version called CPRA in a couple of years, which is more aligned to GDPR. That is a first area we know we need to comply to that, we don't have any way out. But then, there are other areas, there is insider trading, there is how you secure the data that comes from third parties, you know, vendors, partners, suppliers, so Nong, I'd love to hand it over to you, and see if you can maybe throw some light into how our customers are handling these use cases. >> Yeah, absolutely, and I love what you said about balancing agility and liberating, in the face of what may be seen as things that slow you down. So we work with customers across verticals with old and new regulations, so you know, you brought up GDPR. One of our clients is using this to great effect to power their ecosystem. They are a very large retail company that has operations and customers across the world, obviously the importance of GDPR, and the regulations that imposes on them are very top of mind, and at the same time, being able to do effective targeting analytics on customer information is equally critical, right? So they're exactly at that spot where they need this customer insight for powering their business, and then the regulatory concerns are extremely prevalent for them. So in the context of GDPR, you'll hear about things like consent management and right to be forgotten, right? I, as a customer of that retailer should say "I don't want my information used for this purpose," right? "Use it for this, but not this." And you can imagine at a very, very large scale, when you have a billion customers, managing that, all the data you've collected over time through all of your devices, all of your telemetry, really, really challenging. And they're leveraging Okera embedded into their analytics platform so they can do both, right? Their data scientists and analysts who need to do everything they're doing to power the business, not have to think about these kind of very granular customer filtering requirements that need to happen, and then they leverage us to do that. So that's kind of new, right, GDPR, relatively new stuff at this point, but we obviously also work with customers that have regulations from a long long time ago, right? So I think you also mentioned insider trading and that supply chain, so we'll talk to customers, and they want really data-driven decisions on their supply chain, everything about their production pipeline, right? They want to understand all of that, and of course that makes sense, whether you're the CFO, if you're going to make business decisions, you need that information readily available, and supply chains as we know get more and more and more complex, we have more and more integrated into manufacturing and other verticals. So that's your, you're a little bit stuck, right? You want to be data-driven on those supply chain analytics, but at the same time, knowing the details of all the supply chain across all of your dependencies exposes your internal team to very high blackout periods or insider trading concerns, right? For example, if you knew Apple was buying a bunch of something, that's maybe information that only a select few people can have, and the way that manifests into data policies, 'cause you need the ability to have very, very scalable, per employee kind of scalable data restriction policies, so they can do their job easier, right? If we talk about speeding things up, instead of a very complex process for them to get approved, and approved on SEC regulations, all that kind of stuff, you can now go give them access to the part of the supply chain that they need, and no more, and limit their exposure and the company's exposure and all of that kind of stuff. So one of our customers able to do this, getting two orders of magnitude, a 100x reduction in the policies to manage the system like that. >> When I hear you talking like that, I think the old days of "Oh yeah, regulatory, it kind of slows down innovation, got to go faster," pretty basic variables, not a lot of combination of things to check. Now with cloud, there seems to be combinations, Sanjeev, because how complicated has the regulatory compliance and audit environment gotten in the past few years, because I hear security in a supply chain, I hear insider threats, I mean these are security channels, not just compliance department G&A kind of functions. You're talking about large-scale, potentially combinations of access, distribution, I mean it seems complicated. How much more complicated is it now, just than it was a few years ago? >> So, you know the way I look at it is, I'm just mentioning these companies just as an example, when PayPal or Ebay, all these companies started, they started in California. Anybody who ever did business on Ebay or PayPal, guess where that data was? In the US in some data center. Today you cannot do it. Today, data residency laws are really tough, and so now these organizations have to really understand what data needs to remain where. On top of that, we now have so many regulations. You know, earlier on if you were healthcare, you needed to be HIPAA compliant, or banking PCI DSS, but today, in the cloud, you really need to know, what data I have, what sensitive data I have, how do I discover it? So that data discovery becomes really important. What roles I have, so for example, let's say I work for a bank in the US, and I decide to move to Germany. Now, the old school is that a new rule will be created for me, because of German... >> John: New email address, all these new things happen, right? >> Right, exactly. So you end up with this really, a mass of rules and... And these are all static. >> Rules and tools, oh my god. >> Yeah. So Okera actually makes a lot of this dynamic, which reduces your cloud migration overhead, and Nong used some great examples, in fact, sorry if I take just a second, without mentioning any names, there's one of the largest banks in the world is going global in the digital space for the first time, and they're taking Okera with them. So... >> But what's the point? This is my next topic in cloud migration, I want to bring this up because, complexity, when you're in that old school kind of data center, waterfall, these old rules and tools, you have to roll this out, and it's a pain in the butt for everybody, it's a hassle, huge hassle. Cloud gives the agility, we know that, and cloud's becoming more secure, and I think now people see the on-premise, certainly things that'd be on-premises for secure things, I get that, but when you start getting into agility, and you now have cloud regions, you can start being more programmatic, so I want to get you guys' thoughts on the cloud migration, how companies who are now lifting and shifting, replatforming, what's the refactoring beyond that, because you can replatform in the cloud, and still some are kind of holding back on that. Then when you're in the cloud, the ones that are winning, the companies that are winning are the ones that are refactoring in the cloud. Doing things different with new services. Sanjeev, you start. >> Yeah, so you know, in fact lot of people tell me, "You know, we are just going to lift and shift into the cloud." But you're literally using cloud as a data center. You still have all the, if I may say, junk you had on-prem, you just moved it into the cloud, and now you're paying for it. In cloud, nothing is free. Every storage, every processing, you're going to pay for it. The most successful companies are the ones that are replatforming, they are taking advantage of the platform as a service or software as a service, so that includes things like, you pay as you go, you pay for exactly the amount you use, so you scale up and scale down or scale out and scale in, pretty quickly, you know? So you're handling that demand, so without replatforming, you are not really utilizing your- >> John: It's just hosting. >> Yeah, you're just hosting. >> It's basically hosting if you're not doing anything right there. >> Right. The reason why people sometimes resist to replatform, is because there's a hidden cost that we don't really talk about, PaaS adds 3x to IaaS cost. So, some organizations that are very mature, and they have a few thousand people in the IT department, for them, they're like "No, we just want to run it in the cloud, we have the expertise, and it's cheaper for us." But in the long run, to get the most benefit, people should think of using cloud as a service. >> Nong what's your take, because you see examples of companies, I'll just call one out, Snowflake for instance, they're essentially a data warehouse in the cloud, they refactored and they replatformed, they have a competitive advantage with the scale, so they have things that others don't have, that just hosting. Or even on-premise. The new model developing where there's real advantages, and how should companies think about this when they have to manage these data lakes, and they have to manage all these new access methods, but they want to maintain that operational stability and control and growth? >> Yeah, so. No? Yeah. >> There's a few topics that are all (indistinct) this topic. (indistinct) enterprises moving to the cloud, they do this maybe for some cost savings, but a ton of it is agility, right? The motor that the business can run at is just so much faster. So we'll work with companies in the context of cloud migration for data, where they might have a data warehouse they've been using for 20 years, and building policies over that time, right? And it's taking a long time to go proof of access and those kind of things, made more sense, right? If it took you months to procure a physical infrastructure, get machines shipped to your data center, then this data access taking so long feels okay, right? That's kind of the same rate that everything is moving. In the cloud, you can spin up new infrastructure instantly, so you don't want approvals for getting policies, creating rules, all that stuff that Sanjeev was talking about, that being slow is a huge, huge problem. So this is a very common environment that we see where they're trying to do that kind of thing. And then, for replatforming, again, they've been building these roles and processes and policies for 20 years. What they don't want to do is take 20 years to go migrate all that stuff into the cloud, right? That's probably an experience nobody wants to repeat, and frankly for many of them, people who did it originally may or may not be involved in this kind of effort. So we work with a lot of companies like that, they have their, they want stability, they got to have the business running as normal, they got to get moving into the new infrastructure, doing it in a new way that, you know, with all the kind of lessons learned, so, as Sanjeev said, one of these big banks that we work with, that classical story of on-premise data warehousing, maybe a little bit of Hadoop, moved onto AWS, S3, Snowflake, that kind of setup, extremely intricate policies, but let's go reimagine how we can do this faster, right? What we like to talk about is, you're an organization, you need a design that, if you onboarded 1000 more data users, that's got to be way, way easier than the first 10 you onboarded, right? You got to get it to be easier over time, in a really, really significant way. >> Talk about the data authorization safety factor, because I can almost imagine all the intricacies of these different tools creates specialism amongst people who operate them. And each one might have their own little authorization nuance. Trend is not to have that siloed mentality. What's your take on clients that want to just "Hey, you know what? I want to have the maximum agility, but I don't want to get caught in the weeds on some of these tripwires around access and authorization." >> Yeah, absolutely, I think it's real important to get the balance of it, right? Because if you are an enterprise, or if you have diversive teams, you want them to have the ability to use tools as best of breed for their purpose, right? But you don't want to have it be so that every tool has its own access and provisioning and whatever, that's definitely going to be a security, or at least, a lot of friction for you to get things going. So we think about that really hard, I think we've seen great success with things like SSO and Okta, right? Unifying authentication. We think there's a very, very similar thing about to happen with authorization. You want that single control plane that can integrate with all the tools, and still get the best of what you need, but it's much, much easier (indistinct). >> Okta's a great example, if people don't want to build their own thing and just go with that, same with what you guys are doing. That seems to be the dots that are connecting you, Sanjeev. The ease of use, but yet the stability factor. >> Right. Yeah, because John, today I may want to bring up a SQL editor to go into Snowflake, just as an example. Tomorrow, I may want to use the Azure Bot, you know? I may not even want to go to Snowflake, I may want to go to an underlying piece of data, or I may use Power BI, you know, for some reason, and come from Azure side, so the point is that, unless we are able to control, in some sort of a centralized manner, we will not get that consistency. And security you know is all or nothing. You cannot say "Well, I secured my Snowflake, but if you come through HTFS, Hadoop, or some, you know, that is outside of my realm, or my scope," what's the point? So that is why it is really important to have a watertight way, in fact I'm using just a few examples, maybe tomorrow I decide to use a data catalog, or I use Denodo as my data virtualization and I run a query. I'm the same identity, but I'm using different tools. I may use it from home, over VPN, or I may use it from the office, so you want this kind of flexibility, all encompassed in a policy, rather than a separate rule if you do this and this, if you do that, because then you end up with literally thousands of rules. >> And it's never going to stop, either, it's like fashion, the next tool's going to come out, it's going to be cool, and people are going to want to use it, again, you don't want to have to then move the train from the compliance side this way or that way, it's a lot of hassle, right? So we have that one capability, you can bring on new things pretty quickly. Nong, am I getting it right, this is kind of like the trend, that you're going to see more and more tools and/or things that are relevant or, certain use cases that might justify it, but yet, AppSec review, compliance review, I mean, good luck with that, right? >> Yeah, absolutely, I mean we certainly expect tools to continue to get more and more diverse, and better, right? Most innovation in the data space, and I think we... This is a great time for that, a lot of things that need to happen, and so on and so forth. So I think one of the early goals of the company, when we were just brainstorming, is we don't want data teams to not be able to use the tools because it doesn't have the right security (indistinct), right? Often those tools may not be focused on that particular area. They're great at what they do, but we want to make sure they're enabled, they do some enterprise investments, they see broader adoption much easier. A lot of those things. >> And I can hear the sirens in the background, that's someone who's not using your platform, they need some help there. But that's the case, I mean if you don't get this right, there are some consequences, and I think one of the things I would like to bring up on next track is, to talk through with you guys is, the persona pigeonhole role, "Oh yeah, a data person, the developer, the DevOps, the SRE," you start to see now, developers and with cloud developers, and data folks, people, however they get pigeonholed, kind of blending in, okay? You got data services, you got analytics, you got data scientists, you got more democratization, all these things are being kicked around, but the notion of a developer now is a data developer, because cloud is about DevOps, data is now a big part of it, it's not just some department, it's actually blending in. Just a cultural shift, can you guys share your thoughts on this trend of data people versus developers now becoming kind of one, do you guys see this happening, and if so, how? >> So when, John, I started my career, I was a DBA, and then a data architect. Today, I think you cannot have a DBA who's not a developer. That's just my opinion. Because there is so much of CICD, DevOps, that happens today, and you know, you write your code in Python, you put it in version control, you deploy using Jenkins, you roll back if there's a problem. And then, you are interacting, you're building your data to be consumed as a service. People in the past, you would have a thick client that would connect to the database over TCP/IP. Today, people don't want to connect over TCP/IP necessarily, they want to go by HTTP. And they want an API gateway in the middle. So, if you're a data architect or DBA, now you have to worry about, "I have a REST API call that's coming in, how am I going to secure that, and make sure that people are allowed to see that?" And that was just yesterday. >> Exactly. Got to build an abstraction layer. You got to build an abstraction layer. The old days, you have to worry about schema, and do all that, it was hard work back then, but now, it's much different. You got serverless, functions are going to show way... It's happening. >> Correct, GraphQL, and semantic layer, that just blows me away because, it used to be, it was all in database, then we took it out of database and we put it in a BI tool. So we said, like BusinessObjects started this whole trend. So we're like "Let's put the semantic layer there," well okay, great, but that was when everything was surrounding BusinessObjects and Oracle Database, or some other database, but today what if somebody brings Power BI or Tableau or Qlik, you know? Now you don't have a semantic layer access. So you cannot have it in the BI layer, so you move it down to its own layer. So now you've got a semantic layer, then where do you store your metrics? Same story repeats, you have a metrics layer, then the data centers want to do feature engineering, where do you store your features? You have a feature store. And before you know, this stack has disaggregated over and over and over, and then you've got layers and layers of specialization that are happening, there's query accelerators like Dremio or Trino, so you've got your data here, which Nong is trying really hard to protect, and then you've got layers and layers and layers of abstraction, and networks are fast, so the end user gets great service, but it's a nightmare for architects to bring all these things together. >> How do you tame the complexity? What's the bottom line? >> Nong? >> Yeah, so, I think... So there's a few things you need to do, right? So, we need to re-think how we express security permanence, right? I think you guys have just maybe in passing (indistinct) talked about creating all these rules and all that kind of stuff, that's been the way we've done things forever. We get to think about policies and mechanisms that are much more dynamic, right? You need to really think about not having to do any additional work, for the new things you add to the system. That's really, really core to solving the complexity problem, right? 'Cause that gets you those orders of magnitude reduction, system's got to be more expressive and map to those policies. That's one. And then second, it's got to be implemented at the right layer, right, to Sanjeev's point, close to the data, and it can service all of those applications and use cases at the same time, and have that uniformity and breadth of support. So those two things have to happen. >> Love this universal data authorization vision that you guys have. Super impressive, we had a CUBE Conversation earlier with Nick Halsey, who's a veteran in the industry, and he likes it. That's a good sign, 'cause he's seen a lot of stuff, too, Sanjeev, like yourself. This is a new thing, you're seeing compliance being addressed, and with programmatic, I'm imagining there's going to be bots someday, very quickly with AI that's going to scale that up, so they kind of don't get in the innovation way, they can still get what they need, and enable innovation. You've got cloud migration, which is only going faster and faster. Nong, you mentioned speed, that's what CloudOps is all about, developers want speed, not things in days or hours, they want it in minutes and seconds. And then finally, ultimately, how's it scale up, how does it scale up for the people operating and/or programming? These are three major pieces. What happens next? Where do we go from here, what's, the customer's sitting there saying "I need help, I need trust, I need scale, I need security." >> So, I just wrote a blog, if I may diverge a bit, on data observability. And you know, so there are a lot of these little topics that are critical, DataOps is one of them, so to me data observability is really having a transparent view of, what is the state of your data in the pipeline, anywhere in the pipeline? So you know, when we talk to these large banks, these banks have like 1000, over 1000 data pipelines working every night, because they've got that hundred, 200 data sources from which they're bringing data in. Then they're doing all kinds of data integration, they have, you know, we talked about Python or Informatica, or whatever data integration, data transformation product you're using, so you're combining this data, writing it into an analytical data store, something's going to break. So, to me, data observability becomes a very critical thing, because it shows me something broke, walk me down the pipeline, so I know where it broke. Maybe the data drifted. And I know Okera does a lot of work in data drift, you know? So this is... Nong, jump in any time, because I know we have use cases for that. >> Nong, before you get in there, I just want to highlight a quick point. I think you're onto something there, Sanjeev, because we've been reporting, and we believe, that data workflows is intellectual property. And has to be protected. Nong, go ahead, your thoughts, go ahead. >> Yeah, I mean, the observability thing is critically important. I would say when you want to think about what's next, I think it's really effectively bridging tools and processes and systems and teams that are focused on data production, with the data analysts, data scientists, that are focused on data consumption, right? I think bridging those two, which cover a lot of the topics we talked about, that's kind of where security almost meets, that's kind of where you got to draw it. I think for observability and pipelines and data movement, understanding that is essential. And I think broadly, on all of these topics, where all of us can be better, is if we're able to close the loop, get the feedback loop of success. So data drift is an example of the loop rarely being closed. It drifts upstream, and downstream users can take forever to figure out what's going on. And we'll have similar examples related to buy-ins, or data quality, all those kind of things, so I think that's really a problem that a lot of us should think about. How do we make sure that loop is closed as quickly as possible? >> Great insight. Quick aside, as the founder CTO, how's life going for you, you feel good? I mean, you started a company, doing great, it's not drifting, it's right in the stream, mainstream, right in the wheelhouse of where the trends are, you guys have a really crosshairs on the real issues, how you feeling, tell us a little bit about how you see the vision. >> Yeah, I obviously feel really good, I mean we started the company a little over five years ago, there are kind of a few things that we bet would happen, and I think those things were out of our control, I don't think we would've predicted GDPR security and those kind of things being as prominent as they are. Those things have really matured, probably as best as we could've hoped, so that feels awesome. Yeah, (indistinct) really expanded in these years, and it feels good. Feels like we're in the right spot. >> Yeah, it's great, data's competitive advantage, and certainly has a lot of issues. It could be a blocker if not done properly, and you're doing great work. Congratulations on your company. Sanjeev, thanks for kind of being my cohost in this segment, great to have you on, been following your work, and you continue to unpack it at your new place that you started. SanjMo, good to see your Twitter handle taking on the name of your new firm, congratulations. Thanks for coming on. >> Thank you so much, such a pleasure. >> Appreciate it. Okay, I'm John Furrier with theCUBE, you're watching today's session presentation of AWS Startup Showcase, featuring Okera, a hot startup, check 'em out, great solution, with a really great concept. Thanks for watching. (calm music)

Published Date : Sep 22 2021

SUMMARY :

and knows the future. and one of the big topics and I'm so happy you in the policies to manage of things to check. and I decide to move to Germany. So you end up with this really, is going global in the digital and you now have cloud regions, Yeah, so you know, if you're not doing anything right there. But in the long run, to and they have to manage all Yeah, so. In the cloud, you can spin up get caught in the weeds and still get the best of what you need, with what you guys are doing. the Azure Bot, you know? are going to want to use it, a lot of things that need to happen, the SRE," you start to see now, People in the past, you The old days, you have and networks are fast, so the for the new things you add to the system. that you guys have. So you know, when we talk Nong, before you get in there, I would say when you want I mean, you started a and I think those things and you continue to unpack it Thank you so much, of AWS Startup Showcase,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Nick HalseyPERSON

0.99+

JohnPERSON

0.99+

John FurrierPERSON

0.99+

CaliforniaLOCATION

0.99+

USLOCATION

0.99+

Nong LiPERSON

0.99+

AppleORGANIZATION

0.99+

GermanyLOCATION

0.99+

EbayORGANIZATION

0.99+

PayPalORGANIZATION

0.99+

20 yearsQUANTITY

0.99+

SanjeevPERSON

0.99+

TomorrowDATE

0.99+

twoQUANTITY

0.99+

GDPRTITLE

0.99+

Sanjeev MohanPERSON

0.99+

TodayDATE

0.99+

OneQUANTITY

0.99+

yesterdayDATE

0.99+

SnowflakeTITLE

0.99+

todayDATE

0.99+

PythonTITLE

0.99+

GartnerORGANIZATION

0.99+

TableauTITLE

0.99+

first timeQUANTITY

0.99+

3xQUANTITY

0.99+

bothQUANTITY

0.99+

100xQUANTITY

0.99+

oneQUANTITY

0.99+

OkeraORGANIZATION

0.99+

InformaticaORGANIZATION

0.98+

two ordersQUANTITY

0.98+

NongORGANIZATION

0.98+

SanjMoPERSON

0.98+

secondQUANTITY

0.98+

Power BITITLE

0.98+

1000QUANTITY

0.98+

tomorrowDATE

0.98+

two thingsQUANTITY

0.98+

QlikTITLE

0.98+

each oneQUANTITY

0.97+

thousands of rulesQUANTITY

0.97+

1000 more data usersQUANTITY

0.96+

TwitterORGANIZATION

0.96+

first 10QUANTITY

0.96+

OkeraPERSON

0.96+

AWSORGANIZATION

0.96+

hundred, 200 data sourcesQUANTITY

0.95+

HIPAATITLE

0.94+

EUORGANIZATION

0.94+

CCPATITLE

0.94+

over 1000 data pipelinesQUANTITY

0.93+

singleQUANTITY

0.93+

first areaQUANTITY

0.93+

two great special guestsQUANTITY

0.92+

BusinessObjectsTITLE

0.92+

Ajay Vohora, Io-Tahoe | SmartData Marketplaces


 

>> Narrator: From around the globe, it's theCUBE. With digital coverage of smart data marketplaces. Brought to you by Io-Tahoe. >> Digital transformation has really gone from a buzzword to a mandate, but digital business is a data business. And for the last several months we've been working with Io-Tahoe on an ongoing content series, focused on smart data and automation to drive better insights and outcomes, essentially putting data to work. And today we're going to do a deeper dive on automating data discovery. And one of the thought leaders in this space is Ajay Vohora, who's the CEO of Io-Tahoe. Once again, joining me, Ajay good to see you. Thanks for coming on. >> Great to be here, David, thank you. >> So let's, let's start by talking about some of the business realities and what are the economics that are driving automated data discovery? Why is that so important? >> Yeah, on this one, David it's a number of competing factors. We've got the reality of data which may be sensitive. So there's control. Three other elements wanting to drive value from that data to innovation. You can't really drive a lot of value without exchanging data. So the ability to exchange data and to manage those cost overheads and data discovery is at the root of managing that in an automated way to classify that data and set some policies to put that automation in place. >> Yeah, look, we have a picture of this. If we could bring it up guys, cause I want to, Ajay, help the audience understand kind of where data discovery fits in here. This is, as we talked about, this is a complicated situation for a lot of customers. They've got variety of different tools and you've really laid it out nicely here in this diagram. So, take us through sort of where that piece fits. >> Yeah, I mean, we're at the right hand side of this exchange, you know. We're really now in a data driven economy that is everything's connected through APIs that we consume online through mobile apps. And what's not apparent is the chain of activities and tasks that have to go into serving that data to an API at the outset. They may be many legacy systems, technologies, platforms On-premise, in cloud, hybrid, you name it and across those silos, getting to a unified view is the heavy lifting. I think we've seen some, some great impacts that BI tools, such as Power BI, Tableau, Looker, and so on, and Qlik have had, and they're in our ecosystem on visualizing Data and, you know, CEOs, managers, people that are working in companies day-to-day get a lot of value from saying, "What's the real time activity? "What was the trend over this month versus last month?" The tools to enable that, you know, we hear a lot of good things that we're doing with Snowflake, MongoDB on the public Cloud platforms, GCP Azure about enabling building those pipelines to feed into those analytics. But what often gets hidden is how do you source that data that could be locked into a mainframe, a data warehouse, IOT data, and pull over all of that together. And that is the reality of that is it's a lot of heavy lifting. It's hands on work that can be time consuming. And the issue there is that data may have value. It might have potential to have an impact on the top line for a business, on outcomes for consumers, but you're never really sure unless you've done the investigation, discovered it, unified that, and be able to serve that through to other technologies. >> Guys, if you would bring that picture back up again, because Ajay you made a point and I want to land on that for a second. There's a lot of manual curating. An example would be the data catalog. You know, data scientists complain all the time that they're manually wrangling data. And so you're trying to inject automation into the cycle. And then the other piece that I want you to address is the importance of APIs. You really can't do this without an architecture that allows you to connect things together that sort of enables some of the automation. >> Yep, I mean, I'll take that in two parts, David, the APIs, so virtual machines connected by APIs, business rules, and business logic driven by APIs, applications, so everything across the stack from infrastructure down to the network, hardware is all connected through APIs and the work of serving data through to an API, building those pipelines, is often miscalculated, just how much manual effort that takes and that manual effort, we've got a nice list here of what we automate down at the bottom, those tasks of indexing, labeling, mapping across different legacy systems, all of that takes away from the job of a data scientist or data engineer, looking to produce value, monetize data, and to help that business convey to consumers. >> Yeah, it's that top layer that the business sees, of course, there's a lot of work that has to go into achieving that. I want to talk about some of the key tech trends that you're seeing. And one of the things that we talk about a lot is metadata. The importance of metadata, you know, can't be understated. What are some of the big trends that you're seeing metadata and others? >> Yeah, I'll summarize it as five. There's a trend now look at metadata more holistically across the enterprise. And that really makes sense from trying to look across different data silos and apply a policy to manage that data. So that's the control piece. That's that lever. The other side, sometimes competing with that control around sensitive data around managing the cost of data is innovation. Innovation being able to speculate and experiment and try things out where you don't really know what the outcome is if you're a data scientist and engineer, you've got a hypothesis and therefore you've got that tension between control over data and innovation and driving value from it. So enterprise wide metadata management is really helping to unlock where might that latent value be across that sets of data. The other piece is adaptive data governance. Those controls that stick from the data policemen, data stewards, where they're trying to protect the organization, protect the brand, protect consumers data necessary, but in different use cases, you might want to nuance and apply a different policy to govern that data relevant to the context where you might have data that is less sensitive, that can be used for innovation and adapting the style of governance to fit the context is another trend that we're seeing coming up here. A few others is where we're sitting quite extensively in working with automating data discovery. We're now breaking that down into what can we direct? What do we know is a business outcome is a known upfront objective and direct that data discovery to towards that. And that means applying our algorithms around technology and our tools towards solving a known problem. The other one is autonomous data discovery. And that means, you know, trying to allow background processes to understand what changes are happening with data over time, flagging those anomalies. And the reason that's important is when you look over a length of time to see different spikes, different trends and activity, that's really giving a data ops team the ability to manage and calibrate how they're applying policies and controls the data. And the last two, David, that we're seeing is this huge drive towards self-service. So re-imagining how to apply policy data governance into the hands of a data consumer inside a business, or indeed the consumer themselves, to self-service if they're a banking customer or healthcare customer and the policies and the controls and rules, making sure that those are all in place to adaptively serve those data marketplaces that when are involved in creating. >> I want to ask you about the autonomous data discovering, the adaptive data governance, is the problem we're addressing there one of quality, in other words, machines are better than humans are at doing this? Is it one of scale? That humans just don't don't scale that well? Is it both? Can you add some color to that? >> Yeah, honestly, it's the same equation that existed 10 years ago, 20 years ago, it's being exacerbated, but it's that equation of how do I control all the things that I need to protect? How do I enable innovation where it is going to deliver business value? How do I exchange data between a customer, somebody in my supply chain safely, and do all of that whilst managing the fourth leg, which is cost overheads. There's not an open checkbook here. I've got to figure out if I'm the CIO and CDO, how I do all of this within a fixed budget. So those aspects have always been there, now with more choices, infrastructure in the Cloud, API driven applications, On-premises, and that is expanding the choices that a business has and how they put their data to work. It's also then creating a layer of management and data governance that really has to now manage those four aspects, control, innovation, exchange of data, and the cost overhead. >> That top layer of the first slide that we showed was all about the business value. So, I wonder if we could drill into the business impact a little bit. What are your customers seeing specifically in terms of the impact of all this automation on their business? >> Yeah, so we've had some great results. I think a few of the biggest have been helping customers move away from manually curating their data and their metadata. It used to be a time where if data initiatives or data governance initiatives, there'd be teams of people manually feeding a data catalog. And it's great to have that inventory of classified data to be able to understand single version of the truth, but having 10, 15 people manually process that, keep it up to date, when it's moving feet, the reality of it is what's true about data today, add another few sources and a few months time to your business, start collaborating with new partners, suddenly the landscape has changed. The amount of work has gone up, but what we're finding is through automating, creating that data discovery, feeding our data catalog, that's releasing a lot more time for our customers to spend on innovating and managing their data. A couple of others is around self service data analytics, moving the choices of what data might have business value into the hands of business users and data consumers to have faster cycle times around generating insights. And we're really helping them by automating the creation of those data sets that are needed for that. And the last piece, I'd have to say where we're seeing impacts more recently is in the exchange of data. There are a number of marketplaces out there who are now being compelled to become more digital, to rewire their business processes and everything from an RPA initiative to automation involving digital transformation is having CIOs, chief data officers and enterprise architects rethink how do they, how do they rewire the pipelines for their data to feed that digital transformation? >> Yeah, to me, it comes down to monetization. Now, of course, that's for a for-profit industry. For non-profits, for sure, the cost cutting or in the case of healthcare, which we'll talk about in a moment, I mean, it's patient outcomes, but the job of a Chief Data Officer has gone from data quality and governance and compliance to really figuring out how data can be monetized, not necessarily selling the data, but how it contributes to the monetization of the company. And then really understanding specifically for that organization, how to apply that. And that is a big challenge. We sort of chatted about 10 years ago, the early days of a dupe. And then 1% of the companies had enough engineers to figure it out, but now the tooling is available. The technology is there and the practices are there. And that really, to me is the bottom line, Ajay, is it's show me the money. >> Absolutely. It's definitely is focusing in on the single view of that customer and where we're helping there is to pull together those disparate, siloed sources of data to understand what are the needs of the patient, of the broker of the, if it's insurance? What are the needs of the supply chain manager, if it's manufacturing? And providing that 360 view of data is helping to see, helping that individual unlock the value for the business. So data's providing the lens provided, you know which data it is that can assist in doing that. >> And, you know, you mentioned RPA before, I had an RPA customer tell me she was a Six Sigma expert and she told me, "We would never try to apply Six Sigma "to a business process, "but with RPA we can do so very cheaply." Well, what that means is lower costs. It means better employee satisfaction and really importantly, better customer satisfaction and better customer outcomes. Let's talk about healthcare for a minute because it's a really important industry. It's one that is ripe for disruption and has really been, up until recently, pretty slow to adopt a lot of the major technologies that have been made available. But what are you seeing in terms of this theme we're using a putting data to work in healthcare specifically? >> Yeah, I mean, health care's has had a lot thrown at it. There's been a lot of change in terms of legislation recently, particularly in the U.S. market, in other economies, healthcare is on a path to becoming more digital. And part of that is around transparency of price. So, to be operating effectively as a healthcare marketplace, being able to have that price transparency around what an elective procedure is going to cost before taking that step forward. It's super important to have an informed decision around that. So if we look at the U.S., for example, we've seen that healthcare costs annually have risen to $4 trillion, but even with all of that cost, we have healthcare consumers who are reluctant sometimes to take up healthcare even if they have symptoms. And a lot of that is driven through not knowing what they're opening themselves up to. And, you know, I think David, if you or I were to book travel a holiday, maybe, or trip, we'd want to know what we're in for, what we're paying for upfront. But sometimes in healthcare that choice, the option might be the plan, but the cost that comes with it isn't. So recent legislation in the U.S. is certainly helpful to bring forward that price transparency. The underlying issue there though is the disparate different format types of data that are being used from payers, patients, employers, different healthcare departments to try and make that work. And where we're helping on that aspect in particular related to price transparency is to help make that data machine readable. So, sometimes with data, the beneficiary might be a person, but in a lot of cases, now we're seeing the ability to have different systems interact and exchange data in order to process the workflow to generate online lists of pricing from a provider that's been negotiated with a payer is really an enabling factor. >> So guys, I wonder if you could bring up the next slide, which is kind of the nirvana. So, if you saw the previous slide that the middle there was all different shapes and presumably to disparate data, this is the outcome that you want to get, where everything fits together nicely. And you've got this open exchange. It's not opaque as it is today. It's not bubble gum, band-aids and duct tape, but describe this sort of outcome that you're trying to achieve and maybe a little bit about what it's going to take to get there. >> Ajay: Yeah, that that's the culmination of a number of things. It's making sure that the data is machine readable, making it available to APIs, that could be RPA tools. We're working with technology companies that employ RPA for healthcare, and specifically to manage that patient and payer data to bring that together. In our data discovery, what we're able to do is to classify that data and have it made available to a downstream tool technology or person to apply that, that workflow to the data. So this looks like nirvana, it looks like utopia, but it's, you know, the end objective of a journey that we can see in different economies, that are at different stages of maturity in turning healthcare into a digital service even so that you can consume it from where you live, from home with telemedicine and tele care. >> Yeah, so, and this is not just for healthcare, but you know, you want to achieve that self-service data marketplace in virtually any industry. You're working with TCS, Tata Consulting Services to achieve this. You know, a company like Io-Tahoe has to have partnerships with organizations that have deep industry expertise. Talk about your relationship with TCS and what you guys are doing specifically in this regard. >> Yeah, we've been working with TCS now for a long while and we'll be announcing some of those initiatives here where we're now working together to reach their customers where they've got a brilliant framework of business, 4.0, where they're re-imagining with the clients, how their business can operate with AI, with automation and become more agile and digital. Our technology, now, the reams of patients that we have in our portfolio, being able to apply that at scale, on a global scale across industries, such as banking, insurance and healthcare is really allowing us to see a bigger impact on consumer outcomes, patient outcomes. And the feedback from TCS is that we're really helping in those initiatives remove that friction. They talk a lot about data friction. I think that's a polite term for the image that we just saw with the disparate technologies that the legacy that has built up. So if we want to create a transformation, having that partnership with TCS across industries is giving us that reach and that impact on many different people's day-to-day jobs and lives. >> Let's talk a little bit about the Cloud. It's a topic that we've hit on quite a bit here in this content series. But, but you know, the Cloud companies, the big hyper-scalers, they've put everything into the Cloud, right? But customers are more circumspect than that. But at the same time, machine intelligence, ML, AI, the Cloud is a place to do a lot of that. That's where a lot of the innovation occurs. And so what are your thoughts on getting to the Cloud, putting data to work, if you will, with machine learning, stuff that you're doing with AWS, what's your fit there? >> Yeah, we, David, we work with all of the Cloud platforms, Microsoft Azure, GCP, IBM, but we're expanding our partnership now with AWS. And we're really opening up the ability to work with their Greenfield accounts, where a lot of that data, that technology is in their own data centers at the customer. And that's across banking, healthcare, manufacturing, and insurance. And for good reason, a lot of companies that have taken the time to see what works well for them with the technologies that the Cloud providers are offering, and a lot of cases, testing services or analytics using the Cloud to move workloads to the Cloud to drive data analytics is a real game changer. So there's good reason to maintain a lot of systems On-premise. If that makes sense from a cost, from a liability point of view and the number of clients that we work with that do have, and will keep their mainframe systems when in Cobra is no surprise to us, but equally they want to tap into technologies that AWS has such as SageMaker. The issue is as a Chief Data Officer, I didn't have the budget to move everything to the Cloud they want, I might want to show some results first upfront to my business users and work closely with my Chief Marketing Officer to look at what's happening in terms of customer trends and customer behavior> What are the customer outcomes, patient outcomes and partner outcomes that you can achieve through analytics, data science? So, working with AWS and with clients to manage that hybrid topology of some of that data being in the Cloud, being put to work with AWS SageMaker and Io-Tahoe being used to identify where is the data that needs to be amalgamated and curated to provide the dataset for machine learning, advanced analytics to have an impact for the business. >> So what are the critical attributes of what you're looking at to help customers decide what to move and what the keep if you will? >> Well, one of the quickest outcomes that we help customers achieve is to buy that business glossary, you know, that the items of data, that means something to them across those different silos and pull all of that together into a unified view. Once they've got that data engineer working with a business manager to think through, how do we want to create this application? Now, what is the churn model, the loyalty or the propensity model that we want to put in place here? How do we use predictive analytics to understand what needs for a patient that sort of innovation is what we're unlocking, applying a tools such as SageMaker on AWS to then do the computation and to build those models to deliver that outcome is across that value chain. And it goes back to the first picture that we put up, David, you know, the outcome is that API on the back of it, you've got a machine learning model that's been developed in a tool such as Databricks or Jupiter notebook. That data has to be sourced from somewhere. Somebody has to say that, "Yep, "You've got permission to do what you're trying to do without falling foul "of any compliance around data." And it all goes back to discovering that data, classifying it, indexing it in an automated way to cut those timelines down to hours and days. >> Yeah, it's the innovation part of your data portfolio, if you will, that you're going to put into the Cloud, apply tools like SageMaker and others, your tool Azure. I mean, whatever your favorite tool is, you don't care. The customer's going to choose that. And you know, the Cloud vendors, maybe they want you to use their tool, but they're making their marketplaces available to everybody, but it's that innovation piece, the ones that you, where you want to apply that self-service data marketplace to, and really drive, as I said before, monetization, All right, give us your final thoughts. Ajay, bring us home. >> So final thoughts on this, David, is at the moment, we're seeing a lot of value in helping customers discover their data using automation, automatically curating a data catalog. And that unified view is then being put to work through our API is having an open architecture to plug in whatever tool technology our clients have decided to use. And that open architecture is really feeding into the reality of what CIOs and Chief Data Officers are managing, which is a hybrid On-premise Cloud approach to use best of breed. But business users wanting to use a particular technology to get their business outcome, having the flexibility to do that no matter where your data is sitting On-premise, on Cloud is where self-service comes in so that sales service view of what data I can plug together, jive exchange, monetizing that data is where we're starting to see some real traction with customers. Now accelerating, becoming more digital to serve their own customers. >> Yeah, we really have seen a cultural mind shift going from sort of complacency, and obviously COVID has accelerated this, but the combination of that cultural shift, the Cloud machine intelligence tools give me a lot of hope that the promises of big data will ultimately be lived up to in this next 10 years. So Ajay Vohora, thanks so much for coming back on theCUBE. You're a great guest and appreciate your insights. >> Appreciate it, David. See you next time. >> All right, keep it right there, everybody, right back after this short break. (techno music)

Published Date : Sep 17 2020

SUMMARY :

Brought to you by Io-Tahoe. and automation to drive So the ability to exchange data help the audience understand and tasks that have to go into serving is the importance of APIs. all of that takes away from the job that has to go into achieving that. And that means, you know, and that is expanding the choices in terms of the impact And the last piece, I'd have to say And that really, to me is the bottom line, of the broker of the, of the major technologies that choice, the option might be the plan, that the middle there Ajay: Yeah, that that's the culmination has to have partnerships that the legacy that has built up. on getting to the Cloud, of some of that data being in the Cloud, that means something to them to apply that self-service having the flexibility to do that that the promises of big data See you next time. right back after this short break.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

Ajay VohoraPERSON

0.99+

TCSORGANIZATION

0.99+

AWSORGANIZATION

0.99+

Io-TahoeORGANIZATION

0.99+

$4 trillionQUANTITY

0.99+

Tata Consulting ServicesORGANIZATION

0.99+

fiveQUANTITY

0.99+

IBMORGANIZATION

0.99+

two partsQUANTITY

0.99+

bothQUANTITY

0.99+

MicrosoftORGANIZATION

0.99+

first pictureQUANTITY

0.99+

fourth legQUANTITY

0.99+

AjayPERSON

0.99+

Io-TahoePERSON

0.99+

oneQUANTITY

0.99+

20 years agoDATE

0.99+

U.S.LOCATION

0.99+

10 years agoDATE

0.98+

Three other elementsQUANTITY

0.98+

360 viewQUANTITY

0.98+

1%QUANTITY

0.98+

last monthDATE

0.98+

first slideQUANTITY

0.97+

todayDATE

0.97+

Power BITITLE

0.97+

CobraLOCATION

0.96+

DatabricksORGANIZATION

0.96+

10, 15 peopleQUANTITY

0.96+

single viewQUANTITY

0.95+

Six SigmaORGANIZATION

0.95+

GCP AzureTITLE

0.95+

single versionQUANTITY

0.94+

CloudTITLE

0.94+

TableauTITLE

0.92+

AzureTITLE

0.88+

MongoDBTITLE

0.86+

about 10 years agoDATE

0.84+

COVIDTITLE

0.83+

firstQUANTITY

0.82+

SnowflakeTITLE

0.81+

GCPORGANIZATION

0.81+

twoQUANTITY

0.81+

LookerTITLE

0.79+

SageMakerTITLE

0.78+

GreenfieldORGANIZATION

0.78+

next 10 yearsDATE

0.74+

Six SigmaTITLE

0.7+

this monthDATE

0.67+

JupiterORGANIZATION

0.63+

QlikTITLE

0.62+

AWS SageMakerORGANIZATION

0.61+

a secondQUANTITY

0.55+

SageMakerORGANIZATION

0.54+

Distributed Data with Unifi Software


 

>> Narrator: From the Silicon Angle Media Office in Boston, Massachusetts, it's theCUBE. Now, here's your host, Stu Miniman. >> Hi, I'm Stu Miniman and we're here at the east coast studio for Silicon Angle Media. Happy to welcome back to the program, a many time guest, Chris Selland, who is now the Vice President of strategic growth with Unifi Software. Great to see you Chris. >> Thanks so much Stu, great to see you too. >> Alright, so Chris, we'd had you in your previous role many times. >> Chris: Yes >> I think not only is the first time we've had you on since you made the switch, but also first time we've had somebody from Unifi Software on. So, why don't you give us a little bit of background of Unifi and what brought you to this opportunity. >> Sure, absolutely happy to sort of open up the relationship with Unifi Software. I'm sure it's going to be a long and good one. But I joined the company about six months ago at this point. So I joined earlier this year. I actually had worked with Unifi for a bit as partners. Where when I was previously at the Vertica business inside of HP/HP, as you know for a number of years prior to that, where we did all the work together. I also knew the founders of Unifi, who were actually at Greenplum, which was a direct Vertica competitor. Greenplum is acquired by EMC. Vertica was acquired by HP. We were sort of friendly respected competitors. And so I have known the founders for a long time. But it was partly the people, but it was really the sort of the idea, the product. I was actually reading the report that Peter Burris or the piece that Peter Burris just did on I guess wikibon.com about distributed data. And it played so into our value proposition. We just see it's where things are going. I think it's where things are going right now. And I think the market's bearing that out. >> The piece you reference, it was actually, it's a Wikibon research meeting, we run those weekly. Internally, we're actually going to be doing them soon we will be broadcasting video. Cause, of course, we do a lot of video. But we pull the whole team together, and it was one, George Gilbert actually led this for us, talking about what architectures do I need to build, when I start doing distributed data. With my background really more in kind of the cloud and infrastructure world. We see it's a hybrid, and many times a multi-cloud world. And, therefore, one of the things we look at that's critical is wait, if I've got things in multiple places. I've got my SAS over here, I've got multiple public clouds I'm using, and I've got my data center. How do I get my arms around all the pieces? And of course data is critical to that. >> Right, exactly, and the fact that more and more people need data to do their jobs these days. Working with data is no longer just the area where data scientists, I mean organizations are certainly investing in data scientists, but there's a shortage, but at the same time, marketing people, finance people, operations people, supply chain folks. They need data to do their jobs. And as you said where it is, it's distributed, it's in legacy systems, it's in the data center, it's in warehouses, it's in SAS applications, it's in the cloud, it's on premise, It's all over the place, so, yep. >> Chris, I've talked to so many companies that are, everybody seems to be nibbling at a piece of this. We go to the Amazon show and there's this just ginormous ecosystem that everybody's picking at. Can you drill in a little bit for what problems do you solve there. I have talked to people. Everything from just trying to get the licensing in place, trying to empower the business unit to do things, trying to do government compliance of course. So where's Unifi's point in this. >> Well, having come out of essentially the data warehousing market. And now of course this has been going on, of course with all the investments in HDFS, Hadoop infrastructure, and open source infrastructure. There's been this fundamental thinking that, well the answer's if I get all of the data in one place then I can analyze it. Well that just doesn't work. >> Right. >> Because it's just not feasible. So I think really and its really when you step back it's one of these like ah-ha that makes total sense, right. What we do is we basically catalog the data in place. So you can use your legacy data that's on the main frame. Let's say I'm a marketing person. I'm trying to do an analysis of selling trends, marketing trends, marketing effectiveness. And I want to use some order data that's on the main frame, I want some click stream data that's sitting in HDFS, I want some customer data in the CRM system, or maybe it's in Sales Force, or Mercado. I need some data out of Workday. I want to use some external data. I want to use, say, weather data to look at seasonal analysis. I want to do neighborhooding. So, how do I do that? You know I may be sitting there with Qlik or Tableau or Looker or one of these modern B.I. products or visualization products, but at the same time where's the data. So our value proposition it starts with we catalog the data and we show where the data is. Okay, you've got these data sources, this is what they are, we describe them. And then there's a whole collaboration element to the platform that lets people as they're using the data say, well yes that's order data, but that's old data. So it's good if you use it up to 2007, but the more current data's over here. Do things like that. And then we also then help the person use it. And again I almost said IT, but it's not real data scientists, it's not just them. It's really about democratizing the use. Because business people don't know how to do inner and outer joins and things like that or what a schema is. They just know, I'm trying do a better job of analyzing sales trends. I got all these different data sources, but then once I found them, once I've decided what I want to use, how do I use them? So we answer that question too. >> Yea, Chris reminds me a lot of some the early value propositions we heard when kind of Hadoop and the whole big data wave came. It was how do I get as a smaller company, or even if I'm a bigger company, do it faster, do it for less money than the things it use to be. Okay, its going to be millions of dollars and it's going to take me 18 months to roll out. Is it right to say this is kind of an extension of that big data wave or what's different and what's the same? >> Absolutely, we use a lot of that stuff. I mean we basically use, and we've got flexibility in what we can use, but for most of our customers we use HDFS to store the data. We use Hive as the most typical data form, you have flexibility around there. We use MapReduce, or Spark to do transformation of the data. So we use all of those open source components, and as the product is being used, as the platform is being used and as multiple users, cause it's designed to be an enterprise platform, are using it, the data does eventually migrate into the data lake, but we don't require you to sort of get it there as a prerequisite. As I said, this is one of the things that we really talk about a lot. We catalog the data where it is, in place, so you don't have to move it to use it, you don't have to move it to see it. But at the same time if you want to move it you can. The fundamental idea I got to move it all first, I got to put it all in one place first, it never works. We've come into so many projects where organizations have tried to do that and they just can't, it's too complex these days. >> Alright, Chris, what are some of the organizational dynamics you're seeing from your customers. You mention data scientist, the business users. Who is identifying, whose driving this issues, whose got the budget to try to fix some of these challenges. >> Well, it tends to be our best implementations are driven really, almost all of them these days, are driven by used cases. So they're driven by business needs. Some of the big ones. I've sort of talked about customers already, but like customer 360 views. For instance, there's a very large credit union client of ours, that they have all of their data, that is organized by accounts, but they can't really look at Stu Miniman as my customer. How do I look at Stu's value to us as a customer? I can look at his mortgage account, I can look at his savings account, I can look at his checking account, I can look at his debit card, but I can't just see Stu. I want to like organize my data, that way. That type of customer 360 or marketing analysis I talked about is a great use case. Another one that we've been seeing a lot of is compliance. Where just having a better handle on what data is where it is. This is where some of the governance aspects of what we do also comes into play. Even though we're very much about solving business problems. There's a very strong data governance. Because when you are doing things like data compliance. We're working, for instance, with MoneyGram, is a customer of ours. Who this day and age in particular, when there's money flows across the borders, there's often times regulators want to know, wait that money that went from here to there, tell me where it came from, tell me where it went, tell me the lineage. And they need to be able to respond to those inquiries very very quickly. Now the reality is that data sits in all sorts of different places, both inside and outside of the organization. Being able to organize that and give the ability to respond more quickly and effectively is a big competitive advantage. Both helps with avoiding regulatory fines, but also helps with customers responsiveness. And then you've got things GDPR, the General Data Protection Regulation, I believe it is, which is being driven by the EU. Where its sort of like the next Y2K. Anybody in data, if they are not paying attention to it, they need to be pretty quick. At least if they're a big enough company they're doing business in Europe. Because if you are doing business with European companies or European customers, this is going to be a requirement as of May next year. There's a whole 'nother set of how data's kept, how data's stored, what customers can control over data. Things like 'Right to Be Forgotten'. This need to comply with regulatory... As data's gotten more important, as you might imagine, the regulators have gotten more interested in what organizations are doing with data. Having a framework with that, organizes and helps you be more compliant with those regulations is absolutely critical. >> Yeah, my understanding of GDPR, if you don't comply, there's hefty fines. >> Chris: Major Fines. >> Major Fines. That are going to hit you. Does Unifi solve that? Is there other re-architecture, redesign that customers need to do to be able to be compliant? [speaking at The same Time] >> No, no that's the whole idea again where being able to leave the data where it is, but know what it is and know where it is and if and when I need to use it and where it came from and where it's going and where it went. All of those things, so we provide the platform that enables the customers to use it or the partners to build the solutions for their customers. >> Curious, customers, their adoption of public cloud, how does that play into what you are doing? They deploy more SAS environments. We were having a conversation off camera today talking about the consolidation that's happening in the software world. What does those dynamics mean for your customers? >> Well public cloud is obviously booming and growing and any organization has some public cloud infrastructure at this point, just about any organization. There's some very heavily regulated areas. Actually health care's probably a good example. Where there's very little public cloud. But even there we're working with... we're part of the Microsoft Accelerator Program. Work very closely with the Azure team, for instance. And they're working in some health care environments, where you have to be things like HIPAA compliant, so there is a lot of caution around that. But none the less, the move to public cloud is certainly happening. I think I was just reading some stats the other day. I can't remember if they're Wikibon or other stats. It's still only about 5% of IT spending. And the reality is organizations of any size have plenty of on-prem data. And of course with all the use of SAS solutions, with Salesforce, Workday, Mercado, all of these different SAS applications, it's also in somebody else's data center, much of our data as well. So it's absolutely a hybrid environment. That's why the report that you guys put out on distributed data, really it spoke so much to what out value proposition is. And that's why you know I'm really glad to be here to talk to you about it. >> Great, Chris tell us a little bit, the company itself, how many employees you have, what metrics can you share about the number of customers, revenue, things like that. >> Sure, no, we've got about, I believe about 65 people at the company right now. I joined like I said earlier this year, late February, early March. At that point we we were like 40 people, so we've been growing very quickly. I can't get in too specifically to like our revenue, but basically we're well in the triple digit growth phase. We're still a small company, but we're growing quickly. Our number of customers it's up in the triple digits as well. So expanding very rapidly. And again we're a platform company, so we serve a variety of industries. Some of the big ones are health care, financial services. But even more in the industries it tends to be driven by these used cases I talked about as well. And we're building out our partnerships also, so that's a big part of what I do also. >> Can you share anything about funding where you are? >> Oh yeah, funding, you asked about that, sorry. Yes, we raised our B round of funding, which closed in March of this year. So we [mumbles], a company called Pelion Venture Partners, who you may know, Canaan Partners, and then most recently Scale Venture Partners are investors. So the companies raised a little over $32 million dollars so far. >> Partnerships, you mentioned Microsoft already. Any other key partnerships you want to call out? >> We're doing a lot of work. We have a very broad partner network, which we're building up, but some of the ones that we are sort of leaning in the most with, Microsoft is certainly one. We're doing a lot of work guys at Cloudera as well. We also work with Hortonworks, we also work with MapR. We're really working almost across the board in the BI space. We have spent a lot of time with the folks at Looker. Who was also a partner I was working with very closely during my Vertica days. We're working with Qlik, we're working with Tableau. We're really working with actually just about everybody in sort of BI and visualization. I don't think people like the term BI anymore. The desktop visualization space. And then on public cloud, also Google, Amazon, so really all the kind of major players. I would say that they're the ones that we worked with the most closely to date. As I mentioned earlier we're part of the Microsoft Accelerator Program, so we're certainly very involved in the Microsoft ecosystem. I actually just wrote a blog post, which I don't believe has been published yet, about some of the, what we call the full stack solutions we have been rolling out with Microsoft for a few customers. Where we're sitting on Azure, we're using HDInsight, which is essentially Microsoft's Hadoop cloud Hadoop distribution, visualized empower BI. So we've really got to lot of deep integration with Microsoft, but we've got a broad network as well. And then I should also mention service providers. We're building out our service provider partnerships also. >> Yeah, Chris I'm surprised we haven't talked about kind of AI yet at all, machine learning. It feels like everybody that was doing big data, now has kind pivoted in maybe a little bit early in the buzz word phase. What's your take on that? You've been apart of this for a while. Is big data just old now and we have a new thing, or how do you put those together? >> Well I think what we do maps very well until, at least my personal view of what's going on with AI/ML, is that it's really part of the fabric of what our product does. I talked before about once you sort of found the data you want to use, how do I use it? Well there's a lot of ML built into that. Where essentially, I see these different datasets, I want to use them... We do what's called one click functions. Which basically... What happens is these one click functions get smarter as more and more people use the product and use the data. So that if I've got some table over here and then I've got some SAS data source over there and one user of the product... or we might see field names that we, we grab the metadata, even though we don't require moving the data, we grab the metadata, we look at the metadata and then we'll sort of tell the user, we suggest that you join this data source with that data source and see what it looks like. And if they say: ah that worked, then we say oh okay that's part of sort of the whole ML infrastructure. Then we are more likely to advise the next few folks with the one click function that, hey if you trying to do a analysis of sales trends, well you might want to use this source and that source and you might want to join them together this way. So it's a combination of sort of AI and ML built into the fabric of what we do, and then also the community aspect of more and more people using it. But that's, going back to your original question, That's what I think that... There was quote, I'll misquote it, so I'm not going to directly say it, but it was just.. I think it might have John Ferrier, who was recently was talking about ML and just sort of saying you know eventually we're not going to talk about ML anymore than we talk about phone business or something. It's just going to become sort of integrated into the fabric of how organizations do business and how organizations do things. So we very much got it built in. You could certainly call us an AI/ML company if you want, its actually definitely part of our slide deck. But at the same time its something that will just sort of become a part of doing business over time. But it really, it depends on large data sets. As we all know, this is why it's so cheap to get Amazon Echoes and such these days. Because it's really beneficial, because the more data... There's value in that data, there was just another piece, I actually shared it on Linkedin today as a matter of fact, about, talking about Amazon and Whole Foods and saying: why are they getting such a valuation premium? They're getting such a valuation premium, because they're smart about using data, but one of the reasons they're smart about using the data is cause they have the data. So the more data you collect, the more data you use, the smarter the systems get, the more useful the solutions become. >> Absolutely, last year when Amazon reinvented, John Ferrier interviewed Andy Jassy and I had posited that the customer flywheel, is going to be replaced by that data flywheel. And enhanced to make things spin even further. >> That's exactly right and once you get that flywheel going it becomes a bigger and bigger competitive advantage, by the way that's also why the regulators are getting interested these days too, right? There's sort of, that flywheel going back the other way, but from our perspective... I mean first of all it just makes economic sense, right? These things could conceivably get out of control, that's at least what the regulators think, if you're not careful at least there's some oversight and I would say that, yes probably some oversight is a good idea, so you've got kind of flywheels pushing in both directions. But one way or another organizations need to get much smarter and much more precise and prescriptive about how they use data. And that's really what we're trying to help with. >> Okay, Chris want to give you the final word, Unify Software, you're working on kind of the strategic road pieces. What should we look for from you in your segment through the rest of 2017? >> Well, I think, I've always been a big believer, I've probably cited 'Crossing the Chasm' like so many times on theCUBE, during my prior HP 10 year and such but you know, I'm a big believer and we should be talking about customers, we should be talking about used cases. It's not about alphabet soup technology or data lakes, it's about the solutions and it's about how organizations are moving themselves forward with data. Going back to that Amazon example, so I think from us, yes we just released 2.O, we've got a very active blog, come by unifisoftware.com, visit it. But it's also going to be around what our customers are doing and that's really what we're going to try to promote. I mean if you remember this was also something, that for all the years I've worked with you guys I've been very much... You always have to make sure that the customer has agreed to be cited, it's nice when you can name them and reference them and we're working on our customer references, because that's what I think is the most powerful in this day and age, because again, going back to my, what I said before about, this is going throughout organizations now. People don't necessarily care about the technology infrastructure, but they care about what's being done with it. And so, being able to tell those customer stories, I think that's what you're going to probably see and hear the most from us. But we'll talk about our product as much as you let us as well. >> Great thing, it reminds me of when Wikibon was founded it was really about IT practice, users being able to share with their peers. Now when the software economy today, when they're doing things in software often that can be leveraged by their peers and that flywheel that they're doing, just like when Salesforce first rolled out, they make one change and then everybody else has that option. We're starting to see that more and more as we deploy as SAS and as cloud, it's not the shrink wrap software anymore. >> I think to that point, you know, I was at a conference earlier this year and it was an IT conference, but I was really sort of floored, because when you ask what we're talking about, what the enlightened IT folks and there is more and more enlightened IT folks we're talking about these days, it's the same thing. Right, it's how our business is succeeding, by being better at leveraging data. And I think the opportunities for people in IT... But they really have to think outside of the box, it's not about Hadoop and Sqoop and Sequel and Java anymore it's really about business solutions, but if you can start to think that way, I think there's tremendous opportunities and we're just scratching the surface. >> Absolutely, we found that really some of the proof points of what digital transformation really is for the companies. Alright Chris Selland, always a pleasure to catch up with you. Thanks so much for joining us and thank you for watching theCUBE. >> Chris: Thanks too. (techno music)

Published Date : Aug 2 2017

SUMMARY :

Narrator: From the Silicon Angle Media Office Great to see you Chris. we'd had you in your previous role many times. I think not only is the first time we've had you on But I joined the company about six months ago at this point. And of course data is critical to that. it's in legacy systems, it's in the data center, I have talked to people. the data warehousing market. So I think really and its really when you step back and it's going to take me 18 months to roll out. But at the same time if you want to move it you can. You mention data scientist, the business users. and give the ability to respond more quickly Yeah, my understanding of GDPR, if you don't comply, that customers need to do to be able to be compliant? that enables the customers how does that play into what you are doing? to be here to talk to you about it. what metrics can you share about the number of customers, But even more in the industries it tends to be So the companies raised a little Any other key partnerships you want to call out? so really all the kind of major players. in the buzz word phase. So the more data you collect, the more data you use, and I had posited that the customer flywheel, There's sort of, that flywheel going back the other way, What should we look for from you in your segment that for all the years I've worked with you guys We're starting to see that more and more as we deploy I think to that point, you know, and thank you for watching theCUBE. Chris: Thanks too.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
ChrisPERSON

0.99+

George GilbertPERSON

0.99+

John FerrierPERSON

0.99+

UnifiORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

EuropeLOCATION

0.99+

MicrosoftORGANIZATION

0.99+

Chris SellandPERSON

0.99+

Stu MinimanPERSON

0.99+

Pelion Venture PartnersORGANIZATION

0.99+

HPORGANIZATION

0.99+

GreenplumORGANIZATION

0.99+

Peter BurrisPERSON

0.99+

GoogleORGANIZATION

0.99+

VerticaORGANIZATION

0.99+

StuPERSON

0.99+

Unifi SoftwareORGANIZATION

0.99+

Whole FoodsORGANIZATION

0.99+

HortonworksORGANIZATION

0.99+

General Data Protection RegulationTITLE

0.99+

Canaan PartnersORGANIZATION

0.99+

Andy JassyPERSON

0.99+

EMCORGANIZATION

0.99+

Silicon Angle MediaORGANIZATION

0.99+

last yearDATE

0.99+

LookerORGANIZATION

0.99+

May next yearDATE

0.99+

EUORGANIZATION

0.99+

late FebruaryDATE

0.99+

40 peopleQUANTITY

0.99+

18 monthsQUANTITY

0.99+

MoneyGramORGANIZATION

0.99+

QlikORGANIZATION

0.99+

HP/HPORGANIZATION

0.99+

Scale Venture PartnersORGANIZATION

0.99+

360 viewsQUANTITY

0.99+

oneQUANTITY

0.99+

MapRORGANIZATION

0.99+

GDPRTITLE

0.99+

ClouderaORGANIZATION

0.99+

early MarchDATE

0.99+

EchoesCOMMERCIAL_ITEM

0.99+

BothQUANTITY

0.99+

TableauORGANIZATION

0.99+

millions of dollarsQUANTITY

0.99+

Boston, MassachusettsLOCATION

0.99+

bothQUANTITY

0.98+

WikibonORGANIZATION

0.98+

LinkedinORGANIZATION

0.98+

one clickQUANTITY

0.98+

one placeQUANTITY

0.98+

JavaTITLE

0.98+

2007DATE

0.98+

over $32 millionQUANTITY

0.98+

todayDATE

0.98+

SparkTITLE

0.98+

HIPAATITLE

0.98+

first timeQUANTITY

0.98+

earlier this yearDATE

0.98+

unifisoftware.comOTHER

0.98+

10 yearQUANTITY

0.97+

Itamar Ankorion, Attunity & Arvind Rajagopalan, Verizon - #DataWorks - #theCUBE


 

>> Narrator: Live from San Jose in the heart of Silicon Valley, it's the CUBE covering DataWorks Summit 2017 brought to you by Hortonworks. >> Hey, welcome back to the CUBE live from the DataWorks Summit day 2. We've been here for a day and a half talking with fantastic leaders and innovators, learning a lot about what's happening in the world of big data, the convergence with Internet of Things Machine Learning, artificial intelligence, I could go on and on. I'm Lisa Martin, my co-host is George Gilbert and we are joined by a couple of guys, one is a Cube alumni, Itamar Ankorion, CMO of Attunity, Welcome back to the Cube. >> Thank you very much, good to be here, thank you Lisa and George. >> Lisa: Great to have you. >> And Arvind Rajagopalan, the Director of Technology Services for Verizon, welcome to the Cube. >> Thank you. >> So we were chatting before we went on, and Verizon, you're actually going to be presenting tomorrow, at the DataWorks summit, tell us about building... the journey that Verizon has been on building a Data Lake. >> Oh, Verizon is over the last 20 years, has been a large corporation, made up of a lot of different acquisitions and mergers, and that's how it was formed in 20 years back, and as we've gone through the journey of the mergers and the acquisitions over the years, we had data from different companies come together and form a lot of different data silos. So the reason we kind of started looking at this, is when our CFO started asking questions around... Being able to answer One Verizon questions, it's as simple as having Days Payable, or Working Capital Analysis across all the lines of businesses. And since we have a three-major-ERP footprint, it is extremely hard to get that data out, and there was a lot of manual data prep activities that was going into bringing together those One Verizon views. So that's really what was the catalyst to get the journey started for us. >> And it was driven by your CFO, you said? >> Arvind: That's right. >> Ah, very interesting, okay. So what are some of the things that people are going to hear tomorrow from your breakout session? >> Arvind: I'm sorry, say that again? >> Sorry, what are some of the things that the people, the attendees from your breakout session, are going to learn about the steps and the journey? >> So I'm going to primarily be talking about the challenges that we ran into, and share some around that, and also talk about some of the factors, such as the catalysts and what drew us to sort of moving in that direction, as well as getting to some architectural components, from high-level standpoint, talk about certain partners that we work with, the choices we made from an architecture perspective and the tools, as well as to kind of close the loop on, user adoption and what users are seeing in terms of business value, as we start centralizing all of the data at Verizon from a backoff as Finance and Supply Chains standpoint. So that's kind of what I'm looking at talking tomorrow. >> Arvind, it's interesting to hear you talk about sort of collecting data from essentially backoff as operational systems in a Data Lake. Were there... I assume that the state is sort of more refined and easily structured than the typical stories we hear about Data Lakes. Were there challenges in making it available for exploration and visualization, or were all the early-use cases really just Production Reporting? >> So standard reporting across the ERP systems is very mature and those capabilities are there, but then you look at across-ERP systems and we have three major ERP systems for each of the lines of businesses, when you want to look at combining all of the data, it's very hard, and to add to that, you pointed on self-service discovery, and visualization across all three datas, that's even more challenging, because it takes a lot of heavy lift, to normalize all of the data and bring it into one centralized platform, and we started off the journey with Oracle, and then we had SAP HANA, we were trying to bring all the data together, but then we were looking at systems in our non-SAP ERP systems and bringing that data into a SAP-kind of footprint, one, the cost was tremendously high, also there was a lot of heavy lift and challenges in terms of manually having to normalize the data and bring it into the same kind of data models. And even after all of that was done, it was not very self-service oriented for our users and Finance and Supply Chain. >> Let me drill into two of those things. So it sounds like the ETL process of converting it into a consumable format was very complex, and then it sounds like also, the discoverability, like where a tool, perhaps like Elation, might help, which is very, very immature right now, or maybe not immature, it's still young. Is that what was missing, or why was the ETL process so much more heavyweight than with a traditional data warehouse? >> The ETL processes, there's a lot of heavy lifting there involved, because of the proprietary data structures of the ERP systems, especially SAP is... The data structures and how the data is used across clustered and pool tables, is very proprietary. And on top of that, bringing the data formats and structures from a PeopleSoft ERP system which are supporting different lines of businesses, so there are a lot of customization that's gone into place, there are specific things that we use in the ERPs, in terms of the modules and how the processes are modeled in each of the lines of businesses, complicates things a lot. And then you try and bring all these three different ERPs, and the nuances that they have over the years, try and bring them together, it actually makes it very complex. >> So tell us then, help us understand, how the Data Lake made that easier. Was it because you didn't have to do all the refinement before it got there. And tell us how Attunity helped make that possible. >> Oh absolutely, so I think that's one of the big things, why we picked the Hortonworks as one of our key partners in terms of buidling out the Data Lake, it just came on greed, you aren't necessarily worried about doing a whole lot of ETL before you bring the data in, and it also provides with the tools and the technologies from a lot other partners. We have a lot of maturity now, better provided self-service discovery capabilities for ad hoc analysis and reporting. So this is helpful to the users because now they don't have to wait for prolonged IT development cycles to model the data, do the ETL and build reports for the to consume, which sometimes could take weeks and months. Now in a matter of days, they're able to see the data they're looking for and they're able to start the analysis, and once they start the analysis and the data is accessible, it's a matter of minutes and seconds looking at the different tools, how they want to look at it, how they want to model it, so it's actually being a huge value from the perspective of the users and what they're looking to do. >> Speaking of value, one of the things that was kind of thematic yesterday, we see enterprises are now embracing big data, they're embracing Hadoop, it's got to coexist within our ecosystem, and it's got to inter-operate, but just putting data in a Data Lake or Hadoop, that's not the value there, it's being able to analyze that data in motion, at rest, structured, unstructured, and start being able to glean or take actionable insights. From your CFO's perspective, where are you know of answering some of the questions that he or she had, from an insights perspective, with the Data Lake that you have in place? >> Yeah, before I address that, I wanted to quickly touch upon and wrap up George's question, if you don't mind. Because one of the key challenges, and I do talk about how Attunity helped. I was just about to answer the question before we moved on, so I just want to close the loop on that a little bit. So in terms of bringing the data in, the data acquisition or ingestion is key aspect of it, and again, looking at the proprietary data structures from the ERP systems is very complex, and involves a multi-step process to bring the data into a strange environment, and be able to put it in the swamp bring it into the Lake. And what Attunity has been able to help us with is, it has the intelligence to look at and understand the proprietary data structures of the ERPs, and it is able to bring all the data from the ERP source systems directly into Hadoop, without any stops, or staging data bases along the way. So it's been a huge value from that standpoint, I'll get into more details around that. And to answer your question, around how it's helping from a CFO standpoint, and the users in Finance, as I said, now all the data is available in one place, so it's very easy for them to consume the data, and be able to do ad hoc analysis. So if somebody's looking to, like I said earlier, want to look at and calculate base table, as an example, or they want to look at working capital, we are actually moving data using Attunity, CDC replicate product, we're getting data in real-time, into the Data Lake. So now they're able to turn things around, and do that kind of analysis in a matter of hours, versus overnight or in a matter of days, which was the previous environment. >> And that was kind of one of the things this morning, is it's really about speed, right? It's how fast can you move and it sounds like together with Attunity, Verizon is really not only making things simpler, as you talked about in this kind of model that you have, with different ERP systems, but you're also really able to get information into the right hands much, much faster. >> Absolutely, that's the beauty of the near real-time, and the CDC architecture, we're able to get data in, very easily and quickly, and Attunity also provides a lot of visibility as the data is in flight, we're able to see what's happening in the source system, how many packets are flowing through, and to a point, my developers are so excited to work with a product, because they don't have to worry about the changes happening in the source systems in terms of DDL and those changes are automatically understood by the product and pushed to the destination of Hadoop. So it's been a game-changer, because we have not had any downtime, because when there are things changing on the source system side, historically we had to take downtime, to change those configurations and the scripts, and publish it across environments, so that's been huge from that standpoint as well. >> Absolutely. >> Itamar, maybe, help us understand where Attunity can... It sounds like there's greatly reduced latency in the pipeline between the operational systems and the analytic system, but it also sounds like you still need to essentially reformat the data, so that it's consumable. So it sounds like there's an ETL pipeline that's just much, much faster, but at the same time, when it's like, replicate, it sounds like that goes without transformations. So help us sort of understand that nuance. >> Yeah, that's a great question, George. And indeed in the past few years, customers have been focused predominantly on getting the data to the Lake. I actually think it's one of the changes in the fame, we're hearing here in the show and the last few months is, how do we move to start using the data, the great applications on the data. So we're kind of moving to the next step, in the last few years we focused a lot on innovating and creating the solutions that facilitate and accelerate the process of getting data to the Lake, from a large scope of systems, including complex ones like SAP, and also making the process of doing that easier, providing real-time data that can both feed streaming architectures as well as batch ones. So once we got that covered, to your question, is what happens next, and one of the things we found, I think Verizon is also looking at it now and are being concomitant later. What we're seeing is, when you bring data in, and you want to adopt the streaming, or a continuous incremental type of data ingestion process, you're inherently building an architecture that takes what was originally a database, but you're kind of, in a sense, breaking it apart to partitions, as you're loading it over time. So when you land the data, and Arvind was referring to a swamp, or some customers refer to it as a landing zone, you bring the data into your Lake environment, but at the first stage that data is not structured, to your point, George, in a manner that's easily consumable. Alright, so the next step is, how do we facilitate the next step of the process, which today is still very manual-driven, has custom development and dealing with complex structures. So we actually are very excited, we've introduced, in the show here, we announced a new product by Attunity, Compose for Hive, which extends our Data Lake solutions, and what Compose of Hive is exactly designed to do, is address part of the problem you just described, where's when the data comes in and is partitioned, what Compose for Hive does, is it reassembles these partitions, and it then creates analytic-ready data sets, back in Hive, so it can create operational data stores, it can create historical data stores, so then the data becomes formatted, in a matter that's more easily accessible for users, who want to use analytic tools, VI-tools, Tableau, Qlik, any type of tool that can easily access a database. >> Would there be, as a next step, whether led by Verizon's requirements or Attunity's anticipation of broader customer requirements, something where, there's a, if not near real-time, but a very low latency landing and transformation, so that data that is time-sensitive can join the historical data. >> Absolutely, absolutely. So what we've done, is focus on real-time availability of data. So when we feed the data into the Data Lake, we fit it into ways, one is directly into Hive, but we also go through a streaming architecture, like Kafka, in the case of Hortonworks, can also fit also very well into HDF. So then the next step in the process, is producing those analytic data sets, or data source, out of it, which we enable, and what we do is design it together with our partners, with our inner customers. So again when we work on Replicate, then we worked on Compose, we worked very close with Fortune companies trying to deal with these challenges, so we can design a product. In the case of Compose for Hive for example, we have done a lot of collaboration, at a product engineering level, with Hortonworks, to leverage the latest and greatest in Hive 2.2, Hive LLAP, to be able to push down transformations, so those can be done faster, including real-time, so those datasets can be updated on a frequent basis. >> You talked about kind of customer requirements, either those specific or not, obviously talking to telecommunications company, are you seeing, Itamar, from Attunity's perspective, more of this need to... Alright, the data's in the Lake, or first it comes to the swamp, now it's in the Lake, to start partitioning it, are you seeing this need driven in specific industries, or is this really pretty horizontal? >> That's a good question and this is definitely a horizontal need, it's part of the infrastructure needs, so Verizon is a great customer, and we even worked similarly in telecommunications, we've been working with other customers in other industries, from manufacturing, to retail, to health care, to automotive and others, and in all of those cases it's on a foundation level, it's very similar architectural challenges. You need to ingest the data, you want to do it fast, you want to do it incrementally or continuously, even if you're loading directly into Hadoop. Naturally, when you're loading the data through a Kafka, or streaming architecture, it's a continuous fashon, and then you partition the data. So the partitioning of the data is kind of inherent to the architecture, and then you need to help deal with the data, for the next step in the process. And we're doing it both with Compose for Hive, but also for customers using streaming architectures like Kafka, we provide the mechanisms, from supporting or facilitating things like schema unpollution, and schema decoding, to be able to facilitate the downstream process of processing those partitions of data, so we can make the data available, that works both for analytics and streaming analytics, as well as for scenarios like microservices, where the way in which you partition the data or deliver the data, allows each microservice to pick up on the data it needs, from the relevant partition. >> Well guys, this has been a really informative conversation. Congratulations, Itamar, on the new announcement that you guys made today. >> Thank you very much. >> Lisa: Arvin, great to hear the use case and how Verizon really sounds quite pioneering in what you're doing, wish you continued success there, we look forward to hearing what's next for Verizon, we want to thank you for watching the CUBE, we are again live, day two, of the DataWorks summit, #DWS17, before me my co-host George Gilbert, I am Lisa Martin, stick around, we'll be right back. (relaxed techno music)

Published Date : Jun 14 2017

SUMMARY :

in the heart of Silicon Valley, and we are joined by a couple of guys, Thank you very much, good to be here, the Director of Technology Services for Verizon, at the DataWorks summit, So the reason we kind of started looking at this, that people are going to hear tomorrow and the tools, as well as to kind of close the loop on, than the typical stories we hear about Data Lakes. and bring it into the same kind of data models. So it sounds like the ETL process and the nuances that they have over the years, how the Data Lake made that easier. do the ETL and build reports for the to consume, and it's got to inter-operate, and it is able to bring all the data and it sounds like together with Attunity, and the CDC architecture, we're able to get data in, and the analytic system, getting the data to the Lake. can join the historical data. like Kafka, in the case of Hortonworks, Alright, the data's in the Lake, You need to ingest the data, you want to do it fast, Congratulations, Itamar, on the new announcement Lisa: Arvin, great to hear the use case

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
George GilbertPERSON

0.99+

Arvind RajagopalanPERSON

0.99+

ArvindPERSON

0.99+

Lisa MartinPERSON

0.99+

VerizonORGANIZATION

0.99+

Itamar AnkorionPERSON

0.99+

LisaPERSON

0.99+

GeorgePERSON

0.99+

ItamarPERSON

0.99+

OracleORGANIZATION

0.99+

San JoseLOCATION

0.99+

Silicon ValleyLOCATION

0.99+

twoQUANTITY

0.99+

tomorrowDATE

0.99+

KafkaTITLE

0.99+

threeQUANTITY

0.99+

HortonworksORGANIZATION

0.99+

CubeORGANIZATION

0.99+

ArvinPERSON

0.99+

DataWorks SummitEVENT

0.99+

SAP HANATITLE

0.99+

OneQUANTITY

0.99+

eachQUANTITY

0.99+

yesterdayDATE

0.99+

#DWS17EVENT

0.99+

oneQUANTITY

0.98+

a day and a halfQUANTITY

0.98+

CDCORGANIZATION

0.98+

first stageQUANTITY

0.98+

TableauTITLE

0.98+

DataWorks Summit 2017EVENT

0.98+

AttunityORGANIZATION

0.98+

HiveTITLE

0.98+

bothQUANTITY

0.98+

AttunityPERSON

0.98+

DataWorksEVENT

0.97+

todayDATE

0.97+

Compose for HiveORGANIZATION

0.97+

ComposeORGANIZATION

0.96+

Hive 2.2TITLE

0.95+

QlikTITLE

0.94+

HadoopTITLE

0.94+

one placeQUANTITY

0.93+

day twoQUANTITY

0.92+

each microserviceQUANTITY

0.9+

firstQUANTITY

0.9+

20 years backDATE

0.89+

#DataWorksORGANIZATION

0.87+

three major ERP systemsQUANTITY

0.83+

last 20 yearsDATE

0.82+

PeopleSoftORGANIZATION

0.8+

Data LakeCOMMERCIAL_ITEM

0.8+

SAPORGANIZATION

0.79+