Rahul Pathak Opening Session | AWS Startup Showcase S2 E2
>>Hello, everyone. Welcome to the cubes presentation of the 80 minutes startup showcase. Season two, episode two, the theme is data as code, the future of analytics. I'm John furry, your host. We had a great day lineup for you. Fast growing startups, great lineup of companies, founders, and stories around data as code. And we're going to kick it off here with our opening keynote with Rahul Pathak VP of analytics at AWS cube alumni. Right? We'll thank you for coming on and being the opening keynote for this awesome event. >>Yeah. And it's great to see you, and it's great to be part of this event, uh, excited to, um, to help showcase some of the great innovation that startups are doing on top of AWS. >>Yeah. We last spoke at AWS reinvent and, uh, a lot's happened there, service loss of serverless as the center of the, of the action, but all these start-ups rock set Dremio Cribble monks next Liccardo, a HANA imply all doing great stuff. Data as code has a lot of traction. So a lot of still momentum going on in the marketplace. Uh, pretty exciting. >>No, it's, uh, it's awesome. I mean, I think there's so much innovation happening and you know, the, the wonderful part of working with data is that the demand for services and products that help customers drive insight from data is just skyrocketing and has no sign of no sign of slowing down. And so it's a great time to be in the data business. >>It's interesting to see the theme of the show getting traction, because you start to see data being treated almost like how developers write software, taking things out of branches, working on them, putting them back in, uh, machine learnings, uh, getting iterated on you, seeing more models, being trained differently with better insights, action ones that all kind of like working like code. And this is a whole nother way. People are reinventing their businesses. This has been a big, huge wave. What's your reaction to that? >>Uh, I think it's spot on, I mean, I think the idea of data's code and bringing some of the repeatability of processes from software development into how people built it, applications is absolutely fundamental and especially so in machine learning where you need to think about the explainability of a model, what version of the world was it trained on? When you build a better model, you need to be able to explain and reproduce it. So I think your insights are spot on and these ideas are showing up in all stages of the data work flow from ingestion to analytics to I'm out >>This next way is about modernization and going to the next level with cloud-scale. Uh, thank you so much for coming on and being the keynote presenter here for this great event. Um, I'll let you take it away. Reinventing businesses, uh, with ads analytics, right? We'll take it away. >>Okay, perfect. Well, folks, we're going to talk about, uh, um, reinventing your business with, uh, data. And if you think about it, the first wave of reinvention was really driven by the cloud. As customers were able to really transform how they thought about technology and that's well on her way. Although if you stop and think about it, I think we're only about five to 10% of the way done in terms of it span being on the cloud. So lots of work to do there, but we're seeing another wave of reinvention, which is companies reinventing their businesses with data and really using data to transform what they're doing to look for new opportunities and look for ways to operate more efficiently. And I think the past couple of years of the pandemic, it really only accelerated that trend. And so what we're seeing is, uh, you know, it's really about the survival of the most informed folks for the best data are able to react more quickly to what's happening. >>Uh, we've seen customers being able to scale up if they're in, say the delivery business or scale down, if they were in the travel business at the beginning of all of this, and then using data to be able to find new opportunities and new ways to serve customers. And so it's really foundational and we're seeing this across the board. And so, um, you know, it's great to see the innovation that's happening to help customers make sense of all of this. And our customers are really looking at ways to put data to work. It's about making better decisions, finding new efficiencies and really finding new opportunities to succeed and scale. And, um, you know, when it comes to, uh, good examples of this FINRA is a great one. You may not have heard of them, but that the U S equities regulators, all trading that happens in equities, they keep track of they're look at about 250 billion records per day. >>Uh, the examiner, I was only EMR, which is our spark and Hadoop service, and they're processing 20 terabytes of data running across tens of thousands of nodes. And they're looking for fraud and bad actors in the market. So, um, you know, huge, uh, transformation journey for FINRA over the years of customer I've gotten to work with personally since really 2013 onward. So it's been amazing to see their journey, uh, Pinterest, not a great customer. I'm sure everyone's familiar with, but, um, you know, they're about visual search and discovery and commerce, and, um, they're able to scale their daily lot searches, um, really a factor of three X or more, uh, drive down their costs. And they're using the Amazon Opus search service. And really what we're trying to do at AWS is give our customers the most comprehensive set of services for the end-to-end journey around, uh, data from ingestion to analytics and machine learning. And we will want to provide a comprehensive set of capabilities for ingestion, cataloging analytics, and then machine learning. And all of these are things that our partners and the startups that are run on us have available to them to build on as they build and deliver value for their customers. >>And, you know, the way we think about this is we want customers to be able to modernize what they're doing and their infrastructure. And we provide services for that. It's about unifying data, wherever it lives, connecting it. So the customers can build a complete picture of their customers and business. And then it's about innovation and really using machine learning to bring all of this unified data, to bear on driving new innovation and new opportunities for customers. And what we're trying to do AWS is really provide a scalable and secure cloud platform that customers and partners can build on a unifying is about connecting data. And it's also about providing well-governed access to data. So one of the big trends that we see is customers looking for the ability to make self-service data available to that customer there and use. And the key to that is good foundational governance. >>Once you can define good access controls, you then are more comfortable setting data free. And, um, uh, the other part of it is, uh, data lakes play a huge role because you need to be able to think about structured and unstructured data. In fact, about 80% of the data being generated today, uh, is unstructured. And you want to be able to connect data that's in data lakes with data that's in purpose-built data stores, whether that's databases on AWS databases, outside SAS products, uh, as well as things like data warehouses and machine learning systems, but really connecting data as key. Uh, and then, uh, innovation, uh, how can we bring to bear? And we imagine all processes with new technologies like AI and machine learning, and AI is also key to unlocking a lot of the value that's in unstructured data. If you can figure out what's in an imagine the sentiment of audio and do that in real-time that lets you then personalize and dynamically tailor experiences, all of which are super important to getting an edge, um, in, uh, in the modern marketplace. And so at AWS, we, when we think about connecting the dots across sources of data, allowing customers to use data, lakes, databases, analytics, and machine learning, we want to provide a common catalog and governance and then use these to help drive new experiences for customers and their apps and their devices. And then this, you know, in an ideal world, we'll create a closed loop. So you create a new experience. You observe our customers interact with it, that generates more data, which is a data source that feeds into the system. >>And, uh, you know, on AWS, uh, thinking about a modern data strategy, uh, really at the core is a data lakes built on us three. And I'll talk more about that in a second. Then you've got services like Athena included, lake formation for managing that data, cataloging it and querying it in place. And then you have the ability to use the right tool for the right job. And so we're big believers in purpose-built services for data because that's where you can avoid compromising on performance functionality or scale. Uh, and then as I mentioned, unification and inter interconnecting, all of that data. So if you need to move data between these systems, uh, there's well-trodden pathways that allow you to do that, and then features built into services that enable that. >>And, um, you know, some of the core ideas that guide the work that we do, um, scalable data lakes at key, um, and you know, this is really about providing arbitrarily scalable high throughput systems. It's about open format data for future-proofing. Uh, then we talk about purpose-built systems at the best possible functionality, performance, and cost. Uh, and then from a serverless perspective, this has been another big trend for us. We announced a bunch of serverless services and reinvented the goal here is to really take away the need to manage infrastructure from customers. They can really focus about driving differentiated business value, integrated governance, and then machine learning pervasively, um, not just as an end product for data scientists, but also machine learning built into data, warehouses, visualization and a database. >>And so it's scalable data lakes. Uh, data three is really the foundation for this. One of our, um, original services that AWS really the backbone of so much of what we do, uh, really unmatched your ability, availability, and scale, a huge portfolio of analytics services, uh, both that we offer, but also that our partners and customers offer and really arbitrary skin. We've got individual customers and estimator in the expert range, many in the hundreds of petabytes. And that's just growing. You know, as I mentioned, we see roughly a 10 X increase in data volume every five years. So that's a exponential increase in data volumes, Uh, from a purpose-built perspective, it's the right tool for the right job, the red shift and data warehousing Athena for querying all your data. Uh, EMR is our managed sparking to do, uh, open search for log analytics and search, and then Kinesis and Amex care for CAFCA and streaming. And that's been another big trend is, uh, real time. Data has been exploding and customers wanting to make sense of that data in real time, uh, is another big deal. >>Uh, some examples of how we're able to achieve differentiated performance and purpose-built systems. So with Redshift, um, using managed storage and it's led us and since types, uh, the three X better price performance, and what's out there available to all our customers and partners in EMR, uh, with things like spark, we're able to deliver two X performance of open source with a hundred percent compatibility, uh, almost three X and Presto, uh, with on two, which is our, um, uh, new Silicon chips on AWS, better price performance, about 10 to 12% better price performance, and 20% lower costs. And then, uh, all compatible source. So drop your jobs, then have them run faster and cheaper. And that translates to customer benefits for better margins for partners, uh, from a serverless perspective, this is about simplifying operations, reducing total cost of ownership and freeing customers from the need to think about capacity management. If we invent, we, uh, announced serverless redshifts EMR, uh, serverless, uh, Kinesis and Kafka, um, and these are all game changes for customers in terms of freeing our customers and partners from having to think about infrastructure and allowing them to focus on data. >>And, um, you know, when it comes to several assumptions in analytics, we've really got a very full and complete set. So, uh, whether that's around data warehousing, big data processing streaming, or cataloging or governance or visualization, we want all of our customers to have an option to run something struggles as well as if they have specialized needs, uh, uh, instances are available as well. And so, uh, really providing a comprehensive deployment model, uh, based on the customer's use cases, uh, from a governance perspective, uh, you know, like information is about easy build and management of data lakes. Uh, and this is what enables data sharing and self service. And, um, you know, with you get very granular access controls. So rule level security, uh, simple data sharing, and you can tag data. So you can tag a group of analysts in the year when you can say those only have access to the new data that's been tagged with the new tags, and it allows you to very, scaleably provide different secure views onto the same data without having to make multiple copies, another big win for customers and partners, uh, support transactions on data lakes. >>So updates and deletes. And time-travel, uh, you know, John talked about data as code and with time travel, you can look at, um, querying on different versions of data. So that's, uh, a big enabler for those types of strategies. And with blue, you're able to connect data in multiple places. So, uh, whether that's accessing data on premises in other SAS providers or, uh, clouds, uh, as well as data that's on AWS and all of this is, uh, serverless and interconnected. And, um, and really it's about plugging all of your data into the AWS ecosystem and into our partner ecosystem. So this API is all available for integration as well, but then from an AML perspective, what we're really trying to do is bring machine learning closer to data. And so with our databases and warehouses and lakes and BI tools, um, you know, we've infused machine learning throughout our, by, um, the state of the art machine running that we offer through SageMaker. >>And so you've got a ML in Aurora and Neptune for broths. Uh, you can train machine learning models from SQL, directly from Redshift and a female. You can use free inference, and then QuickSight has built in forecasting built in natural language, querying all powered by machine learning, same with anomaly detection. And here are the ideas, you know, how can we up our systems get smarter at the surface, the right insights for our customers so that they don't have to always rely on smart people asking the right questions, um, and you know, uh, really it's about bringing data back together and making it available for innovation. And, uh, thank you very much. I appreciate your attention. >>Okay. Well done reinventing the business with AWS analytics rural. That was great. Thanks for walking through that. That was awesome. I have to ask you some questions on the end-to-end view of the data. That seems to be a theme serverless, uh, in there, uh, Mel integration. Um, but then you also mentioned picking the right tool for the job. So then you've got like all these things moving on, simplify it for me right now. So from a business standpoint, how do they modernize? What's the steps that the clients are taking with analytics, what's the best practice? How do they, what's the what's the high order bit here? >>Uh, so the basic hierarchy is, you know, historically legacy systems are rigid and inflexible, and they weren't really designed for the scale of modern data or the variety of it. And so what customers are finding is they're moving to the cloud. They're moving from legacy systems with punitive licensing into more flexible, more systems. And that allows them to really think about building a decoupled, scalable future proof architecture. And so you've got the ability to combine data lakes and databases and data warehouses and connect them using common KPIs and common data protection. And that sets you up to deal with arbitrary scale and arbitrary types. And it allows you to evolve as the future changes since it makes it easy to add in a new type of engine, as we invent a better one a few years from now. Uh, and then, uh, once you've kind of got your data in a cloud and interconnected in this way, you can now build complete pictures of what's going on. You can understand all your touch points with customers. You can understand your complete supply chain, and once you can build that complete picture of your business, you can start to use analytics and machine learning to find new opportunities. So, uh, think about modernizing, moving to the cloud, setting up for the future, connecting data end to end, and then figuring out how to use that to your advantage. >>I know as you mentioned, modern data strategy gives you the best of both worlds. And you've mentioned, um, briefly, I want to get a little bit more, uh, insight from you on this. You mentioned open, open formats. One of the themes that's come out of some of the interviews, these companies we're going to be hearing from today is open source. The role opens playing. Um, how do you see that integrating in? Because again, this is just like software, right? Open, uh, open source software, open source data. It seems to be a trend. What does open look like to you? How do you see that progressing? >>Uh, it's a great question. Uh, open operates on multiple dimensions, John, as you point out, there's open data formats. These are things like JSI and our care for analytics. This allows multiple engines tend to operate on data and it'll, it, it creates option value for customers. If you're going to data in an open format, you can use it with multiple technologies and that'll be future-proofed. You don't have to migrate your data. Now, if you're thinking about using a different technology. So that's one piece now that sort of software, um, also, um, really a big enabler for innovation and for customers. And you've got things like squat arc and Presto, which are popular. And I know some of the startups, um, you know, that we're talking about as part of the showcase and use these technologies, and this allows for really the world to contribute, to innovating and these engines and moving them forward together. And we're big believers in that we've got open source services. We contribute to open-source, we support open source projects, and that's another big part of what we do. And then there's open API is things like SQL or Python. Uh, again, uh, common ways of interacting with data that are broadly adopted. And this one, again, create standardization. It makes it easier for customers to inter-operate and be flexible. And so open is really present all the way through. And it's a big part, I think, of, uh, the present and the future. >>Yeah. It's going to be fun to watch and see how that grows. It seems to be a lot of traction there. I want to ask you about, um, the other comment I thought was cool. You had the architectural slides out there. One was data lakes built on S3, and you had a theme, the glue in lake formation kind of around S3. And then you had the constellation of, you know, Kinesis SageMaker and other things around it. And you said, you know, pick the tool for the right job. And then you had the other slide on the analytics at the center and you had Redshift and all the other, other, other services around it around serverless. So one was more about the data lake with Athena glue and lake formation. The other one's about serverless. Explain that a little bit more for me, because I'm trying to understand where that fits. I get the data lake piece. Okay. Athena glue and lake formation enables it, and then you can pick and choose what you need on the serverless side. What does analytics in the center mean? >>So the idea there is that really, we wanted to talk about the fact that if you zoom into the analytics use case within analytics, everything that we offer, uh, has a serverless option for our customers. So, um, you could look at the bucket of analytics across things like Redshift or EMR or Athena, or, um, glue and league permission. You have the option to use instances or containers, but also to just not worry about infrastructure and just think declaratively about the data that you want to. >>Oh, so basically you're saying the analytics is going serverless everywhere. Talking about volumes, you mentioned 10 X volumes. Um, what are other stats? Can you share in terms of volumes? What are people seeing velocity I've seen data warehouses can't move as fast as what we're seeing in the cloud with some of your customers and how they're using data. How does the volume and velocity community have any kind of other kind of insights into those numbers? >>Yeah, I mean, I think from a stats perspective, um, you know, take Redshift, for example, customers are processing. So reading and writing, um, multiple exabytes of data there across from each shift. And, uh, you know, one of the things that we've seen in, uh, as time has progressed as, as data volumes have gone up and did a tapes have exploded, uh, you've seen data warehouses get more flexible. So we've added things like the ability to put semi-structured data and arbitrary, nested data into Redshift. Uh, we've also seen the seamless integration of data warehouses and data lakes. So, um, actually Redshift was one of the first to enable a straightforward acquiring of data. That's sitting in locally and drives as well as feed and that's managed on a stream and, uh, you know, those trends will continue. I think you'll kind of continue to see this, um, need to query data wherever it lives and, um, and, uh, allow, uh, leaks and warehouses and purpose-built stores to interconnect. >>You know, one of the things I liked about your presentation was, you know, kind of had the theme of, you know, modernize, unify, innovate, um, and we've been covering a lot of companies that have been, I won't say stumbling, but like getting to the future, some go faster than others, but they all kind of get stuck in an area that seems to be the same spot. It's the silos, breaking down the silos and get in the data lakes and kind of blending that purpose built data store. And they get stuck there because they're so used to silos and their teams, and that's kind of holding back the machine learning side of it because the machine learning can't do its job if they don't have access to all the data. And that's where we're seeing machine learning kind of being this new iterative model where the models are coming in faster. And so the silo brake busting is an issue. So what's your take on this part of the equation? >>Uh, so there's a few things I plan it. So you're absolutely right. I think that transition from some old data to interconnected data is always straightforward and it operates on a number of levels. You want to have the right technology. So, um, you know, we enable things like queries that can span multiple stores. You want to have good governance, you can connect across multiple ones. Uh, then you need to be able to get data in and out of these things and blue plays that role. So there's that interconnection on the technical side, but the other piece is also, um, you know, you want to think through, um, organizationally, how do you organize, how do you define it once data when they share it? And one of the asylees for enabling that sharing and, um, think about, um, some of the processes that need to get put in place and create the right incentives in your company to enable that data sharing. And then the foundational piece is good guardrails. You know, it's, uh, it can be scary to open data up. And, uh, the key to that is to put good governance in place where you can ensure that data can be shared and distributed while remaining protected and adhering to the privacy and compliance and security regulations that you have for that. And once you can assert that level of protection, then you can set that data free. And that's when, uh, customers really start to see the benefits of connecting all of it together, >>Right? And then we have a batch of startups here on this episode that are doing a lot of different things. Uh, some have, you know, new lake new lakes are forming observability lakes. You have CQL innovation on the front end data, tiering innovation at the data tier side, just a ton of innovation around this new data as code. How do you see as executive at AWS? You're enabling all this, um, where's the action going? Where are the white spaces? Where are the opportunities as this architecture continues to grow, um, and get traction because of the relevance of machine learning and AI and the apps are embedding data in there now as code where's the opportunities for these startups and how can they continue to grow? >>Yeah, the, I mean, the opportunity is it's amazing, John, you know, we talked a little bit about this at the beginning, but the, there is no slow down insight for the volume of data that we're generating pretty much everything that we have, whether it's a watch or a phone or the systems that we interact with are generating data and, uh, you know, customers, uh, you know, we talk a lot about the things that'll stay the same over time. And so, you know, the data volumes will continue to go up. Customers are gonna want to keep analyzing that data to make sense of it. They're going to want to be able to do it faster and more cheaply than they were yesterday. And then we're going to want to be able to make decisions and innovate, uh, in a shorter cycle and run more experiments than they were able to do. >>And so I think as long as, and they're always going to want this data to be secure and well-protected, and so I think as long as we, and the startups that we work with can continue to push on making these things better. Can I deal with more data? Can I deal with it more cheaply? Can I make it easier to get insight? And can I maintain a super high bar in security investments in these areas will just be off. Um, because, uh, the demand side of this equation is just in a great place, given what we're seeing in terms of theater and the architect for forum. >>I also love your comment about, uh, ML integration being the last leg of the equation here or less likely the journey, but you've got that enablement of the AIP solves a lot of problems. People can see benefits from good machine learning and AI is creating opportunities. Um, and also you also have mentioned the end to end with security piece. So data and security are kind of going hand in hand these days, not just the governments and the compliance stuff we're talking about security. So machine learning integration kind of connects all of this. Um, what's it all mean for the customers, >>For customers. That means that with machine learning and really enabling themselves to use machine learning, to make sense of data, they're able to find patterns that can represent new opportunities, um, quicker than ever before. And they're able to do it, uh, dynamically. So, you know, in a prior version of the world, we'd have little bit of systems and they would be relatively rigid and then we'd have to improve them. Um, with machine learning, this can be dynamic and near real time and you can customize them. So, uh, that just represents an opportunity to deepen relationships with customers and create more value and to find more efficiency in how businesses are run. So that piece is there. Um, and you know, your ideas around, uh, data's code really come into play because machine learning needs to be repeatable and explainable. And that means versioning, uh, keeping track of everything that you've done from a code and data and learning and training perspective >>And data sets are updating the machine learning. You got data sets growing, they become code modules that can be reused and, uh, interrogated, um, security okay. Is a big as a big theme data, really important security is seen as one of our top use cases. Certainly now in this day and age, we're getting a lot of, a lot of breaches and hacks coming in, being defended. It brings up the open, brings up the data as code security is a good proxy for kind of where this is going. What's your what's take on that and your reaction to that. >>So I'm, I'm security. You can, we can never invest enough. And I think one of the things that we, um, you know, guide us in AWS is security, availability, durability sort of jobs, you know, 1, 2, 3, and, um, and it operates at multiple levels. You need to protect data and rest with encryption, good key management and good practices though. You need to protect data on the wire. You need to have a good sense of what data is allowed to be seen by whom. And then you need to keep track of who did what and be able to verify and come back and prove that, uh, you know, uh, only the things that were allowed to happen actually happened. And you can actually then use machine learning on top of all of this apparatus to say, uh, you know, can I detect things that are happening that shouldn't be happening in near real time so they could put a stop to them. So I don't think any of us can ever invest enough in securing and protecting my data and our systems, and it is really fundamental or adding customer trust and it's just good business. So I think it is absolutely crucial. And we think about it all the time and are always looking for ways to raise >>Well, I really appreciate you taking the time to give the keynote final word here for the folks watching a lot of these startups that are presenting, they're doing well. Business wise, they're being used by large enterprises and people buying their products and using their services for customers are implementing more and more of the hot startups products they're relevant. What's your advice to the customer out there as they go on this journey, this new data as code this new future of analytics, what's your recommendation. >>So for customers who are out there, uh, recommend you take a look at, um, what, uh, the startups on AWS are building. I think there's tremendous innovation and energy, uh, and, um, there's really great technology being built on top of a rock solid platform. And so I encourage customers thinking about it to lean forward, to think about new technology and to embrace, uh, move to the cloud suite, modernized, you know, build a single picture of our data and, and figure out how to innovate and when >>Well, thanks for coming on. Appreciate your keynote. Thanks for the insight. And thanks for the conversation. Let's hand it off to the show. Let the show begin. >>Thank you, John pleasure, as always.
SUMMARY :
And we're going to kick it off here with our opening keynote with um, to help showcase some of the great innovation that startups are doing on top of AWS. service loss of serverless as the center of the, of the action, but all these start-ups rock set Dremio And so it's a great time to be in the data business. It's interesting to see the theme of the show getting traction, because you start to see data being treated and especially so in machine learning where you need to think about the explainability of a model, Uh, thank you so much for coming on and being the keynote presenter here for this great event. And so what we're seeing is, uh, you know, it's really about the survival And so, um, you know, it's great to see the innovation that's happening to help customers make So, um, you know, huge, uh, transformation journey for FINRA over the years of customer And the key to that is good foundational governance. And you want to be able to connect data that's in data lakes with data And then you have the ability to use the right tool for the right job. And, um, you know, some of the core ideas that guide the work that we do, um, scalable data lakes at And that's been another big trend is, uh, real time. and freeing customers from the need to think about capacity management. those only have access to the new data that's been tagged with the new tags, and it allows you to And time-travel, uh, you know, John talked about data as code And here are the ideas, you know, how can we up our systems get smarter at the surface, I have to ask you some questions on the end-to-end Uh, so the basic hierarchy is, you know, historically legacy systems are I know as you mentioned, modern data strategy gives you the best of both worlds. And I know some of the startups, um, you know, that we're talking about as part of the showcase And then you had the other slide on the analytics at the center and you had Redshift and all the other, So the idea there is that really, we wanted to talk about the fact that if you zoom about volumes, you mentioned 10 X volumes. And, uh, you know, one of the things that we've seen And so the silo brake busting is an issue. side, but the other piece is also, um, you know, you want to think through, Uh, some have, you know, new lake new lakes are forming observability lakes. And so, you know, the data volumes will continue to go up. And so I think as long as, and they're always going to want this data to be secure and well-protected, Um, and also you also have mentioned the end to end with security piece. And they're able to do it, uh, that can be reused and, uh, interrogated, um, security okay. And then you need to keep track of who did what and be able Well, I really appreciate you taking the time to give the keynote final word here for the folks watching a And so I encourage customers thinking about it to lean forward, And thanks for the conversation.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Rahul Pathak | PERSON | 0.99+ |
John | PERSON | 0.99+ |
20 terabytes | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
2013 | DATE | 0.99+ |
20% | QUANTITY | 0.99+ |
yesterday | DATE | 0.99+ |
two | QUANTITY | 0.99+ |
S3 | TITLE | 0.99+ |
Python | TITLE | 0.99+ |
FINRA | ORGANIZATION | 0.99+ |
10 X | QUANTITY | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
hundred percent | QUANTITY | 0.99+ |
SQL | TITLE | 0.98+ |
both | QUANTITY | 0.98+ |
One | QUANTITY | 0.98+ |
80 minutes | QUANTITY | 0.98+ |
each shift | QUANTITY | 0.98+ |
one piece | QUANTITY | 0.98+ |
about 80% | QUANTITY | 0.98+ |
Neptune | LOCATION | 0.98+ |
one | QUANTITY | 0.98+ |
ORGANIZATION | 0.98+ | |
today | DATE | 0.97+ |
QuickSight | ORGANIZATION | 0.97+ |
three | QUANTITY | 0.97+ |
Redshift | TITLE | 0.97+ |
wave of reinvention | EVENT | 0.97+ |
first | EVENT | 0.96+ |
hundreds of petabytes | QUANTITY | 0.96+ |
HANA | TITLE | 0.96+ |
first | QUANTITY | 0.95+ |
both worlds | QUANTITY | 0.95+ |
Aurora | LOCATION | 0.94+ |
Amex | ORGANIZATION | 0.94+ |
SAS | ORGANIZATION | 0.94+ |
pandemic | EVENT | 0.94+ |
12% | QUANTITY | 0.93+ |
about 10 | QUANTITY | 0.93+ |
past couple of years | DATE | 0.92+ |
Kafka | TITLE | 0.92+ |
Kinesis | ORGANIZATION | 0.92+ |
Liccardo | TITLE | 0.91+ |
EMR | TITLE | 0.91+ |
about five | QUANTITY | 0.89+ |
tens of thousands of nodes | QUANTITY | 0.88+ |
Kinesis | TITLE | 0.88+ |
10% | QUANTITY | 0.87+ |
three X | QUANTITY | 0.86+ |
Athena | ORGANIZATION | 0.86+ |
about 250 billion records per | QUANTITY | 0.85+ |
U S | ORGANIZATION | 0.85+ |
CAFCA | ORGANIZATION | 0.84+ |
Silicon | ORGANIZATION | 0.83+ |
every five years | QUANTITY | 0.82+ |
Season two | QUANTITY | 0.82+ |
Athena | OTHER | 0.78+ |
single picture | QUANTITY | 0.74+ |
Giorgio Vanzini, DXC Technology | AWS re:Invent 2021
(upbeat music) >> Welcome back to Las Vegas Lisa Martin live here with David Nicholson. We're at AWS reinvent 2021, this an outstanding event. There's a lot of people here, tens of thousands. And this is probably one of the most important and largest hybrid tech events that we're doing this year with AWS and its massive ecosystem of partners. We're going to be covering this two live sets, two remote studios, over 100 guests on the CUBE at this re-invent and David and I are pleased to welcome Giorgio Vanzini next the vice president and global head of partners and alliances at DXC, Georgia Welcome to the program. >> Thank you for having me. >> Talk to us about what's going on at DXC, what are you in AWS doing together with what's the scoop? >> Yes Well, some exciting things are happening between AWS and DXC, which is we're really focusing on our customers that we have especially in the banking and capital markets, but also automotive. And then also we were a launch partner today with the AWS mainframe modernization right, and so we're focusing on mainframes as well. So exciting spaces for us to go collaborate and work with AWS, for our customers. >> Talk to me about some of the things you know the last 22 months have been quite challenging, quite dynamic and we've seen such a massive acceleration to the cloud. What have you seen from your perspective? Are you seeing customers in every industry that have really figured we've got to do this now because if we don't, we're going to be out of business? >> Yes, you're absolutely correct. We've seen a dramatic acceleration of people wanting or customers wanting to move to the cloud public and private, and an acceleration of assistance that they were requesting from a global systems integrator. So what we've seen is you know, part of our clouds ride strategy that we have, really understanding what does the customer need from a strategy perspective, from a business value perspective and the technology perspective, and AWS has been a great partner with us to actually accommodate all of these kinds of things and the announcements that you had today, you know, just substantiate kind of that fact as well. >> Can you double click on the Cloud Right approach, talk to us about what that is, why it's important and what are some of the outcomes that it's helping customers to generate? >> Absolutely love to Cloud Right is really DXC's strategy to take the customers on the journey from the mainframe to the cloud, and to customize this because every customer is different. They have different requirements, different environments, different business strategies. So therefore the Cloud Right approach is really customizing it for the customer. What is the right business strategy? What is the right technology strategy? And then migrating them over into the cloud as well. Keeping in mind that again, customers are specific, industries are specific. You know, data requirements are different analytics are different, you know, government requirements are different. So you need to those in mind when you transition customers over into the cloud space. >> Right, from a data residency, data sovereignty and all of the different rules and regulations that are popping up that are kind of similar to GDPR for example, that's a big challenge, but one of the things too that's happening Giorgio is that every company to be competitive these days has to become a data company, right? There's no choice, you've got to be data-driven, you've got to have a data strategy at the core of the business, otherwise there's a competitor in the rear view mirror, who's ready to take your place. >> That is absolutely correct, and so that's part of our Cloud Right strategy is understanding what are the business requirements from the customer? Understanding their competitive edge and migrating them over. Because in many instances, to your point, they have huge reams of data, petabytes of information of data, but really making sense of it, so running the analytics on it and having the business insights. So helping the customers understand that, but then also understanding of like, what are the key business requirements that they have? Which applications to migrate and which not to migrate? >> So I'm curious, you mentioned that you're a launch partner for mainframe modernization. That's sort of one slice of and very important slice of some organization's business and migration strategy to cloud. I'm curious what the DXC blend is between standardized offerings and bespoke services and how you manage that? Do you have a thought about that? Wouldn't it be great to have small, medium and large and have people click on it? >> Yes here's a T-shirt for you, which size are you? Now I'm actually glad you asked me that question because that's exactly going to the core of the Cloud Right strategy, and the Cloud Right really means that it's like, which T-shirt size is correct for you? Right. This is the question that we just addressed which is it has to be bespoke because one size does not fit all. And so understanding the customer requirements of do we need to move the data to the cloud? Or do we move to need a subset to the cloud? Do we need to move part of the business applications and which ones and in which order? Right? And so that's why I think we bring something to the table in the AWS mainframe modernization, which is unique because we have an end to end kind of approach from a planning to implementation, to execution and running as well. So I think DEX is uniquely positioned with our Cloud Right strategy. >> One of the things AWS Giorgio talks about is not being custom but being purpose-built. Talk to me about kind of compare contrast that with bespoke solutions, industry specific, obviously customers have specificities. Do you see a difference there between purpose-built under bespoke or are they aligned from your perspective? >> Yes, I do agree that a to technology layers are definitely common layers, horizontal layers, right Where I think you have bespoken limitations on the business strategy and the business rules. And so you have to understand what business is the customer really in and how to implement the business rules into the technology stack as well, and bringing it all together. So while the technology I think goes horizontal to your point right, you know, compute and storage is the same. Wherever you go the bits are the same, however how they're utilized and how you use them for your customers and your interaction is completely different from customer to customer and industry to industry, as you guys know as well. >> You know, it can be, it can be really disheartening working in this space when you think of 475 different kinds of instances and how important it is to get that right for a customer and how much they don't care. Ultimately they don't want to hear about it, they don't want to know, but they want you to get it right, so that it doesn't matter. So it's this irony of all of the work that people have to do like at DXC to make those details not matter. Any thoughts on that? Do you, are you a dejected because of that at all? >> Well, that is part of the value that we bring, right? >> David: Sure. >> To your point, absolutely the customer doesn't care in quotes, right? Just make it work for us and run it smoothly. On the other hand, we're on the hook to make sure that all the different partners that we have, that we integrate including AWS, right. Run smoothly and coherent and are up, you know, 99.999% of the time obviously right. And so the customers do care about our you know, interaction with them as well while AWS is always there. >> One of the things that we talked about a little bit ago is every industry had to pivot right. Dramatically the last 22 months or so. And we've seen every industry cloud is no longer a nice to have We've got to be able to get there, but you mentioned a focus in banking, and I think automotive, I'd love to get your perspectives on what some of the things are the opportunities that DXC sees in those particular industries, as opportunities to modernize. >> Yes, we latched on to banking and automotive because those are ripe for transition and the customers are willing to take the steps there as well. It doesn't mean that other industries are not relevant like, you know, consumer or retail or you know, technology and, and manufacturing. However, especially in automotive I think we have a unique positioning where we have the majority of the OAMs car manufacturers worldwide as customers, and when you think about AWS, you think about the utilization of the information that comes back from telematics information and customization, right. Petabytes of information that comes back from every device, which is a car and what kind of service you can provide there. So it's an industry you know, we talked about Tesla early on as well, right It's an industry that ripe for software and software updates. very similar you see a lot of things happening in the banking capital market space, where they're moving you know their customer base into new spaces as well. Just think about all the NFTs, those are happening, all the FinTech that's happening, right. So the, the banking capital markets companies have to, you know, have an evolution going on right, and assisting them in this evolution is as part of our strategy. >> So you're responsible for global partnerships and alliances DXC would be considered a large global systems integrator. The world is obviously moving in the direction of cloud. We've got the three big players AWS, and the other two I can't think of their names while I'm sitting here in Vegas right now, how do you balance what you do with those, with a variety of providers, for customers, and are you going to market primarily as DXC with the DXC relationship with the customer? Or in support of those cloud vendors that have essentially technology that if left unimplemented is essentially worthless, right I mean you, you bridge the divide between the technology and the true value of the technology, but are you the primary seat holder at the customer table, or is AWS the primary seat holder? Or is it a little of both? Long-winded question I apologize but I think you understand what I'm saying. It's an interesting world that we live in now. >> It definitely is, and if I wouldn't know you better I would say it's a trick question, but in all seriousness, we really are customer driven just like AWS as well right so, we really are trying to do the right thing for the customer. Hence our Cloud Right strategy, where we don't have a cookie cutter approach or saying just go do the following five things and you're going to be fine. We really want to look at the customer and say, what is important to you? What is the timeframe you're looking at? What is the strategic imperative that you have? What data do you have to move? You know, what system do you have to leave behind? And then do the right thing for the customer literally right. And so in this instance, absolutely you know, in my role AWS plays a huge role as you know is one of our core hyper scaler partners, a very good partner, we love AWS. And so making sure that they're always going to be there as part of that infrastructure is part of our strategy. >> You mentioned, oh sorry Dave >> No, I was just saying it makes sense. >> It does make sense in terms of being customer first, we talk with AWS, you can't kind of have an interview with, with one of their folks without talking about that. We work backwards from the customer first. This customer obsession, it sounds like from a cultural perspective, there's pretty strong alignment there with DXC. >> Exactly right, so I think from that perspective we share the same DNA where we look first to the customer and then say okay, how do we deduct what is right for the customer and implement it that way right, Because in many instances as you know, you mentioned the, the two other hyper scaler that we don't talk about, customers usually don't have a single source kind of approach, right They usually have a dual approach. And so while we have to work with that, there's preferred vendors that we engage with, right. And so clearly AWS is one of our preferred vendors that we engage with. >> Can you share an example? I'd love to know a customer that's taken the Cloud Right approach applied really kind of in a textbook way that you think really shows the value of DXC. Any customers, but even by industry if you don't want to name them, come to mind that really show the value of that approach. >> Yeah, So we, we just concluded a major migration from one of our leading insurance companies, a global big company that you know is similar with my birthplace. But what we really did is a Cloud Right approach of migrating them from their legacy mainframe and virtualized systems that they had, to a cloud approach. And in the process of doing this you know, we reduced their overall operating expenses, their cap X expenses obviously but also reduced their overall budget about 30% reduction by moving them to the cloud. Again during the Cloud Right approach of understanding what exactly to move in, which timeframe and what to leave behind right, Because in many instances, customers don't have an exit strategy. They rush to the cloud, but then leave their you know old legacy behind and like oh, what are you going to do with this? And so you need to have a comprehensive end to end system strategy of like, what do you want to leave behind? When do you want to sunset it? And when do you want to migrate certain things over as well? >> That's got to be quite challenging for I would assume a legacy historied insurance company been around for a long time, lots of data, but culturally very different than the cloud mindset. >> You bring up one of those soft skills, right. Which is the cultural aspect of talking with the customers of how do we migrate you? It's not just, and that's why I said it's not just a business decision or a technology decision. In many instances, you affect people's life as well. When you think about old systems administrators that were working on mainframes. Now if you move everything to the clouds, they become obsolete. So rescaling the workforce and having a comprehensive plan is part of the soft skills right, Where you think more comprehensive about the customer, it's not just technology it's really is the full experience right At 360 what happens to the people? How do we migrate the people? But also setting expectations with top management, for example right of saying, how is this going to change our business? What new opportunities are going to be there? So those are all the soft kind of skills as well. >> One of the things that struck me this morning during the AWS keynote is just all of the innovation that that goes on. But AWS really is a flywheel of the customer and all the opportunities that their customers create for AWS, and the opportunities then that AWS technologies create for the customers across industries I just thought that I just kind of really felt that flywheel this morning when Adam was talking about all of the things that they're revealing, you must feel the same as a partner. >> I do, and I I'm a tech geek, so I'm totally excited about this, and it you know it feeds my soul because I can remember when, you know, when we first had analytics with you know Redshift rights and then customers are coming back and going like, well could we do something that is real time? Because we have requirements in this, and then CAFCA came out right, as a new service and I'm like okay, great right, and so we're really there to embrace you know, every new service that comes out from AWS. Which is fantastic, right I mean the speed and agility that comes out with AWS and we totally embraced that for our customers. >> Awesome, Georgia thank you for joining David and me today talking about what's going on with DXC, your partnership with AWS, Cloud Right, and how you're helping customers get Cloud Right. We appreciate your insights and your time. >> Thank you, I appreciate it too, thank you. >> All right. For David Nicholson, I'm Lisa Martin. You're watching the cube, the leader in global alive tech coverage. (upbeat music)
SUMMARY :
and David and I are pleased to especially in the banking the things you know the last and the technology perspective, from the mainframe to the cloud, of the different rules and and having the business insights. and how you manage that? and the Cloud Right really One of the things and how you use them for your of all of the work that people have to do and are up, you know, 99.999% One of the things that we and the customers are willing to take and are you going to What is the timeframe you're looking at? we talk with AWS, you can't Because in many instances as you know, that you think really And in the process of doing this you know, than the cloud mindset. is part of the soft skills right, is just all of the and it you know it feeds my soul Awesome, Georgia thank you it too, thank you. the leader in global alive tech coverage.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
David Nicholson | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Giorgio Vanzini | PERSON | 0.99+ |
David Nicholson | PERSON | 0.99+ |
Adam | PERSON | 0.99+ |
Vegas | LOCATION | 0.99+ |
Tesla | ORGANIZATION | 0.99+ |
Las Vegas | LOCATION | 0.99+ |
Dave | PERSON | 0.99+ |
one | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
99.999% | QUANTITY | 0.99+ |
DXC | ORGANIZATION | 0.99+ |
five things | QUANTITY | 0.99+ |
two remote studios | QUANTITY | 0.99+ |
GDPR | TITLE | 0.99+ |
two live sets | QUANTITY | 0.99+ |
first | QUANTITY | 0.98+ |
CAFCA | ORGANIZATION | 0.98+ |
over 100 guests | QUANTITY | 0.98+ |
both | QUANTITY | 0.98+ |
Petabytes | QUANTITY | 0.98+ |
about 30% | QUANTITY | 0.97+ |
One | QUANTITY | 0.97+ |
Giorgio | PERSON | 0.97+ |
today | DATE | 0.97+ |
this year | DATE | 0.96+ |
tens of thousands | QUANTITY | 0.96+ |
DXC Technology | ORGANIZATION | 0.96+ |
one size | QUANTITY | 0.95+ |
single source | QUANTITY | 0.94+ |
475 different kinds | QUANTITY | 0.94+ |
Cloud Right | TITLE | 0.92+ |
three big players | QUANTITY | 0.86+ |
last 22 months | DATE | 0.85+ |
DXC, Georgia | LOCATION | 0.83+ |
this morning | DATE | 0.81+ |
dual | QUANTITY | 0.81+ |
Invent | EVENT | 0.8+ |
last 22 months | DATE | 0.8+ |
AWS reinvent 2021 | EVENT | 0.76+ |
COMMUNICATIONS V1 | CLOUDERA
>>Hi today, I'm going to talk about network analytics and what that means for, for telecommunications as we go forward. Um, thinking about, uh, 5g, what the impact that's likely to have on, on network analytics and the data requirement, not just to run the network and to understand the network a little bit better. Um, but also to, to inform the rest of the operation of the telecommunications business. Um, so as we think about where we are in terms of network analytics and what that is over the last 20 years, the telecommunications industry has evolved its management infrastructure, uh, to abstract away from some of the specific technologies in the network. So what do we mean by that? Well, uh, in the, in the initial, uh, telecommunications networks were designed, there were management systems that were built in, um, eventually fault management systems, uh, assurance systems, provisioning systems, and so on were abstracted away. >>So it didn't matter what network technology had, whether it was a Nokia technology or Erickson technology or Huawei technology or whatever it happened to be. You could just look at your fault management system, understand where false, what happened as we got into the last sort of 10, 15 years or so. Telecommunication service providers become became more sophisticated in terms of their approach to data analytics and specifically network analytics, and started asking questions about why and what if in relation to their network performance and network behavior. And so network analytics as a, as a bit of an independent function was born and over time, more and more data began to get loaded into the network analytics function. So today just about every carrier in the world has a network analytics function that deals with vast quantities of data in big data environments that are now being migrated to the cloud. >>As all telecommunications carriers are migrating as many it workloads as possible, um, to the cloud. So what are the things that are happening as we migrate to the cloud that drive, uh, uh, enhancements in use cases and enhancements and scale, uh, in telecommunications network analytics? Well, 5g is the big thing, right? So 5g, uh, it's not just another G in that sense. I mean, in some cases, in some senses, it is 5g means greater bandwidth, lower latency and all those good things. So, you know, we can watch YouTube videos with less interference and, and less sluggish bandwidth and so on and so forth. But 5g is really about the enterprise and enterprise services. Transformation, 5g is more secure, kind of a network, but 5g is also a more pervasive network 5g, a fundamentally different network topology than previous generations. So there's going to be more masts and that means that you can have more pervasive connectivity. >>Uh, so things like IOT and edge applications, autonomous cars, smart cities, these kinds of things, um, are all much better served because you've got more masks that of course means that you're going to have a lot more data as well. And we'll get to that. The second piece is immersive digital services. So with more masks, with more connectivity, with lower latency with higher man, the potential, uh, is, is, is, is immense for services innovation. And we don't know what those services are going to be. We know that technologies like augmented reality, virtual reality, things like this have great potential. Um, but we, we have yet to see where those commercial applications are going to be, but the innovation and the innovation potential for 5g is phenomenal. Um, it certainly means that we're going to have a lot more, uh, edge devices, um, uh, and that again is going to lead to an increase in the amount of data that we have available. >>And then the idea of pervasive connectivity when it comes to smart, smart cities, uh, autonomous, autonomous currents, um, uh, integrated traffic management systems, um, all of this kind of stuff, those of those kind of smart environments thrive where you've got this kind of pervasive connectivity, this persistent, uh, connection to the network. Um, again, that's going to drive, um, um, uh, more innovation. And again, because you've got these new connected devices, you're going to get even more data. So this rise, this exponential rise in data is really what's driving the change in, in network analytics. And there are four major vectors that are driving this increase in data in terms of both volume and in terms of speed. So the first is more physical elements. So we said already that 5g networks are going to have a different apology. 5g networks will have more devices, more and more masks. >>Um, and so with more physical elements in the network, you're going to get more physical data coming off those physical networks. And so that needs to be aggregated and collected and managed and stored and analyzed and understood when, so that we can, um, have a better understanding as to why things happened the way they do, why the network behaves in which they do in, in, in, in ways that it does and why devices that are connected to the network. And ultimately of course, consumers, whether they be enterprises or retail customers, um, behave in the way they do in relation to their interaction within our edge nodes and devices, we're going to have a, uh, an explosion in terms of the number of devices. We've already seen IOT devices with your different kinds of trackers and, uh, and, and sensors that are hanging off the edge of the network, whether it's to make buildings smarter car smarter, or people smarter, um, in, in terms of having the, the, the measurements and the connectivity and all that sort of stuff. >>So the numbers of devices on the agent beyond the age, um, are going to be phenomenal. One of the things that we've been trying to with as an industry over the last few years is where does the telco network end, and where does the enterprise, or even the consumer network begin. You used to be very clear that, you know, the telco network ended at the router. Um, but now it's not, it's not that clear anymore because in the enterprise space, particularly with virtualized networking, which we're going to talk about in a second, um, you start to see end to end network services being deployed. Um, uh, and so are they being those services in some instances are being managed by the service provider themselves, and in some cases by the enterprise client, um, again, the line between where the telco network ends and where the enterprise or the consumer network begins, uh, is not clear. >>Uh, so, so those edge, the, the, the proliferation of devices at the age, um, uh, in terms of, um, you know, what those devices are, what the data yield is and what the policies are, their need to govern those devices, um, in terms of security and privacy, things like that, um, that's all going to be really, really important virtualized services. We just touched on that briefly. One of the big, big trends that's happening right now is not just the shift of it operations onto the cloud, but the shift of the network onto the cloud, the virtualization of network infrastructure, and that has two major impacts. First of all, it means that you've got the agility and all of the scale, um, uh, benefits that you get from migrating workloads to the cloud, the elasticity and the growth and all that sort of stuff. But arguably more importantly for the telco, it means that with a virtualized network infrastructure, you can offer entire networks to enterprise clients. >>So if you're selling to a government department, for example, is looking to stand up a system for certification of, of, you know, export certification, something like that. Um, you can not just sell them the connectivity, but you can sell them the networking and the infrastructure in order to serve that entire end to end application. You could sentence, you could offer them in theory, an entire end-to-end communications network, um, and with 5g network slicing, they can even have their own little piece of the 5g bandwidth that's been allocated against the carrier, um, uh, and, and have a complete end to end environment. So the kinds of services that can be offered by telcos, um, given virtualize network infrastructure, uh, are, are many and varied. And it's a, it's a, it's a, um, uh, an outstanding opportunity. But what it also means is that the number of network elements virtualized in this case is also exploding. >>That means the amount of data that we're getting on, uh, informing us as to how those network elements are behaving, how they're performing, um, uh, is, is, is going to go up as well. And then finally, AI complexity. So on the demand side, um, while historically, uh, um, network analytics, big data, uh, has been, has been driven by, um, returns in terms of data monetization, uh, whether that's through cost avoidance, um, or service assurance, uh, or even revenue generation through data monetization and things like that. AI is transforming telecommunications and every other industry, the potential for autonomous operations, uh, is extremely attractive. And so understanding how the end-to-end telecommunication service delivering delivery infrastructure works, uh, is essential, uh, as a training ground for AI models that can help to automate a huge amount of telecommunications operating, um, processes. So the AI demand for data is just going through the roof. >>And so all of these things combined to mean big data is getting explosive. It is absolutely going through the roof. So that's a huge thing that's happening. So as telecommunications companies around the world are looking at their network analytics infrastructure, which was initially designed for service insurance primarily, um, and how they migrate that to the cloud. These things are impacting on those decisions because you're not just looking at migrating a workload to operate in the cloud that used to work in the, in the data center. Now you're looking at, um, uh, migrating a workload, but also expanding the use cases in that work and bear in mind, many of those, those are going to need to remain on prem. So they'll need to be within a private cloud or at best a hybrid cloud environment in order to satisfy a regulatory jurisdictional requirements. So let's talk about an example. >>So LGU plus is a Finastra fantastic service provider in Korea. Um, huge growth in that business over the last, uh, over the last 10, 15 years or so. Um, and obviously most people will be familiar with LG, the electronics brand, maybe less so with, uh, with LG plus, but they've been doing phenomenal work. And we're the first, uh, business in the world who launch commercial 5g in 2019. And so a huge milestone that they achieved. And at the same time they deploy the network real-time analytics platform or in rep, uh, from a combination of Cloudera and our partner calmer. Now, um, there were a number of things that were driving, uh, the requirement for it, for the, for the analytics platform at the time. Um, clearly the 5g launch was that was the big thing that they had in mind, but there were other things that re so within the 5g launch, um, uh, they were looking for, for visibility of services, um, and service assurance and service quality. >>So, you know, what services have been launched? How are they being taken up? What are the issues that are arising, where are the faults happening? Um, where are the problems? Because clearly when you launch a new service, but then you want to understand and be on top of the issues as they arise. Um, so that was really, really important. The second piece was, and, you know, this is not a new story to any telco in the world, right. But there are silos in operation. Uh, and so, um, taking advantage of, um, or eliminating redundancies through the process, um, of, of digital transformation, it was really important. And so particular, the two silos between wired and the wireless sides of the business come together so that there would be an integrated network management system, um, for, uh, for LGU plus, as they rolled out 5g. So eliminating redundancy and driving cost savings through the, the integration of the silos is really, really important. >>And that's a process and the people thing every bit, as much as it is a systems and a data thing. So, um, another big driver and the fourth one, you know, we've talked a little bit about some of these things, right? 5g brings huge opportunity for enterprise services, innovation. So industry 4.0 digital experience, these kinds of use cases, um, are very important in the south Korean marketing and in the, um, in the business of LGU plus. And so, uh, um, looking at AI and how can you apply AI to network management? Uh, again, there's a number of use cases, really, really exciting use cases that have gone live now, um, in LG plus since, uh, since we did this initial deployment and they're making fantastic strides there, um, big data analytics for users across LGU plus, right? So it's not just for, um, uh, it's not just for the immediate application of 5g or the support or the 5g network. >>Um, but also for other data analysts and data scientists across the LGU plus business network analytics, while primarily it's primary it's primary use case is around network management, um, LGU plus, or, or network analytics, um, has applications across the entire business, right? So, um, you know, for customer churn or next best offer for understanding customer experience and customer behavior really important there for digital advertising, for product innovation, all sorts of different use cases and departments within the business needed access to this information. So collaboration sharing across the network, the real-time network analytics platform, um, it was very important. And then finally, as I mentioned, LG group is much bigger than just LG plus it's because the electronics and other pieces, and they had launched a major group wide digital transformation program in 2019, and still being a part of that was, well, some of them, the problems that they were looking to address. >>Um, so first of all, the integration of wired and wireless data service data sources, and so getting your assurance data sources, your network, data sources, uh, and so on integrated with is really, really important scale was massive for them. Um, you know, they're talking about billions of transactions in under a minute, uh, being processed, um, and hundreds of terabytes per day. So, uh, you know, phenomenal scale, uh, that needed to be available out of the box as it were, um, real time indicators and alarms. And there was lots of KPIs and thresholds set that, you know, w to make, make it to meet certain criteria, certain standards, um, customer specific, real time analysis of 5g, particularly for the launch root cause analysis, an AI based prediction on service, uh, anomalies and service service issues was, was, was a core use case. Um, as I talked about already the provision of service of data services across the organization, and then support for 5g, uh, served the business service, uh, impact, uh, was extremely important. >>So it's not just understand well, you know, that you have an outage in a particular network element, but what is the impact on the business of LGU plus, but also what is the impact on the business of the customer, uh, from an outage or an anomaly or a problem on, on, on the network. So being able to answer those kinds of questions really, really important, too. And as I said, between Cloudera and Kamarck, uh, uh, and LGU plus, uh, really themselves an intrinsic part of the solution, um, uh, this is, this is what we, we ended up building. So a big complicated architecture space. I really don't want to go into too much detail here. Um, uh, you can see these things for yourself, but let me skip through it really quickly. So, first of all, the key data sources, um, you have all of your wireless network information, other data sources. >>This is really important because sometimes you kind of skip over this. There are other systems that are in place like the enterprise data warehouse that needed to be integrated as well, southbound and northbound interfaces. So we get our data from the network and so on, um, and network management applications through file interfaces. CAFCA no fire important technologies. And also the RDBMS systems that, uh, you know, like the enterprise data warehouse that we're able to feed that into the system. And then northbound, um, you know, we spoke already about me making network analytics services available across the enterprise. Um, so, uh, you know, uh, having both the file and the API interface available, um, for other systems and other consumers across the enterprise is very important. Um, lots of stuff going on then in the platform itself to petabytes and persistent storage, um, Cloudera HDFS, 300 nodes for the, the raw data storage, um, uh, and then, uh, could do for real time storage for real-time indicator analysis, alarm generation, um, uh, and other real time, um, processes. >>Uh, so there, that was the, the core of the solution, uh, spark processes for ETL key quality indicators and alarming, um, and also a bunch of work done around, um, data preparation, data generation for transferal to, to third party systems, um, through the northbound interfaces, um, uh, Impala, API queries, um, for real-time systems, uh, there on the right hand side, and then, um, a whole bunch of clustering classification, prediction jobs, um, through the, uh, the, the, the, the ML processes, the machine learning processes, uh, again, another key use case, and we've done a bunch of work on that. And, um, I encourage you to have a look at the Cloudera website for more detail on some of the work that we did here. Um, so this is some pretty cool stuff. Um, and then finally, just the upstream services, some of these there's lots more than, than, than simply these ones, but service assurance is really, really important. So SQM cm and SED grade. So the service quality management customer experience, autonomous controllers, uh, really, really important consumers of, of the, of the real-time analytics platform, uh, and your conventional service assurance, um, functions like faulted performance management. Uh, these things are as much consumers of the information and the network analytics platform as they are providers of data to the network, uh, analytics >>Platform. >>Um, so some of the specific use cases, uh, that, uh, have been, have been stood up and that are delivering value to this day and lots of more episodes, but these are just three that we pulled out. Um, so first of all, um, uh, sort of specific monitoring and customer quality analysis, Karen response. So again, growing from the initial 5g launch and then broadening into broader services, um, understanding where there are the, where there are issues so that when people complaining, when people have an issue, um, that, um, uh, that we can answer the, the concerns of the client, um, in a substantive way, um, uh, AI functions around root cause analysis or understanding why things went wrong when they went wrong. Um, uh, and also making recommendations as to how to avoid those occurrences in the future. Uh, so we know what preventative measures can be taken. Um, and then finally the, uh, the collaboration function across LGU plus extremely important and continues to be important to this day where data is shared throughout the enterprise, through the API Lira through file interfaces and other things, and through interface integrations with, uh, with upstream systems. >>So, um, that's kind of the, the, uh, real quick run through of LGU plus the numbers are just stave staggering. Um, you know, we've seen, uh, upwards of a billion transactions in under 40 seconds being, um, uh, being tested. Um, and, and we've gone beyond those thresholds now, already, um, and we're started and, and, and, and this isn't just a theoretical sort of a benchmarking test or something like that. We're seeing these kinds of volumes of data and not too far down the track. So, um, with those things that I mentioned earlier with the proliferation of, of, um, of network infrastructure, uh, in the 5g context with virtualized elements, with all of these other bits and pieces are driving massive volumes of data towards the, uh, the, the, the network analytics platform. So phenomenal scale. Um, this is just one example we work with, with service providers all over the world is over 80% of the top 100 telecommunication service providers run on Cloudera. >>They use Cloudera in the network, and we're seeing those customers, all migrating legacy cloud platforms now onto CDP onto the Cloudera data platform. Um, they're increasing the, the, the jobs that they do. So it's not just warehousing, not just ingestion ETL, and moving into things like machine learning. Um, and also looking at new data sources from places like NWTF the network data analytics function in 5g, or the management and orchestration layer in, in software defined networks, network, function, virtualization. So, you know, new use cases coming in all the time, new data sources coming in all the time growth in, in, in, in the application scope from, as we say, from edge to AI. Um, and so it's, it's really exciting to see how the, the, the, the footprint is growing and how, uh, the applications in telecommunications are really making a difference in, in facilitating, um, network transformation. And that's covering that. That's me covered for today. I hope you found that helpful, um, by all means, please reach out, uh, there's a couple of links here. You can follow me on Twitter. You can connect to the telecommunications page, reach out to me directly at Cloudera. I'd love to answer your questions, um, uh, and, uh, and talk to you about how big data is transforming networks, uh, and how network transformation is, is accelerating telcos, uh, throughout >>Jamie Sharath with Liga data, I'm primarily on the delivery side of the house, but I also support our new business teams. I'd like to spend a minute really just kind of telling you about the legal data, where basically a Silicon valley startup, uh, started in 2014, and, uh, our lead iron, our executive team, basically where the data officers at Yahoo before this, uh, we provide managed data services, and we provide products that are focused on telcos. So we have some experience in non telco industry, but our focus for the last seven years or so is specifically on telco. So again, something over 200 employees, we have a global presence in north America, middle east Africa, Asia, and Europe. And we have folks in all of those places, uh, I'd like to call your attention to the, uh, the middle really of the screen there. So here is where we have done some partnership with Cloudera. >>So if you look at that and you can see we're in Holland and Jamaica, and then a lot to throughout Africa as well. Now, the data fabric is the product that we're talking about. And the data fabric is basically a big data type of data warehouse with a lot of additional functionality involved. The data fabric is comprised of, uh, some something called a flare, which we'll talk about in a minute below there, and then the Cloudera data platform underneath. So this is how we're partnering together. We, uh, we, we have this tool and it's, uh, it's functioning and delivering in something over 10 up. So flare now, flare is a piece of that legal data IP. The rest is there. And what flare does is that basically pulls in data, integrates it to an event streaming platform. It's, uh, it is the engine behind the data fabric. >>Uh, it's also a decisioning platform. So in real time, we're able to pull in data. We're able to run analytics on it, and we're able to alert are, do whatever is needed in a real-time basis. Of course, a lot of clients at this point are still sending data in batch. So it handles that as well, but we call that a CA picture Sanchez. Now Sacho is a very interesting app. It's an AI analytics app for executives. What it is is it runs on your mobile phone. It ties into your data. Now this could be the data fabric, but it couldn't be a standalone product. And basically it allows you to ask, you know, human type questions to say, how are my gross ads last week? How are they comparing against same time last week before that? And even the same time 60 days ago. So as an executive or as an analyst, I can pull it up and I can look at it instantly in a meeting or anywhere else without having to think about queries or anything like that. >>So that's pretty much for us at legal data, not really to set the context of where we are. So this is a traditional telco environments. So you see the systems of record, you see the cloud, you see OSS and BSS data. So one of the things that the next step above which calls we call the system of intelligence of the data fabric does, is it mergers that BSS and OSS data. So the longer we have any silos or anything that's separated, it's all coming into one area to allow business, to go in or allow data scientists go in and do that. So if you look at the bottom line, excuse me, of the, uh, of the system of intelligence, you can see that flare is the tools that pulls in the data. So it provides even streaming capabilities. It preserves entity states, so that you can go back and look at it state at any time. >>It does stream analytics that is as the data is coming in, it can perform analytics on it. And it also allows real-time decisioning. So that's something that, uh, that's something that business users can go in and create a system of, uh, if them's, it looks very much like the graph database, where you can create a product that will allow the user to be notified if a certain condition happens. So for instance, a bundle, so a real-time offer or user is succinct to run out of is ongoing, and an offer can be sent to him right on the fly. And that's set up by the business user as opposed to programmers, uh, data infrastructure. So the fabric has really three areas. That data is persistent, obviously there's the data lake. So the data lake stores that level of granularity that is very deep years and years of history, data, scientists like that, uh, and, uh, you know, for a historical record keeping and requirements from the government, that data would be stored there. >>Then there's also something we call the business semantics layer and the business semantics layer contains something over 650 specific telco KPIs. These are initially from PM forum, but they also are included in, uh, various, uh, uh, mobile operators that we've delivered at. And we've, we've grown that. So that's there for business data lake is there for data scientists, analytical stores, uh, they can be used for many different reasons. There are a lot of times RDBMS is, are still there. So these, this, this basically platform, this cloud they're a platform can tie into analytical data stores as well via flair access and reporting. So graphic visualizations, API APIs are a very key part of it. A third-party query tools, any kind of grid tools can be used. And those are the, of course, the, uh, the ones that are highly optimized and allow, you know, search of billions of records. >>And then if you look at the top, it's the systems of engagement, then you might vote this use cases. So teleco reporting, hundreds of KPIs that are, that are generated for users, segmentation, basically micro to macro segmentation, segmentation will play a key role in a use case. We talked about in a minute monetization. So this helps teleco providers monetize their specific data, but monetize it in. Okay, how to, how do they make money off of it, but also how might you leverage this data to engage with another client? So for instance, in some where it's allowed a DPI is used, and the fabric tracks exactly where each person goes each, uh, we call it a subscriber, goes within his, uh, um, uh, internet browsing on the, on the four or 5g. And, uh, the, all that data is stored. Uh, whereas you can tell a lot of things where the segment, the profile that's being used and, you know, what are they propensity to buy? Do they spend a lot of time on the Coca-Cola page? There are buyers out there that find that information very valuable, and then there's signs of, and we spoke briefly about Sanchez before that sits on top of the fabric or it's it's alone. >>So, so the story really that we want to tell is, is one, this is, this is one case out of it. This is a CVM type of case. So there was a mobile operator out there that was really offering, you know, packages, whether it's a bundle or whether it's a particular tool to subscribers, they, they were offering kind of an abroad approach that it was not very focused. It was not depending on the segments that were created around the profiling earlier, uh, the subscriber usage was somewhat dated and this was causing a lot of those. A lot of those offers to be just basically not taken and, and not, not, uh, audited. Uh, there was limited segmentation capabilities really before the, uh, before the, uh, fabric came in. Now, one of the key things about the fabric is when you start building segments, you can build that history. >>So all of that data stored in the data lake can be used in terms of segmentation. So what did we do about that? The, the, the envy and, oh, the challenge this, uh, we basically put the data fabric in and the data fabric was running Cloudera data platform and that, uh, and that's how we team up. Uh, we facilitated the ability to personalize campaign. So what that means is, uh, the segments that were built and that user fell within that segment, we knew exactly what his behavior most likely was. So those recommendations, those offers could be created then, and we enable this in real time. So real-time ability to even go out to the CRM system and gather further information about that. All of these tools, again, we're running on top of the Cloudera data platform, uh, what was the outcome? Willie, uh, outcome was that there was a much more precise offer given to the client that is, that was accepted, no increase in cross sell and upsell subscriber retention. >>Uh, our clients came back to us and pointed out that, uh, it was 183% year on year revenue increase. Uh, so this is a, this is probably one of the key use cases. Now, one thing to really mention is there are hundreds and hundreds of use cases running on the fabric. And I would even say thousands. A lot of those have been migrated. So when the fabric is deployed, when they bring the Cloudera and the legal data solution in there's generally a legacy system that has many use cases. So many of those were, were migrated virtually all of them in pen, on put on the cloud. Uh, another issue is that new use cases are enabled again. So when you get this level of granularity and when you have campaigns that can now base their offers on years of history, as opposed to 30 days of history, the campaigns campaign management response systems, uh, are, are, uh, are enabled quite a bit to do all, uh, to be precise in their offers. Okay. >>Okay. So this is a technical slide. Uh, one of the things that we normally do when we're, when we're out there talking to folks, is we talk and give an overview and that last little while, and then we give a deep technical dive on all aspects of it. So sometimes that deep dive can go a couple of hours. I'm going to do this slide and a couple of minutes. So if you look at it, you can see over on the left, this is the, uh, the sources of the data. And they go through this tool called flare that runs on the cloud. They're a data platform, uh, that can either be via cues or real-time cues, or it can be via a landing zone, or it can be a data extraction. You can take a look at the data quality that's there. So those are built in one of the things that flare does is it has out of the box ability to ingest data sources and to apply the data quality and validation for telco type sources. >>But one of the reasons this is fast to market is because throughout those 10 or 12, uh, opcos that we've done with Cloudera, where we have already built models, so models for CCN, for air for, for most mediation systems. So there's not going to be a type of, uh, input that we haven't already seen are very rarely. So that actually speeds up deployment very quickly. Then a player does the transformations, the, uh, the metrics, continuous learning, we call it continuous decisioning, uh, API access. Uh, we, uh, you know, for, for faster response, we use distributed cash. I'm not going to go too deeply in there, but the layer in the business semantics layer again, are, are sitting on top of the Cloudera data platform. You see the Kafka CLU, uh, Q1, the right as well. >>And all of that, we're calling the fabric. So the fabric is Cloudera data platform and the cloud and flair and all of this runs together. And, and by the way, there've been many, many, many, many hundreds of hours testing flare with Cloudera and, uh, and the whole process, the results, what are the results? Well, uh, there are, there are four I'm going to talk about, uh, we saw the one for the, it was called my pocket pocket, but it's a CDM type, a use case. Uh, the subscribers of that mobile operator were 14 million plus there was a use case for 24 million plus that a year on year revenue was 130%, uh, 32 million plus for 38%. These are, um, these are different CVM pipe, uh, use cases, as well as network use cases. And then there were 44%, uh, telco with 76 million subscribers. So I think that there are a lot more use cases that we could talk about, but, but in this case, this is the ones we're looking at, uh, again, 183%. This is something that we find consistently. And these figures come from our, uh, our actual end client. How do we unlock the full potential of this? Well, I think to start is to arrange a meeting and, uh, it would be great to, to, uh, for you to reach out to me or to Anthony. Uh, we're working at the junction on this, and we can set up a, uh, we can set up a meeting and we can go through this initial meeting. And, uh, I think that's the very beginning. Uh, again, you can get additional information from Cloudera website and from the league of data website, Anthony, that's the story. Thank you. >>No, that's great. Jeremy, thank you so much. It's a, it's, it's wonderful to go deep. And I know that there are hundreds of use cases being deployed in MTN, um, but great to go deep on one. And like you said, it can, once you get that sort of architecture in place, you can do so many different things. The power of data is tremendous, but it's great to be able to see how you can, how you can track it end to end from collecting the data, processing it, understanding it, and then applying it in a commercial context and bringing actual revenue back into the business. So there is your ROI straight away. Now you've got a platform that you can transform your business on. That's, that's, it's a tremendous story, Jamie, and thank you for your part. Sure. Um, that's a, that's, that's our story for today. Like Jamie says, um, please do flee, uh, feel free to reach out to us. Um, the, the website addresses are there and our contact details, and we'd be delighted to talk to you a little bit more about some of the other use cases, perhaps, um, and maybe about your own business and, uh, and how we might be able to make it, make it perform a little better. So thank you.
SUMMARY :
Um, thinking about, uh, So it didn't matter what network technology had, whether it was a Nokia technology or Erickson technology the cloud that drive, uh, uh, enhancements in use cases uh, and that again is going to lead to an increase in the amount of data that we have available. So the first is more physical elements. And so that needs to be aggregated and collected and managed and stored So the numbers of devices on the agent beyond the age, um, are going to be phenomenal. the agility and all of the scale, um, uh, benefits that you get from migrating So the kinds of services So on the demand side, um, So they'll need to be within a private cloud or at best a hybrid cloud environment in order to satisfy huge growth in that business over the last, uh, over the last 10, 15 years or so. And so particular, the two silos between And so, uh, um, the real-time network analytics platform, um, it was very important. Um, so first of all, the integration of wired and wireless data service data sources, So, first of all, the key data sources, um, you have all of your wireless network information, And also the RDBMS systems that, uh, you know, like the enterprise data warehouse that we're able to feed of the information and the network analytics platform as they are providers of data to the network, Um, so some of the specific use cases, uh, Um, you know, we've seen, Um, and also looking at new data sources from places like NWTF the network data analytics So here is where we have done some partnership with So if you look at that and you can see we're in Holland and Jamaica, and then a lot to throughout And even the same time So the longer we have any silos data, scientists like that, uh, and, uh, you know, for a historical record keeping and requirements of course, the, uh, the ones that are highly optimized and allow, the segment, the profile that's being used and, you know, what are they propensity to buy? Now, one of the key things about the fabric is when you start building segments, So all of that data stored in the data lake can be used in terms of segmentation. So when you get this level of granularity and when you have campaigns that can now base their offers So if you look at it, you can see over on the left, this is the, uh, the sources of the data. So there's not going to be a type of, uh, input that we haven't already seen are very rarely. So the fabric is Cloudera data platform and the cloud uh, and how we might be able to make it, make it perform a little better.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jamie | PERSON | 0.99+ |
Jeremy | PERSON | 0.99+ |
Holland | LOCATION | 0.99+ |
Jamie Sharath | PERSON | 0.99+ |
Anthony | PERSON | 0.99+ |
Korea | LOCATION | 0.99+ |
38% | QUANTITY | 0.99+ |
Cloudera | ORGANIZATION | 0.99+ |
2014 | DATE | 0.99+ |
2019 | DATE | 0.99+ |
183% | QUANTITY | 0.99+ |
Europe | LOCATION | 0.99+ |
24 million | QUANTITY | 0.99+ |
14 million | QUANTITY | 0.99+ |
LG | ORGANIZATION | 0.99+ |
second piece | QUANTITY | 0.99+ |
30 days | QUANTITY | 0.99+ |
Jamaica | LOCATION | 0.99+ |
Nokia | ORGANIZATION | 0.99+ |
Huawei | ORGANIZATION | 0.99+ |
today | DATE | 0.99+ |
Yahoo | ORGANIZATION | 0.99+ |
130% | QUANTITY | 0.99+ |
32 million | QUANTITY | 0.99+ |
Asia | LOCATION | 0.99+ |
last week | DATE | 0.99+ |
Erickson | ORGANIZATION | 0.99+ |
Finastra | ORGANIZATION | 0.99+ |
three | QUANTITY | 0.99+ |
thousands | QUANTITY | 0.99+ |
Africa | LOCATION | 0.99+ |
north America | LOCATION | 0.99+ |
telco | ORGANIZATION | 0.99+ |
Silicon valley | LOCATION | 0.99+ |
first | QUANTITY | 0.99+ |
each person | QUANTITY | 0.99+ |
Willie | PERSON | 0.99+ |
10 | QUANTITY | 0.99+ |
44% | QUANTITY | 0.99+ |
over 80% | QUANTITY | 0.99+ |
one | QUANTITY | 0.98+ |
76 million subscribers | QUANTITY | 0.98+ |
60 days ago | DATE | 0.98+ |
over 200 employees | QUANTITY | 0.98+ |
LGU plus | ORGANIZATION | 0.98+ |
Cloudera | TITLE | 0.98+ |
Sacho | TITLE | 0.98+ |
middle east Africa | LOCATION | 0.97+ |
First | QUANTITY | 0.97+ |
Liga data | ORGANIZATION | 0.97+ |
four major vectors | QUANTITY | 0.97+ |
under 40 seconds | QUANTITY | 0.97+ |
YouTube | ORGANIZATION | 0.97+ |
one example | QUANTITY | 0.97+ |
One | QUANTITY | 0.97+ |
two silos | QUANTITY | 0.97+ |
each | QUANTITY | 0.96+ |
Karen | PERSON | 0.96+ |
one case | QUANTITY | 0.96+ |
billions of records | QUANTITY | 0.96+ |
three areas | QUANTITY | 0.96+ |
under a minute | QUANTITY | 0.95+ |
CAFCA | ORGANIZATION | 0.95+ |
one thing | QUANTITY | 0.95+ |
both | QUANTITY | 0.94+ |
12 | QUANTITY | 0.94+ |
LG plus | ORGANIZATION | 0.94+ |
ORGANIZATION | 0.94+ | |
one area | QUANTITY | 0.93+ |
fourth one | QUANTITY | 0.93+ |
hundreds and | QUANTITY | 0.92+ |
a year | QUANTITY | 0.92+ |
Savio Rodrigues, IBM | IBM Think 2021
>>From around the globe with digital coverage of IBM, think 2021 brought to you by IBM. Welcome to the cubes coverage of IBM. Think 2021. I am Lisa Martin today. I have Savio and Rodriguez here with me, the VP of integration and application platform Savio. It's great to have you on the program, >>Lisa, really great to be here. Thanks for having me >>Talk about automation integration. But one of the things that we're going to kind of break down versus is hyper automation. Gardner announced that about a year and a half ago was one of the top 10 things. It was the top 10 strategic technology trends of 2020. Well, here we are in 2021. Before we talk about automating integrations, getting IBM's perspective on hyper automation and what did we see in 2020? Like reality? >>Yeah, no great, great question. So, and IBM, we believe that the next tidal wave to hit organizations will be really the task, but frankly, the opportunity to automate the entire enterprise. And by that, I really do mean everything in the enterprise. So Gartner, when they talk about hyper automation, they're absolutely right, because they're focusing on automating business tasks, but IBM's point of view is broader than that. And so we want to think about the work that business professionals, it developers that it staff security, focus, administrators, all of that work. And we think that the real differentiation is going to come to organizations that attack the task of automating work for all three labor types, business developers, and it, so hyper automation focuses on the first labor type IBM's approach is looking at all three labor types. Now you should pick automation projects that are specific to one labor type to begin, right. Instead of saying let's automate everything, but the latter is the strategic statement. The former is tactical. Um, and, and w w we're seeing clients automating specific business processes, like order to cash, and then others are automating work of, uh, it had been such as reducing the number of security vulnerabilities found in production, and then others are automating the work of developers by automating the approach that they take to the integration life cycle. And that's what I'd like to talk to the audience about today. >>All right. So look how you talked about it in terms of prioritization. Cause that's one thing I think that businesses can struggle with in terms of making automation and eventually hyper automation successful is where do we start? Let's talk though about this application sprawl that every organization pretty much is living in. We saw this massive adoption of SAS applications and 2020, which we, a lot of businesses were dependent on to even facilitate just collaboration, but talk to us about the relationship between integration automation, applications. >>Another great question. So I spend most of my day thinking about integration, um, but I also know that most of my clients and probably the audience here thinks about automation first and then thinks about integration as a means, not the ends. The ultimate goal is digital transformation. I E delivering new apps faster with higher quality, if that's the case. And you think about what's an application today versus what will an application 20 years ago. So today there's definitely some business logic and code that you're writing, but the majority is actually integration logic. So you have to connect to a SAS service like Workday to get data, connect to an app that's running on AWS, get other data that's running on IBM cloud to transform it, put it into a different database that's running on Azure. So there's a little bit of application logic and a ton of integration logic. So if you're a line of business owner that controls 50% or more of it budgets, or you're a CIO, that's beholden to that line of business, um, and you want applications faster than ever before, and you don't want to sacrifice quality. How are you going to do that? Well, the way you do that is by focusing on the integration tier, because applications are really driven by integration today. So if you want a faster applications with higher quality, you really need to think about delivering integrations faster with higher quality. >>An integration is absolutely critical as we look at the hybrid cloud, the advance of AI organizations that are in this multi-hybrid cloud world, what are some of the challenges that they face with respect to integrating those applications? So to your point, you know, they can pull down data for Workday, align it with data in AWS, for example, to make business decisions in real time, >>One of the biggest challenges is manual effort, right? So we started the conversation thinking about automation and when we're coming back to it, because we believe that you have to automate your integrations and the way you do so is through AI. So you can of course use rules-based, um, automations. And that helps to some degree, but things get really interesting when you apply AI and the automation is driven by real world data. That's specific to your organization in a continuous feedback loop. We like to call closed loop and that's continuously driving efficiency. So if you think about the integration life cycle, you've got to create an integration, test it, socialize, it operated governance. That's what we mean by automating integrations, that whole life cycle. So for instance, if you can create an integration flow and do a field mapping based on AI, best practices, you reduce manual effort, you reduce coding, you reduce the need for integration experts, or if you're a business user, and you're able to describe your intent and you have your integration software handle, um, converting that intent into university that's required. >>So for instance, if you could say, generate a lead score and wrote the leads based on location, um, to your sales team, well, you know, what, what you're trying to achieve, why not get the software to do that for you based on AI, under the covers, or if you're doing testing, um, how about letting the AI generate hundreds of new tests for your integrations that reflect real world usage behavior at your specific company. And these tests are based on other APIs that are running at your company. So we take the operational data. We know what's, uh, which parts of the APR are being exercised. We know what data is going through your system. So things that are, for instance, personally, identifiable shouldn't be used as test data, right. Or if you're operating your integrations and wouldn't it be great if your AI could uncover optimizations in the integration flow, such as adding, adding in, um, maybe buffering to a message queue so that it prevents you from, uh, overages on your Salesforce account and having that happen without needing a human in front of a dashboard. I E the AI under the covers is doing this for you. So for AI to really drive that integration automation, you need the operational data, um, from your specific company and using that in a closed loop fashion. So you're continuously improving, not just your current integrations, but your future integration. >>I can only imagine how much more important this has been become in the last year as businesses and industry we're pivoting multiple times to survive. And then ultimately thriving. When I think of integrations, I think of customers that I've spoken to, who you get the right example with perspective sales, they've got a CRM, they're got an ERP and they're not in sync and not integrated so that I can't, there's no one system of record. I can only imagine how much more important having that system of record has been in the last year for supply chains, even for demanding consumers going, can I get some toilet paper? And if so, where can I find it? >>I absolutely. And this is where that notion of a closed loop, um, approach to integration and the automation via AI comes in, right? So we strongly feel that today, this is the time the clients need to rethink their integration strategy. And we do agree with some of the other analysts and vendors that are talking about automating the integration work, and that's part of what we've discussed earlier. And that's definitely necessary, but it's not sufficient. Right. Go ahead. Sorry. Sorry. Well, yeah. So our feeling here is that you also have to be thinking about evolving those integrations in a closed loop fashion. So you're continuously making those integrations better, uh, with AI that's powered by your operational data, that's specific to your company. And then finally that the, you know, the old approach that integration vendors used to have in terms of this style of integration fits all problems, is the wrong approach. And instead, what we start seeing today is that customers are using multiple forms of integration to solve a specific business problems. So they're using CAFCA API APIs messaging iPad. So from an IBM standpoint, we feel that every integration must be automated closed loop. And Multistyle with AI, that's informed by your company specific data to continue to improve so that you end up getting integrations faster, but that, they're also better >>When, when companies have that spectrum of different integration processes, as you just mentioned, one of the things that I kind of think is as we look forward, and you mentioned this a minute ago, wanting to have the foundation so that not only are applications integrated today and communicating well and sharing data, but in the future. So talk to me about this closed loop system that you mentioned, and how does that enable an organization to establish that now, but be able to take on applications that are not even created yet, >>But that's really a foundational aspect that clients need to be thinking about, right? Because the closed loop nature of thinking of your integrations means that you're always looking at operational data and using that operational data and feeding it into your AI to improve your business processes, your integrations today, but also the ones that you're going to be delivering in the future. Right? So I'm sure your listeners are sitting here thinking, you know, where should I get started? Um, and frankly for me, I turn it around and say, you probably should ask your integration vendor of choice, how effectively their solutions can provide an automated closed loop and multi-step approach to integration. And if the answer that they give you, isn't very detailed, but I hope you'll ask IBM. And when you ask us this question, what you're going to hear about is IBM's cloud pack for integration, which is our, uh, our complete platform for automated closed loop. >>And multicell integrations. It's optimized for deployment across clouds, with red hat OpenShift. And with IBM, you'll be able to use natural language powered integration flows, uh, AI powered flow and field mapping, RPA conductivity, things that really take the manual effort of integration out and replace it with AI driven, um, automation. Um, second, you want to think about the data that's feeding the AI, right? So this is where the operational, um, closed loop aspect comes into play. Sometimes the other vendors in the space are taking, um, operation data from hundreds of, of, um, customers and putting it together and coming out with the average and using that to train the AI. We don't think that's the right approach because your most important, uh, integration processes are shared by no other customer, right? So you want your operational data to feed the AI. That's providing things like field mapping, flow creation, creating the API tests automatically, or the uncovering, the inefficiencies that are running in your, um, your production environment. >>Um, and then finally, would I be able to tell you is we've got the broadest set of integration capabilities of a Multistyle integration capabilities, all delivered with a common UI and shared reuse and governance with unified management across clouds. And that's exactly what clients need, because if you think about where are you deploying applications today, the composers are running on multiple clouds, so you have to integrate across clouds. And then finally, what you hear from us is that IBM provides a proven hybrid and DMC ready security gateway. That's never been hacked in 15 years, over 30,000 TPS for second, but the performance and security that frankly clients need for their applications today. So automated closed loop. Multistyle, you'll hear me repeat those over and over because we feel that's absolutely necessary for, for, um, listeners when they think about their next generation applications and the integrations that we required for them. >>Excellent. Well, Sophia, I wish we had more time, but thank you for sharing. What's going on with audit, uh, in automating integrations, AI, what hyper automation means kind of where it is. Now we look forward to hearing more about this and I'm sure the guests will be excited to see what comes at IBM. Think we thank you for your time. >>Thank you very much >>For Savio Rodriguez. I'm Lisa Martin. You're watching the cubes coverage by IBM. Think 2021.
SUMMARY :
It's great to have you on the program, Lisa, really great to be here. But one of the things that we're going to kind of break down versus is hyper And by that, I really do mean everything in the enterprise. So look how you talked about it in terms of prioritization. So if you want a faster applications with higher quality, And that helps to some degree, but things get really interesting when you apply AI and a message queue so that it prevents you from, uh, overages on your Salesforce account and When I think of integrations, I think of customers that I've spoken to, who you get the right example So our feeling here is that you So talk to me about this closed loop system that you mentioned, and how does that enable And when you ask us this question, So you want your operational data to And then finally, what you hear from us is that Think we thank you for your time. Think 2021.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Sophia | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
50% | QUANTITY | 0.99+ |
Savio Rodriguez | PERSON | 0.99+ |
2020 | DATE | 0.99+ |
Savio Rodrigues | PERSON | 0.99+ |
hundreds | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
15 years | QUANTITY | 0.99+ |
iPad | COMMERCIAL_ITEM | 0.99+ |
Lisa | PERSON | 0.99+ |
today | DATE | 0.99+ |
2021 | DATE | 0.99+ |
Gartner | ORGANIZATION | 0.99+ |
Rodriguez | PERSON | 0.99+ |
last year | DATE | 0.99+ |
Gardner | PERSON | 0.99+ |
one | QUANTITY | 0.98+ |
over 30,000 TPS | QUANTITY | 0.97+ |
20 years ago | DATE | 0.96+ |
One | QUANTITY | 0.96+ |
Salesforce | ORGANIZATION | 0.96+ |
first labor | QUANTITY | 0.95+ |
second | QUANTITY | 0.94+ |
Think 2021 | COMMERCIAL_ITEM | 0.93+ |
about a year and a half ago | DATE | 0.91+ |
OpenShift | TITLE | 0.91+ |
Azure | TITLE | 0.91+ |
one thing | QUANTITY | 0.89+ |
a minute ago | DATE | 0.89+ |
one labor type | QUANTITY | 0.86+ |
10 strategic technology trends | QUANTITY | 0.83+ |
three | QUANTITY | 0.82+ |
Workday | TITLE | 0.79+ |
top 10 things | QUANTITY | 0.78+ |
first | QUANTITY | 0.75+ |
SAS | ORGANIZATION | 0.69+ |
SAS | TITLE | 0.67+ |
Savio | PERSON | 0.67+ |
new tests | QUANTITY | 0.58+ |
CAFCA | TITLE | 0.5+ |
wave | EVENT | 0.43+ |
BOS3 Savio Rodrigues VTT
(upbeat music) >> Advertiser: From around the globe, It's theCUBE, With digital coverage of IBM think 2021, brought to you by IBM. >> Welcome to theCUBES coverage of IBM think 2021, I am Lisa Martin. Today I have Savio Rodriguez here with me, the VP of integration and application platform. Savio, It's great to have you on the program. >> Lisa, really great to be here. Thanks for having me. >> We're going to talk about Automation Integration, but one of the things that we're going to kind of break down first is hyperautomation. Gartner announced that about a year and a half ago, was one of the top 10 top, 10 strategic technology trends of 2020, and here we are in 2021. Before we talk about Automating Integrations, give me IBM's perspective on hyperautomation. And what did we see in 2020? Like, reality? >> Yeah, no, great question. So, an IBM, we believe that the next tidal wave to hit organizations will be really the task, but frankly the opportunity to automate the entire enterprise. And by that, I really do mean everything in the enterprise. So, Gartner, when they talk about hyperautomation, they're absolutely right because they're focusing on automating business tasks. But IBM's point of view is broader than that. And so we want to think about the work that business professionals, IT developers, that IT staff, security focus, administrators, all of that work. And we think that the real differentiation is going to come to organizations that attack the task of automating work for all three labor types; business, developers and IT. So hyperautomation, focuses on the first labour type. IBM's approach is looking at all three labour types. Now, you should pick automation projects that are specific to one labour type to begin, right? Instead of saying, "let's automate everything." but the latter is a strategic statement, the former is tactical. And we're seeing clients automating specific business processes like, order the cash. And then others are automating work of IBM, such as, reducing the number of security vulnerabilities, found in production. And then others are automating the work of developers by automating the approach that they take to be integration life cycle. And that's what I'd like to talk to the audience about today. >> Alright, so look how you talked about it in terms of prioritization, cause that's one thing I think that, businesses can struggle with in terms of making automation and eventually hyper automation successful, As where do we start? Let's talk though, about this application sprawl, that every organization pretty much is living in. We saw this massive adoption as SaaS applications in 2020, which a lot of businesses were dependent on to even facilitate just collaboration. But talk to us about, the relationship between integration, automation, applications. >> Another great question. So, I spend most of my day thinking about integration, but I also know that most of my clients and probably the audience too, thinks about automation first, and then thinks about integration as a means, not the ends, right? The ultimate goal, is digital transformation I.e., delivering new apps faster with higher quality. If that's the case, and you think about what's an application today? Versus, what were an application 20 years ago? So, today, there's definitely some business logic in code that you're writing but the majority is actually integration logic. So you have to connect to a SaaS service like, Workday to get data, connect to an app that's running on AWS, get other data that's running on IBM Cloud to transform it, put it into different database that's running on Azure. So, there's a little bit of application logic and a tone of integration logic. So if you're a line of business owner, that controls 50% or more of IT budgets, or you're a CIO that, beholden to that line of business, and you want applications faster, than ever before, and you don't want to sacrifice quality, How are you going to do that? Well, the way you do that, is by focusing on the integration tier because, applications are really driven by integration today. So if you want, a faster applications with higher quality, you really need to think about delivering integrations faster with higher quality. >> An integration is absolutely critical. As we look at the hybrid cloud, the advance of AI, organizations that are in this multi-hybrid cloud world, what are some of the challenges that they face, with respect to integrating these applications at your point? You know, they can pull down data from Workday, align it with data in AWS, for example, to make business decisions in real time. >> One of the biggest challenges is, manual effort, right? So, we started the conversation thinking about automation and we're coming back to it, because we believe that, you have to automate your integrations and the way you do so, is through AI. So you can of course use, the rules-based automations. And that helps to some degree. But things get really interesting, when you apply AI, and the automation is driven by real world data. that's specific to your organization in a continuous feedback loop, we like to call, closed loop and that's continuously driving efficiency. So, if you think about the integration life cycle, you've got to create an integration, test it, socialize it, operate it, govern it. That's what we mean by, Automating Integrations, the whole life cycle. So, for instance, if you can create an integration flow, and do field mapping based on AI best practices, you reduce manual effort, you reduce coding you reduce the need for integration experts or if you're a business user, and you're able to describe your intent, and you have your integration software, handle, converting that intent into universal that's required. So for instance, if you could say, generate a lead score, and wrote the leads based on location to your sales team. Well, you know what you're trying to achieve, when I get the software to do that for you, based on AI under the covers or if you're doing testing, how about letting the AI generate hundreds of new tests for your integrations, that reflect real world usage behavior, at your specific company. And these tests, are based on, other API that are running at your company. So we take the operational data, we know which parts of the API are being exercised, We know what data is going through your system. So things that are for instance, personally identifiable, shouldn't be used to test data, right? Or if you're operating your integrations, and wouldn't it be great if your AI could uncover optimizations in the integration flow? such as, adding in maybe buffering to a message queue so that, it prevents you from overages on your Salesforce account and having that happen, without needing a human in front of a dashboard I.e., the AI under the covers is doing this for you. So, for AI to really drive that Integration Automation, you need the operational data, from your specific company and using that in a closed loop fashion you're continuously improving, not just your current integrations, but your future integration. >> I can only imagine how much more important this has been become, in the last year as businesses and every industry were pivoting, multiple times to survive and then ultimately thriving. When I think of integrations, I think of customers that I've spoken to who you get the right example with respect to sales. They've got a CRM, they're got an ERP and they're not in sync and not integrated so that I can't... There's no one system of record. I can only imagine how much more important having that system of record, has been in the last year of supply chains, even for demanding consumers going, "can I get some toilet paper?" And if so, "where can I find it?" >> Absolutely. And this is where that notion of a closed loop, approach to integration and the automation via AI comes in, right? So we strongly feel that, today, this is the time, the client needs to rethink their integration strategy. And we do agree with some of the other analysts and vendors that are talking about automating the integration work, and that's part of what we've discussed earlier. And that's definitely necessary, but it's not sufficient. >> Go ahead, Sorry. >> Sorry, well, yes, so our feeling here is that, you also have to be thinking about, evolving those integrations in a closed loop fashion. So you're continuously making those integrations better with AI, that's powered by your operational data, that's specific to your company. And then finally, that you know, the old approach that, integration vendors used to have in terms of this style of integration fits all problems, is the long approach. And instead, what we start seeing today, is like, customers are using multiple forms of integration to solve a specific business problem. So they're using CAFCA APIs, messaging, iPad. So, from an IBM standpoint, we feel that every integration must be automated, closed loop and Multistyle, with AI that's informed by your company specific data to continue to improve, so that you end up getting integrations faster but they're also better. >> When companies have that spectrum of different integration, process, as you just mentioned, one of the things that I kind of think is, as we look forward, and you mentioned this a minute ago wanting to have, the foundation so that, not only are applications integrated today, communicating well and sharing data, but in the future. So, talk to me about this closed loop system that you mentioned. And how does that, unable an organization to establish that now, but be able to take on applications that are not even created yet? >> That's really a foundation aspect that the clients need to be thinking about, right? Because the closed loop nature of thinking of your integrations, means that, you're always looking at operational data and using that operational data and feeding it into your AI to improve your business processes, your integrations today but also the ones that you're going to be delivering in the future, right? So, I'm sure your listeners are sitting here thinking you know, where should I get started? And frankly for me, I turn around and say, you probably should ask your integration vendor of choice, how effectively their solutions can provide an automated, closed loop and Multistyle approach to integration. And if the answer that they give you, isn't very detailed, but I hope you ask IBM. And when you ask us those questions, what you going to hear about is, IBM's cloud pack for integration, which is our complete platform for automated closed loop and multi-style integrations. It's optimized for deployment across clouds with Red Hat OpenShift. And with IBM, you'll be able to use natural language powered integration flows, AI powered flow and field mapping, RPA conductivity. Things that really take the manual effort of integration out and replace it with AI driven automation. Second, you want to think about the data that's feeding the AI, right? This is where the operational closed loop aspect comes into play. Sometimes the other vendors in the space are taking operation data from hundreds of customers and putting it together and coming out with the average and using that to train the AI. We don't think that's the right approach because your most important integration processes, are shared by no other customer, right? So, you want your operational data to feed the AI. That's providing things like, field mapping, flow creation, creating the API tests automatically, or the uncovering the inefficiencies that are running in your production environment. And then finally, would I be able to tell you we've got the broadest set of integration capabilities of a multistyle integration capabilities, all delivered with a common UI and shared reuse in governance with unified management across clouds. And that's exactly what clients need. Because if you think about in where are you deploying applications today? The composers are running on multiple clouds, so you have to integrate across clouds. And then finally, what you hear from us is that, IBM provides a proven hybrid and DMC ready Security Gateway. That never been hacked in 15 years, over 30,000 TPS for second but the performance, and security that, frankly clients need for their applications today. So automated, closed loop, multistyle, you'll hear me repeat those over and over because we feel that's absolutely necessary for listeners when they think about, the next generation applications and the integrations that will be required for that. >> Excellent, well, Savio, I wish we had more time but thank you. for sharing what's going on with Automating Integrations, AI what hyperautomation means, kind of where it is now. We look forward to hearing more about this and I'm sure the guests will be excited to see what comes at IBM think. We thank you for your time. >> Thank you very much. >> Savio Rodriguez, I'm Lisa Martin, you're watching theCUBES coverage, via IBM think 2021. (upbeat music)
SUMMARY :
brought to you by IBM. the VP of integration Lisa, really great to be here. but one of the things that the next tidal wave the relationship between Well, the way you do that, cloud, the advance of AI, and the way you do so, is through AI. in the last year of supply chains, the client needs to rethink so that you end up getting but in the future. that the clients need to and I'm sure the guests will be excited you're watching theCUBES coverage,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Lisa Martin | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Savio Rodriguez | PERSON | 0.99+ |
50% | QUANTITY | 0.99+ |
2020 | DATE | 0.99+ |
2021 | DATE | 0.99+ |
Savio | PERSON | 0.99+ |
Gartner | ORGANIZATION | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
15 years | QUANTITY | 0.99+ |
Lisa | PERSON | 0.99+ |
hundreds | QUANTITY | 0.99+ |
Second | QUANTITY | 0.99+ |
last year | DATE | 0.99+ |
today | DATE | 0.99+ |
iPad | COMMERCIAL_ITEM | 0.99+ |
first labour | QUANTITY | 0.99+ |
Savio Rodrigues | PERSON | 0.98+ |
20 years ago | DATE | 0.98+ |
over 30,000 TPS | QUANTITY | 0.98+ |
One | QUANTITY | 0.96+ |
one | QUANTITY | 0.96+ |
hundreds of customers | QUANTITY | 0.96+ |
Today | DATE | 0.95+ |
first | QUANTITY | 0.95+ |
about a year and a half ago | DATE | 0.94+ |
second | QUANTITY | 0.93+ |
Salesforce | ORGANIZATION | 0.93+ |
a minute ago | DATE | 0.91+ |
Red Hat OpenShift | TITLE | 0.89+ |
one thing | QUANTITY | 0.89+ |
Azure | TITLE | 0.89+ |
one labour type | QUANTITY | 0.84+ |
10 strategic technology trends | QUANTITY | 0.78+ |
CAFCA | TITLE | 0.74+ |
10 | QUANTITY | 0.67+ |
new tests | QUANTITY | 0.67+ |
three labour | QUANTITY | 0.65+ |
DMC | ORGANIZATION | 0.6+ |
theCUBE | ORGANIZATION | 0.48+ |
Cloud | TITLE | 0.47+ |
BOS3 | TITLE | 0.38+ |
think | TITLE | 0.28+ |
think 2021 | OTHER | 0.27+ |
think 2021 | TITLE | 0.27+ |
Nima Badiey, Pivotal | Dell Boomi World 2018
(upbeat techno music) >> Live from Las Vegas, it's theCUBE. Covering Boomi World 2018. Brought to you by Dell Boomi. >> Good afternoon, welcome back to theCUBE's continuing coverage of Boomi World 2018 from Las Vegas. I'm Lisa Martin with John Furrier and we're welcoming back to theCUBE one of our alumni Nima Badiey, Head of Technology Ecosystems from Pivotal. Nima, welcome back. >> Thank you for having me back. >> So Pivotal, part of the Dell technologies part of the companies, >> Yeah. >> You guys IPOd recently. And I did read that of the first half 2018, eight of the 10 tech IPOs were powered by Boomi. >> Well, I don't know about that specific. I know that tech IPOs are making a big comeback. We did IPO on the 20th of April, so we've passed out six-month anniversary if you can say. But it's been a distinct privilege to be part of the overall Dell family of businesses. I think what you have in Michael as a leader, who, he has a specific vision, but he's left the independent operating units to work on their own, to find their path through that journey, and to help each other as brethren, as like sisters and brothers. And the fact that Pivotal is here supporting Boomi. That Boomi is within our conference of supporting our customers that we're working together really speaks volumes. I think if you take a look at it, a lot of things happened this week, right? So a couple weeks ago, IBM's acquiring RedHat, this morning VMWare's acquiring Heptio. That's a solid signal that the enterprise transformation and adoption of cloud native model is really taking off. So the new middleware is really all about the cloud native polyglock, multiglock environment. >> And what's interesting, I want to get your thoughts on this because first of all congratulations on the IP, some are saying Pivotal's never going to go public, and they did, you guys were spectacular, great success. But what's going on now is interesting. We're hearing here at this show, as other shows is, cloud scale and data are really at the center of this horizontally scalable cloud poly proposition. Okay great, you mention Kubernetes and Heptio and VM where, that's all great. The question that is how do you compete when ecosystems become the most important thing. You worked at VMware you're at Pivotal. Dell knows ecosystems. Boomi's got an ecosystem. Partners, which is also suppliers and integrators. >> Yeah. >> They integrate and also developers. This is a key competitive advantage. What's your take on that here? >> So I think you touched on the right point. You compete because of your ecosystem, not despite your ecosystem. We can't be completely hedgemonic like Microsoft or Cisco or Amazon can afford to be. And I don't think customers really want that. Customers actually want choice. They want the best options but from a variety of sources. And that's why one of the reasons that we not only invest Dell ecosystem but also in Pivotal's own ecosystem is to cultivate the right technologies that will help our customers on that journey. And our philosophy's always find the leaders in the quadrant. The Cadillac vendors, the Lexus vendors onboard them and the most important thing you can do is, to ensure a pristine customer experience. We're not measuring whether feature A from one partner is better than feature B from another partner. We really don't care. What we care about is we can hand wire and automate what would have been a very manual process for customers, so that, let's say Boomi with Cloud Foundry works perfectly out of the box. So the customers doesn't have to go through and hire consultants and additional external resources just to figure out how two pieces of software should work together, they just should. So when they make that buying decision they know that the day after that buying decision, everything's going to be installed and their developers and their app dev teams and their ops teams can be productive. So that's the power of the ecosystem. >> Can you talk about the relationship between Pivotal and Boomi, because Boomi's been born in the Cloud as start up. Acquired eight years ago. You're part of the Dell Technologies family. VMware's VMware, we know about VMware doing great. You guys doing great. Now Boomi's out there. So how do they factor into and what's the relationship you have with them and how does that work, how do you guys work together? >> Perfect question. So, in my primary role at Pivotal is to manage all of our partner ecosystems, specifically the technology partners. And what I look for are any force multipliers. Any essentially ISVs who can help us accomplish more together than we could on our own. Boomi's a classic example of that. What do they enable? So take your classic customer. Classic customer has, let's say, 100 applications in inventory that they have built, managed, and purchased procured off from shelf-to-shelf components. And roughly 20 or 30% are newish, green field applications, perfect for the cloud native transformation. Most 80% of them or 70% are going to be older, ground field applications that will have to be refactored. But there's always going to be that 15% towards the end that's legacy mainframe. It can't be changed, you cannot afford to modernize it, to restructure it, to refactor it. You're going to have to leave it alone, but you need it. Your inventory systems are there. >> These are critical systems, those people who think legacy as outdated, but they're actually just valued. >> No, they're critically valuable. >> Yes. >> We just cannot be modernized. >> Bingo. >> So a partner like Boomi will allow you to access the full breadth of those resources without having to change them. So I could potentially put Boomi in front of any number of older business applications and effectively modernize them by bridging those older legacy systems with the new systems that I want to build. So let's do an example. I am the Gap and I want to build a new version of our in-store procurement system that runs on my iPhone, that I can just point to a garment and it will automatically put it in my, ya know, check out box. How do I do that? Well I can build all the intelligence. And I can use AI and functions and I can build everything it's out of containers, that's great. But I still have to connect to the inventory system. Inventory system... >> Which is a database. All these systems are out there. >> Somewhere, something. And my developers don't know enough about the old legacy database to be able to use it. But if I put a restful interface using Boomi in front of it and a business connector that's not older XML or kind of inflexible, whatever, solo gateways. Then I have enabled my developer to actually build something that is real. That is customer focused. It is appropriate for that market without being hamstrung by my existing legacy infrastructure. And now my legacy infrastructure is not an anchor that's holding me back. >> You had mentioned force, me and Lisa talk about this all the time on theCUBE, where that scenario's totally legit and relevant because in the old version of IT you have to essentially build inventory management into the new app. You'd have to essentially kill the old to bring in the new. I think with containers and cloud native has shown is you can keep the old and sunset it if you want on your own time table or keep it there and make it productive. Make the data exposeble, but you can bring the cool relevant new stuff in. >> Yeah. >> I think that is what I see and we see from customers, like OK cool, I don't have to kill the old. I'll take care of it on my own timetable versus a complete switching cost analysis. Take down a production system. >> Exactly. >> Build something new, will it work. Ya know cross your fingers. Okay, again and this is a key IT different dynamic. >> It is and it's a realization that there are things you can move and those are immutable. They're simply just monolithic that will never move. And you're going to work within those confines. You can have the best of both worlds. You can maintain your legacy applications. They're still fine, they run most of your business. And still invent the new and explore new markets and new industries and new verticals. And just new capabilities all through and through without having to touch in your back end systems. Without having to bring the older vendors in and say can you please modernize your stuff because my business is dependent and I am going to lose that. I'm going to become the new Sears, I going to become the new Woolworth or whoever. Blockbuster that has missed an opportunity to vector into a new way of delivering their services. >> When you're having customer conversations, Nima, I'm curious, talking with enterprise organizations who have tons of data, all the systems including the legacy, which I'm glad that you brought up that that's not just old systems. There's a lot of business critical, mission critical application running on 'em. Where do you start that conversation with the large enterprise, who doesn't want to become a Blockbuster to your point, and going this is the suite of applications we have, where do we start? Talk to us about that customer journey that you help enable. >> That's great 'cause in most cases the customers already know exactly what they want. It's not the what that you have to have the conversation around, it's the how do I get there. I know what I want, I know what I want to be, I know what I want to design. And it's how do I transform my business fundamentally do an app transformation, enterprise transformation, digital transformation? Where do I begin? And so, ya know, our perspective at Pivotal is, ya know, we're diehard adopters of agile methodology. We truly, truly believe that you can be an agile development organization. We truly believe in Marc Andreessen's vision of software eating the world. Which let's unpack what that means. It just means that if you're going to survive the next 10 years you have to fundamentally become a software company, right? So look at all the companies we work with. Are you an insurance company or are you delivering an insurance product through software? Are you a bank or are you delivering banking product through software? Well, when was the last time you talked to a bank teller? Or the atm, most of your banking's done online. Your computer or your mobile device. Even my check cashing, I don't have to talk to anyone. It's wonderful. Ford Motor Company, do they bend sheet metal and put wheels on it or are they a software company? Well consider that your modern pickup truck has... >> They're an IOT company now. (laughing) (crosstalking) Manufacturing lines. >> That's what's crazy. You have a 150 million lines of code in your pickup truck. Your car, your pickup truck, your whatever is more software than it is anything else. >> But also data's key. I want to get your thoughts since this is super important Michael Dell brought up on the keynote today here at Boomi World was, okay the data's got to stay in the car. I don't need to have a latency issue of hey, I need to know nanosecond results. With data, cloud has become a great use case. With multicloud on the horizon, some people are going to throw data in multiple clouds and that's clear use case, and everyone can see the benefits of that. How do you guys look at this? 'Cause now data needs to be addressable across horizontal systems. You mentioned the Gap and the Gap example. >> That's great, so, one of the biggest trends we see in data is really event streaming. Is the idea that the ability to generate data far out exceeds the ability to consume it. So, what if we treated data as just a river? And I'm going to cast my line and only pick up what I want out of that stream. And this is where CAFCA and companies like Solice and any venturing networks and spring cloud functions and spring cloud data are really coming into play, is acknowledgement that yes we are not in a world where we can store all of the data all the time and figure out what to do with it after the fact. We need timely, and timely is within milliseconds, if not seconds. Action taken on an event or data even coming through. So why don't we modernize around, ya know, that type of data structure and data event and data horizon. So that's one of the trends we see. The second is that there is no one database to rule them all anymore. I can't get away with having oracle and that's my be all, end all. I now have my ESQL and SQL and Mongo and Cassandra and Redis and any other number of databases that are form, fit and function specific for a utility and they're perfect for that. I see graph databases, I see key value stores, I see distributed data warehouse. And so my options as a developer, as a user is really expanding, which means the total types of data components that I can use are also expanding exponentially. And that gives me a lot more flexibility on the types of products that I can build and the services that I can ultimately deliver. >> And that highlights micro services trend, because you have now a multitude of databases, it's not the one database rules them all. They'll be literally thousands of database on censors, so micro service has become the key element to connect all these systems. >> All of it together. And micro services really a higher level of abstraction. So we started with virtual machines and then we went to containers and then we went to functions and micro services. It's on an upward trend necessarily as it is an expansion. Into different ways of being able to do work. So some of my work products are going to be very, very small. They can afford to be ephemeral, but there may be many of them. How do I manage a cluster of millions of these potential work loads? Backing off I can have an ephemeral applications that run inside of containers or I can have ridged fixed applications that have to run inside a virtual machines. I'm going to have all of them. What I need is a platform that delivers all of this for me without me having to figure out how to hand wire these bits and pieces from various different either proprietary or open source kits just to make it work. I'm going to need a 60 to 100 or 200 person team just to maintain this very bespoke thing that I have developed. I'll just pull it off the shelf 'cause this is a solved problem. Right, Pivotal has already solved this problem. Other companies have already solved this problem. Let me start there and so now I'm here. I don't have to worry about all this left over plumbing. Now I can actually build on top of my business. The analogy I'd use is you don't bring furniture with you every time you check into a hotel. And we're telling customers every time you want to move to a different city just for business meeting or for work trip we're going to build you a house and you need to furnish it. Well, that's ridiculous. I'm going to check into a hotel and my expectation is I can check out of any other room and they'll all be the same, it doesn't really matter what floor I'm on, what room I'm in. But they'll have the same facilities, the same bed, the same, ya know, restroom facilities. That's what I want. That's what containers are. Eventually all the services surrounding that hotel room experience will be micro services. >> And we're the work load, the people. >> And we are the work load and we're the most important thing, we are the application, you're right. >> I love that. That's probably best analogy I've heard of containers. Nima, thanks so much for stopping by theCUBE, joining John and me today. And talking to us about what's going on with Pivotal and how you guys are really helping as part of Dell business dramatically transform. >> Been my pleasure. Thank you both. >> Thank you. >> Thank you. Thank you for watching theCUBE. I'm Lisa Martin with John Furrier. We are in Las Vegas at Boomi World '18. Stick around, John and I will be right back with our next guest. (light techno music)
SUMMARY :
Brought to you by Dell Boomi. back to theCUBE one of our alumni Nima Badiey, And I did read that of the first half 2018, That's a solid signal that the enterprise transformation The question that is how do you compete when ecosystems and also developers. and the most important thing you can do is, to ensure in the Cloud as start up. You're going to have to leave it alone, but you need it. those people who think legacy We just cannot that I can just point to a garment and it will automatically Which is a database. And my developers don't know enough about the old legacy because in the old version of IT you have to essentially like OK cool, I don't have to kill the old. Okay, again and this is a key IT different dynamic. It is and it's a realization that there are things you the legacy, which I'm glad that you brought up It's not the what that you have to have They're an IOT company now. You have a 150 million lines of code in your pickup truck. With multicloud on the horizon, some people are going to Is the idea that the ability to generate data far out so micro service has become the key element to connect applications that have to run inside a virtual machines. And we are the work load and we're the most important And talking to us about what's going on with Pivotal Thank you both. Thank you for watching theCUBE.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
John | PERSON | 0.99+ |
Nima Badiey | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Michael Dell | PERSON | 0.99+ |
Cisco | ORGANIZATION | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Marc Andreessen | PERSON | 0.99+ |
Nima | PERSON | 0.99+ |
Boomi | PERSON | 0.99+ |
CAFCA | ORGANIZATION | 0.99+ |
Las Vegas | LOCATION | 0.99+ |
Solice | ORGANIZATION | 0.99+ |
Ford Motor Company | ORGANIZATION | 0.99+ |
Lexus | ORGANIZATION | 0.99+ |
six-month | QUANTITY | 0.99+ |
Michael | PERSON | 0.99+ |
two pieces | QUANTITY | 0.99+ |
Dell | ORGANIZATION | 0.99+ |
100 applications | QUANTITY | 0.99+ |
60 | QUANTITY | 0.99+ |
15% | QUANTITY | 0.99+ |
20th of April | DATE | 0.99+ |
Cadillac | ORGANIZATION | 0.99+ |
iPhone | COMMERCIAL_ITEM | 0.99+ |
thousands | QUANTITY | 0.99+ |
70% | QUANTITY | 0.99+ |
Lisa | PERSON | 0.99+ |
30% | QUANTITY | 0.99+ |
Pivotal | ORGANIZATION | 0.99+ |
eight years ago | DATE | 0.99+ |
Boomi | ORGANIZATION | 0.99+ |
one partner | QUANTITY | 0.99+ |
Sears | ORGANIZATION | 0.99+ |
150 million lines | QUANTITY | 0.99+ |
100 | QUANTITY | 0.99+ |
feature B | OTHER | 0.99+ |
eight | QUANTITY | 0.99+ |
both | QUANTITY | 0.99+ |
second | QUANTITY | 0.99+ |
Woolworth | ORGANIZATION | 0.99+ |
VMware | ORGANIZATION | 0.99+ |
feature A | OTHER | 0.99+ |
theCUBE | ORGANIZATION | 0.98+ |
VMWare | ORGANIZATION | 0.98+ |
John Furrier | PERSON | 0.98+ |
first half 2018 | DATE | 0.98+ |
10 tech IPOs | QUANTITY | 0.98+ |
this week | DATE | 0.98+ |
today | DATE | 0.98+ |
Dell Technologies | ORGANIZATION | 0.98+ |
one | QUANTITY | 0.98+ |
SQL | TITLE | 0.97+ |
200 person | QUANTITY | 0.97+ |
ESQL | TITLE | 0.97+ |
80% | QUANTITY | 0.96+ |
Boomi World 2018 | EVENT | 0.96+ |
both worlds | QUANTITY | 0.96+ |
millions | QUANTITY | 0.95+ |
VM | ORGANIZATION | 0.94+ |
Boomi World '18 | EVENT | 0.92+ |
Kubernetes | ORGANIZATION | 0.92+ |
20 | QUANTITY | 0.91+ |
Heptio | ORGANIZATION | 0.88+ |
first | QUANTITY | 0.87+ |
Boomi World | ORGANIZATION | 0.86+ |
RedHat | ORGANIZATION | 0.84+ |
couple weeks ago | DATE | 0.83+ |