Mike Flasko, Microsoft | Microsoft Ignite 2018
>> Live from Orlando, Florida it's theCUBE, covering Microsoft Ignite. Brought to you by Cohesity and theCUBE's eco-system partners. >> Welcome back everyone to theCUBE's live coverage of Microsoft Ignite. I'm your host Rebecca Knight along with my co-host Stu Miniman. We are joined by Mike Flasko. He is the Principal Group Product Manager here at Microsoft. Thanks so much for returning to theCUBE, you are a CUBE alumni. >> I am, yeah thanks for having me back. I appreciate it. >> So you oversee a portfolio of products. Can you let our viewers know what are you workin' on right now? >> Sure, yeah. I work in the area of data integration and governance at Microsoft, so everything around data integration, data acquisition, transformation and then pushing into the governance angles of, you know, once you acquire data and analyze it are you handling it properly as per industry guidelines or enterprise initiatives as you might have? >> You mentioned the magic word, transformation. I would love to have you define. It's become a real buzz word in this industry. How do you define digital transformation? >> Sure, I think it's a great discussion because we're talking about this all the time, but what does that really mean? And for us, the way I see it is starting to make more and more data driven decisions all the time. And so it's not like a light switch, where you weren't and then you were. Typically what happens is as we start working with customers they find new and interesting ways to use more data to help them make a more informed decision. And it starts from a big project or a small project and then just kind of takes off throughout the company. And so really, I think it boils down to using more data and having that guide a lot of the decisions you're making and typically that starts with tapping into a bunch of data that you may already have that just hasn't been part of your kind of traditional data warehousing or BI loop and thinking about how you can do that. >> Mike bring us inside the portfolio a little bit, you know, everybody knows Microsoft. We think about our daily usage of all the Microsoft product that my business data runs through, but when you talk about your products they're specific around the data. Help us walk through that a little bit. >> Sure, yeah. So we have a few kind of flagship products in the space, if you will. The first is something called Azure Data Factory and the purpose of that product is fairly simple. It's really for data professionals. They might be integrators or warehousing professionals et cetera and its to facilitate making it really easy to acquire data from wherever it is. Your business data on-prem from other clouds, SAS applications and allow a really easy experience to kind of bring data into your cloud, into our cloud for analytics and then build data processing pipelines that take that raw data and transform it into something useful, whatever your business domain requires. Whether that's training a machine learning model or populating your warehouse based on more data than you've had before. So first one, data factory all about data integration kind of a modern take on it. Built for the cloud, but fundamentally supports hybrid scenarios. And then other products we've got are things like Azure Data Catalog, which are more in the realm of aiding the discovery and governance of data. So once you start acquiring all this data and using it more productively, you start to have a lot and how do you connect those who want to consume data with the data professionals or data scientists that are producing these rich data sets. So how do you connect your information workers with your data scientists or your data engineers that are producing data sets? Data catalog's kind of the glue between the two. >> Mike wondering if you can help connect the dots to some of the waves we've been seeing. There was a traditonal kind of BI and data warehousing then we went through a kind of big data, the volumes of data and how can I, even if I'm not some multi-national or global company, take advantage of the data? Now there's machine intelligence. Machine learning, AI and all these pieces. What's the same and what's different about the trend and the products today? >> Sure, I think the first thing I've learnt through this process and being in our data space for a while and then working our big data projects is that, for a while we used to talk about them as different things. Like you do data warehousing and now that kind of has an old kind of connotation feeling to it. It's got an old feel to it, right? And then we talk about big data and you have a big data project and I think the realization that we've got is it's really those two things starting to come together and if you think about it, like everybody has been doing some form of analytics and warehousing for a while. And if we start to think about what the Brick Data Technologies has brought is a couple of things, in my opinion that kind of bring these two things together is with big data we started to be able to acquire data of significantly larger size and varying shape, right? But at the end of the day, the task is often acquire that data, shape that data into something useful and then connect it up to our business decision makers that need to leverage that data from a day to day basis. We've been doing that process in warehousing forever. It's really about how easily can we marry big data processing with the traditional data warehousing processes so that our warehouses, our decision making can kind of scale to large data and different shapes of data. And so probably what you'll see actually, at Ignite conference in a lot of our sessions, you'll hear our speakers talking about something called a modern data warehousing and like, it really doesn't matter what the label is associated with it. But it's really about how do you use big data technologies like Spark and Data Bricks naturally alongside warehousing technologies and integration technologies so they really form the modern data warehouse that does naturally handle big data, that does naturally bring in data of all shapes and sizes and provides kind of an experimentation ground as well, for data science. I think that's the last one that kind of comes in is once you've got big data and warehousing kind of working together to expand your analytics beyond kind of traditional approaches the next is opening up some of that data earlier in its life cycle for experimentation by data science. It's kind of the new angle and we think about this notion of kind of modern data warehousing as almost one thing supporting them all going forward. I think the challenge we've had is when we try to separate these into kind of net new deliverables, net new projects where we're starting to kind of bifurcate, if you will, the data platform to some degree. And things were getting a little too complex and so I think what we're seeing is that people are learning what these tools are good at and what they're not good at and now how to bring them together to really get back some of the productivity that we've had in the past. >> I want to ask you about those business decision makers that you referenced. I mean there's an assumption that every organization wants to become more data driven. And I think that most companies would probably say yes, but then there's another set of managers who really want to go by their gut. I mean have you found that being a conflict in terms of how you are positioning the products and services? >> Yeah absolutley. In a number of customer engagements we've had where you start to bring in more data, you start to evolve kind of the analytics practice. There is a lot of resistance at times that, you know, we've done it this way for 20 years, business is pretty good. What are we really fixing here? And so what we've found is the best path through this and in a lot of cases the required path has been show people the art of the possible, run experiments, show them side by side examples and typically with that comes a comfort level in what's possible sometimes it exposes new capabilities and options, sometimes it also shows that there's some other ways to arrive at decisions, but we've certainly seen that and almost like anything, you kind of have to start small, create a proving ground and be able to do it in a kind of side by side manner to show comparison as we go, but it's a conversation that I think is going to carry forward for the next little while especially as some of the work in AI and machine learning is starting to make it's way into business critical settings, right? Pricing your products. Product placement. All of this stuff that directly affects bottom lines you're starting to see these models do a really good job. And I think what we've found is it's all about experimentation. >> Mike when we listen to (mumbles) and to Dell and we talk about, you know, how things are developed inside of Microsoft, usually hear things like open and extensible, you got to have APIs in any of these modern pieces. It was highlighted in the Keynote on Monday, talking about the open data initiative got companies like Adobe and SAP out there, they have a lot of data, so the question is, of course, Microsoft has a lot of data that customers flow through, but there's also this very large eco-system we see at this show. What's the philosophy? Is it just, you know, oh, I've got some APIs and people plug into it? How does all the data get so that the customers can use it? >> Yeah it's a great question. That one I work a lot on and I think there's a couple of angles to it. One is, I think as big data's taken off, a lot of the integration technology that we've used in the past really wasn't made for this era. Where you've got data coming from everywhere. It's different shapes and it's different sizes and so at least within some of our products, we've been investing a lot into how do we make it really easy to acquire all the data you need because, you know, like you hear in all these cases, you can have the best model in the world if you don't have the best data sets it doesn't matter. Digital transformation starts with getting access to more data than you had before and so I think we've been really focused on this, we call it the ingestion of data. Being able to really easily connect and acquire all of the data and that's the starting point. The next thing that we've seen from companies have kind of gone down that journey with us is once you've acquired it all, you quickly have to understand it and you have to be able to kind of search over it and understand it through the lens of potentially business terms if you're a business user trying to understand what is all these data sets? What do they mean? And so I think this is where you're starting to see the rise of data cataloging initiatives not necessarily master data, et cetera, of the past, but this idea of, wow, I'm acquiring all of this data, how do I make sense of it? How do I catalog it? How does all of my workers or my employees easily find what they need and search for the data through the lens that makes sense to them. Data scientists are going to search through a very technical lens. Your business users through business glossary, business domain terms in that way and, so for me it all starts with the acquisition. I think it still far too hard and then becomes kind of a cataloging initiative and then the last step is how do we start to get some form of standards or agreement around the semantics of the data itself? Like this is a customer, this is a place. This is what, you know, a rating and I think with that you're going to start to see a whole eco-system of apps start to develop and one of the things that we're pretty excited about with the open data partnerships is how can we bring in data and to some degree auto-classify it into a set of terms that allow you to just get on with the business logic as opposed to spend all the time in the acquisition phase that most companies do today. >> You mentioned that AI is becoming increasingly important and mission critical or at least, bottom line critical in business models. What are some of the most exciting new uses of AI that you're seeing and that you hope expands into the larger industry? >> Sure. It really does cross a number of domains. We work with a retailer, ASOS. Every time we get to chat with them it's a very interesting use on how they have completely customized the shopping experience from how they layout the page based on your interest and preference through to how the search terms come back based on seasonality of what you're looking at based on what they've learnt about your purchase patterns over time, your sex, et cetera. And so I think this notion of like, intensely customized customer experiences is playing out everywhere. We've seen it on the other side in engine design and preventative maintenance. Where we've got certain customers now that are selling engine hours as opposed to engines themselves. And so if there's an engine hour that they can't provide that's a big deal and so they want to get ahead of any maintenance issue they can and they're using models to predict when a particular maintenance event is going to be required and getting ahead of that through to athletes and injury prevention. We're now seeing all the way down to connected clothing and athletic gear where all the way down, not just at the professional level, but it's starting to come down to the club level on athletes as they're playing, starting to realize that, oh, something's not quite right, I want to get ahead of this before I have a more serious injury. And so we've seen it in a number of domains almost every new customer I'm talking with. I'm excited by what they're doing in this area. >> Well, you bring up an interesting challenges. I've heard Microsoft is really I guess verticalizing around certain industries to put solutions together. One of the challenges we saw, you know, we saw surveys of big data. The number use case came back was always custom and it was like, oh, okay, well how do I templatize and allow hundreds of customers to do this not every single project is a massive engagement. What are you seeing that we're learning from the past and it feels like we're getting over that hump a little bit faster now than we were a few years ago. >> Yeah, so if I heard you correctly, it's a little bit loud so you're saying everything started at custom? And how do we get past that? And I think it actually goes back to what we're talking about earlier with this notion of a common understanding of data because what was happening is everybody felt they had bespoke data or we had data that was speaking about the same domains and terms, but we didn't agree on anything, so we spent a ton of time in the bespoke or custom arena of integrating, cleaning, transforming, before we could even get to model building or before we could get to any kind of innovation on the data itself and so I think one of the things is realizing that a lot of these domains we're trying to solve similar problems, we all have similar data. The more we can get to a common understanding of the data that we have, the more you can see higher level re-usable components being built, saying, "Ah, I know how to work on customer data" "I know how to work on sales data" "I know how to work on, you know, oil and gas data" whatever it might be, you'll probably start to see things come up in industry verticals as well. And I think it's that motion, like we had the same problem years ago when we talked about log files. Before there was logging standards, everything was a custom solution, right? Now we have very rich solutions for understanding IT infrastructure et cetera that usually became because we had a better base line for the understanding of the data we had. >> Great. Mike Thank you so much for coming on theCUBE. It was a pleasure having you. >> Thank you for having me. >> I'm Rebecca Knight for Stu Miniman, we will have more of theCUBE's live coverage of Microsoft Ignite coming up just after this. (techno music)
SUMMARY :
Brought to you by Cohesity He is the Principal Group Product Manager I am, yeah thanks for having me back. what are you workin' on right now? of, you know, once you I would love to have you define. of the decisions you're making of all the Microsoft product in the space, if you will. and the products today? the data platform to some degree. that you referenced. and in a lot of cases the and we talk about, you know, all the data you need because, you know, that you hope expands and getting ahead of that One of the challenges we saw, you know, of the data that we have, Mike Thank you so much of Microsoft Ignite
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Mike Flasko | PERSON | 0.99+ |
Rebecca Knight | PERSON | 0.99+ |
Stu Miniman | PERSON | 0.99+ |
Mike | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
20 years | QUANTITY | 0.99+ |
Dell | ORGANIZATION | 0.99+ |
ASOS | ORGANIZATION | 0.99+ |
Adobe | ORGANIZATION | 0.99+ |
Orlando, Florida | LOCATION | 0.99+ |
two | QUANTITY | 0.99+ |
CUBE | ORGANIZATION | 0.99+ |
two things | QUANTITY | 0.99+ |
Brick Data Technologies | ORGANIZATION | 0.99+ |
theCUBE | ORGANIZATION | 0.98+ |
Monday | DATE | 0.98+ |
first | QUANTITY | 0.98+ |
Cohesity | ORGANIZATION | 0.98+ |
One | QUANTITY | 0.97+ |
hundreds of customers | QUANTITY | 0.97+ |
today | DATE | 0.96+ |
Ignite | EVENT | 0.96+ |
Data Bricks | ORGANIZATION | 0.96+ |
one | QUANTITY | 0.96+ |
first one | QUANTITY | 0.95+ |
Azure Data Factory | ORGANIZATION | 0.95+ |
Azure Data Catalog | TITLE | 0.92+ |
few years ago | DATE | 0.91+ |
Spark | ORGANIZATION | 0.9+ |
first thing | QUANTITY | 0.85+ |
SAP | ORGANIZATION | 0.8+ |
single project | QUANTITY | 0.77+ |
Microsoft Ignite | ORGANIZATION | 0.72+ |
years | DATE | 0.72+ |
SAS | TITLE | 0.7+ |
Ignite 2018 | EVENT | 0.52+ |
ton | QUANTITY | 0.41+ |
Andrew Liu, Microsoft | Microsoft Ignite 2018
>> Live from Orlando, Florida. It's theCUBE. Covering Microsoft Ignite. Brought to you by Cohesity, and theCUBE's ecosystem partners. >> Welcome back to the CUBE's live coverage of Microsoft Ignite here in Orlando, Florida. I'm your host, Rebecca Knight. Along with my co-host Stu Miniman. We're joined by Andrew Liu. He is the senior product manager at Azure Cosmos DB. Thanks so much for coming on the show Andrew. >> Oh, thank you for hosting. >> You're a first timer, so this will be a lot of fun. So, talk to me a little bit. Azure Cosmos DB is a database for building blazing fast planet scale applications. Can you tell our viewers a little bit about what you do and about the history of Azure Cosmos? >> Sure, so Azure Cosmos DB started with, about eight years ago, where we were also outgrowing a lot of our own database needs with what we had previously built. And a lot of the challenges that we had was really around partitioning, replication, and resource governance. So, I'll talk a little bit about each one. Partitioning is really about solving the problem of scale. Right? I have so much data, doesn't fit on a single machine, and I have so many requests per second. Also doesn't, can't be served out of a single machine. So how do I go and build a system, a database that can elastically scale over a cluster of machines, so I don't have to manually shard, and as a user have to shard a database across many, many instances. This way I really want to be able to scale just seamlessly. The velocity problem is, we also wanted to build something that, can respond in a very fast manner, in terms of latency. So, it's great and all that we can serve lots of request per second, but, what is the response time of each one of those requests? And the resource governance was there to really actually build this as a cloud native database in which we wanted to exploit the properties of our cloud. We wanted to use the economies of scale that we can have basically data centers built all around the world, and build this as a multi, truly multi-tenant service. And by doing so we can also afford the total cost of ownership for us, as well as, a guaranteed predictable performance for the tenants. Now we did this, for initially our first party tenants at Microsoft, where we have made a bet on everything from our Microsoft live platform, to Office, to Azure itself as built on Azure Cosmos DB. And about four years ago we found that hey, this is not really just a Microsoft problem that we're solving, but it's an everybody problem, it's become universal, and so we've launched it out to the open. >> Yeah, Andrew that's, great point, and I want you to help unpack that for us a little bit because you know, we've been saying on theCUBE for many years, distributed architectures are some of the toughest challenges of our time, but, if I'm a Facebook, or a Google, or a Microsoft, I understand some of the challenges, and I understand why I need it, but, when you talk about scale, well, scale means a lot of different things to a lot of different people. So, how does Cosmos? What does that mean to your users, end users, why do they need this? You know, haven't they just felt some microservices architecture? And they'll just leverage, ya know what's in Azure. And things like that. How does this global scale impact the typical user? >> So I'm actually seeing this come in different types of patterns for different types of industries. So for example, in manufacturing we're commonly seeing Cosmos DB used really for that scalability for the write scalability, and having many, many concurrent writes per second. Typically this is done in an IoT telemetry, or an IoT device registry case. So let's use one of our customers for example, Toyota. Each year they're shipping millions of vehicles on the road, and they're building a big connected car platform. The connected car platform allows you to do things like, whenever it alerts an airbag gets deployed, they can go and make sure and call their driver, hey, I saw the airbag was deployed are you okay? And if the user doesn't pick up their phone, immediately notify emergency services. But the challenge here is if each year I'm shipping millions of vehicles on the road, and each of 'em has a heartbeat every second, I'm dealing with millions of writes per second, and I need a database that can scale to that. In contrast, in retail I'm actually seeing very different use cases. They're using more of the replication side of our stock where they have a global user base, and they're trying to expand an eCommerce shop. So for example ASOS is a big fashion retailer, they ship to 200 different countries globally, and they want to make sure that they can deliver real-time experiences like real-time personalization, and based off of who the user is recommended set of products that is tailored to that user. Well now what I need is a data set that can expand to my shoppers across two different hundred, 200 countries around the globe, and deliver that with very, very low latency so that my web experience is also very robust. So what they use is our global distribution, and our multi-mastering technology. Where we can actually have a database presence, similar to like what a CDN does for static content, we're doing for our dynamic evolving content. So in a database your work load, typically your data set is evolving, and you want to be able to run queries with consistency over that. As opposed to in CDN you're typically serving static assets. Well here we can actually support those dynamic content, and then build these low latency experiences to users all around the globe. The other area we see a lot of usage is in ISV's for mission critical workloads. And the replication actually gets us two awesome properties, right? One is the low latency by shipping data closer to where the user is, but the other property you get is a lot of redundancy, and so we actually also offer industry leading SLA's where we guarantee five nines of availability, and the way we're able to do so is, with a highly redundant architecture you don't care if let's say a machine were to bomb out at any given time, because we have multiple redundant copies in different parts of the globe. You're guaranteed that your workload is always online. >> So my question for you is, when you have these, you just described some really, really interesting customer use cases in manufacturing, in retail, do you then create products and services for each of these industries? Or do you say hey other retail customers, we've noticed this really works for this customer over here, how do you go out to the community with what you're selling? >> Ah, got it. So we actually have found that this can be a challenging space for some of our customers today, 'cause we have so many products. The way we kind of view it is we want to have a portfolio, so that you can always choose the right tool for the right job. And I think a lot of how Microsoft has evolved as a business actually is around this. Previously we would sell a hammer, and we'd tell you don't worry everything's a nail, even if it looks like a screw let's just pretend it's a nail and whack it down. But today we've built this big vast toolbox, and you can think of Cosmos DB as just one of many tools in our vast toolbox. So if you have a screw maybe you pickup a screwdriver, and screw that in. And the way Azure works is then if we have a very comprehensive toolbox, depending on what precise scenario you have, you can kind of mix and match the tools that fit your problem. So think of them as like individual Lego blocks, and whether you're building like a death star, or an x-wing, you can go, and assemble the right pieces for your application. >> Andrew, some news at the show around Cosmos DB. Share us what the updates are. >> Oh sure, so we're really excited to launch a few new features. The highlights are multi-master, and Cassandra API. So multi-master really exploits the replicated nature of our database. Before multi-master what we would do is, we would allow you to have a globally distributed database in which you can have write requests go to single region, and reads being served out of any of these other locations. With multi-master we've actually made it so that each of those replicas we've deployed around the globe can also accept write requests. What that translates to from a user point of view is number one, your write requests are a lot faster, they're super low latency, single-digit millisecond latency in fact. No matter where the user is around the globe. And number two, you also get much higher write availability. So even if let's say, we're having a natural disaster, we had a nasty hurricane as you know pass through on the east coast last week, but with a globally distributed database the nice thing is even if you have, let's say, a power disruption in one region of the world, it doesn't matter cause you can then just fail over, and talk to another data center, where you have a live replica already located. So we just came out with multi-master. The short summary is low latency writes, as well as high available writes. The other feature that we launched is Cassandra API, and as you know this is a multi-model, multi-API database. What that means is, what we're trying to do is also meet our users where they are. As opposed to pushing our proprietary software on them, and we take the whole concept of vendor lock-in very, very seriously. Which is why we make such a big bet on the open source ecosystem. If you already have, let's say a MongoDB application, or a Cassandra application, but you'd really love to be able to take advantage of some of the novel properties that we've built with building a fully managed multi-master database. Well, what we've done is we've implemented this as a wire level protocol on the server side. So it can take an existing application, not change a single line of code, and point it to Cosmos DB as a back-end, and then take advantage of Cosmos DB as your database. >> One of the interesting things if you look at the kind of changing face of databases, it's how users are being able to leverage their data. You talk about everything from you know, I think Cassandra back, and some of the big data discussions, today everything's AI which I know is near and dear to Microsoft's heart. Satya Nadella I'm talking about, how do you think of the role of data in this solution set? >> Sorry, can you say that one more time? >> So, how customers think about leveraging data, how things like Cosmos allow them to really extract the value out of data, not just be some database that kind of stuck in the back-end somewhere. >> Yeah, yeah. I mean a lot of it is the new novel experiences people are building. So for example, like the connected car platform, I'm seeing people actually build this, and take advantage of new novel territories that a traditional automobile manufacturer used to not do. Not only are they building experiences around, how do they provide value to their end users? Like the air bag scenario, but they're also using this as a way of building value for their business, and how to make sure that, hey when, next time you're up for an oil change that they can send a helpful reminder, and say hey I noticed you're due for an oil change in terms of mileage. Why don't I just go set up an appointment, just up for you, as well as other experiences for things, like when they want to do fleet management, and do partnerships with either ride sharing companies like Uber, and Lyft, or rental car companies like Avis, Hertz, et cetera. I've also seen people take advantage of, taking kind of new novel experiences through databases, through AI, and machine learning. So for example, the product recommendations. This was something that historically, when I wanted to do recommendations a decade ago, maybe I have some big beefy data lake running somewhere in the back-end, it might take a week to munch through that data, but that's okay, a week later once I'm ready, I'll send out some mail, maybe some email to you, but today when I want to actually show live right when the user is browsing my website, my website has to load fast right? If my goal is to increase conversions on sales, having a slow running website is the fastest way for my user to click the back button. But if I want to build real-time personalization, and want to generate let's say a recommendation within 200 millisecond latency, well now that I have databases that can guarantee me single-digit millisecond latency, it gives me ample time to actually improve the business logic for those recommendations. >> I want to ask you a question about culture, because you are based at the mothership in Redmond, Washington. So we heard Satya Nadella on the main stage today talk about tech intensiveness, tech intensity, sorry, this idea that we need to not only be adopting technology, but also building the latest, and greatest. I'm curious about, how that translates at Microsoft's campus, and sort of how, how this idea is, infuses how you work with your colleagues, and then also how you work with your customers and partners? >> I think some of the biggest positive changes I've seen over the last decade has been how much more of a customer focus we have today then ever. And i think a lot of things have led to that. One is, just the ability to ship much faster. As we move to Cloud services we're no longer in these big box product release cycles of building a product, and waiting like one or two years to ship it to our users. But now we can actually get some real-time feedback. So as we go, and ship, and deploy software, we actually deploy even on a weekly cadence over here. What that allows us to do is actually experiment a lot more, and get real-time feedback, so if we have an idea, and rather than having to go through a long lengthy vetting process, spending years building, and hoping that it really pays off. What we can do is we can just go talk to our users, and say hey, ya know, we have an idea for our future. We'd love to get your feedback, or a lot of times honestly our customers actually come to us, where we're so tightly engaged these days, that when, users even come to us, and say like hey, what do you think about this idea? It would really add a lot of value to my scenario. We go, and try to root cause that, really get an idea of what exactly that they need. But then we can turn that around in blazing fast time. And I think a lot of the shift to Cloud services, and being able to avoid the overhead of well we got to wait for this ship train, and then wait for the right operation personnel to go and deploy the updates. Now that we can control our own destiny, and just ship on a very, very fast cadence, we're closer to our users, and we experiment a lot more, and I think it's a beautiful thing. >> Great, well Andrew thank you so much for coming on theCUBE, it was fun talking to you. >> Oh yeah, thank you for hosting. >> I'm Rebecca Knight, for Stu Miniman, we will have more from theCUBE's live coverage of Microsoft Ignite coming up just after this. (techno music)
SUMMARY :
Brought to you by Cohesity, Thanks so much for coming on the show Andrew. what you do and about the history of Azure Cosmos? And a lot of the challenges that we had was and I want you to help unpack that and I need a database that can scale to that. and you can think of Cosmos DB as just one Andrew, some news at the show around Cosmos DB. and as you know this is a multi-model, One of the interesting things if you look that kind of stuck in the back-end somewhere. So for example, like the connected car platform, and then also how you work with your customers and partners? and say like hey, what do you think about this idea? Great, well Andrew thank you so much we will have more from theCUBE's live coverage
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Rebecca Knight | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
Infinidat | ORGANIZATION | 0.99+ |
Andrew | PERSON | 0.99+ |
Satya Nadella | PERSON | 0.99+ |
Andrew Liu | PERSON | 0.99+ |
Uber | ORGANIZATION | 0.99+ |
Stu Miniman | PERSON | 0.99+ |
Hertz | ORGANIZATION | 0.99+ |
Toyota | ORGANIZATION | 0.99+ |
Lyft | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Avis | ORGANIZATION | 0.99+ |
Tuesday March 27th | DATE | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
March 2018 | DATE | 0.99+ |
ORGANIZATION | 0.99+ | |
ASOS | ORGANIZATION | 0.99+ |
one | QUANTITY | 0.99+ |
last week | DATE | 0.99+ |
ORGANIZATION | 0.99+ | |
Orlando, Florida | LOCATION | 0.99+ |
Crowdchat.net/infinichat | OTHER | 0.99+ |
CUBE | ORGANIZATION | 0.99+ |
Moshe Yanai | PERSON | 0.99+ |
a week later | DATE | 0.99+ |
two years | QUANTITY | 0.99+ |
next week | DATE | 0.99+ |
Each year | QUANTITY | 0.99+ |
each | QUANTITY | 0.99+ |
five | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
Azure Cosmos DB | TITLE | 0.98+ |
millions of vehicles | QUANTITY | 0.98+ |
a decade ago | DATE | 0.98+ |
Cohesity | ORGANIZATION | 0.98+ |
200 different countries | QUANTITY | 0.98+ |
single machine | QUANTITY | 0.98+ |
Cassandra | TITLE | 0.98+ |
Redmond, Washington | LOCATION | 0.98+ |
theCUBE | ORGANIZATION | 0.98+ |
hundred, 200 countries | QUANTITY | 0.97+ |
each year | QUANTITY | 0.97+ |
Office | TITLE | 0.97+ |
each one | QUANTITY | 0.97+ |
a week | QUANTITY | 0.97+ |
single-digit | QUANTITY | 0.96+ |
Lego | ORGANIZATION | 0.95+ |
about four years ago | DATE | 0.95+ |
200 millisecond | QUANTITY | 0.95+ |
about eight years ago | DATE | 0.94+ |
MongoDB | TITLE | 0.93+ |
two awesome properties | QUANTITY | 0.93+ |
Microsoft Ignite | ORGANIZATION | 0.93+ |
single line | QUANTITY | 0.92+ |
first party | QUANTITY | 0.92+ |
Azure Cosmos | TITLE | 0.92+ |
one region | QUANTITY | 0.92+ |
Cosmos | TITLE | 0.91+ |
Azure | TITLE | 0.89+ |
millions of writes per second | QUANTITY | 0.89+ |