Image Title

Search Results for Mondal:

Thomas Henson and Chhandomay Mandal, Dell Technologies | Dell Technologies World 2020


 

>>from around the globe. It's the Cube with digital coverage of Dell Technologies. World Digital Experience Brought to You by Dell Technologies. >>Welcome to the Cubes Coverage of Dell Technologies World 2020. The Digital Experience. I'm Lisa Martin, and I'm pleased to welcome back a Cube alumni and a new Cube member to the program today. China. My Mondal is back with US Director of Solutions Marketing for Dell Technologies China. But it's great to see you at Dell Technologies world, even though we're very specially death. >>Happy to be back. Thank you, Lisa. >>And Thomas Henson is joining us for the first time. Global business development manager for a I and analytics. Thomas, Welcome to the Cube. >>I am excited to be here. It's my first virtual cube. >>Yeah, well, you better make it a good one. All right. I said we're talking about a I so so much has changed John to me. The last time I saw you were probably were sitting a lot closer together. So much has changed in the last 67 months, but a lot has changed with the adoption of Ai Thomas. Kick us off. What are some of the big things feeling ai adoption right now? >>Yeah, I >>would have to >>say the two biggest things right now or as we look at accelerated compute and by accelerated compute we're not just talking about the continuation of Moore's law, but how In Data Analytics, we're actually doing more processing now with GP use, which give us faster insights. And so now we have the ability to get quicker insights in jobs that may have taken, you know, taking weeks to months a song as we were measuring. And then the second portion is when we start to talk about the innovation going on in the software and framework world, right? So no longer do you have toe know C plus plus or a lower level language. You can actually do it in Python and even pull it off of Get Hub. And it's all part of that open source community. So we're seeing Mawr more folks in the field of data science and deep learning that can actually implement some code. And then we've got faster compute to be able to process that. >>Tell me, what are your thoughts? >>Think I want to add? Is the explosive growth off data on that's actually are fulfilling the AI adoption. Think off. Like all the devices we have, the i o t. On age devices are doing data are pumping data into the pipeline. Our high resolution satellite imagery, all social media generating data. No. All of this data are actually helping the adoption off a I because now we have very granular data tow our friend the AI model Make the AI models are much better. Besides, so the combination off both in, uh, data the power off Like GPU, power surfers are coupled with the inefficient in the eye after and tools helping off. Well, the AI growth that we're seeing today >>trying to make one of the things that we've known for a while now is that it's for a I to be valuable. It's about extracting value from that. Did it? You talked about the massive explosion and data, but yet we know for a long time we've been talking about AI for decades. Initiatives can fail. What can Dell Technologies do now to help companies have successfully I project? >>Yeah, eso As you were saying, Lisa, what we're seeing is the companies are trying to add up AI Technologies toe Dr Value and extract value from their data set. Now the way it needs to be framed is there is a business challenge that customers air trying to solve. The business challenge gets transformed into a data science problem. That data scientist is going toe work with the high technology, trained them on it. That data science problem gets to the data science solution on. Then it needs to be mapped to production deployment as a business solution. What happens? Ah, lot off. The time is the companies do not plan for output transition from all scale proof of concept that it a scientists are playing with, like a smaller set of data two, when it goes toe the large production deployment dealing with terabytes toe terabyte self data. Now that's where we come in. At their technologies, we have into end solutions for the, uh for the ai for pollution in the customers journeys starting from proof of concept to production. And it is all a seamless consular and very scalable. >>So if some of the challenges there are just starting with iterations. Thomas question for you as business development manager, those folks that John um I talked about the data scientists, the business. How are you helping them come together from the beginning so that when the POC is initiated, it actually can go on the right trajectory to be successful? >>No, that's a great point. And just to kind of build off of what Shonda my was talking about, You know, we call it that last mile, right? Like, Hey, I've got a great POC. How do I get into production? Well, if you have executive sponsorship and it's like, Hey, everybody was on board, but it's gonna take six months to a year. It's like, Whoa, you're gonna lose some momentum. So where we help our customers is, you know, by partnering with them to show them how to build, you know, from an i t. And infrastructure perspective what that ai architectural looks like, right? So we have multiple solutions around that, and at the end of the day, it's about just like Sean. Um, I was saying, You know, we may start off with a project that maybe it's only half a terabyte. Maybe it's 10 terabytes, but once you go into production, if it turns out to be three petabytes four petabytes. Nobody really, you know, has the infrastructure built unless they built on those solid practices. And that's where our solutions come in. So we can go from small scale laboratory all the way large scale production without having to move any of that data. Right? So, you know, at the heart of that is power scale and giving you that ability to scale your data and no more data migration so that you can handle one PC or multiple PCs as those models continue to improve as you start to move into production >>and I'm sticking with you 1st. 2nd 0, sorry. Trying to go ahead. >>So I was going to add that, uh, just like posthumous said right. So if you were a data scientist, you are working with this data science workstations, but getting the data from, uh, L M c our scales thes scale out platform and, uh, as it is growing from, you see two large kills production data can stay in place with the power scale platform. You can add notes, and it can grow to petabytes. And you can add in not just the workstations, but also our They'll power it, solve our switches building out our enter A I ready solutions are already solution for your production. Giving are very seamless experience from the data scientist with the i t. >>So China may will stick with you then. I'm curious to know in the last 6 to 7 months since 2020 has gone in a very different direction thing we all would have predicted our last Dell Technologies world together. What are you seeing? China. My in terms of acceleration or maybe different industries. What our customers needs, how they changed. I guess I should say in the in 2020. >>So in 2020 we're seeing the adoption off a I even more rapidly. Uh, if you think about customers ranging from like say, uh, media and entertainment industry toe, uh, the customer services off any organization to, uh the healthcare and life sciences with lots off genome analysts is going on in all of these places where we're dealing with large are datasets. We're seeing ah, lot off adoption foster processing off A. I R. Technologies, uh, giving with, say, the all the research that the's Biosciences organizations are happening. Uh, Thomas, I know like you are working with, like, a customer. So, uh, can you give us a little bit more example in there? >>Yes, one of the areas. You know, we're talking about 2021 of the things that we're seeing Mawr and Mawr is just the expansion of Just look at the need for customer support, right arm or folks working remotely their arm or folks that are learning remote. I know my child is going through virtual schools, So think about your I t organization and how Maney calls you're having now to expand. And so this is a great area where we're starting to see innovation within a I and model building to be ableto have you know, let's call it, you know, the next generation of chatbots rights. You can actually build these models off the data toe, augment those soup sports systems >>because you >>have two choices, right? You can either. You know, you you can either expand out your call center right for for we're not sure how long or you can use AI and analytics to help augment to help maybe answer some of those first baseline questions. The great thing about customers who are choosing power scale and Dell Technologies. Their partner is they already have. The resource is to be able to hold on to that data That's gonna help them train those models to help. >>So, Thomas, whenever we're talking about data, the explosions it brings to mind compliance. Protection, security. We've seen ransom where really skyrocket in 2020. Just you know, the other week there was the VA was hit. Um, I think there was also a social media Facebook instagram ticktock, 235 million users because there was an unsecured cloud database. So that vector is expanding. How can you help customers? Customers accelerate their AI projects? Well, ensuring compliance and protection and security of that data. >>Really? That's the sweet spot for power scale. We're talking with customers, right? You know, built on one FS with all the security features in mind. And I, too, came from the analytics world. So I remember in the early days of Hadoop, where, you know, as a software developer, we didn't need security, right? We you know, we were doing researching stuff, but then when we took it to the customer and and we're pushing to production, But what about all the security features. We needed >>the same thing >>for artificial intelligence, right? We want toe. We want to make sure that we're putting those security features and compliance is in. And that's where you know, from from an AI architecture perspective, by starting with one FS is at the heart of that solution. You can know that you're protecting for you know, all the enterprise features that you need, whether it be from compliance, thio, data strategy, toe backup and recovery as well. >>So when we're talking about big data volumes Chanda, mind we have to talk about the hyper scale er's talk to us about, you know, they each offer azure A W s Google cloud hundreds of AI services. So how does DEL help customers use the public cloud the data that's created outside of it and use all of those use that the right AI services to extract that value? >>Yeah. Now, as you mentioned, all of these hyper scholars are they differentiate with our office is like a i m l r Deep Learning Technologies, right? And as our customer, you want toe leverage based off all the, uh, all the cloud has to offer and not stuck with one particular cloud provider. However, we're talking about terabytes off data, right? So if you are happy with what doing service A from cloud provider say Google what you want to move to take advantage off another surface off from Asia? It comes with a very high English p a migration risk on time it will take to move the data itself. Now that's not good, right? As the customer, we should be able to live for it. Best off breed our cloud services for AI and for that matter, for anything across the board. Now, how we help customers is you can have all of your data say, in a managed, uh, managed cloud service provider running on power scale. But then you can connect from this managed cloud service provider directly toe any off the hyper scholars. You can connect toe aws, azure, Google Cloud and even, like even, uh, the in place analytics that power scale offers you can run. Uh, those, uh I mean, run those clouds AI services directly on that data simultaneously from these three, and I'll add like one more thing, right? Thes keep learning. Technologies need GPU power solvers, right? and cloud even within like one cloud is not homogeneous environment. Like sometimes you'll find a US East has or gp part solvers. But like you are in the West and the same for other providers. No, with our still our technologies cloud power scale for multi cloud our scale is sitting outside off those hyper scholars connected directly to our any off this on. Then you can burst into different clouds, take advantage off our spot. Instances on are like leverage. All the GP is not from one particular service provider part. All of those be our hyper scholars. So those are some examples off the work we're doing in the multi cloud world for a I >>So that's day. You're talking about data there. So powers failed for multi cloud for data that's created outside the public club. But Thomas, what about for data that's created inside the cloud? How does Del help with that? >>Yes. So, this year, we actually released a solution, uh, in conjunction with G C. P. So within Google Cloud, you can have power scale for one fs, right? And so that's that native native feature. So, you know, goes through all the compliance and all the features within being a part of that G c p natively eso counts towards your credits and your GP Google building as well. But it's still all the features that you have. And so we've been running some, actually, some benchmarks. So we've got a couple of white papers out there, that kind of detail. You know what we can do from an artificial intelligence perspective back to Sean Demise Example. We were just talking about, you know, being able to use more and more GPU. So we we've done that to run some of our AI benchmarks against that and then also, you know, jumped into the Hadoop space. But because you know, that's 11 area from a power scale, prospective customers were really interested. Um, and they have been for years. And then, really, the the awesome portion about this is for customers that are looking for a hybrid solution. Or maybe it's their first kickoff to it. So back Lisa to those compliance features that we were talking about those air still inherent within that native Google G C P one fs version, but then also for customers that have it on prim. You can use those same features to burst your data into, um, your isil on cluster using all the same native tools that you've been using for years within your enterprise. >>God, it's so starting out for power. Skill for Google Cloud Trying to get back to you Kind of wrapping things up here. What are some of the things that we're going to see next from Dell from an AI Solutions perspective? >>Yes. So we are working on many different interesting projects ranging from, uh, the latest, uh, in video Salford's that they have announced d d x a 100. And in fact, two weeks ago at GTC, uh, Syria announced take too far parts with, uh, it takes a 100 solvers. We're part off that ecosystem. And we are working with, uh, the leading, uh uh, solutions toe benchmark, our ai, uh, environments, uh, for all the storage, uh, ensuring, like we are providing, like, all the throughput and scalability that we have to offer >>Thomas finishing with you from the customer perspective. As we talked about so many changes this year alone as we approach calendar year 2021 what are some of the things that Dell is doing with its customers with its partners, the hyper scale er's and video, for example, Do you think customers are really going to be able to truly accelerate successful AI projects? >>Yeah. So the first thing I'd like to talk about is what we're doing with the D. G. S A 100. So this month that GTC you saw our solution for a reference architecture for the G s, a 100 plus power scale. So you talk about speed and how we can move customers insights. I mean, some of the numbers that we're seeing off of that are really a really amazing right. And so this is gives the customers the ability to still, you know, take all the features and use use I salon and one f s, um, like they have in the past, but now combined with the speed of the A 100 still be ableto speed up. How fast they're using those building out those deep learning models and then secondly, with that that gives them the ability to scale to. So there's some features inherent within this reference architecture that allow for you to make more use, right? So bring mawr data scientists and more modelers GP use because that's one thing you don't see Data scientist turning away right there always like, Hey, you know, I mean, this this project here needs needs a GPU. And so, you know, from a power scale one fs perspective, we want to be able to make sure that we're supporting that. So that as that data continues to grow, which, you know we're seeing is one of the large factors. Whenever we're talking about artificial intelligence is the scale for the data. We wanna them to be able to continue to build out that data consolidation area for all these multiple different workloads. That air coming in. >>Excellent, Thomas. Thanks for sharing that. Hopefully next time we get to see you guys in person and we can talk about a customer who has done something very successful with you guys. Kind of me. Always great to talk to you. Thank you for joining us. >>Thank you. Thank you >>for China. May Mandel and Thomas Henson. I'm Lisa Martin. You're watching the cubes Coverage of Dell Technologies, World 2020

Published Date : Oct 21 2020

SUMMARY :

It's the Cube with digital coverage of Dell But it's great to see you at Dell Technologies world, Happy to be back. Thomas, Welcome to the Cube. I am excited to be here. So much has changed in the last 67 months, but a lot has changed with And so now we have the ability to get quicker insights in jobs that may have taken, you know, Well, the AI growth that we're seeing today You talked about the massive explosion Yeah, eso As you were saying, Lisa, what we're seeing is the So if some of the challenges there are just starting with iterations. at the heart of that is power scale and giving you that ability to scale your data and no more and I'm sticking with you 1st. So if you were a data scientist, you are working with this data science workstations, So China may will stick with you then. So, uh, can you give us a little bit more to be ableto have you know, let's call it, you know, the next generation of chatbots rights. for for we're not sure how long or you can use AI and analytics to help Just you know, the other week there was the VA was hit. So I remember in the early days of Hadoop, where, you know, as a software developer, And that's where you know, from from an AI architecture perspective, talk to us about, you know, they each offer azure A W s Google cloud hundreds of So if you are happy with what doing created outside the public club. to run some of our AI benchmarks against that and then also, you know, jumped into the Hadoop space. Skill for Google Cloud Trying to get back to you Kind of wrapping things up And we are working with, uh, the leading, uh uh, Thomas finishing with you from the customer perspective. And so this is gives the customers the ability to still, you know, take all the features and use use I salon Hopefully next time we get to see you guys in person and we can talk about a customer who has Thank you. of Dell Technologies, World 2020

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
ThomasPERSON

0.99+

Lisa MartinPERSON

0.99+

JohnPERSON

0.99+

Thomas HensonPERSON

0.99+

GoogleORGANIZATION

0.99+

LisaPERSON

0.99+

2020DATE

0.99+

Dell TechnologiesORGANIZATION

0.99+

AsiaLOCATION

0.99+

10 terabytesQUANTITY

0.99+

DellORGANIZATION

0.99+

SeanPERSON

0.99+

Dell TechnologiesORGANIZATION

0.99+

six monthsQUANTITY

0.99+

C plus plusTITLE

0.99+

PythonTITLE

0.99+

two weeks agoDATE

0.99+

second portionQUANTITY

0.99+

ChinaLOCATION

0.99+

three petabytesQUANTITY

0.99+

hundredsQUANTITY

0.99+

threeQUANTITY

0.98+

this yearDATE

0.98+

four petabytesQUANTITY

0.98+

FacebookORGANIZATION

0.98+

first timeQUANTITY

0.98+

Chhandomay MandalPERSON

0.98+

May MandelPERSON

0.98+

half a terabyteQUANTITY

0.98+

11 areaQUANTITY

0.98+

this yearDATE

0.98+

bothQUANTITY

0.97+

235 million usersQUANTITY

0.97+

two choicesQUANTITY

0.97+

MoorePERSON

0.97+

oneQUANTITY

0.97+

Deep Learning TechnologiesORGANIZATION

0.95+

first kickoffQUANTITY

0.94+

100 solversQUANTITY

0.94+

petabytesQUANTITY

0.94+

USLOCATION

0.93+

GTCLOCATION

0.93+

A 100COMMERCIAL_ITEM

0.92+

EnglishOTHER

0.92+

decadesQUANTITY

0.91+

1st. 2nd 0QUANTITY

0.91+

first thingQUANTITY

0.91+

two large killsQUANTITY

0.9+

instagramORGANIZATION

0.9+

this monthDATE

0.9+

HadoopTITLE

0.9+

todayDATE

0.89+

US EastLOCATION

0.89+

terabytesQUANTITY

0.88+

MondalPERSON

0.88+

D.COMMERCIAL_ITEM

0.88+

first baselineQUANTITY

0.87+

secondlyQUANTITY

0.86+

two biggest thingsQUANTITY

0.86+

100 plusCOMMERCIAL_ITEM

0.86+

Technologies World 2020EVENT

0.85+

Get HubTITLE

0.84+

DelPERSON

0.84+

G C. P.ORGANIZATION

0.83+

David Nguyen & Chhandomay Mandal, Dell Technologies | VMworld 2019


 

>> live from San Francisco, celebrating 10 years of high tech coverage. It's the Cube covering Veum, World 2019 brought to you by VM Wear and its ecosystem partners. >> Welcome back. We're here! Mosconi North for VM World 2019 10th Year of the Cube covering VM World. I'm stupid and my co host is John Troyer. And welcome to the program to guest from Del Technologies. Sitting to my right is Tender, my Mondal, who's the director of storage solutions and sitting to his right is David when the senior director of server, product planning and management also with Dell. Gentlemen, thanks so much for joining us. All right, so we've got server and storage and talk about something that we've been talking about for a while on the server side been delivered for a bit and on the storage side is now rolling out. So everybody's favorite topic. Nonviolent till memory express or envy me as it rolls off the tongue storage class memory, or SCM and lots of other things, you know, down there, really helping a big, transformational wave that, you know, we really changes how our applications interact with the infrastructure channel, you know, bring us up to date on the latest. >> Sure on, let's start where you ended. We're seeing explosion off applications, right? And in fact, in mornings, keynote. Bad girl singer had a stocky speaks. There are 352 million enterprise applications today. On it will be 792 million in three years. Now, as the applications are growing exponentially, we cannot keep growing the infrastructure at that rate, So N v m e is the way we can consolidate it. Ah, lot off the infrastructure. If we can think about in tow and envy, Emmy starting from the server in fear me off our fabric through the stories area down, toe the back end with envy Emmy necessities. This actually can put together a great platform where you can consulate it. Ah, lot off the applications and delivering the high performance low latency that will need while meeting video surfaced level objectives so we can go over a little bit off the details, but I think it all starts from envy me over fabric coming from the server to the story, Ari. So probably like that's the fourth step we need to consider >> David. Do You know, I love this discussion when we get to talk at the application later because, you know, Flash changed the market a lot. You know, it's like, you know, much better energy, and it's much faster, Anything. But you know, this inflection point that we're talking about for application modernization, you know, envy me is one of those enablers there and something they know your team's been working on >> for a while. Yeah, actually, on the power each side we've been, You know, we've been embracing the benefits of enemy for quite some so many years now, right? We start out by introducing enemy in our 12 generations servers, you know, frontloaded hot, serviceable drives. And then, of course, we branch out from there on in today, you know, Ah, a lot of the servers from a Polish family all support enemy devices. So the benefit there is really giving customer choices in terms of what kind of storage kind of cheering they wanted, you know, for the applications needs. Right now, one of things that's great about, you know, enemy over fabric is it's more than just a flash storage itself. It's about enabling the standards, you know, across the host across the data fire Break down to the storage really to deliver on the overall performance that you know the applications of needs and buy, you know, improving I ops and lower late, Easy overall, from a server perspective, this just means that we're releasing more CPU cycles back into the application so that they can run different types of workloads. And for us, this is this is a great story from power. Just was from Power Macs and coming together to enable this Emmy, Emmy or fabric. >> You know, I'm I'm I'm kind of slow about some of these things, but if you kind of squint at the history and, you know, we went from the PC revolution and then we had, you know, we had Sands and raise right and we had we had centralized toward shared storage last couple of years, a lot of interest and stale right hyper converged. And you had a You had a lot of pizza boxes with the storage right there. It's I mean, I now think right and I'm following the threat, I think which is now that where we now can have ah, Iraq with again a fabric and and again, now we can We can focus on our envy me storage over our envy me over fabric driven, solid state storage somewhere below my servers that are that are doing handling compute somewhere else. Is that that the future we're headed towards now >> Yes. I mean, everything has its place. But to give you the perspective, right? It's not just, I mean coming down to the storage area, but how This is enough bling, the future storage as well. And the storage class memory is the perfect example. And as Defeat said, let's take power, Max, as an example, right. Eso in power Max, you can It is like entrant, envy me ready like you get envy emi over Fabrica de front end But then we have n v m E s s trees in the back end. The thing is now it is also the N v m e is enabling technologies like stories class memory which is bringing in very high performance, very less latency Latency is going down in the order off like tents off microseconds. Now this is as close as you can get. Tow the like Dedham with persistent story. However, you need a balance. This is like order of magnitude are costlier. Now you got bar Max. What we're doing in terms of first, it's envy me. Done right? What do you mean by that? You have, like, Marty controller architectures that can actually do this level of parallel processing and our concurrency. And then we have bought, like, ECM for storage, class, memory and envy, Emmy essences. And we're doing intelligent tearing best on the built in mission learning engine that we have. And it is looking at 40 million data sets. Really time to decide. Like which sort of walk lords should go on this same drives which should go on and the M. E s estates. And on top of it, you add quality of service. So this platform gives you are service level objectives. You can choose from diamond, platinum, gold, silver or bronze, and you can consulate it. Ah, lot off those 352 million different types of applications on this area guaranteeing you are going to meet all off your SL s, no matter what type of applications they were consolidated into. >> Okay, I'm wonder if you could boast. You know bring us into what this means for VM wear customers and break it into two pieces. One is kind of a traditional virtualized shop. And secondly, you know, spend a lot of time in the keynote this morning talking about the cloud native containerized, you know, type of environment. Will there be any difference from from both of your world? >> Yeah, absolutely. I'm glad you brought that up because, you know, from from our perspective, right, what we've seen with the enablement of enemy platforms. You know, John, you brought up a very interesting point, right? It seems like you know, past couple years, we went from moving storage onto the host and now would envy me with fabric. We're actually taking the storage away from the host again. Right? And that's exactly true, because, you know, the first, the first statement you brought up stew. It's about how flash enabled different applications to run better on the host. What? We see that still right? And so what enemy? You know, we see the lower response time enabling our customers Thio run more jobs and more v ems per server. That's one aspect of it. You know, we've seen his benefit a lot of our platform today or using various different applications and solutions, and you talk about the ex rail that's a visa and story for Del. You Talk about Visa and ready notes for customers who want to build it themselves. Right platforms enabled would envy me back playing enemies. Storage allows them to use enemy or SAS sata whatever they want. But the point is, here is that when they're using every me flash, for instance, and I'll talk a little bit about the power climaxed with this all flash, uh, me back plane in a case in the study that we did with V San application running, oh ltp type of workload, we saw the response time with every me over traditional SAS, you know, from our competitors improved by 56% right, which means that from that same particular solution build out, we were able to add 44% more of'em on the platform. Now, at the same time, we increase the overall orders per minute by roughly over 600,000. Oh, pm's for that type of, uh, benchmark over our nearest competitors so that right there is the benefit that we see from my virtual eyes from, Ah, being where perspective >> on. I'll add from the storage perspective in two ways. In fact, in last vehement in a MIA, we demonstrated in tow and envy, EMI over five break up with special build off this fear supporting Envy me over fabric and stories. Class memory with envy Me drives what it gives you a regular like this fear best environment is that you have the ability to move your PM's around like the applications where the highest performance and Latin's is critical. It will be on those special service levels and special like de testers. In fact, that demonstration was like ECM did a store, and in P m E Sense media does so in the same fabric with in Bar Mexican moved things around, whether it's like regular Fibre Channel or CNN and then the other part. I want to add in the morning like we saw the announcement that now communities is built in or will be built in with the years Excite platform, right and you're sexy is bread and butter off all the storage customers that we have now with like when you consider those, uh, those things built in under this fear black from Think about, like how many applications? How many actualized workloads you can run, where that it's on premise or humor. Cloud on AWS. All of those consolidation, as well as like the performance needs while reducing your footprint does the benefit of the V M R R shops. But the PM admits are going to see from the storage site >> again. I'm not following the parts, but what kind of we're not talking about a couple of megabytes here anymore, Right? What size of parts are shipping these days? So >> So, from our perspective, up to 77 gigabyte actually start. Seven terabytes drives are available on the markets today for Envy Me Now, whether customer by those drives, you know, it depends on economic factor. But yeah, it's something that's in this available from Dell >> so on. I'll act to what David said so far in CM drives 750 gig to 1.5. Articulate a C M drives on Dwell ported often drives that will be available in the power Max Acela's 15 terabyte envy EMI assistants. So this is the capacity we're talking about. And again the Latin's is at the application level, like from the storage like you're going to see, like, less than 300 microsecond. That's the power we are bringing in with this technology to the market. >> Give >> us a >> little look forward we talked about, you know, envy me has been shipping for a bit on the servers now, really rolling out on the storage side, I saw there's a lot of started from the space. You know, one recent acquisition got guts and people talking. What? What should we be looking for from both of you over kind of the next 6 to 12 months. >> So over next to a next 6 to 12 months, he will see a lot of innovation in this case from the storage site where wth e order of magnitude. I mean, the one single Ari, I mean, today it supports, say, like, 10 million I offs less than 500 microsecond latency. Ah, I cannot give you the exact details, but within like, a short time, these numbers are going to go up by more than, like, 50%. Latency is goingto get reduced. The troop would will be driving will actually like more than double s o. You see, like a lot of these innovations and kind of like evolution in terms off the drive capacities both from the CME, drives perspective. Envy me, assess these. Those will continue to expand, leading to foster performance. Better consolidation, Uh, for all the workloads. >> Yeah, from our perspective, I mean, you know, data growth is gonna continue. We all know that, And for us, it's like designing systems based on what the customers need, what the applications needs, right. And that's why we have different types of storage available today. So for us, you know, while we're doing a lot of things from a direct attached storage perspective, customers continue to have a need for share storage. EMI over fabric just provides a better know intense story for us, really from a Power edge and Power Macs perspective. But in the future, you asked what we're going to do. Well, we see the need to probably decouple stories, class memory from the host again. And really, what's preventing us from doing today? It's really having the right fabric in place to be able to deliver to that performance level that applications needs. MM evil fabrics, fibre Channel Ethernet ice, scuzzy or I'm sorry, Infinite Band, whatever. These are some of the things that you know we're looking forward to in the future to make that that lead. All >> right, well, it's really been great to see technology that I know the people that build your products have been excited about for many years. But rolling out into the real world deployment for customers that will transform what they're doing. So for John Troyer, I'm still Minuteman back with lots more coverage here from Be enrolled 2019. Thanks for watching the Cube.

Published Date : Aug 26 2019

SUMMARY :

brought to you by VM Wear and its ecosystem partners. interact with the infrastructure channel, you know, bring us up to date on the latest. So probably like that's the fourth step we need to consider You know, it's like, you know, much better energy, in today, you know, Ah, a lot of the servers from a Polish family all support the history and, you know, we went from the PC revolution But to give you the perspective, you know, spend a lot of time in the keynote this morning talking about the cloud native containerized, we saw the response time with every me over traditional SAS, you know, customers that we have now with like when you consider those, I'm not following the parts, but what kind of we're not talking about a couple of megabytes whether customer by those drives, you know, it depends on economic factor. That's the power we are bringing in with this technology little look forward we talked about, you know, envy me has been shipping for a bit on the servers now, Ah, I cannot give you the exact details, These are some of the things that you know we're looking forward to in the But rolling out into the real world deployment for customers that will transform what

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

John TroyerPERSON

0.99+

JohnPERSON

0.99+

San FranciscoLOCATION

0.99+

44%QUANTITY

0.99+

750 gigQUANTITY

0.99+

56%QUANTITY

0.99+

MondalPERSON

0.99+

David NguyenPERSON

0.99+

Del TechnologiesORGANIZATION

0.99+

50%QUANTITY

0.99+

792 millionQUANTITY

0.99+

10 yearsQUANTITY

0.99+

10 millionQUANTITY

0.99+

352 millionQUANTITY

0.99+

EmmyPERSON

0.99+

DellORGANIZATION

0.99+

2019DATE

0.99+

AriPERSON

0.99+

two piecesQUANTITY

0.99+

Chhandomay MandalPERSON

0.99+

less than 300 microsecondQUANTITY

0.99+

bothQUANTITY

0.99+

firstQUANTITY

0.99+

less than 500 microsecondQUANTITY

0.99+

fourth stepQUANTITY

0.99+

15 terabyteQUANTITY

0.99+

three yearsQUANTITY

0.99+

AWSORGANIZATION

0.99+

two waysQUANTITY

0.99+

1.5QUANTITY

0.99+

OneQUANTITY

0.98+

TenderPERSON

0.98+

12 generationsQUANTITY

0.98+

Dell TechnologiesORGANIZATION

0.98+

CNNORGANIZATION

0.98+

todayDATE

0.98+

over 600,000QUANTITY

0.98+

VM World 2019EVENT

0.98+

40 million data setsQUANTITY

0.97+

Seven terabytesQUANTITY

0.97+

10th YearQUANTITY

0.96+

V SanTITLE

0.96+

6QUANTITY

0.96+

secondlyQUANTITY

0.95+

12 monthsQUANTITY

0.95+

oneQUANTITY

0.94+

one aspectQUANTITY

0.94+

DedhamPERSON

0.93+

last couple of yearsDATE

0.92+

first statementQUANTITY

0.92+

DwellORGANIZATION

0.89+

Fibre ChannelORGANIZATION

0.89+

this morningDATE

0.89+

past couple yearsDATE

0.88+

VM WearORGANIZATION

0.87+

each sideQUANTITY

0.86+

LatinOTHER

0.84+

PolishOTHER

0.82+

FabricaORGANIZATION

0.81+

up to 77 gigabyteQUANTITY

0.8+

352 million enterprise applicationsQUANTITY

0.79+

CubeCOMMERCIAL_ITEM

0.79+

Power MacsORGANIZATION

0.78+

a couple of megabytesQUANTITY

0.76+

VMworld 2019EVENT

0.75+

one singleQUANTITY

0.75+

over fiveQUANTITY

0.73+

VMEVENT

0.72+

MosconiLOCATION

0.64+

E SenseTITLE

0.61+

girlTITLE

0.6+

MartyPERSON

0.59+

MeORGANIZATION

0.59+

SandsORGANIZATION

0.57+

MinutemanPERSON

0.56+

ECMTITLE

0.53+

IraqORGANIZATION

0.5+