Image Title

Search Results for InifiBand:

09_19_18 Peter & Dave DDN Signal Event


 

>> Dave Vellante, welcome to theCUBE! >> Thank you, Peter. Good to see you. >> Good to see you too. So, Dave, lot of conversation about AI. What is about today that is making AI so important in so many businesses? >> Well, I think there's three things, Peter. The first is the data. We've been on this decade-long Hadoop bandwagon, and what that did is it really focused organizations on putting data at the center of their business. And now, they're trying to figure out, okay, how do we get more value out of that, so the second piece of that is the technology is now becoming available, so, AI, of course, has been around forever, but the infrastructure to support that, the GPUs, the processing power, flash storage, deep learning frameworks like TensorFlow and Caffe have started to come to the marketplace, so the technology is now available to act on that data, and I think the third is, people are trying to get digital right. This is about digital transformation. Digital means data, we talk about that all the time. And every corner office is trying to figure out what their digital strategy should be, so they're trying to remain competitive, and they see automation and artificial intelligence, machine intelligence applied to that data as a linchpin of their competitiveness. >> So, a lot of people talk about the notion of data as a source of value, and there's been some presumption that's all going to the cloud. Is that accurate? >> (laughs) Funny you say that, because, as you know, we've done a lot of work on this, and I think the thing that organizations have realized in the last 10 years is, the idea of bringing five megabytes of compute to petabyte of data is far more viable and as a result, the pendulum is really swinging in many different directions, one being the edge, data is going to stay there, certainly the cloud is a major force. And most of the data, still today, lives on premises, and that's where most of the data is likely going to stay, and so, no, all the data is not going to go into the cloud. >> At least not the central cloud. >> That's right, the central public cloud. You can maybe redefine the boundaries of the cloud. I think the key is, you want to bring that cloud-like experience to the data, we've talked about that a lot in the Wikibon and CUBE communities, and that's all about simplification and cloud business models. >> So that suggests pretty strongly that there is going to continue to be a relationship between choices about hardware infrastructure on premises and the success at making some of these advanced, complex workloads run and scream and really drive some of that innovative business capabilities. As you think about that, what is it about AI technologies or AI algorithms and applications that have an impact on storage decisions? >> Well, I mean, the characteristics of the workloads are going to be, oftentimes, largely unstructured data, there's going to be small files, there's going to be a lot of those small files, and they're going to be kind of randomly distributed, and as a result, that's going to change the way in which people are going to design systems to accommodate those workloads. There's going to be a lot more bandwidth, there's going to be a lot more parallelism in those systems in order to accommodate and keep those CPUs busy, you'll know, we're going to talk more about that, but the workload characteristics are changing, so the fundamental infrastructure has to change as well. >> And so our goal, ultimately, is to ensure that we can keep these new, high-performing GPUs saturated by flowing data to them without a lot of spiky performance throughout the entire subsystem, have I got that right? >> Yeah, I think that's right, that's when I was talking about parallelism, that's what you want to do, you want to be able to load up that processor, especially these alternative processors like GPUs, and make sure that they stay busy. You know, the other thing is, when there's a problem, you don't want to have to restart the job. So you want to have realtime error recovery, if you will. That's been crucial in the high performance world for a long, long time, because these jobs as you know, take a long, long, time, so to the extent that you don't have to restart a job from ground zero, you can save a lot of money. >> Yeah, especially as you said, as we start to integrate some of these AI applications with some of the operational implications, they're actually recording the results of the work that's being performed, or the prediction that's being made, or the recommendation that's being proffered. So I think, ultimately, if we start thinking about this crucial role that AI workloads are going to have in business, and that storage is going to have on AI, move more processing close to the data, et cetera, that suggests that there's going to be some changes in the offing for the storage industry. What are you thinking about how the storage industry is going to evolve over time? >> Well, there's certainly a lot of hardware stuff that's going on, we always talk about software definement, hardware still matters, right? So obviously, flash storage changed the game from spinning mechanical disk, and that's part of this. You're also, as I said before, seeing a lot more parallelism, high bandwidth is critical. Lot of the discussion we're having in our community is, the affinity between HPC, high performance computing, and big data, and I think that was pretty clear, and now that's evolving to AI, so the internal network, things like InifiBand are pretty important, NVMe is coming onto the scene. So those are some of the things that we see. I think the other one is file systems. NFS tends to deal really well with unstructured data and data that is sequential. When you have all this-- >> Streaming, for example. >> Exactly, and when you have all this, what we just described, this sort of random nature and you have the need for parallelism, you really need to rethink file systems. File systems are, again, a linchpin of getting the most out of these AI workloads. And I think the others, we talked about the cloud model, you got to make this stuff simple. If we're going to bring AI and machine intelligence workloads to the enterprise, it's got to be manageable by enterprise admins. You're not going to be able to have a scientist be able to deploy this stuff, so it got to be simpler, cloud-like. >> Fantastic, Dave Vellante, Wikibon, thanks very much for being on theCUBE. >> My pleasure.

Published Date : Sep 28 2018

SUMMARY :

Good to see you. Good to see you too. so the technology is now available to act on that data, that's all going to the cloud. and so, no, all the data is not going to go into the cloud. that cloud-like experience to the data, and the success at making some of these and as a result, that's going to change the way so to the extent that you don't have to restart a job and that storage is going to have on AI, and now that's evolving to AI, so it got to be simpler, cloud-like. Fantastic, Dave Vellante, Wikibon,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

PeterPERSON

0.99+

five megabytesQUANTITY

0.99+

DavePERSON

0.99+

thirdQUANTITY

0.99+

firstQUANTITY

0.99+

second pieceQUANTITY

0.99+

three thingsQUANTITY

0.99+

todayDATE

0.96+

09_19_18DATE

0.95+

CUBEORGANIZATION

0.9+

WikibonORGANIZATION

0.78+

last 10 yearsDATE

0.77+

InifiBandORGANIZATION

0.75+

petabyteQUANTITY

0.67+

TensorFlowTITLE

0.64+

WikibonPERSON

0.63+

ground zeroQUANTITY

0.6+

oneQUANTITY

0.59+

Signal EventEVENT

0.56+

CaffeORGANIZATION

0.54+

DDNORGANIZATION

0.41+