Image Title

Search Results for James Kabila:

Jim HPE DCE 3 Segment


 

thanks Peter I'm here with James kabila Sawicki bonds lead analyst for data science Jim been hearing a lot about data science and how machine learning is coming into this environment give it give us a little bit of a guidance as to how that this this whole space fits into data science you know how does that that infrastructure fit in with data science today yeah well stew data science is a set of practices for building and training statistical models often known as machine learning models to be deployed into applications to do things like predictive analysis automating next best offers and marketing and so forth so what machine learning is all about is the statistical model and those are built by a category of professionals known as data scientists but data scientists operated in teams there are data engineers who manage your data lake there are data modelers who build the models themselves there are there are professionals who specialize in training the models and deploying them trainings like Quality Assurance so what it's all about is really these part these functions are increasingly being combined into workflows they have to conform with DevOps practices because this is an important set of application development capabilities that are absolutely essential to deploy machine learning into AI in AI is really the secret sauce of so many apps nowadays all right Jim is we've looked at data center Ops walk us through the tech the process and the people okay data center else really is data science ops or often well wiki bomb we've referred to as DevOps for data science and really what we start with the the start with the people I've already been to sketch those up so in terms of the people the professionals involved in building and training and deploying and evaluating and iterating machine learning models there are the data scientists who are this justjust iskele modelers you might call them the algorithm jockeys though that may be regarded as a pejorative but nonetheless these are the high-powered professionals who who build who know which algorithm is correct for what the challenge they build the models on there are the data engineers who not only manage your data lakes the data lakes is where the training data is maintained the data for building the model and for training the models are maintained in data lakes the data engineers manage that they also manage data preparation data transformation data cleansing to get the data clean and correct so that it can be used to build high quality models there are other functions that are absolutely essential there are as what some call ml architour machine learning architects I like to think of them as subject-matter experts who work with the data scientist to build what are called the feature sets the predictors that need to be built into machine learning models for those models to do therefore perform their function correctly whether it be a prediction or like face recognition or natural language processing for your ear your chat BOTS and so forth you need the subject matter experts with you to provide guidance to the data scientists as to what variables to build into these models there was also coders there's a lot of coding that's done in data science and ml ops that's done in Python and Java and Scala and a variety of other languages and there's other functions as well but these are the core functions that need to be performed in a team environment really in a workflow and that is where the process comes in the workflow for data science in teams is DevOps it's really the continuous integration of different data sets as well as different models as well as different features into the building and training of AI so these need to be pretty nice functions need to be performed in a workflow that's highly structured where there's checkpoints and there's governance and there's a transparency and auditability so it really all this needs to be performed in a DevOps environment where you have the data lake which is the source of the the data of course we also have a source repository for managing the current and past versions of the models themselves where you also do governance on the code builds that are with each of the models that are deployed into your application environment so that's the process site at all and then the platform our tech side is really revolves around with some colleague data science workbench or a data science platform there's a variety of terms for it but essentially it is a development environment that enables a high degree of automation and all across all these functions because automation is absolutely essential for speed and consistency in terms of how models are built and entrained there's also a need for our collaboration capability strong ones within these platforms so these different human roles can work together in a cohesive fashion and really like a well-oiled machine screaming there's a need for repositories - like I said managed in govern the current versions of all the artifacts be they data be they models be they code bills and so forth that are essential so all of these people processes and Texas and there is building high-quality AI yeah so Jim I noticed you call it DevOps for data science so yes there's a real emphasis there on how we get all of these new things aligned with the process for DevOps and maybe help us put a point on you know why that's so important well because DevOps is how applications are built and deployed now everywhere which is essentially it so it's a workflow it's a standard workflow that involves a scaleable organization where you have code that is built and managed and governed according to a standard workflow standard repositories with checkpoints and transparency as a way of consensual e ensuring that high quality code is deployed into working applications according to essentially a factory-style automation or an industrialized workflow so data science is a development discipline data science needs to as a as a workflow needs to conform with the established DevOps practices that your application developers your coders have already established in fact most AI applications most machine learning applications involve code involved machine learning models but also involve containers and kubernetes and increasingly serverless interfaces and so forth so data science is not separated from the other aspects of the DevOps workflow itting and christie is a unified and integrated piece of your operations and they needs to be managed as such all right well Jim appreciate you going through the evolution on that I know you've written quite a bit about this topic on the wiki bond website and Peter will send it back to you

Published Date : Sep 6 2019

**Summary and Sentiment Analysis are not been shown because of improper transcript**

ENTITIES

EntityCategoryConfidence
JimPERSON

0.99+

PeterPERSON

0.99+

JavaTITLE

0.99+

PythonTITLE

0.99+

ScalaTITLE

0.99+

DevOpsTITLE

0.99+

TexasLOCATION

0.98+

todayDATE

0.96+

James kabila SawickiPERSON

0.96+

eachQUANTITY

0.91+

christiePERSON

0.91+

appsQUANTITY

0.66+

wiki bondORGANIZATION

0.66+

OpsORGANIZATION

0.55+

Jim Kobelius HPE2 1


 

from our studios in the heart of Silicon Valley Palo Alto California this is a cute conversation hi I'm Peter Burris and welcome back to another cube conversation we're talking today about some of the new challenges that enterprises face as they try to get more out of their data specifically we've got 10 plus years of building advanced analytic pipelines through things like data warehousing but more recently Big Data data lakes etc and we've got some new storage technologies flash and others that have been essential in improving the productivity of a lot of those systems but as we establish the baseline enterprise's are finding new and more complex way to weave together the tool chains that are essential to deploying and sustaining these very complex very rich strategic AI and related analytic applications how will the relationship of storage AI and analytics Co evolved to have that conversation I'm joined by my colleague James Kabila's of wiki bond Jim welcome to the cube thank you Peter so let's start with the problem I kind of laid it out generally but let's start with this observation am i right is there a coevolution taking place between the applications that people want the storage technologies they require and how tool chains are going to weave this all together and the new demands it's going to place on some of these storage technologies yes they're India's coevolution and what's often called the the AI or the data science pipeline which is a simply an industrialised process where my data is turned into machine learning models which are turned into inferences which drive amazing results in real-world applications and to make this pay plan work this industrialized pipeline you need specialists people like data scientists and data engineers you need specialized processes and workflows that are sometimes called DevOps or continuous integration and continuous deployment of models and apps and storage and compute and networking resources are fundamentally important at every stage of this process any data lakes are fundamental in this pipeline because data lakes where the data is stored that it from various source is ingested and then the data is then used to build the models machine learning machine learning being the heart of AI data storage our resources are used to train the models to make sure that they fit or that they're highly predictive for whatever task it is such as recognizing you know a cat in a picture or whatever it might be so storage is fundamentally important as a resource to enable high volume highly parallel ingest transformation cleansing stewardship modeling training and all of that throughout the pipeline but it becomes very most important in the in the in the process of preparing and training the models storage gives way to high volume or highly parallel compute resources as you go further toward inferencing which increasingly is real-time and when you talk about inferencing that's where it gets closer to the need for things like flash and other you know in-memory technologies to take it the next step of the game in terms of the pipeline so the pipeline is required to actually construct the models from large volumes of often small files is necessarily or is likely to be different from the storage that's associated with very high performance automated automation inferencing if I got that right right yeah bulk storage high volume bulk storage talk about petabytes and beyond of data in the back-end process of data engineering and then you have real time in memory data persistence increasingly at the forefront of what you need to do serving and inferences of the models that are created so yes different types of storage are needed at different points in the in the pipeline to make it work as an efficient end-to-end process but let's talk global the conversation because if we go above the devices or if we go up the chain so to speak of the devices we're also now talking about new classes of data services that are going to be required to support these very complex applications and the rich toolchains necessary to drive those applications we're you know encryption new types of security new types of backup and restore new types of data protection I'll give us a sense ultimately of how these new applications that they get more deeply embedded within business and become an essential feature of the revenue producing side not just the cost side of how that's gonna change the need for rethinking data services yeah well you know what happens now with AI machine learning is that it's not just the the the the underlying data that becomes important but the derived assets the machine learning models that you build from the data all that needs to be stored and persistent persistent and governed and from end to end throughout the lifecycle of a given application development so you need storage resources that can manage really you need a repositories that can manage all of those assets as a as a collection to drive the whole DevOps process to enables you know check-in checkout rollback you know transparency the end-to-end process by which and machine learning driven inference is generated so the storage resources need to become increasingly oriented towards object storage if you have complex objects it but also you need end-to-end stream computing to drive a real-time workflow it needs to be highly parallel and you need to be able to manage multiple streams in parallel from a common set of data so it needs to be dating governance baked into this whole end-to-end data persistence and data storage architecture so increasingly the storage resources storage management resources have to evolve and take on attributes of what we can used to call data management they have to know a little bit more about the applications a little bit more about the quality of the data be able to discern patterns in that data so that you can both protect the data where is but also assure the availability of the data where it needs to be and sustained security across the entire set of pipelines and execution resources have I got that right Jim that's right so that demands a degree of end-to-end auditing and logging of the audit trails for all these processes and for the state of every resource that's involved in the construction and the serving of the machine learning model who and who ends in other words you mean that's just data lakes to store the data but you also increasingly need vast petabyte scale logs for end-to-end transparency of this process and the transparency quite often increasingly will be for legal and compliance reasons as well we have mandates Ling gdpr and so for that demand a high degree of transparency into the data and the data derived assets and in general are built and trained throughout the problem they've brought the lifecycle of a given application because some there are real consequences of the machine learning model fails or makes the wrong decision that might be legally apt actionable and so more the storage architecture has to support high degree of really like queryable archives to be able to manage to search and and to be able to roll up a complete audit trail of why a given model AI model had a given result in the field Jim there's been a great conversation thanks a lot for talking about this crucial relationship between analytics applications like AI driven applications and and storage and crucially the evolving role and the impacts of on storage these very rich data pipelines or tools change that are gonna make all of that possible once again I'm Peter Burris you've been listening to another cube conversation thanks very much you [Music]

Published Date : May 1 2019

**Summary and Sentiment Analysis are not been shown because of improper transcript**

ENTITIES

EntityCategoryConfidence
Peter BurrisPERSON

0.99+

IBMORGANIZATION

0.99+

KeithPERSON

0.99+

JohnPERSON

0.99+

2018DATE

0.99+

Dave VellantePERSON

0.99+

AWSORGANIZATION

0.99+

November 1stDATE

0.99+

SUSEORGANIZATION

0.99+

Peter BurrisPERSON

0.99+

RancherORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

2020DATE

0.99+

Rancher Government ServicesORGANIZATION

0.99+

2021DATE

0.99+

DODORGANIZATION

0.99+

John FurrierPERSON

0.99+

James KabilaPERSON

0.99+

Keith BasilPERSON

0.99+

HypergiantORGANIZATION

0.99+

VegasLOCATION

0.99+

SUSE RGSORGANIZATION

0.99+

Home DepotORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

2022DATE

0.99+

Brandon GullaPERSON

0.99+

HPEORGANIZATION

0.99+

TelecoORGANIZATION

0.99+

10 plus yearsQUANTITY

0.99+

Red HatORGANIZATION

0.99+

Last weekDATE

0.99+

Jim KobeliusPERSON

0.99+

PeterPERSON

0.99+

KatePERSON

0.99+

Silicon ValleyLOCATION

0.99+

HPE DiscoverORGANIZATION

0.99+

EdrickPERSON

0.99+

seven peopleQUANTITY

0.99+

EdgeORGANIZATION

0.99+

JimPERSON

0.99+

one customerQUANTITY

0.99+

first timeQUANTITY

0.99+

TeslaORGANIZATION

0.99+

MelissaPERSON

0.99+

OneQUANTITY

0.99+

thousandsQUANTITY

0.98+

two thingsQUANTITY

0.98+

over 2,600QUANTITY

0.98+

LinuxTITLE

0.98+

US governmentORGANIZATION

0.98+

K3sCOMMERCIAL_ITEM

0.98+

three business unitsQUANTITY

0.97+

oneQUANTITY

0.97+

MetroORGANIZATION

0.96+

two halvesQUANTITY

0.96+

KubernetesTITLE

0.96+

SLE MicroTITLE

0.96+

SLE MicroCOMMERCIAL_ITEM

0.96+

Edge SolutionsORGANIZATION

0.96+

eachQUANTITY

0.95+

AkriORGANIZATION

0.95+

firstQUANTITY

0.94+

EdgeLOCATION

0.94+

Yaron Haviv, Iguazio | CUBEConversation, April 2019


 

>> From our studios in the heart of Silicon Valley. HOLLOWAY ALTO, California It is a cube conversation. >> Hello and welcome to Cube conversations. I'm James Kabila's lead analyst at Wicked Bond. Today we've got an excellent guest. Who's a Cube alumnus? Par excellence. It's your own Haviv who is the founder and CEO of a guajillo. Hello. You're wrong. Welcome in. I think you're you're coming in from Tel Aviv. If I'm not mistaken, >> right? Really? Close the deal of any thanks from my seeing you again. >> Yeah. Nice to see you again. So I'm here in our Palo Alto studios. And so I'm always excited when I can hear your own and meet with your room because he always has something interesting in new to share. But what they're doing in the areas of cloud and serve earless and really time streaming analytics And now, data science. I wasn't aware of how deeply they're involved in the whole data Science pipelines, so ah, your own. This is great to have you. So my first question really is. Can you sketch out? What are the emerging marketplace requirements that USA gua Si are seeing in the convergence of all these spaces? Especially riel time streaming analytics edge computing server lis and data science and A I can you give us a sort of ah broad perspective and outlook on the convergence and really the new opportunities or possibilities that the convergence of those technologies enable for enterprises that are making deep investments. >> Yeah, so I think we were serving dissipated. What's happening now? We just call them different names will probably get into into this discussion in a minute. I think what you see is the traditional analytics and even data scientist Science was starting at sort of a research labs, people exploring cancer, expressing, you know, impact. Whether on, you know, people's moved its era. And now people are trying to make real or a Y from a guy in their assigned, so they have to plug it within business applications. Okay, so it's not just a veil. A scientist Inning the silo, you know, with a bunch of large that he got from his friends, the data engineer in the scan them and Derrickson Namesake runs to the boss and says, You know what? You know, we could have made some money in a year ago. We've done something so that doesn't make a lot of impact on the business, where the impact on the business is happening is when you actually integrate a I in jackpot in recommendation engines in doing predictive analytics on analyzing failures and saving saving failures on, you know, saving people's life. Those kind of use cases. Doctors are the ones that record a tighter integration between the application and the data and algorithms that come from the day I. And that's where we started to think about our platform. Way worked on a real time data, which is where you know, when you're going into more production environment of not fatal accident. Very good, very fast integration with data. And we have this sort of fast computation layer, which was a one micro services, and now everyone talks about micro services. We sort of started with this area, and that is allowing people to build those intelligent application that are integrated into the business applications. And the biggest challenges they see today for organizations is moving from this process of books on research, on data in a historical data and translating that into a visit supplication or into impact on business application. This is where people can spend the year. You know, I've seen the tweet saying with build a machine learning model in, like, a few weeks. And now we've waited eleven months for the product ization. So that artifact, >> Yes, that's what we're seeing it wicked bomb. Which is that A. I is the heart of modern applications in business and the new generation of application developers, in many ways, our data scientists, or have you know, lovers the skills and tools for data science. Now, looking at a glass zeros portfolio, you evolve so rapidly and to address a broader range of use cases I've seen. And you've explained it over the years that in position to go, as well as being a continuous data platform and intelligent edge platform, a surveillance platform. And now I see that you're a bit of a data science workbench or pipeline tooling. Clever. Could you connect these dots here on explain what is a guajillo fully >> role, Earl? Nice mark things for this in technology that we've built, OK, just over the years, you know, people, four years when we started, So we have to call it something else. Well, that I thought that analytic sort of the corporate state of science. And when we said continued analytics, we meant essentially feeding data and running, some of them speaking some results. This is the service opposed to the trend of truth which was dating the lady Throw data in and then you run the batch that analytic and they're like, Do you have some insight? So continue statistics was served a term that we've came up with a B, not the basket. You know, describe that you're essentially thinking, needing from different forces crunching it, Prue algorithms and generating triggers and actions are responsible user requests. Okay on that will serve a pretty unique and serve the fireman here in this industry even before they called it streaming or in a real time, data science or whatever. Now, if you look at our architecture are architecture, as I explained before, is comprised of three components. The first event is a real time, full time model database. You know, you know about it really exceptional in his performance and its other capabilities. The second thing is a pursue miss engine that allows us to essentially inject applications. Various guys, initially we started with application. I sense you do analytics, you know, grouping joining, you know, correlating. And then we start just adding more functions and other things like inference, saying humans recognitions and analysis. It's Arab is we have dysfunction engine. It allows us a lot of flexibility and find the really fast for the engine on a really fast data there endure it, remarkable results and then this return calling this turn this micro assume it's finger serve Ellis who certainly even where have the game of this or service gang. And the third element of our platform is a sense she having a fully manage, passed a platform where a ll those micro services our data and it threw a self service into face surfing over there is a mini cloud. You know, we've recently the last two years we've shifted to working with coronaries versus using our own A proprietary micro spurs does or frustration originally. So we went into all those three major technologies. Now, those pit into different application when they're interesting application. If you think about edge in the engine in serving many clouds, you need variety of data, sources and databases. With you, no problem arose streaming files. Terra. We'LL support all of them when our integrated the platform and then you need to go micro services that developed in the cloud and then just sort of shift into the enforcement point in the edge. And you need for an orchestration there because you want to do suffer upgrades, you need to protect security. So having all the integrated separated an opportunity for us to work with providers of agin, you may have noticed our joint announcement with Google around solution for hedge around retailers and an i O. T. We've made some announcement with Microsoft in the fast. We're going to do some very interesting announcement very soon. We've made some joint that nonsense with Samsung and in video, all around those errands, we continue. It's not that we're limited to EJ just what happens because we have extremely high density data platform, very power of fish and very well integrated. It has a great feat in the India, but it's also the same platform that we sell in. The cloud is a service or we sell two on from customers s so they can run. The same things is in the clouds, which happens to be the fastest, most real time platform on the Advantage service. An essential feature cannot just ignore. >> So you're wrong. Europe. Yeah, Iguazu is a complete cloud, native development and run time platform. Now serve earless in many ways. Seems to be the core of your capability in your platform. New Cleo, which is your technology you've open sourced. It's bill for Prem bays to private clouds. But also it has is extensible to be usable in broader hybrid cloud scenarios. Now, give us a sense for how nuclear and civilised functions become valuable or useful for data science off or for executing services or functions of data of the data science pipeline kick you connect the dots of nuclear and data science and a I from the development standpoint >> church. So So I think you know, the two pillars that we have technology that the most important ones are the data. You know, we have things like twelve batons on our data engine is very high performance and nuclear functions, and also they're very well integrated because usually services stateless. So you know, you you end up. If you want to practice that they have some challenges with service with No, no, you can't. You stay for use cases. You can mount files. You have real time connections to data, so that makes it a lot more interesting than just along the functions. The other thing, with no clothes that is extremely high performance has about two hundred times faster than land. So that means that you can actually go and build things like the stream processing and joins in real time all over practice, their base activities. You can just go and do collectors. We call them those like things. Go fetch information from whether services from routers for the X cybersecurity analysis for all sorts of sensors. So those functions are becoming like, you know, those nanobots technology of off the movies is that you just send them over to go and do things for you, whether it's the daily collection and crunching, whether it's the influencing engines, those things that, for example, get a picture of very put the model, decide what's in the picture, and that this is where we're really comes into play. They nothing important you see now an emergence off a service patterns in data science. So there are many companies that do like mother influencing as a service city what they do, they launch an end point of your eleven point and serve runs the model inside you send the Vector America values and get back in the Americans and their conversion. It's not really different and service it just wait more limited because I don't just want to send a vector off numbers because usually I understand really like a geo location of my cellphone, which are user I D. And I need dysfunction to cross correlated with other information about myself with the location. Then came commendation of which a product they need to buy. So and then those functions also have all sorts of dependency exam on different packages. Different software environment, horribles, build structures, all those. This is really where service technologies are much more suitable now. It's interesting that if you'LL go to Amazon, they have a product called Sage Maker. I'm sure yes, which is dinner, then a science block. Okay, now sage mint for although you would say that's a deal use case for after Onda functions actually don't use Amazon London functions in sage maker, and you ask yourself, Why aren't they using Lambda Stage Maker just telling you, you know you could use Lambda is a blue logic around sage maker. And that's because because London doesn't feed the use case. Okay, because lambda doesn't it is not capable of storing large content and she learning miles could be hundreds of megabytes or Landa is extremely slow. So you cannot do hi concurrency influencing with will land the function so essentially had to create another surveillance and college with a different name. Although if they just would have approved Landa, maybe it was one or a Swiss are So we're looking, We've took it, were taken the other approach We don't have the resources that I have so we created a monster virus Engine one servant attention does batch Frost is saying scream processing, consort, lots of data, even rocketeer services to all the different computation pattern with a single engine. And that's when you started taking all this trend because that's about yeah, we need two version our code. We need to, you know, record all our back into dependencies. And although yes, service doesn't so if we just had to go and tied more into the existing frameworks and you've looked at our frantically product called Tokyo Jupiter, which is essentially a scientist, right, some code in his data's passport book and then in clicks. One command called nuclear Deploy, it automatically compiles, is their science artifact in notebooks, that server and converted into a real hand function that can listen in on your next city. People can listen on streams and keep the scheduled on various timing. It could do magic. So many other things. So, and the interesting point is that if you think about their scientists there, not the farmers, because they should be a scientist on this's means that they actually have a bigger barrier to write in code. So if you serve in this framework that also automates the law daughter scaling the security provisioning of data, the versions of everything in fact fantasies, they just need to focus on writing other them's. It's actually a bigger back for the book. Now, if you just take service into them, Epstein's and they will tell you, Yeah, you know, we know how to explain, Doctor. We know all those things, so they're very their eyes is smaller than the value in the eyes of their scientists. So that's why we're actually seeing this appeal that those those people that essentially focus in life trying math and algorithms and all sorts of those sophisticated things they don't want to deal with. Coding and maintenance are refreshed. And by also doing so by oppression analyzing their cool for service, you can come back to market. You can address calle ability to avoid rewriting of code. All those big challenges the organizations are facing. >> You're gonna have to ask you, that's great. You have the tools to build, uh, help customers build serve Ellis functions for and so forth inside of Jupiter notebooks. And you mentioned Sage Maker, which is in a WS solution, which is up in coming in terms of supporting a full data science tool chain for pipeline development. You know, among teams you have a high profile partnerships with Microsoft and Google and Silver. Do you incorporate or integrator support either of these cloud providers own data science workbench offerings or third party offerings from? There's dozens of others in this space. What are you doing in terms of partnerships in that area? >> Yeah, obviously we don't want to lock us out from any of those, and, you know, if someone already has his work bench that I don't know my customers say they were locking me into your world back in our work when things are really cool because like our Jupiter is connected for real time connections to the database. And yes, serve other cool features that sentir getting like a huge speed boost we have. But that's on A with an within vigna of round Heads and Integration, which reviews are creating a pool of abuse from each of one of the data scientist running on African essentially launch clubs on this full of civilians whose off owning the abuse, which are extremely expensive, is you? No. But what we've done is because of her. The technology beside the actual debate engine is open source. We can accept it essentially just going any sold packages. And we demonstrate that to Google in danger. The others we can essentially got just go and load a bunch of packages into their work match and make it very proposed to what we provide in our manage platform. You know, not with the same performance levels. Well, functionality wise, the same function. >> So how can you name some reference customers that air using a guajillo inside a high performance data science work flows is ah, are you Are there you just testing the waters in that market for your technology? Your technology's already fairly mature. >> That says, I told you before, although you know, sort of changed messaging along the lines. We always did the same thing. So when we were continuous analytics and we've spoken like a year or two ago both some news cases that we Iran like, you know, tell cooperators and running really time, you know, health, a predictive health, monitoring their networks and or killing birds and those kind of things they all use algorithms. You control those those positions. We worked with Brian nailing customers so we can feed a lot of there there in real time maps and do from detection. And another applications are on all those things that we've noticed that all of the use cases that we're working with involved in a science in some cases, by the way, because of sort of politics that with once we've said, we have analytics for continuous analytics, we were serving send into sent into the analytic schools with the organization, which more focused on survey data warehouse because I know the case is still serve. They were saying, and I do. And after the people that build up can serve those data science applications and serve real time. Aye, aye. OK, Ianto. Business applications or more, the development and business people. This is also why we sort of change are our name, because we wanted to make it very clear that we're aren't the carnage is about building a new applications. It's not about the warehousing or faster queries. On a day of Eros is about generating value to the business, if you ask it a specific amplification. And we just announced two weeks in the investment off Samsung in Iguazu, former that essentially has two pillars beyond getting a few million dollars, It says. One thing is that they're adopted. No cure. Is there a service for the internal clouds on the second one is, we're working with them on a bunch of us, Della sighs. Well, use case is one of them was even quoted in enough would make would be There are no I can not say, but says she knows our real business application is really a history of those that involves, you know, in in intercepting data from your sister's customers, doing real time on analytics and responding really quickly. One thing that we've announced it because of youse off nuclear sub picture. We're done with inferior we actually what were pulled their performance. >> You're onto you see if you see a fair number of customers embedding machine learning inside of Realtor time Streaming stream computing back ones. This is the week of Flink forward here in San San Francisco. I I was at the event earlier this week and I I saw the least. They're presenting a fair amount of uptake of ml in sight of stream computing. Do you see that as being a coming meet Mainstream best practice. >> Streaming is still the analytics bucket. OK, because what we're looking for is a weakness which are more interactive, you know, think about like, uh, like a chatterbox or like doing a predictive analytic. It's all about streaming. Streaming is still, you know, it's faster flow data, but it's still, sir has delay the social. It's not responses, you know. It's not the aspect of legacy. Is that pickle in streaming? Okay, the aspect of throughput is is higher on streaming, but not necessarily the response that I think about sparks streaming. You know, it's good at crossing a lot of data. It's definitely not good at three to one on would put spark as a way to respond to user request on the Internet S O. We're doing screaming, and we see that growth. But think where we see the real growth is panic to reel of inches. The ones with the customer logs in and sends a request or working with telcos on scenarios where conditions of LA car, if the on the tracks and they settled all sorts of information are a real time invent train. Then the customer closer says, I need a second box and they could say No, this guy needs to go away to that customer because how many times you've gotten technician coming to your house and said I don't have that more exactly. You know, they have to send a different guy. So they were. How do you impact the business on three pillars of business? Okay, the three pillars are one is essentially improving your china Reducing the risk is essentially reducing your calls. Ask him. The other one is essentially audio, rap or customer from a more successful. So this is around front and application and whether it's box or are doing, you know our thing or those kind of us kisses. And also under you grow your market, which is a together on a recommendation in at this time. So all those fit you if you want, have hey, I incorporated in your business applications. In few years you're probably gonna be dead. I don't see any bits of sustained competition without incorporating so ability to integrate really real data with some customer data and essentially go and react >> changes. Something slightly you mentioned in video as a partner recently, Of course, he announced that few weeks ago. At their event on, they have recently acquired Melon ox, and I believe you used to be with Melon Axe, so I'd like to get your commentary on that acquisition or merger. >> Right? Yes, yes, I was VP Data Center man Ox. Like my last job, I feel good friends off off the Guider, including the CEO and the rest of the team with medicines. And last week I was in Israel's with talk to the media. Kansas. Well, I think it's a great merger if you think about men in Ox Head as sort of the best that breaking and storage technology answer Silicon Side and the video has the best view technologies, man. It's also acquired some compute cheap technologies, and they also very, very nice. Photonics technologies and men are today's being by all the club providers. Remiss Troll was essentially only those technical engagement would like the seizures and you know the rest of the gas. So now VP running with the computation engine in and minerals coming, we serve the rest of the pieces were our storage and make them a very strong player. And I think it's our threatens intel because think about it until they haven't really managed to high speed networking recently. They haven't really managed to come with Jiffy use at your combat and big technology, and so I think that makes a video, sort of Ah, pretty. You know, vendor and suspect. >> And another question is not related to that. But you're in Tel Aviv, Israel. And of course, Israel is famous for the start ups in the areas of machine learning. And so, especially with a focus on cyber security of the Israel, is like near the top of the world in terms of just the amount of brainpower focused on cyber security there. What are the hot ML machine? Learning related developments or innovations you see, coming out of Israel recently related to cybersecurity and distributed cloud environments, anything in terms of just basic are indeed technology that we should all be aware of that will be finding its way into mainstream Cloud and Cooper Netease and civilised environments. Going forward, your thoughts. >> Yes, I think there are different areas, you know, The guys in Israel also look at what happens in sort of the U. S. And their place in all the different things. I think with what's unique about us is a small country is always trying to think outside of the box because we know we cannot compete in a very large market. It would not have innovation. So that's what triggers this ten of innovation part because of all this tippy expects in the country. And also there's a lot of cyber, you know, it's time. I think I've seen one cool startup. There's also backed by our VC selling. Serve, uh, think about like face un recognition, critical technology off sent you a picture and make it such that you machine learning will not be able to recognize Recognize that, you know, sort of out of the cyber attack for image recognition. So that's something pretty unique that I've heard. But there are other starts working on all the aspects on their ops and information in our animal and also cyber automated cyber security and hope. Curious aspect. >> Right, Right. Thank you very much. Your own. This has been an excellent conversation, and we've really enjoyed hearing your comments. And Iguazu. It was a great company. Quite quite an innovator is always a pleasure to have you on the Cube. With that, I'm going to sign off. This is James Kabila's with wicked bond with your own haviv on dh er we bid You all have a good day. >> Thank you.

Published Date : Apr 4 2019

SUMMARY :

From our studios in the heart of Silicon Valley. It's your own Haviv Close the deal of any thanks from my seeing you again. new opportunities or possibilities that the convergence of those technologies enable for A scientist Inning the silo, you know, with a bunch of large that Which is that A. I is the heart of modern applications built, OK, just over the years, you know, people, four years when we started, of data of the data science pipeline kick you connect the dots of nuclear and data science and a I from So, and the interesting point is that if you think You know, among teams you have a high profile partnerships with Microsoft and, you know, if someone already has his work bench that I don't know my customers say they were locking me are you Are there you just testing the waters in that market for your technology? you know, in in intercepting data from your sister's customers, This is the week of Flink forward here in San San Francisco. And also under you grow your market, which is a together Melon ox, and I believe you used to be with Melon Axe, so I'd like to get your commentary on that acquisition Well, I think it's a great merger if you think about men in in terms of just the amount of brainpower focused on cyber security there. And also there's a lot of cyber, you know, it's time. Quite quite an innovator is always a pleasure to have you on the Cube.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
MicrosoftORGANIZATION

0.99+

SamsungORGANIZATION

0.99+

IsraelLOCATION

0.99+

GoogleORGANIZATION

0.99+

San San FranciscoLOCATION

0.99+

April 2019DATE

0.99+

James KabilaPERSON

0.99+

IguazuLOCATION

0.99+

Silicon ValleyLOCATION

0.99+

eleven monthsQUANTITY

0.99+

AmazonORGANIZATION

0.99+

Tel AvivLOCATION

0.99+

Yaron HavivPERSON

0.99+

Wicked BondORGANIZATION

0.99+

two weeksQUANTITY

0.99+

twelve batonsQUANTITY

0.99+

Palo AltoLOCATION

0.99+

first questionQUANTITY

0.99+

HavivPERSON

0.99+

three pillarsQUANTITY

0.99+

third elementQUANTITY

0.99+

last weekDATE

0.99+

twoQUANTITY

0.99+

BrianPERSON

0.99+

KansasLOCATION

0.99+

TodayDATE

0.99+

WSORGANIZATION

0.99+

JupiterLOCATION

0.99+

ErosORGANIZATION

0.98+

bothQUANTITY

0.98+

IndiaLOCATION

0.98+

OxORGANIZATION

0.98+

second thingQUANTITY

0.98+

hundreds of megabytesQUANTITY

0.98+

VP Data CenterORGANIZATION

0.98+

oneQUANTITY

0.98+

earlier this weekDATE

0.98+

second boxQUANTITY

0.98+

EuropeLOCATION

0.98+

U. S.LOCATION

0.98+

four yearsQUANTITY

0.98+

two pillarsQUANTITY

0.98+

IguazuPERSON

0.98+

Melon AxeORGANIZATION

0.97+

two versionQUANTITY

0.97+

todayDATE

0.97+

first eventQUANTITY

0.97+

Tel Aviv, IsraelLOCATION

0.97+

eachQUANTITY

0.96+

threeQUANTITY

0.95+

One commandQUANTITY

0.95+

EllisPERSON

0.94+

a year agoDATE

0.94+

IguazioPERSON

0.94+

DellaPERSON

0.94+

FlinkORGANIZATION

0.94+

One thingQUANTITY

0.94+

LandaTITLE

0.94+

second oneQUANTITY

0.93+

three major technologiesQUANTITY

0.93+

few weeks agoDATE

0.92+

SilverORGANIZATION

0.92+

single engineQUANTITY

0.92+

a year orDATE

0.91+

about two hundred timesQUANTITY

0.9+

two agoDATE

0.88+

Derrickson NamesakeORGANIZATION

0.88+

EarlPERSON

0.88+

GuiderORGANIZATION

0.88+

HOLLOWAY ALTO, CaliforniaLOCATION

0.86+

AmericansLOCATION

0.86+

three componentsQUANTITY

0.85+

FrostPERSON

0.81+

Lambda Stage MakerTITLE

0.8+

Tokyo JupiterORGANIZATION

0.8+

chinaLOCATION

0.78+

ArabOTHER

0.77+

dozensQUANTITY

0.77+

EpsteinPERSON

0.76+

last two yearsDATE

0.75+