Florian Berberich, PRACE AISBL | SuperComputing 22
>>We're back at Supercomputing 22 in Dallas, winding down day four of this conference. I'm Paul Gillan, my co-host Dave Nicholson. We are talking, we've been talking super computing all week and you hear a lot about what's going on in the United States, what's going on in China, Japan. What we haven't talked a lot about is what's going on in Europe and did you know that two of the top five supercomputers in the world are actually from European countries? Well, our guest has a lot to do with that. Florian, bearish, I hope I pronounce that correctly. My German is, German is not. My strength is the operations director for price, ais, S B L. And let's start with that. What is price? >>So, hello and thank you for the invitation. I'm Flon and Price is a partnership for Advanced Computing in Europe. It's a non-profit association with the seat in Brussels in Belgium. And we have 24 members. These are representatives from different European countries dealing with high performance computing in at their place. And we, so far, we provided the resources for our European research communities. But this changed in the last year, this oral HPC joint undertaking who put a lot of funding in high performance computing and co-funded five PET scale and three preis scale systems. And two of the preis scale systems. You mentioned already, this is Lumi and Finland and Leonardo in Bologna in Italy were in the place for and three and four at the top 500 at least. >>So why is it important that Europe be in the top list of supercomputer makers? >>I think Europe needs to keep pace with the rest of the world. And simulation science is a key technology for the society. And we saw this very recently with a pandemic, with a covid. We were able to help the research communities to find very quickly vaccines and to understand how the virus spread around the world. And all this knowledge is important to serve the society. Or another example is climate change. Yeah. With these new systems, we will be able to predict more precise the changes in the future. So the more compute power you have, the better the smaller the grid and there is resolution you can choose and the lower the error will be for the future. So these are, I think with these systems, the big or challenges we face can be addressed. This is the climate change, energy, food supply, security. >>Who are your members? Do they come from businesses? Do they come from research, from government? All of the >>Above. Yeah. Our, our members are public organization, universities, research centers, compute sites as a data centers, but But public institutions. Yeah. And we provide this services for free via peer review process with excellence as the most important criteria to the research community for free. >>So 40 years ago when, when the idea of an eu, and maybe I'm getting the dates a little bit wrong, when it was just an idea and the idea of a common currency. Yes. Reducing friction between, between borders to create a trading zone. Yes. There was a lot of focus there. Fast forward to today, would you say that these efforts in supercomputing, would they be possible if there were not an EU super structure? >>No, I would say this would not be possible in this extent. I think when though, but though European initiatives are, are needed and the European Commission is supporting these initiatives very well. And before praise, for instance 2008, there were research centers and data centers operating high performance computing systems, but they were not talking to each other. So it was isolated praise created community of operation sites and it facilitated the exchange between them and also enabled to align investments and to, to get the most out of the available funding. And also at this time, and still today for one single country in Europe, it's very hard to provide all the different architectures needed for all the different kind of research communities and applications. If you want to, to offer always the latest technologies, though this is really hardly possible. So with this joint action and opening the resources for other research groups from other countries, you, we, we were able to, yeah, get access to the latest technology for different communities at any given time though. And >>So, so the fact that the two systems that you mentioned are physically located in Finland and in Italy, if you were to walk into one of those facilities and meet the people that are there, they're not just fins in Finland and Italians in Italy. Yeah. This is, this is very much a European effort. So this, this is true. So, so in this, in that sense, the geography is sort of abstracted. Yeah. And the issues of sovereignty that make might take place in in the private sector don't exist or are there, are there issues with, can any, what are the requirements for a researcher to have access to a system in Finland versus a system in Italy? If you've got a EU passport, Hmm. Are you good to go? >>I think you are good to go though. But EU passport, it's now it becomes complicated and political. It's, it's very much, if we talk about the recent systems, well first, let me start a praise. Praise was inclusive and there was no any constraints as even we had users from US, Australia, we wanted just to support excellence in science. And we did not look at the nationality of the organization, of the PI and and so on. There were quotas, but these quotas were very generously interpreted. So, and if so, now with our HPC joint undertaking, it's a question from what European funds, these systems were procured and if a country or being country are associated to this funding, the researchers also have access to these systems. And this addresses basically UK and and Switzerland, which are not in the European Union, but they were as created to the Horizon 2020 research framework. And though they could can access the systems now available, Lumi and Leono and the Petascale system as well. How this will develop in the future, I don't know. It depends to which research framework they will be associated or not. >>What are the outputs of your work at price? Are they reference designs? Is it actual semiconductor hardware? Is it the research? What do you produce? >>So the, the application we run or the simulation we run cover all different scientific domains. So it's, it's science, it's, but also we have industrial let projects with more application oriented targets. Aerodynamics for instance, for cars or planes or something like this. But also fundamental science like the physical elementary physics particles for instance or climate change, biology, drug design, protein costa, all these >>Things. Can businesses be involved in what you do? Can they purchase your, your research? Do they contribute to their, I'm sure, I'm sure there are many technology firms in Europe that would like to be involved. >>So this involving industry though our calls are open and is, if they want to do open r and d, they are invited to submit also proposals. They will be evaluated and if this is qualifying, they will get the access and they can do their jobs and simulations. It's a little bit more tricky if it's in production, if they use these resources for their business and do not publish the results. They are some, well, probably more sites who, who are able to deal with these requests. Some are more dominant than others, but this is on a smaller scale, definitely. Yeah. >>What does the future hold? Are you planning to, are there other countries who will be joining the effort, other institutions? Do you plan to expand your, your scope >>Well, or I think or HPC joint undertaking with 36 member states is quite, covers already even more than Europe. And yeah, clearly if, if there are other states interest interested to join that there is no limitation. Although the focus lies on European area and on union. >>When, when you interact with colleagues from North America, do you, do you feel that there is a sort of European flavor to supercomputing that is different or are we so globally entwined? No. >>So research is not national, it's not European, it's international. This is also clearly very clear and I can, so we have a longstanding collaboration with our US colleagues and also with Chap and South Africa and Canada. And when Covid hit the world, we were able within two weeks to establish regular seminars inviting US and European colleagues to talk to to other, to each other and exchange the results and find new collaboration and to boost the research activities. So, and I have other examples as well. So when we, we already did the joint calls US exceed and in Europe praise and it was a very interesting experience. So we received applications from different communities and we decided that we will review this on our side, on European, with European experts and US did it in US with their experts. And you can guess what the result was at the meeting when we compared our results, it was matching one by one. It was exactly the same. Recite >>That it, it's, it's refreshing to hear a story of global collaboration. Yeah. Where people are getting along and making meaningful progress. >>I have to mention you, I have to to point out, you did not mention China as a country you were collaborating with. Is that by, is that intentional? >>Well, with China, definitely we have less links and collaborations also. It's also existing. There, there was initiative to look at the development of the technologies and the group meet on a regular basis. And there, there also Chinese colleagues involved. It's on a lower level, >>Yes, but is is the con conversations are occurring. We're out of time. Florian be operations director of price, European Super Computing collaborative. Thank you so much for being with us. I'm always impressed when people come on the cube and submit to an interview in a language that is not their first language. Yeah, >>Absolutely. >>Brave to do that. Yeah. Thank you. You're welcome. Thank you. We'll be right back after this break from Supercomputing 22 in Dallas.
SUMMARY :
Well, our guest has a lot to do with that. And we have 24 members. And we saw this very recently with excellence as the most important criteria to the research Fast forward to today, would you say that these the exchange between them and also enabled to So, so the fact that the two systems that you mentioned are physically located in Finland nationality of the organization, of the PI and and so on. But also fundamental science like the physical Do they contribute to their, I'm sure, I'm sure there are many technology firms in business and do not publish the results. Although the focus lies on European area is different or are we so globally entwined? so we have a longstanding collaboration with our US colleagues and That it, it's, it's refreshing to hear a story of global I have to mention you, I have to to point out, you did not mention China as a country you the development of the technologies and the group meet Yes, but is is the con conversations are occurring. Brave to do that.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Nicholson | PERSON | 0.99+ |
Paul Gillan | PERSON | 0.99+ |
Florian Berberich | PERSON | 0.99+ |
Brussels | LOCATION | 0.99+ |
Finland | LOCATION | 0.99+ |
Europe | LOCATION | 0.99+ |
US | LOCATION | 0.99+ |
European Commission | ORGANIZATION | 0.99+ |
Dallas | LOCATION | 0.99+ |
Italy | LOCATION | 0.99+ |
Bologna | LOCATION | 0.99+ |
two | QUANTITY | 0.99+ |
24 members | QUANTITY | 0.99+ |
Florian | PERSON | 0.99+ |
United States | LOCATION | 0.99+ |
two systems | QUANTITY | 0.99+ |
North America | LOCATION | 0.99+ |
2008 | DATE | 0.99+ |
Belgium | LOCATION | 0.99+ |
Australia | LOCATION | 0.99+ |
four | QUANTITY | 0.99+ |
three | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
last year | DATE | 0.99+ |
EU | ORGANIZATION | 0.99+ |
Covid | PERSON | 0.99+ |
pandemic | EVENT | 0.99+ |
first language | QUANTITY | 0.98+ |
two weeks | QUANTITY | 0.98+ |
first | QUANTITY | 0.98+ |
Canada | LOCATION | 0.98+ |
South Africa | LOCATION | 0.97+ |
European | OTHER | 0.97+ |
36 member states | QUANTITY | 0.97+ |
Chap | ORGANIZATION | 0.97+ |
40 years ago | DATE | 0.97+ |
Horizon 2020 | TITLE | 0.96+ |
HPC | ORGANIZATION | 0.96+ |
Flon | ORGANIZATION | 0.96+ |
European | LOCATION | 0.96+ |
day four | QUANTITY | 0.94+ |
Chinese | OTHER | 0.93+ |
Switzerland | LOCATION | 0.92+ |
UK | LOCATION | 0.92+ |
ais | ORGANIZATION | 0.91+ |
one of those facilities | QUANTITY | 0.86+ |
five supercomputers | QUANTITY | 0.86+ |
European Union | ORGANIZATION | 0.85+ |
Lumi and | ORGANIZATION | 0.8+ |
Leonardo | ORGANIZATION | 0.79+ |
three preis scale systems | QUANTITY | 0.78+ |
one single country | QUANTITY | 0.78+ |
China, | LOCATION | 0.78+ |
Price | ORGANIZATION | 0.76+ |
Finland | ORGANIZATION | 0.69+ |
Europe | ORGANIZATION | 0.68+ |
22 | OTHER | 0.67+ |
500 | QUANTITY | 0.66+ |
China | LOCATION | 0.65+ |
five PET | QUANTITY | 0.64+ |
S B L. | PERSON | 0.6+ |
price | ORGANIZATION | 0.6+ |
scale | OTHER | 0.58+ |
Petascale | TITLE | 0.57+ |
Benoit Dageville and Florian Douetteau V1
>> Hello everyone, welcome back to theCUBE'S wall to wall coverage of the Snowflake Data Cloud Summit. My name is Dave Vellante and with me are two world-class technologists, visionaries, and entrepreneurs. Benoit Dageville is the, he co-founded Snowflake. And he's now the president of the Product division and Florian Douetteau is the co-founder and CEO of Dataiku. Gentlemen, welcome to theCUBE, two first timers, love it. >> Great time to be here. >> Now Florian, you and Benoit, you have a number of customers in common. And I've said many times on theCUBE that, the first era of cloud was really about infrastructure, making it more agile taking out costs. And the next generation of innovation is really coming from the application of machine intelligence to data with the cloud, is really the scale platform. So is that premise relevant to you, do you buy that? And why do you think Snowflake and Dataiku make a good match for customers? >> I think that because it's our values that align. When it gets all about actually today, and knowing complexity per customer, so you close the gap or we need to commoditize the access to data, the access to technology, it's not only about data, data is important, but it's also about the impacts of data. How can you make the best out of data as fast as possible, as easily as possible within an organization? And another value is about just the openness of the platform, building a future together. I think a platform that is not just about the platform but also for the ecosystem of partners around it, bringing the little bit of accessibility and flexibility, you need for the 10 years of that. >> Yes, so that's key, but it's not just data. It's turning data into insights. Now Benoit, you came out of the world of very powerful, but highly complex databases. And we all know that, you and the Snowflake team, you get very high marks for really radically simplifying customers' lives. But can you talk specifically about the types of challenges that your customers are using Snowflake to solve? >> Yeah, so really the challenge before Snowflake, I would say, was really to put all the data, in one place and run all the computes, all the workloads that you wanted to run, against that data. And of course, existing legacy platforms were not able to support that level of concurrency, many workload. We talk about machine learning, data science, data engineering, data warehouse, big data workloads, all running in one place, didn't make sense at all. And therefore, what customers did, is to create silos, silos of data everywhere, with different systems having a subset of the data. And of course now you cannot analyze this data in one place. So Snowflake, we really solved that problem by creating a single architecture where you can put all the data in the cloud. So it's a really cloud native. We really thought about how to solve that problem, how to create leverage cloud and the elasticity of cloud to really put all the data in one place. But at the same time, not run all workload at the same place. So each workload that runs in Snowflake at least dedicate compute resources to run. And that makes it very agile, right. Florian talked about data scientist having to run analysis. So they need a lot of compute resources, but only for few hours and with Snowflake, they can run these new workload, add this workload to the system, get the compute resources that they need to run this workload. And then when it's over, they can shut down their system. It will automatically shut down. Therefore they would not pay for the resources that they don't choose. So it's a very agile system, where you can do these analysis when you need, and you have all the power to run all these workload at the same time. >> Well, it's profound what you guys built. To me, I mean, because everybody's trying to copy it now. It's like, I remember the notion of bringing compute to the data in the Hadoop days. And I think that, as I say, everybody is sort of following your suit now or trying to. Florian, I got to say, the first data scientist I ever interviewed on theCUBE was the amazing Hilary Mason, right after she started at Bitly. And she made data science sounds so compelling, but data science is hard. So same question for you. What do you see is the biggest challenges for customers that they're facing with data science? >> The biggest challenge from my perspective is that once you solve the issue of the data silo with Snowflake, you don't want to bring another silo, which would be a silo of skills. And essentially, thanks to that talent gap between the talent and labor of the markets, or how it is to actually find, recruit and train data scientists and what needs to be done. And so you need actually to simplify the access to technology such as every organization can make it, whatever the talents by bridging that gap. And to get there, there is a need of actually breaking up the silos. I think a collaborative approach, where technologies and business work together and actually all put some of their ends into those data projects together. >> Yeah, it makes sense. So Florian, Let's stay with you for a minute, if I can. Your observation spaces, is pretty, pretty global. And so, you have a unique perspective on how companies around the world might be using data and data science. Are you seeing any trends, maybe differences between regions or maybe within different industries? What are you seeing? >> Yep. Yeah, definitely, I do see trends that are not geographic that much, but much more in terms of maturity of certain industries and certain sectors, which are that certain industries invested a lot in terms of data, data access, ability to store data as well as few years and know each level of maturity where they can invest more and get to the next steps. And it's really reliant to reach out to certain details, certain organization, actually to have built this longterm data strategy a few years ago, and no stocks ripping off the benefits. >> You know, a decade ago, Florian, Hal Varian famously said that the sexy job in the next 10 years will be statisticians. And then everybody sort of changed that to data scientists. And then everybody, all the statisticians became data scientists and they got a raise. But data science requires more than just statistics acumen. What skills do you see is critical for the next generation of data science? >> Yeah, it's a good question because I think the first generation of data scientists became better scientists because they could learn some Python quickly and be flexible. And I think that skills of the next generation of data scientists will definitely be different. It will be first about being able to speak the language of the business, meaning all you translate data insight, predictive modeling, all of this into actionable insights or business impact. And it will be about who you collaborate with the rest of the business. It's not just how fast you can build something, how fast you can do a notebook in Python or do quantity models of some sorts. It's about how you actually build this bridge with the business. And obviously those things are important, but we also must be cognizant of the fact that technology will evolve in the future. There will be new tools in technologies, and they will still need to get this level of flexibility and get to understand quickly what are the next tools, they need to use or new languages or whatever to get there. >> Thank you for that. Benoit, let's come back to you. This year has been tumultuous to say the least for everyone, but it's a good time to be in tech, ironically. And if you're in cloud, it's even better. But you look at Snowflake and Dataiku, you guys had done well, despite the economic uncertainty and the challenges of the pandemic. As you look back on 2020, what are you thinking? What are you telling people as we head into next year? >> Yeah, I think it's very interesting, right. We, this crisis has told us that the world really can change from one day to the next. And this has dramatic and profound aspects. For example, companies all of a sudden, saw their revenue line dropping and they had to do less with data. And some of the companies was the reverse, right? All of a sudden, they were online like Instacart, for example, and their business completely change from one day to the other. So this agility of adjusting the resources that you have to do the task, a need that can change, using solution like Snowflake, really helps that. And we saw both in our customers. Some customers from one day to the next, were growing like big time, because they benefited from COVID and their business benefited, but also, as you know, had to drop and what is nice with cloud, it allows to adjust compute resources to your business needs and really address it in-house. The other aspect is understanding what is happening, right? You need to analyze. So we saw all our customers basically wanted to understand, what is it going to be the impact on my business? How can I adapt? How can I adjust? And for that, they needed to analyze data. And of course, a lot of data, which are not necessarily data about their business, but also data from the outside. For example, COVID data. Where is the state, what is the impact, geographic impact on COVID all the time. And access to this data is critical. So this is the promise of the data cloud, right? Having one single place where you can put all the data of the world. So, our customers all of a sudden, started to consume the COVID data from our data marketplace. And we have the unit already thousands of customers looking at this data, analyzing this data to make good decisions. So this agility and this adapting from one hour to the next is really critical and that goes with data, with cloud, more interesting resources and that's doesn't exist on premise. So, indeed I think the lesson learned is, we are living in a world which is changing all the time, and we have to understand it. We have to adjust and that's why cloud, some way is great. >> Excellent, thank you. You know, in theCUBE, we like to talk about disruption, of course, who doesn't. And also, I mean, you look at AI and the impact that it's beginning to have and kind of pre-COVID, you look at some of the industries that were getting disrupted by, everybody talks about digital transformation and you had on the one end of the spectrum, industries like publishing, which are highly disrupted or taxis, and you can say, "Okay well, that's Bits versus Adam, the old Negroponte thing." But then the flip side of this, it says, "Look at financial services that hadn't been dramatically disrupted, certainly healthcare, which is right for disruption, defense." So the more the number of industries that really hadn't leaned into digital transformation, if it ain't broke, don't fix it. Not on my watch. There was this complacency. And then of course COVID broke everything. So Florian, I wonder if you could comment, what industry or industries do you think are going to be most impacted by data science and what I call machine intelligence or AI in the coming years and decades? >> Honestly, I think it's all of them, or at least most of them. Because for some industries, the impact is very visible because we are talking about brand new products, drones, flying cars, or whatever is that are very visible for us. But for others, we are talking about spectrum changes in the way you operate as an organization. Even if financial industry itself doesn't seem to be so impacted when you look at it from the consumer side or the outside. In fact internally, it's probably impacted just because of the way you use data to develop for flexibility you need, is there kind of a cost gain you can get by leveraging the latest technologies, is just enormous. And so it will, actually comes from the industry, that also. And overall, I think that 2020 is a year where, from the perspective of AI and analytics, we understood this idea of maturity and resilience. Maturity, meaning that when you've got a crisis, you actually need data and AI more than before, you need to actually call the people from data in the room to take better decisions and look forward and not backward. And I think that's a very important learning from 2020 that will tell things about 2021. And resilience, it's like, yeah, data analytics today is a function consuming every industries, and is so important that it's something that needs to work. So the infrastructure needs to work, the infrastructure needs to be super resilient. So probably not on trend and not fully on trend, at some point and the kind of residence where you need to be able to plan for literally anything. like no hypothesis in terms of behaviors can be taken for granted. And that's something that is new and which is just signaling that we are just getting into a next step for all data analytics. >> I wonder Benoit, if you have anything to add to that, I mean, I often wonder, you know, when are machines going to be able to make better diagnoses than doctors, some people say already. Will the financial services, traditional banks lose control of payment systems? You know, what's going to happen to big retail stores? I mean, may be bring us home with maybe some of your final thoughts. >> Yeah, I would say, I don't see that as a negative, right? The human being will always be involved very closely, but then the machine and the data can really help, see correlation in the data that would be impossible for human being alone to discover. So, I think it's going to be a compliment, not a replacement and everything that has made us faster, doesn't mean that we have less work to do. It means that we can do more. And we have so much to do. That I would not be worried about the effect of being more efficient and better at our work. And indeed, I fundamentally think that, data, processing of images and doing AI on these images and discovering patterns and potentially flagging disease, way earlier than it was possible, it is going to have a huge impact in health care. And as Florian was saying, every industry is going to be impacted by that technology. So, yeah, I'm very optimistic. >> Great, Guys, I wish we had more time. We got to leave it there but so thanks so much for coming on theCUBE. It was really a pleasure having you. >> [Benoit & Florian] Thank you. >> You're welcome but keep it right there, everybody. We'll back with our next guest, right after this short break. You're watching theCUBE.
SUMMARY :
And he's now the president And the next generation of the access to data, the And we all know that, you all the workloads that you the notion of bringing the access to technology such as And so, you have a unique And it's really reliant to reach out Hal Varian famously said that the sexy job And it will be about who you collaborate and the challenges of the pandemic. adjusting the resources that you have end of the spectrum, of the way you use data to I mean, I often wonder, you know, So, I think it's going to be a compliment, We got to leave it there right after this short break.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Vellante | PERSON | 0.99+ |
Florian | PERSON | 0.99+ |
Benoit | PERSON | 0.99+ |
Florian Douetteau | PERSON | 0.99+ |
Benoit Dageville | PERSON | 0.99+ |
2020 | DATE | 0.99+ |
10 years | QUANTITY | 0.99+ |
Dataiku | ORGANIZATION | 0.99+ |
Hilary Mason | PERSON | 0.99+ |
Python | TITLE | 0.99+ |
Hal Varian | PERSON | 0.99+ |
next year | DATE | 0.99+ |
Snowflake | ORGANIZATION | 0.99+ |
one place | QUANTITY | 0.99+ |
both | QUANTITY | 0.99+ |
one hour | QUANTITY | 0.99+ |
Bitly | ORGANIZATION | 0.99+ |
Snowflake Data Cloud Summit | EVENT | 0.99+ |
a decade ago | DATE | 0.98+ |
one day | QUANTITY | 0.98+ |
theCUBE | ORGANIZATION | 0.98+ |
first | QUANTITY | 0.98+ |
each level | QUANTITY | 0.98+ |
Snowflake | TITLE | 0.98+ |
2021 | DATE | 0.97+ |
today | DATE | 0.97+ |
first generation | QUANTITY | 0.97+ |
pandemic | EVENT | 0.97+ |
few years ago | DATE | 0.93+ |
thousands of customers | QUANTITY | 0.93+ |
single architecture | QUANTITY | 0.92+ |
first era | QUANTITY | 0.88+ |
Negroponte | PERSON | 0.87+ |
first data scientist | QUANTITY | 0.87+ |
Instacart | ORGANIZATION | 0.87+ |
This year | DATE | 0.86+ |
one single place | QUANTITY | 0.86+ |
two | QUANTITY | 0.83+ |
two world- | QUANTITY | 0.78+ |
each workload | QUANTITY | 0.78+ |
one | QUANTITY | 0.76+ |
Adam | PERSON | 0.74+ |
next 10 years | DATE | 0.69+ |
first timers | QUANTITY | 0.52+ |
COVID | OTHER | 0.51+ |
COVID | ORGANIZATION | 0.43+ |
COVID | EVENT | 0.37+ |
decades | DATE | 0.29+ |
Boost Your Solutions with the HPE Ezmeral Ecosystem Program | HPE Ezmeral Day 2021
>> Hello. My name is Ron Kafka, and I'm the senior director for Partner Scale Initiatives for HBE Ezmeral. Thanks for joining us today at Analytics Unleashed. By now, you've heard a lot about the Ezmeral portfolio and how it can help you accomplish objectives around big data analytics and containerization. I want to shift gears a bit and then discuss our Ezmeral Technology Partner Program. I've got two great guest speakers here with me today. And together, We're going to discuss how jointly we are solving data analytic challenges for our customers. Before I introduce them, I want to take a minute to talk to provide a little bit more insight into our ecosystem program. We've created a program with a realization based on customer feedback that even the most mature organizations are struggling with their data-driven transformation efforts. It turns out this is largely due to the pace of innovation with application vendors or ICS supporting data science and advanced analytic workloads. Their advancements are simply outpacing organization's ability to move workloads into production rapidly. Bottom line, organizations want a unified experience across environments where their entire application portfolio in essence provide a comprehensive application stack and not piece parts. So, let's talk about how our ecosystem program helps solve for this. For starters, we were leveraging HPEs long track record of forging technology partnerships and it created a best in class ISB partner program specific for the Ezmeral portfolio. We were doing this by developing an open concept marketplace where customers and partners can explore, learn, engage and collaborate with our strategic technology partners. This enables our customers to adopt, deploy validated applications from industry leading software vendors on HPE Ezmeral with a high degree of confidence. Also, it provides a very deep bench of leading ISVs for other groups inside of HPE to leverage for their solutioning efforts. Speaking of industry leading ISV, it's about time and introduce you to two of those industry leaders right now. Let me welcome Daniel Hladky from Dataiku, and Omri Geller from Run:AI. So I'd like to introduce Daniel Hladky. Daniel is with Dataiku. He's a great partner for HPE. Daniel, welcome. >> Thank you for having me here. >> That's great. Hey, would you mind just talking a bit about how your partnership journey has been with HPE? >> Yes, pleasure. So the journey started about five years ago and in 2018 we signed a worldwide reseller agreement with HPE. And in 2020, we actually started to work jointly on the integration between the Dataiku Data Science Studio called DSS and integrated that with the Ezmeral Container platform, and was a great success. And it was on behalf of some clear customer projects. >> It's been a long partnership journey with you for sure with HPE. And we welcome your partnership extremely well. Just a brief question about the Container Platform and really what that's meant for Dataiku. >> Yes, Ron. Thanks. So, basically I'd like the quote here Florian Douetteau, which is the CEO of Dataiku, who said that the combination of Dataiku with the HPE Ezmeral Container Platform will help the customers to successfully scale and put machine learning projects into production. And this basically is going to deliver real impact for their business. So, the combination of the two of us is a great success. >> That's great. Can you talk about what Dataiku is doing and how HPE Ezmeral Container Platform fits in a solution offering a bit more? >> Great. So basically Dataiku DSS is our product which is a end to end data science platform, and basically brings value to the project of customers on their past enterprise AI. In simple ways, we can say it could be as simple as building data pipelines, but it could be also very complex by having machine and deep learning models at scale. So the fast track to value is by having collaboration, orchestration online technologies and the models in production. So, all of that is part of the Data Science Studio and Ezmeral fits perfectly into the part where we design and then basically put at scale those project and put it into product. >> That's perfect. Can you be a bit more specific about how you see HPE and Dataiku really tightening up a customer outcome and value proposition? >> Yes. So what we see is also the challenge of the market that probably about 80% of the use cases really never make it to production. And this is of course a big challenge and we need to change that. And I think the combination of the two of us is actually addressing exactly this need. What we can say is part of the MLOps approach, Dataiku and the Ezmeral Container Platform will provide a frictionless approach, which means without scripting and coding, customers can put all those projects into the productive environment and don't have to worry any more and be more business oriented. >> That's great. So you mentioned you're seeing customers be a lot more mature with their AI workloads and deployment. What do you suggest for the other customers out there that are just starting this journey or just thinking about how to get started? >> Yeah. That's a very good question, Ron. So what we see there is actually the challenge that people need to go on a pass of maturity. And this starts with a simple data pipelines, et cetera, and then basically move up the ladder and basically build large complex project. And here I see a very interesting offer coming now from HPE which is called D3S, which is the data science startup pack. That's something I discussed together with HPE back in early 2020. And basically, it solves the three stages, which is explore, experiment and evolve and builds quickly MVPs for the customers. By doing so, basically you addressed business objectives, lay out in the proper architecture and also setting up the proper organization around it. So, this is a great combination by HPE and Dataiku through the D3S. >> And it's a perfect example of what I mentioned earlier about leveraging the ecosystem program that we built to do deeper solutioning efforts inside of HPE in this case with our AI business unit. So, congratulations on that and thanks for joining us today. I'm going to shift gears. I'm going to bring in Omri Geller from Run:AI. Omri, welcome. It's great to have you. You guys are killing it out there in the market today. And I just thought we could spend a few minutes talking about what is so unique and differentiated from your offerings. >> Thank you, Ron. It's a pleasure to be here. Run:AI creates a virtualization and orchestration layer for AI infrastructure. We help organizations to gain visibility and control over their GPO resources and help them deliver AI solutions to market faster. And we do that by managing granular scheduling, prioritization, allocation of compute power, together with the HPE Ezmeral Container Platform. >> That's great. And your partnership with HPE is a bit newer than Daniel's, right? Maybe about the last year or so we've been working together a lot more closely. Can you just talk about the HPE partnership, what it's meant for you and how do you see it impacting your business? >> Sure. First of all, Run:AI is excited to partner with HPE Ezmeral Container Platform and help customers manage appeals for their AI workloads. We chose HPE since HPE has years of experience partnering with AI use cases and outcomes with vendors who have strong footprint in this markets. HPE works with many partners that are complimentary for our use case such as Nvidia, and HPE Container Platform together with Run:AI and Nvidia deliver a world class solutions for AI accelerated workloads. And as you can understand, for AI speed is critical. Companies want to gather important AI initiatives into production as soon as they can. And the HPE Ezmeral Container Platform, running IGP orchestration solution enables that by enabling dynamic provisioning of GPU so that resources can be easily shared, efficiently orchestrated and optimal used. >> That's great. And you talked a lot about the efficiency of the solution. What about from a customer perspective? What is the real benefit that our customers are going to be able to gain from an HPE and Run:AI offering? >> So first, it is important to understand how data scientists and AI researchers actually build solution. They do it by running experiments. And if a data scientist is able to run more experiments per given time, they will get to the solution faster. With HPE Ezmeral Container Platform, Run:AI and users such as data scientists can actually do that and seamlessly and efficiently consume large amounts of GPU resources, run more experiments or given time and therefore accelerate their research. Together, we actually saw a customer that is running almost 7,000 jobs in parallel over GPUs with efficient utilization of those GPUs. And by running more experiments, those customers can be much more effective and efficient when it comes to bringing solutions to market >> Couldn't agree more. And I think we're starting to see a lot of joint success together as we go out and talk to the story. Hey, I want to thank you both one last time for being here with me today. It was very enlightening for our team to have you as part of the program. And I'm excited to extend this customer value proposition out to the rest of our communities. With that, I'd like to close today's session. I appreciate everyone's time. And keep an eye out on our ISP marketplace for Ezmeral We're continuing to expand and add new capabilities and new partners to our marketplace. We're excited to do a lot of great things and help you guys all be successful. Thanks for joining. >> Thank you, Ron. >> What a great panel discussion. And these partners they really do have a good understanding of the possibilities, working on the platform, and I hope and expect we'll see this ecosystem continue to grow. That concludes the main program, which means you can now pick one of three live demos to attend and chat live with experts. Now those three include day in the life of IT Admin, day in the life of a data scientist, and even a day in the life of the HPE Ezmeral Data Fabric, where you can see the many ways the data fabric is used in your life today. Wish you could attend all three, no worries. The recordings will be available on demand for you and your teams. Moreover, the show doesn't stop here, HPE has a growing and thriving tech community, you should check it out. It's really a solid starting point for learning more, talking to smart people about great ideas and seeing how Ezmeral can be part of your own data journey. Again, thanks very much to all of you for joining, until next time, keep unleashing the power of your data.
SUMMARY :
and how it can help you Hey, would you mind just talking a bit and integrated that with the and really what that's meant for Dataiku. So, basically I'd like the quote here Florian Douetteau, and how HPE Ezmeral Container Platform and the models in production. about how you see HPE and and the Ezmeral Container Platform or just thinking about how to get started? and builds quickly MVPs for the customers. and differentiated from your offerings. and control over their GPO resources and how do you see it and HPE Container Platform together with Run:AI efficiency of the solution. So first, it is important to understand for our team to have you and even a day in the life of
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Daniel | PERSON | 0.99+ |
Ron Kafka | PERSON | 0.99+ |
Ron | PERSON | 0.99+ |
Omri Geller | PERSON | 0.99+ |
Florian Douetteau | PERSON | 0.99+ |
HPE | ORGANIZATION | 0.99+ |
Daniel Hladky | PERSON | 0.99+ |
Dataiku | ORGANIZATION | 0.99+ |
two | QUANTITY | 0.99+ |
2020 | DATE | 0.99+ |
Nvidia | ORGANIZATION | 0.99+ |
2018 | DATE | 0.99+ |
DSS | ORGANIZATION | 0.99+ |
one | QUANTITY | 0.99+ |
last year | DATE | 0.99+ |
today | DATE | 0.99+ |
three | QUANTITY | 0.99+ |
early 2020 | DATE | 0.99+ |
first | QUANTITY | 0.98+ |
Data Science Studio | ORGANIZATION | 0.98+ |
Ezmeral | PERSON | 0.98+ |
Ezmeral | ORGANIZATION | 0.98+ |
Dataiku Data Science Studio | ORGANIZATION | 0.97+ |
three live demos | QUANTITY | 0.97+ |
both | QUANTITY | 0.97+ |
about 80% | QUANTITY | 0.96+ |
HPEs | ORGANIZATION | 0.95+ |
three stages | QUANTITY | 0.94+ |
two great guest speakers | QUANTITY | 0.93+ |
Omri | PERSON | 0.91+ |
Analytics Unleashed | ORGANIZATION | 0.91+ |
D3S | TITLE | 0.87+ |
almost 7,000 jobs | QUANTITY | 0.87+ |
HPE Container Platform | TITLE | 0.86+ |
HPE Ezmeral Container Platform | TITLE | 0.83+ |
HBE Ezmeral | ORGANIZATION | 0.83+ |
Run | ORGANIZATION | 0.82+ |
Ezmeral Container Platform | TITLE | 0.81+ |
about five years ago | DATE | 0.8+ |
Platform | TITLE | 0.71+ |
Ezmeral | TITLE | 0.7+ |
Run:AI | ORGANIZATION | 0.7+ |
Ezmeral Data | ORGANIZATION | 0.69+ |
2021 | DATE | 0.68+ |
Ezmeral Ecosystem Program | TITLE | 0.68+ |
ICS | ORGANIZATION | 0.67+ |
Run | TITLE | 0.66+ |
Partner Scale Initiatives | ORGANIZATION | 0.66+ |
Boost Your Solutions with the HPE Ezmeral Ecosystem Program | HPE Ezmeral Day 2021
>> Hello. My name is Ron Kafka, and I'm the senior director for Partner Scale Initiatives for HBE Ezmeral. Thanks for joining us today at Analytics Unleashed. By now, you've heard a lot about the Ezmeral portfolio and how it can help you accomplish objectives around big data analytics and containerization. I want to shift gears a bit and then discuss our Ezmeral Technology Partner Program. I've got two great guest speakers here with me today. And together, We're going to discuss how jointly we are solving data analytic challenges for our customers. Before I introduce them, I want to take a minute to talk to provide a little bit more insight into our ecosystem program. We've created a program with a realization based on customer feedback that even the most mature organizations are struggling with their data-driven transformation efforts. It turns out this is largely due to the pace of innovation with application vendors or ICS supporting data science and advanced analytic workloads. Their advancements are simply outpacing organization's ability to move workloads into production rapidly. Bottom line, organizations want a unified experience across environments where their entire application portfolio in essence provide a comprehensive application stack and not piece parts. So, let's talk about how our ecosystem program helps solve for this. For starters, we were leveraging HPEs long track record of forging technology partnerships and it created a best in class ISB partner program specific for the Ezmeral portfolio. We were doing this by developing an open concept marketplace where customers and partners can explore, learn, engage and collaborate with our strategic technology partners. This enables our customers to adopt, deploy validated applications from industry leading software vendors on HPE Ezmeral with a high degree of confidence. Also, it provides a very deep bench of leading ISVs for other groups inside of HPE to leverage for their solutioning efforts. Speaking of industry leading ISV, it's about time and introduce you to two of those industry leaders right now. Let me welcome Daniel Hladky from Dataiku, and Omri Geller from Run:AI. So I'd like to introduce Daniel Hladky. Daniel is with Dataiku. He's a great partner for HPE. Daniel, welcome. >> Thank you for having me here. >> That's great. Hey, would you mind just talking a bit about how your partnership journey has been with HPE? >> Yes, pleasure. So the journey started about five years ago and in 2018 we signed a worldwide reseller agreement with HPE. And in 2020, we actually started to work jointly on the integration between the Dataiku Data Science Studio called DSS and integrated that with the Ezmeral Container platform, and was a great success. And it was on behalf of some clear customer projects. >> It's been a long partnership journey with you for sure with HPE. And we welcome your partnership extremely well. Just a brief question about the Container Platform and really what that's meant for Dataiku. >> Yes, Ron. Thanks. So, basically I like the quote here Florian Douetteau, which is the CEO of Dataiku, who said that the combination of Dataiku with the HPE Ezmeral Container Platform will help the customers to successfully scale and put machine learning projects into production. And this basically is going to deliver real impact for their business. So, the combination of the two of us is a great success. >> That's great. Can you talk about what Dataiku is doing and how HPE Ezmeral Container Platform fits in a solution offering a bit more? >> Great. So basically Dataiku DSS is our product which is a end to end data science platform, and basically brings value to the project of customers on their past enterprise AI. In simple ways, we can say it could be as simple as building data pipelines, but it could be also very complex by having machine and deep learning models at scale. So the fast track to value is by having collaboration, orchestration online technologies and the models in production. So, all of that is part of the Data Science Studio and Ezmeral fits perfectly into the part where we design and then basically put at scale those project and put it into product. >> That's perfect. Can you be a bit more specific about how you see HPE and Dataiku really tightening up a customer outcome and value proposition? >> Yes. So what we see is also the challenge of the market that probably about 80% of the use cases really never make it to production. And this is of course a big challenge and we need to change that. And I think the combination of the two of us is actually addressing exactly this need. What we can say is part of the MLOps approach, Dataiku and the Ezmeral Container Platform will provide a frictionless approach, which means without scripting and coding, customers can put all those projects into the productive environment and don't have to worry any more and be more business oriented. >> That's great. So you mentioned you're seeing customers be a lot more mature with their AI workloads and deployment. What do you suggest for the other customers out there that are just starting this journey or just thinking about how to get started? >> Yeah. That's a very good question, Ron. So what we see there is actually the challenge that people need to go on a pass of maturity. And this starts with a simple data pipelines, et cetera, and then basically move up the ladder and basically build large complex project. And here I see a very interesting offer coming now from HPE which is called D3S, which is the data science startup pack. That's something I discussed together with HPE back in early 2020. And basically, it solves the three stages, which is explore, experiment and evolve and builds quickly MVPs for the customers. By doing so, basically you addressed business objectives, lay out in the proper architecture and also setting up the proper organization around it. So, this is a great combination by HPE and Dataiku through the D3S. >> And it's a perfect example of what I mentioned earlier about leveraging the ecosystem program that we built to do deeper solutioning efforts inside of HPE in this case with our AI business unit. So, congratulations on that and thanks for joining us today. I'm going to shift gears. I'm going to bring in Omri Geller from Run:AI. Omri, welcome. It's great to have you. You guys are killing it out there in the market today. And I just thought we could spend a few minutes talking about what is so unique and differentiated from your offerings. >> Thank you, Ron. It's a pleasure to be here. Run:AI creates a virtualization and orchestration layer for AI infrastructure. We help organizations to gain visibility and control over their GPO resources and help them deliver AI solutions to market faster. And we do that by managing granular scheduling, prioritization, allocation of compute power, together with the HPE Ezmeral Container Platform. >> That's great. And your partnership with HPE is a bit newer than Daniel's, right? Maybe about the last year or so we've been working together a lot more closely. Can you just talk about the HPE partnership, what it's meant for you and how do you see it impacting your business? >> Sure. First of all, Run:AI is excited to partner with HPE Ezmeral Container Platform and help customers manage appeals for their AI workloads. We chose HPE since HPE has years of experience partnering with AI use cases and outcomes with vendors who have strong footprint in this markets. HPE works with many partners that are complimentary for our use case such as Nvidia, and HPE Ezmeral Container Platform together with Run:AI and Nvidia deliver a word about solution for AI accelerated workloads. And as you can understand, for AI speed is critical. Companies want to gather important AI initiatives into production as soon as they can. And the HPE Ezmeral Container Platform, running IGP orchestration solution enables that by enabling dynamic provisioning of GPU so that resources can be easily shared, efficiently orchestrated and optimal used. >> That's great. And you talked a lot about the efficiency of the solution. What about from a customer perspective? What is the real benefit that our customers are going to be able to gain from an HPE and Run:AI offering? >> So first, it is important to understand how data scientists and AI researchers actually build solution. They do it by running experiments. And if a data scientist is able to run more experiments per given time, they will get to the solution faster. With HPE Ezmeral Container Platform, Run:AI and users such as data scientists can actually do that and seamlessly and efficiently consume large amounts of GPU resources, run more experiments or given time and therefore accelerate their research. Together, we actually saw a customer that is running almost 7,000 jobs in parallel over GPUs with efficient utilization of those GPUs. And by running more experiments, those customers can be much more effective and efficient when it comes to bringing solutions to market >> Couldn't agree more. And I think we're starting to see a lot of joint success together as we go out and talk to the story. Hey, I want to thank you both one last time for being here with me today. It was very enlightening for our team to have you as part of the program. And I'm excited to extend this customer value proposition out to the rest of our communities. With that, I'd like to close today's session. I appreciate everyone's time. And keep an eye out on our ISP marketplace for Ezmeral We're continuing to expand and add new capabilities and new partners to our marketplace. We're excited to do a lot of great things and help you guys all be successful. Thanks for joining. >> Thank you, Ron. (bright upbeat music)
SUMMARY :
and how it can help you journey has been with HPE? and integrated that with the and really what that's meant for Dataiku. and put machine learning and how HPE Ezmeral Container Platform and the models in production. about how you see HPE and and the Ezmeral Container Platform or just thinking about how to get started? and builds quickly MVPs for the customers. and differentiated from your offerings. and control over their GPO resources and how do you see it and outcomes with vendors efficiency of the solution. So first, it is important to understand and new partners to our marketplace. Thank you, Ron.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Daniel | PERSON | 0.99+ |
Ron Kafka | PERSON | 0.99+ |
Florian Douetteau | PERSON | 0.99+ |
Ron | PERSON | 0.99+ |
Omri Geller | PERSON | 0.99+ |
HPE | ORGANIZATION | 0.99+ |
Daniel Hladky | PERSON | 0.99+ |
Nvidia | ORGANIZATION | 0.99+ |
two | QUANTITY | 0.99+ |
2020 | DATE | 0.99+ |
2018 | DATE | 0.99+ |
Dataiku | ORGANIZATION | 0.99+ |
DSS | ORGANIZATION | 0.99+ |
last year | DATE | 0.99+ |
today | DATE | 0.99+ |
Omri | PERSON | 0.99+ |
Data Science Studio | ORGANIZATION | 0.98+ |
early 2020 | DATE | 0.98+ |
first | QUANTITY | 0.98+ |
Ezmeral | ORGANIZATION | 0.98+ |
Dataiku Data Science Studio | ORGANIZATION | 0.97+ |
about 80% | QUANTITY | 0.97+ |
both | QUANTITY | 0.97+ |
HPEs | ORGANIZATION | 0.95+ |
three stages | QUANTITY | 0.94+ |
two great guest speakers | QUANTITY | 0.93+ |
one | QUANTITY | 0.93+ |
almost 7,000 jobs | QUANTITY | 0.92+ |
Analytics Unleashed | ORGANIZATION | 0.91+ |
HPE Ezmeral Container Platform | TITLE | 0.84+ |
HBE Ezmeral | ORGANIZATION | 0.83+ |
Run | ORGANIZATION | 0.83+ |
Ezmeral Container Platform | TITLE | 0.82+ |
D3S | TITLE | 0.81+ |
about five years ago | DATE | 0.8+ |
HPE Ezmeral Container Platform | TITLE | 0.79+ |
2021 | DATE | 0.76+ |
Run:AI | ORGANIZATION | 0.72+ |
Ezmeral | TITLE | 0.7+ |
Platform | TITLE | 0.69+ |
Ezmeral Container Platform | TITLE | 0.68+ |
ICS | ORGANIZATION | 0.67+ |
Partner Scale Initiatives | ORGANIZATION | 0.66+ |
HPE | TITLE | 0.62+ |
DSS | TITLE | 0.6+ |
Ezmeral Container | TITLE | 0.59+ |
Container | TITLE | 0.56+ |
HPE Ezmeral | EVENT | 0.55+ |
First | QUANTITY | 0.52+ |
Run | TITLE | 0.51+ |
Day | EVENT | 0.51+ |
Democratizing AI and Advanced Analytics with Dataiku x Snowflake
>>My name is Dave Volonte, and with me are two world class technologists, visionaries and entrepreneurs. And Wa Dodgeville is the he co founded Snowflake, and he's now the president of the product division. And Florian Duetto is the co founder and CEO of Data Aiko. Gentlemen, welcome to the Cube to first timers. Love it. >>Great to be here >>now, Florian you and Ben Wa You have a number of customers in common. And I have said many times on the Cube that you know, the first era of cloud was really about infrastructure, making it more agile, taking out costs. And the next generation of innovation is really coming from the application of machine intelligence to data with the cloud is really the scale platform. So is that premise your relevant to you? Do you buy that? And and why do you think snowflake and data ICU make a good match for customers? >>I think that because it's our values that are aligned when it's all about actually today allowing complexity for customers. So you close the gap or the democratizing access to data access to technology. It's not only about data data is important, but it's also about the impact of data. Who can you make the best out of data as fast as possible as easily as possible within an organization. And another value is about just the openness of the platform building the future together? Uh, I think a platform that is not just about the platform but also full ecosystem of partners around it, bringing the level off accessibility and flexibility you need for the 10 years away. >>Yeah, so that's key. But it's not just data. It's turning data into insights. Have been why you came out of the world of very powerful but highly complex databases. And we know we all know that you and the snowflake team you get very high marks for really radically simplifying customers lives. But can you talk specifically about the types of challenges that your customers air using snowflake to solve? >>Yeah, so So the really the challenge, you know, be four. Snowflake. I would say waas really? To put all the data, you know, in one place and run all the computers, all the workloads that you wanted to run, You know, against that data and off course, you know, existing legacy platforms. We're not able to support. You know that level of concurrency, Many workload. You know, we we talk about machine learning that a science that are engendering, you know, that our house big data were closed or running in one place didn't make sense at all. And therefore, you know what customers did is to create silos, silos of data everywhere, you know, with different system having a subset of the data. And of course, now you cannot analyze this data in one place. So, snowflake, we really solve that problem by creating a single, you know, architectural where you can put all the data in the cloud. So it's a really cloud native we really thought about You know how to solve that problem, how to create, you know, leverage, Cloud and the lessee cc off cloud to really put all the die in one place, but at the same time not run all workload at the same place. So each workload that runs in Snowflake that is dedicated, You know, computer resource is to run, and that makes it very Ajai, right? You know, Floyd and talk about, you know, data scientists having to run analysis, so they need you know a lot of compute resources, but only for, you know, a few hours on. Do you know, with snowflake they can run these new work lord at this workload to the system, get the compute resources that they need to run this workload. And when it's over, they can shut down. You know that their system, it will be automatically shut down. Therefore, they would not pay for the resources that they don't use. So it's a very Ajai system where you can do this, analyzes when you need, and you have all the power to run all this workload at the same time. >>Well, it's profound what you guys built to me. I mean, of course, everybody's trying to copy it now. It was like, remember that bringing the notion of bringing compute to the data and the Hadoop days, and I think that that Asai say everybody is sort of following your suit now are trying to Florian I gotta say the first data scientist I ever interviewed on the Cube was amazing. Hilary Mason, right after she started a bit Lee. And, you know, she made data science that sounds so compelling. But data science is hard. So same same question for you. What do you see is the biggest challenges for customers that they're facing with data science. >>The biggest challenge, from my perspective, is that owns you solve the issue of the data. Seidel with snowflake, you don't want to bring another Seidel, which would be a side off skills. Essentially, there is to the talent gap between the talented label of the market, or are it is to actually find recruits trained data scientist on what needs to be done. And so you need actually to simplify the access to technologies such as every organization can make it, whatever the talent, by bridging that gap and to get there, there is a need of actually breaking up the silos. And in a collaborative approach where technologists and business work together and actually put some their hands into those data projects together, >>it makes sense for flooring. Let's stay with you for a minute. If I can your observation spaces, you know it's pretty, pretty global, and and so you have a unique perspective on how companies around the world might be using data and data science. Are you seeing any trends may be differences between regions or maybe within different industries. What are you seeing? >>Yes. Yeah, definitely. I do see trends that are not geographic that much, but much more in terms of maturity of certain industries and certain sectors, which are that certain industries invested a lot in terms of data, data access, ability to start data in the last few years and no age, a level of maturity where they can invest more and get to the next steps. And it's really rely on the ability of certain medial certain organization actually to have built this long term strategy a few years ago and no start raping up the benefits. >>You know, a decade ago, Florian Hal Varian, we, you know, famously said that the sexy job in the next 10 years will be statisticians. And then everybody sort of change that to data scientists and then everybody. All the statisticians became data scientists, and they got a raise. But data science requires more than just statistics acumen. What what skills >>do >>you see as critical for the next generation of data science? >>Yeah, it's a good question because I think the first generation of the patient is became the licenses because they could done some pipe and quickly on be flexible. And I think that the skills or the next generation of data sentences will definitely be different. It will be first about being able to speak the language of the business, meaning, oh, you translate data inside predictive modeling all of this into actionable insight or business impact. And it would be about you collaborate with the rest of the business. It's not just a farce. You can build something off fast. You can do a notebook in python or your credit models off themselves. It's about, oh, you actually build this bridge with the business. And obviously those things are important. But we also has become the center of the fact that technology will evolve in the future. There will be new tools and technologies, and they will still need to keep this level of flexibility and get to understand quickly, quickly. What are the next tools they need to use the new languages or whatever to get there. >>As you look back on 2020 what are you thinking? What are you telling people as we head into next year? >>Yeah, I I think it's Zaveri interesting, right? We did this crisis, as has told us that the world really can change from one day to the next. And this has, you know, dramatic, you know, and perform the, you know, aspect. For example, companies all the sudden, you know, So their revenue line, you know, dropping. And they had to do less meat data. Some of the companies was the reverse, right? All the sudden, you know, they were online, like in stock out, for example, and their business, you know, completely, you know, change, you know, from one day to the other. So this GT off, You know, I, you know, adjusting the resource is that you have tow the task a need that can change, you know, using solution like snowflakes, you know, really has that. And we saw, you know, both in in our customers some customers from one day to the to do the next where, you know, growing like big time because they benefited, you know, from from from from co vid and their business benefited, but also, as you know, had to drop. And what is nice with with with cloud, it allows to, you know, I just compute resources toe, you know, to your business needs, you know, and really adjusted, you know, in our, uh, the the other aspect is is understanding what is happening, right? You need to analyze the we saw all these all our customers basically wanted to understand. What is that going to be the impact on my business? How can I adapt? How can I adjust? And and for that, they needed to analyze data. And, of course, a lot of data which are not necessarily data about, you know, their business, but also data from the outside. You know, for example, coffee data, You know, where is the States? You know, what is the impact? You know, geographic impact from covitz, You know, all the time and access to this data is critical. So this is, you know, the promise off the data crowd, right? You know, having one single place where you can put all the data off the world. So our customers, all the Children you know, started to consume the cov data from our that our marketplace and and we had the literally thousands of customers looking at this data analyzing this data, uh, to make good decisions So this agility and and and this, you know, adapt adapting, you know, from from one hour to the next is really critical. And that goes, you know, with data with crowding adjusting, resource is on and that's, you know, doesn't exist on premise. So So So indeed, I think the lesson learned is is we are living in a world which machines changing all the time and we have for understanding We have to adjust and and And that's why cloud, you know, somewhere it's great. >>Excellent. Thank you. You know the kid we like to talk about disruption, of course. Who doesn't on And also, I mean, you look at a I and and the impact that is beginning to have and kind of pre co vid. You look at some of the industries that were getting disrupted by, you know, we talked about digital transformation and you had on the one end of the spectrum industries like publishing which are highly disrupted or taxis. And you could say Okay, well, that's, you know, bits versus Adam, the old Negroponte thing. But then the flip side of that look at financial services that hadn't been dramatically disrupted. Certainly healthcare, which is ripe for disruption Defense. So the number number of industries that really hadn't leaned into digital transformation If it ain't broke, don't fix it. Not on my watch. There was this complacency and then, >>of >>course, co vid broke everything. So, florian, I wonder if you could comment? You know what industry or industries do you think you're gonna be most impacted by data science and what I call machine intelligence or a I in the coming years and decades? >>Honestly, I think it's all of them artist, most of them because for some industries, the impact is very visible because we're talking about brand new products, drones like cars or whatever that are very visible for us. But for others, we are talking about sport from changes in the way you operate as an organization, even if financial industry itself doesn't seems to be so impacted when you look it from the consumer side or the outside. In fact, internally, it's probably impacted just because the way you use data on developer for flexibility, you need the kind off cost gay you can get by leveraging the latest technologies is just enormous, and so it will actually transform the industry that also and overall, I think that 2020 is only a where, from the perspective of a I and analytics, we understood this idea of maturity and resilience, maturity, meaning that when you've got a crisis, you actually need data and ai more than before. You need to actually call the people from data in the room to take better decisions and look for a while and not background. And I think that's a very important learning from 2020 that will tell things about 2021 and the resilience it's like, Yeah, Data Analytics today is a function consuming every industries and is so important that it's something that needs to work. So the infrastructure is to work in frustration in super resilient. So probably not on prime on a fully and prime at some point and the kind of residence where you need to be able to plan for literally anything like no hypothesis in terms of behaviors can be taken for granted. And that's something that is new and which is just signaling that we're just getting to the next step for the analytics. >>I wonder, Benoit, if you have anything to add to that. I mean, I often wonder, you know, winter machine's gonna be able to make better diagnoses than doctors. Some people say already, you know? Well, the financial services traditional banks lose control of payment systems. Uh, you know what's gonna happen to big retail stores? I mean, maybe bring us home with maybe some of your final thoughts. >>Yeah, I would say, you know, I I don't see that as a negative, right? The human being will always be involved very closely, but the machine and the data can really have, you know, see, Coalition, you know, in the data that that would be impossible for for for human being alone, you know, you know, to to discover so So I think it's going to be a compliment, not a replacement on. Do you know everything that has made us you know faster, you know, doesn't mean that that we have less work to do. It means that we can doom or and and we have so much, you know, to do, uh, that that I would not be worried about, You know, the effect off being more efficient and and and better at at our you know, work. And indeed, you know, I fundamentally think that that data, you know, processing off images and doing, you know, I ai on on on these images and discovering, you know, patterns and and potentially flagging, you know, disease, where all year that then it was possible is going toe have a huge impact in in health care, Onda and And as as as Ryan was saying, every you know, every industry is going to be impacted by by that technology. So So, yeah, I'm very optimistic. >>Great guys. I wish we had more time. I gotta leave it there. But so thanks so much for coming on. The Cube was really a pleasure having you.
SUMMARY :
And Wa Dodgeville is the he co founded And I have said many times on the Cube that you know, the first era of cloud was really about infrastructure, So you close the gap or the democratizing access to data And we know we all know that you and the snowflake team you get very high marks for Yeah, so So the really the challenge, you know, be four. And, you know, And so you need actually to simplify the access to you know it's pretty, pretty global, and and so you have a unique perspective on how companies the ability of certain medial certain organization actually to have built this long term strategy You know, a decade ago, Florian Hal Varian, we, you know, famously said that the sexy job in the next And it would be about you collaborate with the rest of the business. So our customers, all the Children you know, started to consume the cov you know, we talked about digital transformation and you had on the one end of the spectrum industries You know what industry or industries do you think you're gonna be most impacted by data the kind of residence where you need to be able to plan for literally I mean, I often wonder, you know, winter machine's gonna be able to make better diagnoses that data, you know, processing off images and doing, you know, I ai on I gotta leave it there.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Volonte | PERSON | 0.99+ |
Florian Duetto | PERSON | 0.99+ |
Hilary Mason | PERSON | 0.99+ |
Florian Hal Varian | PERSON | 0.99+ |
Florian | PERSON | 0.99+ |
Benoit | PERSON | 0.99+ |
Ryan | PERSON | 0.99+ |
Ben Wa | PERSON | 0.99+ |
Data Aiko | ORGANIZATION | 0.99+ |
2020 | DATE | 0.99+ |
10 years | QUANTITY | 0.99+ |
Lee | PERSON | 0.99+ |
Wa Dodgeville | PERSON | 0.99+ |
next year | DATE | 0.99+ |
python | TITLE | 0.99+ |
Snowflake | ORGANIZATION | 0.99+ |
first | QUANTITY | 0.99+ |
one place | QUANTITY | 0.99+ |
one hour | QUANTITY | 0.98+ |
a decade ago | DATE | 0.98+ |
Floyd | PERSON | 0.98+ |
2021 | DATE | 0.98+ |
one day | QUANTITY | 0.98+ |
both | QUANTITY | 0.97+ |
today | DATE | 0.97+ |
first generation | QUANTITY | 0.96+ |
Adam | PERSON | 0.93+ |
Onda | ORGANIZATION | 0.93+ |
one single place | QUANTITY | 0.93+ |
florian | PERSON | 0.93+ |
each workload | QUANTITY | 0.92+ |
one | QUANTITY | 0.91+ |
four | QUANTITY | 0.9+ |
few years ago | DATE | 0.88+ |
thousands of customers | QUANTITY | 0.88+ |
Cube | COMMERCIAL_ITEM | 0.87+ |
first data scientist | QUANTITY | 0.84+ |
single | QUANTITY | 0.83+ |
Asai | PERSON | 0.82+ |
two world | QUANTITY | 0.81+ |
first era | QUANTITY | 0.74+ |
next 10 years | DATE | 0.74+ |
Negroponte | PERSON | 0.73+ |
Zaveri | ORGANIZATION | 0.72+ |
Dataiku | ORGANIZATION | 0.7+ |
Cube | ORGANIZATION | 0.64+ |
Ajai | ORGANIZATION | 0.58+ |
years | DATE | 0.57+ |
covitz | PERSON | 0.53+ |
decades | QUANTITY | 0.52+ |
Cube | PERSON | 0.45+ |
Snowflake | TITLE | 0.45+ |
Seidel | ORGANIZATION | 0.43+ |
snowflake | EVENT | 0.35+ |
Seidel | COMMERCIAL_ITEM | 0.34+ |
Democratizing AI & Advanced Analytics with Dataiku x Snowflake | Snowflake Data Cloud Summit
>> My name is Dave Vellante. And with me are two world-class technologists, visionaries and entrepreneurs. Benoit Dageville, he co-founded Snowflake and he's now the President of the Product Division, and Florian Douetteau is the Co-founder and CEO of Dataiku. Gentlemen, welcome to the cube to first timers, love it. >> Yup, great to be here. >> Now Florian you and Benoit, you have a number of customers in common, and I've said many times on theCUBE, that the first era of cloud was really about infrastructure, making it more agile, taking out costs. And the next generation of innovation, is really coming from the application of machine intelligence to data with the cloud, is really the scale platform. So is that premise relevant to you, do you buy that? And why do you think Snowflake, and Dataiku make a good match for customers? >> I think that because it's our values that aligned, when it gets all about actually today, and knowing complexity of our customers, so you close the gap. Where we need to commoditize the access to data, the access to technology, it's not only about data. Data is important, but it's also about the impacts of data. How can you make the best out of data as fast as possible, as easily as possible, within an organization. And another value is about just the openness of the platform, building a future together. Having a platform that is not just about the platform, but also for the ecosystem of partners around it, bringing the level of accessibility, and flexibility you need for the 10 years of that. >> Yeah, so that's key, that it's not just data. It's turning data into insights. Now Benoit, you came out of the world of very powerful, but highly complex databases. And we know we all know that you and the Snowflake team, you get very high marks for really radically simplifying customers' lives. But can you talk specifically about the types of challenges that your customers are using Snowflake to solve? >> Yeah, so the challenge before snowflake, I would say, was really to put all the data in one place, and run all the computes, all the workloads that you wanted to run against that data. And of course existing legacy platforms were not able to support that level of concurrency, many workload, we talk about machine learning, data science, data engineering, data warehouse, big data workloads, all running in one place didn't make sense at all. And therefore be what customers did this to create silos, silos of data everywhere, with different system, having a subset of the data. And of course now, you cannot analyze this data in one place. So Snowflake, we really solved that problem by creating a single architecture where you can put all the data into cloud. So it's a really cloud native. We really thought about how solve that problem, how to create, leverage cloud, and the elasticity of cloud to really put all the data in one place. But at the same time, not run all workload at the same place. So each workload that runs in Snowflake, at its dedicated compute resources to run. And that makes it agile, right? Florian talked about data scientist having to run analysis, so they need a lot of compute resources, but only for a few hours. And with Snowflake, they can run these new workload, add this workload to the system, get the compute resources that they need to run this workload. And then when it's over, they can shut down their system, it will automatically shut down. Therefore they would not pay for the resources that they don't use. So it's a very agile system, where you can do this analysis when you need, and you have all the power to run all these workload at the same time. >> Well, it's profound what you guys built. I mean to me, I mean of course everybody's trying to copy it now, it was like, I remember that bringing the notion of bringing compute to the data, in the Hadoop days. And I think that, as I say, everybody is sort of following your suit now or trying to. Florian, I got to say the first data scientist I ever interviewed on theCUBE, it was the amazing Hillary Mason, right after she started at Bitly, and she made data sciences sounds so compelling, but data science is a hard. So same question for you, what do you see as the biggest challenges for customers that they're facing with data science? >> The biggest challenge from my perspective, is that once you solve the issue of the data silo, with Snowflake, you don't want to bring another silo, which will be a silo of skills. And essentially, thanks to the talent gap, between the talent available to the markets, or are released to actually find recruits, train data scientists, and what needs to be done. And so you need actually to simplify the access to technologies such as, every organization can make it, whatever the talent, by bridging that gap. And to get there, there's a need of actually backing up the silos. Having a collaborative approach, where technologies and business work together, and actually all puts up their ends into those data projects together. >> It makes sense, Florain let's stay with you for a minute, if I can. Your observation space, it's pretty, pretty global. And so you have a unique perspective on how can companies around the world might be using data, and data science. Are you seeing any trends, maybe differences between regions, or maybe within different industries? What are you seeing? >> Yeah, definitely I do see trends that are not geographic, that much, but much more in terms of maturity of certain industries and certain sectors. Which are, that certain industries invested a lot, in terms of data, data access, ability to store data. As well as experience, and know region level of maturity, where they can invest more, and get to the next steps. And it's really relying on the ability of certain leaders, certain organizations, actually, to have built these long-term data strategy, a few years ago when no stats reaping of the benefits. >> A decade ago, Florian, Hal Varian famously said that the sexy job in the next 10 years will be statisticians. And then everybody sort of changed that to data scientist. And then everybody, all the statisticians became data scientists, and they got a raise. But data science requires more than just statistics acumen. What skills do you see as critical for the next generation of data science? >> Yeah, it's a great question because I think the first generation of data scientists, became data scientists because they could have done some Python quickly, and be flexible. And I think that the skills of the next generation of data scientists will definitely be different. It will be, first of all, being able to speak the language of the business, meaning how you translates data insight, predictive modeling, all of this into actionable insights of business impact. And it would be about how you collaborate with the rest of the business. It's not just how fast you can build something, how fast you can do a notebook in Python, or do predictive models of some sorts. It's about how you actually build this bridge with the business, and obviously those things are important, but we also must be cognizant of the fact that technology will evolve in the future. There will be new tools, new technologies, and they will still need to keep this level of flexibility to understand quickly what are the next tools they need to use a new languages, or whatever to get there. >> As you look back on 2020, what are you thinking? What are you telling people as we head into next year? >> Yeah, I think it's very interesting, right? This crises has told us that the world really can change from one day to the next. And this has dramatic and perform the aspects. For example companies all of a sudden, show their revenue line dropping, and they had to do less with data. And some other companies was the reverse, right? All of a sudden, they were online like Instacart, for example, and their business completely changed from one day to the other. So this agility of adjusting the resources that you have to do the task, and need that can change, using solution like Snowflake really helps that. Then we saw both in our customers. Some customers from one day to the next, were growing like big time, because they benefited from COVID, and their business benefited. But others had to drop. And what is nice with cloud, it allows you to adjust compute resources to your business needs, and really address it in house. The other aspect is understanding what happening, right? You need to analyze. We saw all our customers basically, wanted to understand what is the going to be the impact on my business? How can I adapt? How can I adjust? And for that, they needed to analyze data. And of course, a lot of data which are not necessarily data about their business, but also they are from the outside. For example, COVID data, where is the States, what is the impact, geographic impact on COVID, the time. And access to this data is critical. So this is the premise of the data cloud, right? Having one single place, where you can put all the data of the world. So our customer obviously then, started to consume the COVID data from that our data marketplace. And we had delete already thousand customers looking at this data, analyzing these data, and to make good decisions. So this agility and this, adapting from one hour to the next is really critical. And that goes with data, with cloud, with interesting resources, and that doesn't exist on premise. So indeed I think the lesson learned is we are living in a world, which is changing all the time, and we have to understand it. We have to adjust, and that's why cloud some ways is great. >> Excellent thank you. In theCUBE we like to talk about disruption, of course, who doesn't? And also, I mean, you look at AI, and the impact that it's beginning to have, and kind of pre-COVID. You look at some of the industries that were getting disrupted by, everyone talks about digital transformation. And you had on the one end of the spectrum, industries like publishing, which are highly disrupted, or taxis. And you can say, okay, well that's Bits versus Adam, the old Negroponte thing. But then the flip side of, you say look at financial services that hadn't been dramatically disrupted, certainly healthcare, which is ripe for disruption, defense. So there a number of industries that really hadn't leaned into digital transformation, if it ain't broke, don't fix it. Not on my watch. There was this complacency. And then of course COVID broke everything. So Florian I wonder if you could comment, what industry or industries do you think are going to be most impacted by data science, and what I call machine intelligence, or AI, in the coming years and decade? >> Honestly, I think it's all of them, or at least most of them, because for some industries, the impact is very visible, because we have talking about brand new products, drones, flying cars, or whatever that are very visible for us. But for others, we are talking about a part from changes in the way you operate as an organization. Even if financial industry itself doesn't seem to be so impacted, when you look at it from the consumer side, or the outside insights in Germany, it's probably impacted just because the way you use data (mumbles) for flexibility you need. Is there kind of the cost gain you can get by leveraging the latest technologies, is just the numbers. And so it's will actually comes from the industry that also. And overall, I think that 2020, is a year where, from the perspective of AI and analytics, we understood this idea of maturity and resilience, maturity meaning that when you've got to crisis you actually need data and AI more than before, you need to actually call the people from data in the room to take better decisions, and look for one and a backlog. And I think that's a very important learning from 2020, that will tell things about 2021. And the resilience, it's like, data analytics today is a function transforming every industries, and is so important that it's something that needs to work. So the infrastructure needs to work, the infrastructure needs to be super resilient, so probably not on prem or not fully on prem, at some point. And the kind of resilience where you need to be able to blend for literally anything, like no hypothesis in terms of BLOs, can be taken for granted. And that's something that is new, and which is just signaling that we are just getting to a next step for data analytics. >> I wonder Benoir if you have anything to add to that. I mean, I often wonder, when are machines going to be able to make better diagnoses than doctors, some people say already. Will the financial services, traditional banks lose control of payment systems? What's going to happen to big retail stores? I mean, maybe bring us home with maybe some of your finals thoughts. >> Yeah, I would say I don't see that as a negative, right? The human being will always be involved very closely, but then the machine, and the data can really help, see correlation in the data that would be impossible for human being alone to discover. So I think it's going to be a compliment not a replacement. And everything that has made us faster, doesn't mean that we have less work to do. It means that we can do more. And we have so much to do, that I will not be worried about the effect of being more efficient, and bare at our work. And indeed, I fundamentally think that data, processing of images, and doing AI on these images, and discovering patterns, and potentially flagging disease way earlier than it was possible. It is going to have a huge impact in health care. And as Florian was saying, every industry is going to be impacted by that technology. So, yeah, I'm very optimistic. >> Great, guys, I wish we had more time. I've got to leave it there, but so thanks so much for coming on theCUBE. It was really a pleasure having you.
SUMMARY :
and Florian Douetteau is the And the next generation of innovation, the access to data, about the types of challenges all the workloads that you of bringing compute to the And essentially, thanks to the talent gap, And so you have a unique perspective And it's really relying on the that the sexy job in the next 10 years of the next generation the resources that you have and the impact that And the kind of resilience where you need Will the financial services, and the data can really help, I've got to leave it there,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Vellante | PERSON | 0.99+ |
Benoit | PERSON | 0.99+ |
Florian Douetteau | PERSON | 0.99+ |
Florian | PERSON | 0.99+ |
Benoit Dageville | PERSON | 0.99+ |
Dataiku | ORGANIZATION | 0.99+ |
2020 | DATE | 0.99+ |
Hillary Mason | PERSON | 0.99+ |
Hal Varian | PERSON | 0.99+ |
10 years | QUANTITY | 0.99+ |
Python | TITLE | 0.99+ |
Snowflake | ORGANIZATION | 0.99+ |
Germany | LOCATION | 0.99+ |
one hour | QUANTITY | 0.99+ |
both | QUANTITY | 0.99+ |
next year | DATE | 0.99+ |
Bitly | ORGANIZATION | 0.99+ |
one day | QUANTITY | 0.98+ |
2021 | DATE | 0.98+ |
A decade ago | DATE | 0.98+ |
one place | QUANTITY | 0.97+ |
Snowflake Data Cloud Summit | EVENT | 0.97+ |
Snowflake | TITLE | 0.96+ |
each workload | QUANTITY | 0.96+ |
today | DATE | 0.96+ |
first generation | QUANTITY | 0.96+ |
Benoir | PERSON | 0.95+ |
snowflake | EVENT | 0.94+ |
first era | QUANTITY | 0.92+ |
COVID | OTHER | 0.92+ |
single architecture | QUANTITY | 0.91+ |
thousand customers | QUANTITY | 0.9+ |
first data scientist | QUANTITY | 0.9+ |
one | QUANTITY | 0.88+ |
one single place | QUANTITY | 0.87+ |
few years ago | DATE | 0.86+ |
Negroponte | PERSON | 0.85+ |
Florain | ORGANIZATION | 0.82+ |
two world | QUANTITY | 0.81+ |
first | QUANTITY | 0.8+ |
Instacart | ORGANIZATION | 0.75+ |
next 10 years | DATE | 0.7+ |
hours | QUANTITY | 0.67+ |
Snowflake | EVENT | 0.59+ |
a minute | QUANTITY | 0.58+ |
theCUBE | ORGANIZATION | 0.55+ |
Adam | PERSON | 0.49+ |