Jeff Erhardt, GE | CUBEConversation, May 2018
(upbeat orchestral music) >> Welcome back everybody. Jeff Frick here with the CUBE. We're at our Palo Alto studios having a CUBE conversation about digital transformation, industrial internet, AI, ML, all things great, and we're really excited to have a representative of GE, one of our favorite companies to work with because they're at the cutting edge of old industrial stuff and new digital transformation and building a big software organization out in San Ramon. So we're so happy to have here first time Jeff Erhardt. He is the VP Intelligent Systems from GE Digital. Jeff, great to see you. >> Pleasure to be here. Thanks for having me. >> Absolutely, so how did you get into GE? You actually, a creature of the valley, you've been here a little while. How did you end up at GE? >> I have. I'm a new guy, so I've been here about a year and a half, I came in via the acquisition of a company called Wise IO where I was the CEO, so I've spent the last 10 years or so of my life building two different analytic startups. One was based around a very popular and powerful open source language called R and spent a lot of time working with much of the Fortune 500. Think the really data driven companies now that you would think of, the Facebooks, the Goldman Sachs, the Mercks, the Pfeizers helping them go through this data driven journey. Anyway, that company was acquired by Microsoft and is embedded into their products now. But the biggest thing I learned out about that was that even if you have really good data science teams, it's incredibly hard to go from white board into production. How do you take concepts and make them work reliably repeatably, scalably over time? And so, Wise IO was a machine learning company that was a spin out from Berkeley, and we spent time building what I now refer to as intelligent systems for the purposes of customer support automation within things like the sales force and Zendesk ecosystem, and it was really that capability that drew us to GE or drew GE to approach us, to think about how do we build that gap not just from algorithms, but into building true intelligent applications? >> Right, so GE is such a great company. They've been around for a hundred years, original DOW component, Jeff Immelt's not there now, but he was the CEO I think for 16 years. A long period of time. Beth Comstock, fantastic leader. Bill Ruth building this great organization. But it's all built around these industrial assets. But they've started, they did the industrial internet launch. We helped cover it in 2013. They have the Pridix Cloud, their own kind of industrial internet cloud, had a big developer conference. But I'm curious coming from kind of a small Silicon Valley startup situation. When you went into GE, what's kind of the state of their adoption, you know, kind of how had Bill's group penetrated the rest of GE and were they making process? We're people kinda getting it, or were you still doing some evangelical work out in the field? Absolutely both, meaning people understand it are implementing yet I think there was maybe misunderstandings about how to think about software data in particular analytics and AI machine learning. And so a big part of my first year at the company was to spend the time coming in really from the top down, from sort of the CEO and CDO levels across the different business understanding what was the state of data and data driven processes within their businesses. And what I learned really quickly was that the core of this business, and this is all public information been well publicized, is in things like GE Aviation. It's not necessarily the sale of the engine that is incredible profitable, but rather it's maintaining and servicing that over time. >> Right. >> And what organizations like them, like our oil and gas divisions, with things like their inspection capabilities like our power division had really done is they had created as a service businesses where they we're taking data across the customer base, running it through a data driven process, and then driving outcomes for our customers. And all of a sudden the aha moment was wow, wait a minute. This is the business model that every startup in the valley is getting funded to take down the traditional software players for. It's just not yet modern, scalable, repeatable, with AI machine learning built in, but that's the purpose and the value of building these common platforms with these applications on top that you can then make intelligent. >> Right. >> So, once we figure that out it was very easy to know where to focus and start building from that. >> So it's just, it's kinda weird I'm sure for people on the outside looking in to say data driven company. We all want to drive data driven companies. But then you say, well wait a minute, now GE builds jet engines. There's no greater example that's used at conferences as to the number of terabytes of data an engine throws off on a transcontinental flight. Or you think of a power plant or locomotion and you think of the control room with all this information so it probably seems counterintuitive to most that, didn't they have data, weren't they a data driven organization? How has the onset of machine learning and some of the modern architectures actually turned them into a data driven company, where before I think they were but really not to the level that we're specifying here. >> Yah, I-- >> What would be your objective, what are you trying to take on this? >> Absolutely, machine learning, AI, whatever buzz words you want to use is a fascinating topic. It's certainly come into vogue. like many things that are hyped, gets confused, gets misused, and gets overplayed. But, it has the potential to be both an incredibly simple technology as well as an incredibly powerful technology. So, one of the things I've most often seen cause people to go awry in this space is to try to think about what is the new things that I can do with machine learning? What is the green field opportunity? And whenever I'm talking to somebody at whatever level, but particularly at the higher levels of the company is I like to take a step back and I like to say, "What are the value producing, data driven workflows within your business?" And I say define for me the data that you have, how decisions are made upon it, and what outcome that you are driving for. And if you can do that, then what we can do is we can overlay machine learning as a technology to intelligently automate or augment those processes. And in turn what that's gonna do is it's gonna force you to standardize your infrastructure, standardize those workflows, quantify what you're trying to optimize for your customers. And if you do that in a standardized and incremental way, you can look backward having accomplished some very big things. >> Right, and those are such big foundational pieces that most people I think discount again, just the simple question of where is your data. >> That's right. >> What form is it in? So another interesting concept that we cover all the time with all the shows we go to is democratization, right? So it seems to me pretty simple, actually. How do you drive innovation, democratize the data, democratize the tool to manipulate the data, and democratize the ability to actually do something about it. That said, it's not that easy. And this kind of concept that we see evolving from citizen developer to citizen integrator to citizen data scientist is kinda where we all want to go to, but as you've experienced first hand it's not quite as easy as maybe it appears. >> Yah, I think that's a very fair statement and you know, one of the things, again I spend a lot of time talking about, is I like to think about getting the right people in the right roles, using the right tools. And the term data scientist has evolved over the past five plus years going from to give Drew Conway some credit of his Venn diagram of a program or a math kinda domain expert, into meaning anybody that's looking at data. And there's nothing wrong with that, but the concept of taking anybody that has ability to look at data within something like a BI or a Tableau tool, that is something that should absolutely be democratized and you can think about creating citizens for those people. On the flip side, though, how do you structure a true intelligent system that is running reliably, robustly, and particular in our field in mission critical, high risk, high stakes applications? There are bigger challenges than simply are the tools easy enough to use. It's very much more a software engineering problem than it is a data access or algorithmic problem. >> Right. >> And, so we need to build those bridges and think about where do we apply the citizens to for that understanding, and how do we build robust, reliable software over time? >> Right, so many places we can go, and we're gonna go a lot of them. But one of the things you touched on which also is now coming in vogue is kind of ML that you can, somebody else's ML, right? >> Mhmm. >> As you would buy an application at an app store, now there's all kinds of algorithmic equations out there that you can purchase and participate in. And that really begs an interesting question of kinda the classic buy versus build, or as you said before we turned on the cameras buy versus consume because with API economy with all these connected applications, it really opens up an opportunity that you can use a lot more than was produced inside your own four walls. >> Absolutely. >> For those applications. >> Yep. >> And are you seeing that? How's that kinda playing out? >> So we can parse that in a couple of different ways. So the first thing that I would say is there's a Google paper from a few years back that we love and it's required reading for every new employee that we bring on board. And the title of it was machine Learning is the High Interest Credit Card of Technical Debt. And one of the key points within that paper is that the algorithm piece is something like five percent of an overall production machine learning implementation. And so it gets back to the citizen piece. About it's not just making algorithms easier to use, but it's also about where do you consume things from an API economy? So that's the first thing I would think about. The second thing I would think about is there's different ways to use algorithms or APIs or pieces of information within an overall intelligent system. So you might think of speech to text or translation as capabilities. That's something where it probably absolutely makes sense to call an API from an Amazon or a Microsoft or a Google to do that, but then knowing how to integrate that reliably, robustly into the particular application or business problem that you have, is an important next step. >> Right. >> The third thing that I would think about is, it very much matters what your space is. And there's a difference between doing things like image classification on things like Imagenet which is publicly available images which are well documented. Is it a dog versus a cat? Is it a hot dog versus not? Versus some of the things that we face with an industrial context, which aren't really publicly available. So we deal with things like within our oil and gas business we have a very large pipeline inspection integrity business where the purpose of that is to send the equivalent of an MRI machine through the pipes and collect spectral images that collect across 14 different sensors. The ability to think that you're gonna take a pre trained algorithm based on deep learning and publicly available images to something that is noisy, dirty, has 14 different types of sensors on it and get a good answer-- >> Right. >> Is ridiculous. >> And there's not that many, right? >> And there's not that many. >> That's the other thing I think people underestimate the advantage that Google has we're all taking pictures of dogs and blueberries-- >> Correct. >> So that it's got so much more data to work with. >> That's right. >> As opposed to these industrial applications which are much smaller. >> That's right. >> Lets shift gears again, in terms of digital transformation one of the other often often said examples is when will the day come that GE doesn't sell just engines but actually sells propulsion miles? >> Yep. >> To really convert to a service. >> Yah. >> And that's ultimately where it needs to go cause it's kinda the next step beyond maintenance. >> Yep. >> How are you seeing that digital transformation play out? Do people kinda get it? Do the old line guys that run the jet engine see that this is really a better opportunity? >> Mhmm. >> Cause you guys have, and this is the broader theme, very uniques data and very unique expertise that you've aggregated across in the jet engines base all of your customers in all of the flying conditions and all of the types of airplanes where one individual mechanic or one individual airline just doesn't have an expertise. >> Yep. >> Huge opportunity. >> That's exactly right, and you can say the ame thing in our power space, in our power generation space. You can say the same thing in the one we we're just talking about, you know things like our inspection technology spaces. That's what makes the opportunity so powerful at GE and it's exactly the reason why I'm there because we can't get that any place else. It's both that history, it's that knowledge tied to the data, and very importantly it's what you hinted at that bares repeating is the customer relationships and the customer base upon which you can work together to aggregate all that data together. And if you look at what things are being done, they're already doing it. They are selling effectively, efficiency within a power plant. They are selling safety within certain systems, and again, coming back to why create a platform. Why create standardized applications? Why put these on top? Is if you standardize that, it gives you the ability to create derivative and adjacent products very easily, very efficiently, in ways that nobody else can match. >> Right, right. And I love the whole, for people who aren't familiar with the digital twin concept, but really leveraging this concept of a digital twin not to mimic kinda the macro level, but to mimic the micro level of a particular part unit engine in a particular ecosystem where you can now run simulations, you can run tests, you can do all kinds of stuff without actually having that second big piece of capital gear out there. >> That's right, and it's really hard to mimic those if you didn't start from the first phase of how did you design, build, and put it in to the field? >> Right, right. So, I want to shift gears a little bit just on to philosophical things that you've talked about and doing some research. One of them is that tech is the means to an end, and I know people talk about that all the time, but we're in the tech business. We're here in Silicon Valley. People get so enamored with the technology that they forget that it is a means to an end. It is now the end and to stay focused. >> That's right. >> How are you seeing that kind of play out in GE Digital? Obviously Bill built this humongous organization. I'm super impressed he was able to hire that many people within the last like four years in San Ramon. >> Yah. >> Originally I think just to build the internal software workings within the GE business units, but now really to go much further in terms of industrial internet connectivity, etc. So how do you see that really kinda playing out? >> Yah, I think one of my favorite quotes that I forget who it came from but I'll borrow it is, "Customers don't want to buy a one inch drill bit, they want to buy a one inch hole." >> Right. >> And I think there is both an art and a science and a degree of understanding that needs to go into what is the real customer problem that they are trying to solve for, and how do you peel the onion to understanding that versus just giving what they ask for? >> Right. >> And I think there's an organizational design to how do you get that right. So we had a visitor from Europe, the chairman of one of our large customers, who is going through this data driven journey, and they were at the stage of simply just collecting data off of their equipment. In this case it was elevators and escalators. And then understanding how was it being used? What does it mean for field maintenance, etcetera? But his guys wanted to move right to the end stage and they wanted to come in and say, "Hey, we want to build AI machine learning systems." And we spent some time talking through them about how this is a journey, how you step through it. And you could see the light bulb go off. That yes, I shouldn't try to jump right to that end state. There's a process of going through it, number one, and then the second thing we spent some time talking about was how he can think about structuring his company to create that bridge between the new technology people who are building and doing things in a certain way, and the people who have the legacy knowledge of how things are built, run, and operated? >> Right. >> And it's many times those organizational aspects that are as challenging or as big of barriers to getting it right as a specific technology. >> Oh, for sure, I mean people process and tech it's always the people that are the hard part. It's funny you bring up the elevator or escalator story, We did a show at Spunk many moons ago and we had a person on from an elevator company and the amazing insight they connected Spunk to it. They could actually tell the health of a building by the elevator traffic. >> Yah. >> Not the health of it's industrial systems and it's HVAC, but whether some of the tenants were in trouble. >> Yep. >> By watching the patterns that were coming off the elevator. While different kinda data driven value proposition than they had before. >> Yep. So again, if you could share some best practices really from your experiences with R and now kinda what you're doing at GE about how people should start those first couple of steps in being data driven beyond kinda the simple terms of getting your house in order, getting your data in order, where is it. >> Yah. >> Can you connect to it? Is it clean? >> Yah. >> How should they kinda think about prioritizing? Ho do they look for those easy wins cause at the end of the day it's always about the easiest wins to get the support to move to the next level. >> Yah, so I've sorta got a very simple Hilo play book and you know the first step is you have to know your business. And you have to really understand and prioritize. Again, sometimes I think about not the build, buy decision per say, but maybe the build consume decision. And again, where does it take the effort to go through hiring the people, understanding building those solutions, versus where is it just best to say, "I'm best to consume this product or service from somebody else." So that's number one, and you have to understand your business to do that, really well. The second one is, and we touched on this before, which is getting the right people in the right seats of the bus. Understanding who those citizen data scientists are versus who your developers are, who your analytics people are, who your machine learning people are, and making sure you've got the right people doing the right thing. >> Right. >> And then the last thing is to make sure, to understand that it is a journey. And we like to think about the journey that we go through in sort of three phases, right? Or sort of three swim lanes that could happen, both in parallel, but also as a journey. And we think about those as sort of basic BI and exploratory analytics. How do I learn is there any there there? And fundamentally you're saying, I want to ask and answer a question one time. Think about traditional business reporting. But once you've done that, your goal is always to put something into production. You say, "I've asked and answered once, now I want to ask and answer hundreds, millions, billions of times-- >> Right, right. >> In a row." And the goal is to codify that knowledge into a statistic, an analytic, a business role. And then, how do you start running those within a consistent system? And it's gonna do and force exactly what you just said. Do I have my data in one place? Is it scalable? Is it robust? Is it queryable? Where is it being consumed? How do I capture what's good or bad? And once I start to then define those, I can then start to standardize that within an application workflow and then move into, again, these complex, adaptive, intelligent systems powered by AI machine learning. And so, that's the way we think about it. Know your business, get the people right, understand that it's a systematic journey. >> Right, and then really bake it into the application. >> That's right. >> That's the thing, we don't want to make the same mistake that we do with big data, right? >> Yep. >> Just put it into the application. It's not this stand alone-- >> Correct. >> You know, kinda funny thing. >> Exactly. >> Alright, Jeff, I'll give you the last work before we wrap for the day. So you've been with GE now for about a year and a half, about halfway through 2018. What are your priorities for the next 12 months? If we sit down here, you know June one next year, what are you working on, what's kinda top of mind for you going forward? >> Yah, so top of the line for me, so as I mentioned sort of our first year here was really surveying the landscape, understanding how this company does business, where the opportunities are. Again, where those data driven work flows are. And we have an idea of of that with the core industrial. And so what we've been doing is getting that infrastructure right, getting those people right, getting the V ones of some very powerful systems set up. And so, what I'm gonna be doing over the next year or so is really working with them to scale those out within those core parts of the business, understand how we can create derivative and adjacent products over those, and then how we can take them to market more broadly based upon that, exactly as you said earlier, large scale data that we have available, that customer insight, and that knowledge of how we've been building the stuff, so. >> Alright, I look forward to it. >> I look forward to being back in a year. >> All right, Jeff Erhardt. Thanks for watching. I'm Jeff Frick. You're watching the CUBE from our Palo Alto studios. See you next time. (upbeat orchestra music)
SUMMARY :
He is the VP Intelligent Systems from GE Digital. Pleasure to be here. You actually, a creature of the valley, you've been here Think the really data driven companies now that you would It's not necessarily the sale of the engine that is And all of a sudden the aha moment was wow, wait a minute. So, once we figure that out it was very easy to know where the outside looking in to say data driven company. And I say define for me the data that you have, question of where is your data. and democratize the ability to actually do something On the flip side, though, how do you structure a true But one of the things you touched on which also is now the classic buy versus build, or as you said before we And one of the key points within that paper is that the Versus some of the things that we face with an industrial As opposed to these industrial applications which And that's ultimately where it needs to go cause it's customers in all of the flying conditions and all of the You can say the same thing in the one we we're just talking And I love the whole, for people who aren't familiar It is now the end and to stay focused. How are you seeing that kind of play out in GE Digital? So how do you see that really kinda playing out? Yah, I think one of my favorite quotes that I forget who And I think there's an organizational design to how do as challenging or as big of barriers to getting it right the people that are the hard part. Not the health of it's industrial systems and it's HVAC, off the elevator. of steps in being data driven beyond kinda the simple day it's always about the easiest wins to get the support And you have to really understand and prioritize. And then the last thing is to make sure, to understand And the goal is to codify that knowledge into a statistic, Just put it into the application. If we sit down here, you know June one next year, what are And we have an idea of of that with the core industrial. See you next time.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff | PERSON | 0.99+ |
Jeff Erhardt | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
2013 | DATE | 0.99+ |
GE | ORGANIZATION | 0.99+ |
Beth Comstock | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Silicon Valley | LOCATION | 0.99+ |
16 years | QUANTITY | 0.99+ |
Bill Ruth | PERSON | 0.99+ |
Goldman Sachs | ORGANIZATION | 0.99+ |
Jeff Immelt | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
Mercks | ORGANIZATION | 0.99+ |
GE Aviation | ORGANIZATION | 0.99+ |
May 2018 | DATE | 0.99+ |
San Ramon | LOCATION | 0.99+ |
Spunk | ORGANIZATION | 0.99+ |
Zendesk | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Wise IO | ORGANIZATION | 0.99+ |
Facebooks | ORGANIZATION | 0.99+ |
14 different sensors | QUANTITY | 0.99+ |
first step | QUANTITY | 0.99+ |
Bill | PERSON | 0.99+ |
one | QUANTITY | 0.99+ |
Pfeizers | ORGANIZATION | 0.99+ |
both | QUANTITY | 0.99+ |
one inch | QUANTITY | 0.99+ |
Drew Conway | PERSON | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
GE Digital | ORGANIZATION | 0.99+ |
14 different types | QUANTITY | 0.99+ |
second one | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
five percent | QUANTITY | 0.99+ |
next year | DATE | 0.99+ |
2018 | DATE | 0.98+ |
first thing | QUANTITY | 0.98+ |
June one | DATE | 0.98+ |
Berkeley | LOCATION | 0.98+ |
first year | QUANTITY | 0.98+ |
third thing | QUANTITY | 0.98+ |
second thing | QUANTITY | 0.97+ |
first time | QUANTITY | 0.97+ |
first phase | QUANTITY | 0.97+ |
one individual airline | QUANTITY | 0.97+ |
first couple | QUANTITY | 0.96+ |
one time | QUANTITY | 0.96+ |
Tableau | TITLE | 0.96+ |
about a year and a half | QUANTITY | 0.93+ |
many moons ago | DATE | 0.92+ |
one place | QUANTITY | 0.9+ |
DOW | ORGANIZATION | 0.9+ |
one individual mechanic | QUANTITY | 0.9+ |
next 12 months | DATE | 0.88+ |