StrongyByScience Podcast | Bill Schmarzo Part One
produced from the cube studios this is strong by science in-depth conversations about science based training sports performance and all things health and wellness here's your host max smart [Music] [Applause] [Music] all right thank you guys tune in today I have the one and only Dean of big data the man the myth the legend bill Schwarz oh also my dad is the CTO of Hitachi van Tara and IOC in analytics he has a very interesting background because he is the well he's known as the Dean of big data but also the king of the court and all things basketball related when it comes to our household and unlike most people in the data world and I want to say most as an umbrella term but a some big bill has an illustrious sports career playing at Coe College the Harvard of the Midwest my alma mater as well but I think having that background of not just being computer science but where you have multiple disciplines involved when it comes to your jazz career you had basketball career you have obviously the career Iran now all that plays a huge role in being able to interpret and take multiple domains and put it into one so thank you for being here dad yeah thanks max that's a great introduction I rep reciate that no it's it's wonderful to have you and for our listeners who are not aware bill is referring him is Bill like my dad but I call my dad the whole time is gonna drive me crazy bill has a mind that thinks not like most so he he sees things he thinks about it not just in terms of the single I guess trajectory that could be taken but the multiple domains that can go so both vertically and horizontally and when we talk about data data is something so commonly brought up in sports so commonly drop in performance and athletic development big data is probably one of the biggest guess catchphrases or hot words or sayings that people have nowadays but doesn't always have a lot of meaning to it because a lot of times we get the word big data and then we don't have action out of big data and bill specialty is not just big data but it's giving action out of big data with that going forward I think a lot of this talk to be talking about how to utilize Big Data how do you guys data in general how to organize it how to put yourself in a situation to get actionable insights and so just to start it off Becky talked a little bit on your background some of the things you've done and how you develop the insights that you have thanks max I have kind of a very nos a deep background but I've been doing data analytics a long time and I was very fortunate one of those you know Forrest Gump moments in life where in the late 1980s I was involved in a project at Procter & Gamble I ran the project where we brought in Walmart's point of sales data for the first time into a what we would now call a data warehouse and for many of this became the launching point of the data warehouse bi marketplace and we can trace the effect the origins of many of the BI players to that project at Procter & Gamble in 87 and 88 and I spent a big chunk of my life just a big believer in business intelligence and data warehousing and trying to amass data together and trying to use that data to report on what's going on and writing insights and I did that for 20 25 years of my life until as you probably remember max I was recruited out Business Objects where I was the vice president of analytic applications I was recruited out of there by Yahoo and Yahoo had a very interesting problem which is they needed to build analytics for their advertisers to help those advertisers to optimize or spend across the Yahoo ad network and what I learned there in fact what I unlearned there was that everything that I had learned about bi and data warehouse and how you constructed data warehouses how you were so schema centric how everything was evolved around tabular data at Yahoo there was an entirely different approach the of my first introduction to Hadoop and the concept of a data Lake that was my first real introduction into data science and how to do predictive analytics and prescriptive analytics and in fact it was it was such a huge change for me that I was I was asked to come back to the TD WI data world Institute right was teaching for many years and I was asked to do a keynote after being at Yahoo for a year or so to share sort of what were the observations what did I learn and I remember I stood up there in front of about 600 people and I started my presentation by saying everything I've taught you the past 20 years is wrong and it was well I didn't get invited back for 10 years so that probably tells you something but it was really about unlearning a lot about what I had learned before and probably max one of the things that was most one of the aha moments for me was bi was very focused on understanding the questions that people were trying to ask an answer davus science is about us to understand the decisions they're trying to take action on questions by their very nature our informative but decisions are actionable and so what we did at Yahoo in order to really drive the help our advertisers optimize your spend across the Yahoo ad network is we focus on identifying the decisions the media planners and buyers and the campaign managers had to make around running a campaign know what what how much money to allocate to what sides how much how many conversions do I want how many impressions do I want so all the decisions we built predictive analytics around so that we can deliver prescriptive actions to these two classes of stakeholders the media planners and buyers and the campaign managers who had no aspirations about being analysts they're trying to be the best digital marketing executives or you know or people they could possibly be they didn't want to be analysts so and that sort of leads me to where I am today and my my teaching my books my blogs everything I do is very much around how do we take data and analytics and help organizations become more effective so everything I've done since then the books I've written the teaching I do with University of San Francisco and next week at the National University of Ireland and Galway and all the clients I work with is really how do we take data and analytics and help organizations become more effective at driving the decisions that optimize their business and their operational models it's really about decisions and how do we leverage data and analytics to drive those decisions so what would how would you define the difference between a question that someone's trying to answer versus a decision but they're trying to be better informed on so here's what I'd put it I call it the Sam test I am and that is it strategic is it actionable is it material and so you can ask questions that are provocative but you might not fast questions that are strategic to the problems you're trying to solve you may not be able to ask questions that are actionable in a sense you know what to do and you don't necessarily ask questions that are material in the sense that the value of that question is greater than the cost of answering that question right and so if I think about the Sam test when I apply it to data science and decisions when I start mining the data so I know what decisions are most important I'm going through a process to identify to validate the value and prioritize those decisions right I understand what decisions are most important now when I start to dig through the data all this structured unstructured data across a number different data sources I'm looking for I'm trying to codify patterns and relationships buried in that data and I'm applying the Sam test is that against those insights is it strategic to the problem I'm trying to solve can I actually act on it and is it material in the sense that it's it's it's more valuable to act than it is to create the action around it so that's the to me that big difference is by their very nature decisions are actually trying to make a decision I'm going to take an action questions by their nature are informative interesting they could be very provocative you know questions have an important role but ultimately questions do not necessarily lead to actions so if I'm a a sport coach I'm writing a professional basketball team some of the decisions I'm trying to make are I'm deciding on what program best develops my players what metrics will help me decide who the best prospect is is that the right way of looking at it yeah so we did an exercise at at USF too to have the students go through an exercise - what question what decisions does Steve Kerr need to make over the next two games he's playing right and we go through an exercise of the identifying especially in game decisions exercise routes oh no how often are you gonna play somebody no how long are they gonna play what are the right combinations what are the kind of offensive plays that you're gonna try to run so there's a know a bunch of decisions that Steve Kerr is coach of the Warriors for example needs to make in the game to not only try to win the game but to also minimize wear and tear on his players and by the way that's a really good point to think about the decisions good decisions are always a conflict of other ideas right win the game while minimizing wear and tear on my players right there's there are there are all the important decisions in life have two three or four different variables that may not be exactly the same which is by this is where data science comes in the data science is going to look across those three or four very other metrics against what you're going to measure success and try to figure out what's the right balance of those given the situation I'm in so if going back to the decision about about playing time well think about all the data you might want to look at in order to optimize that so when's the next game how far are they in this in this in the season where do they currently sit ranking wise how many minutes per game has player X been playing looking over the past few years what's there you know what's their maximum point so there's there's a there's not a lot of decisions that people are trying to make and by the way the beauty of the decisions is the decisions really haven't changed in years right what's changed is not the decisions it's the answers and the answers have changed because we have this great bound of data available to us in game performance health data you know all DNA data all kinds of other data and then we have all these great advanced analytic techniques now neural networks and unstructured supervised machine learning on right all this great technology now that can help us to uncover those relationships and patterns that are buried in the data that we can use to help individualize those decisions one last point there the point there to me at the end when when people talk about Big Data they get fixated on the big part the volume part it's not the volume of big data that I'm going to monetize it's the granularity and what I mean by that is I now have the ability to build very detailed profiles going back to our basketball example I can build a very detailed performance profile on every one of my players so for every one of the players on the Warriors team I can build a very detailed profile it the details out you know what's their optimal playing time you know how much time should they spend before a break on the feet on the on the on the court right what are the right combinations of players in order to generate the most offense or the best defense I can build these very detailed individual profiles and then I can start mission together to find the right combination so when we talk about big it's not the volume it's interesting it's the granularity gotcha and what's interesting from my world is so when you're dealing with marketing and business a lot of that when you're developing whether it be a company that you're trying to find more out about your customers or your startup trying to learn about what product you should develop there's tons of unknowns and a lot of big data from my understanding it can help you better understand some patterns within customers how to market you know in your book you talk about oh we need to increase sales at Chipotle because we understand X Y & Z our current around us now in the sports science world we have our friend called science and science has helped us early identify certain metrics that are very important and correlated to different physiological outcomes so it almost gives us a shortcut because in the big data world especially when you're dealing with the data that you guys are dealing with and trying to understand customer decisions each customer is individual and you're trying to compile all together to find patterns no one's doing science on that right it's not like a lab work where someone is understanding muscle protein synthesis and the amount of nutrients you need to recover from it so in my position I have all these pillars that maybe exist already where I can begin my search there's still a bunch of unknowns with that kind of environment do you take a different approach or do you still go with the I guess large encompassing and collect everything you can and siphon after maybe I'm totally wrong I'll let you take it away no that's it's a it's a good question and what's interesting about that max is that the human body is governed by a series of laws we'll say in each me see ology and the things you've talked about physics they have laws humans as buyers you know shoppers travelers we have propensity x' we don't have laws right I have a propensity that I'm gonna try to fly United because I get easier upgrades but I might fly you know Southwest because of schedule or convenience right I have propensity x' I don't have laws so you have laws that work to your advantage what's interesting about laws that they start going into the world of IOT and this concept called digital twins they're governed by laws of physics I have a compressor or a chiller or an engine and it's got a bunch of components in it that have been engineered together and I can actually apply the laws I can actually run simulations against my digital twins to understand exactly when is something likely to break what's the remaining useful life in that product what's the severity of the the maintenance I need to do on that so the human body unlike the human psyche is governed by laws human behaviors are really hard right and we move the las vegas is built on the fact that human behaviors are so flawed but body mate but bat body physics like the physics that run these devices you can actually build models and one simulation to figure out exactly how you know what's the wear and tear and what's the extensibility of what you can operate in gotcha yeah so that's when from our world you start looking at subsystems and you say okay this is your muscular system this is your autonomic nervous system this is your central nervous system these are ways that we can begin to measure it and then we can wrote a blog on this that's a stress response model where you understand these systems and their inferences for the most part and then you apply a stress and you see how the body responds and even you determine okay well if I know the body I can only respond in a certain number of ways it's either compensatory it's gonna be you know returning to baseline and by the mal adaptation but there's only so many ways when you look at a cell at the individual level that that cell can actually respond and it's the aggregation of all these cellular responses that end up and manifest in a change in a subsystem and that subsystem can be measured inferential II through certain technology that we have but I also think at the same time we make a huge leap and that leap is the word inference right we're making an assumption and sometimes those assumptions are very dangerous and they lead to because that assumptions unknown and we're wrong on it then we kind of sway and missed a little bit on our whole projection so I like the idea of looking at patterns and look at the probabilistic nature of it and I'm actually kind of recently change my view a little bit from my room first I talked about this I was much more hardwired and laws but I think it's a law but maybe a law with some level of variation or standard deviation and it we have guardrails instead so that's kind of how I think about it personally is that something that you say that's on the right track for that or how would you approach it yeah actually there's a lot of similarities max so your description of the human body made up of subsystems when we talk to organizations about things like smart cities or smart malls or smart hospitals a smart city is comprised of a it's made up of a series of subsystems right I've got subsystems regarding water and wastewater traffic safety you know local development things like this look there's a bunch of subsystems that make a city work and each of those subsystems is comprised of a series of decisions or clusters of decisions with equal use cases around what you're trying to optimize so if I'm trying to improve traffic flow if one of my subsystems is practically flow there are a bunch of use cases there about where do I do maintenance where do I expand the roads you know where do I put HOV lanes right so and so you start taking apart the smart city into the subsystems and then know the subsystems are comprised of use cases that puts you into really good position now here's something we did recently with a client who is trying to think about building the theme park of the future and how do we make certain that we really have a holistic view of the use cases that I need to go after it's really easy to identify the use cases within your own four walls but digital transformation in particular happens outside the four walls of an organization and so what we what we're doing is a process where we're building journey maps for all their key stakeholders so you've got a journey map for a customer you have a journey map for operations you have a journey map for partners and such so you you build these journey maps and you start thinking about for example I'm a theme park and at some point in time my guest / customer is going to have a pity they want to go do something you want to go on vacation at that point in time that theme park is competing against not only all the other theme parks but it's competing against major league baseball who's got things it's competing against you know going to the beach in Sanibel Island just hanging around right there they're competing at that point and if they only start engaging the customer when the customers actually contacted them they must a huge part of the market they made you miss a huge chance to influence that person's agenda and so one of the things that think about I don't know how this applies to your space max but as we started thinking about smart entities we use design thinking and customer journey match there's a way to make certain that we're not fooling ourselves by only looking within the four walls of our organization that we're knocking those walls down making them very forest and we're looking at what happens before somebody engages it with us and even afterwards so again going back to the theme park example once they leave the theme park they're probably posting on social media what kind of fun they had or fun they didn't have they're probably making plans for next year they're talking to friends and other things so there's there's a bunch of stuff we're gonna call it afterglow that happens after event that you want to make certain that you're in part of influencing that so again I don't know how when you combined the data science of use cases and decisions with design thinking of journey Maps what that might mean to do that your business but for us in thinking about smart cities it's opened up all kinds of possibilities and most importantly for our customers it's opened up all kinds of new areas where they can create new sources of value so anyone listening to this need to understand that when the word client or customer is used it can be substituted for athlete and what I think is really important is that when we hear you talk about your the the amount of infrastructure you do for an idea when you approach a situation is something that sports science for in my opinion especially across multiple domains it's truly lacking what happens is we get a piece of technology and someone says go do science while you're taking the approach of let's actually think out what we're doing beforehand let's determine our key performance indicators let's understand maybe the journey that this piece of technology is going to take with the athlete or how the athletes going to interact with this piece of technology throughout their four years if you're in the private sector right that afterglow effect might be something that you refer to as a client retention and their ability to come back over and over and spread your own word for you if you're in the sector with student athletes maybe it's those athletes talking highly about your program to help with recruiting and understanding that developing athletes is going to help you know make that college more enticing to go to or that program or that organization but what really stood out was the fact that you have this infrastructure built beforehand and the example I give I spoke with a good number of organizations and teams about data utilization is that if if you're to all of a sudden be dropped in the middle of the woods and someone says go build a cabin now how was it a giant forest I could use as much wood as I want I could just keep chopping down trees until I had something that had with a shelter of some sort right even I could probably do that well if someone said you know what you have three trees to cut down to make a cabin you could become very efficient and you're going to think about each chop in each piece of wood and how it's going to be used and your interaction with that wood and conjunction with that woods interaction with yourself and so when we start looking at athlete development and we're looking at client retention or we're looking at general health and wellness it's not just oh this is a great idea right we want to make the world's greatest theme park and we want to make the world's greatest training facility but what infrastructure and steps you need to take and you said stakeholders so what individuals am i working with am I talking with the physical therapist am i talking with the athletic trainer am I talking with the skill coach how does the skill coach want the data presented to them maybe that's different than how the athletic trainer is going to have a day to present it to them maybe the sport coach doesn't want to see the data unless something a red flag comes up so now you have all these different entities just like how you're talking about developing this customer journey throughout the theme park and making sure that they have a you know an experience that's memorable and causes an afterglow and really gives that experience meaning how can we now take data and apply it in the same way so we get the most value like you said on the granular aspect of data and really turn that into something valuable max you said something really important and one of the things that let me share one of many horror stories that that that comes up in my daily life which is somebody walking up to me and saying hey I got a client here's their data you know go do some science on it like well well what the heck right so when we created this thing called the hypothesis development canvas our sales teams hate it or do the time our data science teams love it because we do all this pre work we just say we make sure we understand the problem we're going after the decision they're trying to make the KPI is it's what you're going to measure success in progress what are they the operational and financial business benefits what are the data sources we want to consider here's something by the way that's it's important that maybe I wish Boeing would have thought more about which is what are the costs of false positives and false negatives right do you really understand where your risks points are and the reason why false positive and false negatives are really important in data science because data size is making predictions and by virtue of making predictions we are never 100% certain that's right or not predictions hath me built on I'm good enough well when is good enough good enough and a lot of that determination as to when is good enough good enough is really around the cost of false positives and false negatives think about a professional athlete like the false the you know the ramifications of overtraining professional athlete like a Kevin Durant or Steph Curry and they're out for the playoffs as huge financial implications them personally and for the organization so you really need to make sure you understand exactly what's the cost of being wrong and so this hypothesis development canvas is we do a lot of this work before we ever put science to the data that yeah it's it's something that's lacking across not just sports science but many fields and what I mean by that is especially you referred to the hypothesis canvas it's a piece of paper that provides a common language right it's you can sit it out before and for listeners who aren't aware a hypothesis canvas is something bill has worked and developed with his team and it's about 13 different squares and boxes and you can manipulate it based on your own profession and what you're diving into but essentially it goes through the infrastructure that you need to have setup in order for this hypothesis or idea or decision to actually be worth a damn and what I mean by that is that so many times and I hate this but I'm gonna go in a little bit of a rant and I apologize that people think oh I get an idea and they think Thomas Edison all son just had an idea and he made a light bulb Thomas Edison's famous for saying you know I did you know make a light bulb I learned was a 9000 ways to not make a light bulb and what I mean by that is he set an environment that allowed for failure and allowed for learning but what happens often people think oh I have an idea they think the idea comes not just you know in a flash because it always doesn't it might come from some research but they also believe that it comes with legs and it comes with the infrastructure supported around it that's kind of the same way that I see a lot of the data aspect going in regards to our field is that we did an idea we immediately implement and we hope it works as opposed to set up a learning environment that allows you to go okay here's what I think might happen here's my hypothesis here's I'm going to apply it and now if I fail because I have the infrastructure pre mapped out I can look at my infrastructure and say you know what that support beam or that individual box itself was the weak link and we made a mistake here but we can go back and fix it
**Summary and Sentiment Analysis are not been shown because of improper transcript**
ENTITIES
Entity | Category | Confidence |
---|---|---|
Steve Kerr | PERSON | 0.99+ |
Kevin Durant | PERSON | 0.99+ |
Procter & Gamble | ORGANIZATION | 0.99+ |
Steph Curry | PERSON | 0.99+ |
Yahoo | ORGANIZATION | 0.99+ |
Sanibel Island | LOCATION | 0.99+ |
10 years | QUANTITY | 0.99+ |
Procter & Gamble | ORGANIZATION | 0.99+ |
Chipotle | ORGANIZATION | 0.99+ |
Walmart | ORGANIZATION | 0.99+ |
three | QUANTITY | 0.99+ |
a year | QUANTITY | 0.99+ |
9000 ways | QUANTITY | 0.99+ |
Boeing | ORGANIZATION | 0.99+ |
Hitachi van Tara | ORGANIZATION | 0.99+ |
Bill Schmarzo | PERSON | 0.99+ |
two | QUANTITY | 0.99+ |
100% | QUANTITY | 0.99+ |
four | QUANTITY | 0.99+ |
Becky | PERSON | 0.99+ |
Thomas Edison | PERSON | 0.99+ |
IOC | ORGANIZATION | 0.99+ |
each piece | QUANTITY | 0.99+ |
Warriors | ORGANIZATION | 0.99+ |
University of San Francisco | ORGANIZATION | 0.99+ |
Hadoop | TITLE | 0.99+ |
each | QUANTITY | 0.99+ |
each chop | QUANTITY | 0.99+ |
next year | DATE | 0.98+ |
Thomas Edison | PERSON | 0.98+ |
four years | QUANTITY | 0.98+ |
first | QUANTITY | 0.98+ |
next week | DATE | 0.98+ |
today | DATE | 0.98+ |
bill | PERSON | 0.98+ |
late 1980s | DATE | 0.98+ |
Forrest Gump | PERSON | 0.98+ |
20 25 years | QUANTITY | 0.97+ |
first time | QUANTITY | 0.97+ |
two classes | QUANTITY | 0.97+ |
Harvard | ORGANIZATION | 0.97+ |
first introduction | QUANTITY | 0.96+ |
four different variables | QUANTITY | 0.96+ |
single | QUANTITY | 0.94+ |
Coe College | ORGANIZATION | 0.94+ |
each customer | QUANTITY | 0.94+ |
two games | QUANTITY | 0.94+ |
both | QUANTITY | 0.94+ |
Dean | PERSON | 0.93+ |
about 600 people | QUANTITY | 0.93+ |
years | QUANTITY | 0.92+ |
USF | ORGANIZATION | 0.92+ |
ta world Institute | ORGANIZATION | 0.92+ |
one | QUANTITY | 0.91+ |
one of my subsystems | QUANTITY | 0.9+ |
about 13 different squares | QUANTITY | 0.89+ |
a day | QUANTITY | 0.88+ |
Galway | LOCATION | 0.86+ |
88 | DATE | 0.86+ |
National University of Ireland | ORGANIZATION | 0.85+ |
StrongyByScience | TITLE | 0.82+ |
Bill | PERSON | 0.81+ |
Southwest | LOCATION | 0.81+ |
TD WI | ORGANIZATION | 0.81+ |
tons of unknowns | QUANTITY | 0.81+ |
Sam test | TITLE | 0.8+ |
bill Schwarz | PERSON | 0.8+ |
lot of times | QUANTITY | 0.78+ |
87 | DATE | 0.78+ |
three trees | QUANTITY | 0.78+ |
boxes | QUANTITY | 0.77+ |
many times | QUANTITY | 0.74+ |
United | ORGANIZATION | 0.72+ |
one last point | QUANTITY | 0.7+ |
one of the things | QUANTITY | 0.68+ |
past 20 years | DATE | 0.67+ |
Part One | OTHER | 0.67+ |
other metrics | QUANTITY | 0.65+ |
Iran | ORGANIZATION | 0.65+ |
four walls | QUANTITY | 0.63+ |
past few years | DATE | 0.62+ |
max | PERSON | 0.62+ |
Stewart Mclaurin, White House Historical Association | AWS Public Sector Summit 2018
>> Live, from Washington, D.C. It's theCUBE, covering the AWS Public Sector Summit 2018. Brought to you by Amazon Web Services, and its ecosystem partners. (futuristic music) >> Hey, welcome back everyone. We're live in Washington, D.C. for Amazon Web Services Public Sector Summit. This is their big show for the public sector. It's like a mini reinvent for specifically the public sector. I'm John Furrier, your host, with Stu Miniman, my co-host this segment, and Stewart Mclaurin, president of the White House Historic Association, is our guest. I heard him speak last night at a private dinner with Teresa Carlson and their top customers. Great story here, Amazon success story, but I think something more we can all relate to. Stewart, thank you for joining us and taking the time, appreciate it. >> Thanks John, it's just great to be with you. >> Okay, so let's jump into it; what's your story? You work for the White House Historical Association, which means you preserve stuff? Or, you provide access? Tell the story. >> Well, we have a great and largely untold story, and a part of our partnership with Amazon Web Services is to blow that open so more people know who we are and what we do, and have access to the White House, because it's the people's house. It doesn't belong to any one particular president; it's your house. We were founded in 1961 by First Lady Jacqueline Kennedy, who realized that the White House needed a nonprofit, nonpartisan partner. We have no government funding whatsoever, completely private. So we fund the acquisition of art, furnishings, decorative arts for the White House, if a new rug is needed, or new draperies are needed on the State Floor, or a frame needs to be regilded. We also acquire the china, the presidential and first lady portraits that are done; we fund those. But more importantly, in my view, is our education mission that Mrs. Kennedy also started, to teach and tell the stories of White House history going back to 1792, when George Washington selected that plot of land and the architect to build that house that we know today. So we unpack those stories through publications, programs, lectures, symposia, and now this new multifaceted partnership with AWS. >> Let's talk about, first of all, a great mission. This is the people's house; I love that. But it's always the secret cloak and dagger, kind of what's going on in there? The tours are not always, they're probably packed when people go through there, but the average person on the street doesn't have access. >> Sure, well, your cable news channels handle the politics and the policy of the place. We handle the building and the history, and all that's taken place there, including innovation and technology. If you think of Thomas Edison and Alexander Graham Bell, and others that evolved their early technologies through the White House, about 500,000 people get a chance to go through the White House every year. And when you think about in that small space, the president and his family lives, the president and his staff work, it's the ceremonial stage upon which our most important visitors are received, and then about 500,000 people schlep through, so you imagine 500,000 people that are going through your house, and all of that takes place. But it's very important to us for people to be able to see up close and personal, and walk through these spaces where Lincoln walked, and Roosevelt worked. >> Is that what the book you have, and share the book 'cause it's really historic, and the app that you have with Amazon, I think this is a great-- >> Sure, this is a real prize from our office. Mrs. Kennedy wanted us to teach and tell the stories of White House history, and so the first thing she wanted was a guide book, because the White House never had one. So in 1962, she published this guide book with us, and this is her actual copy. Her hands held this book. This was her copy of the book. Now, we continue to update this. It's now in its 24th edition, and each new edition has the latest renovations and updates that the latest president has added. But it's now 2018. So books are great, but we want to be able to impart this information and experience to people not only around Washington, who are going through the White House, but across the country and around the world. So this app that we've developed, you get through WHExperience at the App Store, you have three different tours. If you're walking through the White House, tours are self-guided, so unless you know what you're looking at, you don't know what you're looking at. So you can hold up an image, you can see, it brings to life for you everything that you're looking at in every room. Two other types of tours; if you're outside the White House in President's Park, it will unpack and open the doors of these rooms for you virtually, so you can see the Oval Office, and the Cabinet Room, and the Blue Room, and the Green Room. If you're around the world, there's a third tour experience, but the best part of it is, empowered by Amazon recognition technology, and it allows people to take a selfie, and it analyzes that selfie against all presidential portraits and first lady portraits, and the spatial features of your face, and it will tell you you're 47% Ronald Reagan, or 27% Jackie Kennedy, and people have a lot of fun with that part of the app. >> (laughs) That's awesome. >> Stewart, fascinating stuff. You know, when I go to a museum a lot of times, it's like, oh, the book was something you get on the way home, because maybe you couldn't take photos, or the book has beautiful photos. Can you speak a little bit about how the technology's making the tours a little bit more interactive? >> Sure, well we love books, and we'll publish six hardbound books this year on the history of the White House, and those are all available at our website, whitehousehistory.org. But the three facets of technology that we're adapting with Amazon, it's the app that I've spoken about, and that has the fun gamification element of portrait analysis, but it also takes you in a deeper depth in each room, even more so than the book does. And we can update it for seasons, like we'll update it for the Fall Garden Tour, we'll update it for the Christmas decorations, we'll update it for the Easter Egg Roll. But another part of the partnership is our digital library. We have tens of thousands of images of the White House that have literally been in a domestic freezer, frozen for decades, and with AWS, we're unpacking those and digitizing them, and it's like bringing history to life for the first time. We're seeing photographs of Kennedy, Johnson, other presidents, that haven't been seen by anybody in decades, and those are becoming available through our digital library. And then third, we're launching here a chatbot, so that through a Lex and Polly technology, AWS technology, you'll be able to go to Alexa and ask questions about White House history and the spaces in the White House, or keyboard to our website and ask those questions as well. >> It's going to open up a lot of windows to the young folks in education too. >> It is. >> It's like you're one command away; Hey, Alexa! >> It takes a one-dimensional picture off of a page, or off of a website, and it gives the user an experience of touring the White House. >> Talk about your vision around modernization. We just had a conversation with the CEO of Tellus, when we're talking about government has a modernization approach, and I think Obama really put the stake in the ground on that; former President Obama. And that means something to a lot of people, for you guys it's extending it forward. But your digital strategy is about bringing the experience digitally online from historical documents, and then going forward. So is there plans in the future, for virtual reality and augmented reality, where I can pop in and-- >> That's right. We're looking to evolve the app, and to do other things that are AR and VR focused, and keep it cool and fun, but we're here in a space that's all about the future. I was talking at this wonderful talk last night, about hundreds of thousands of people living and working on Mars, and that's really great. But we all need to remember our history and our roots. History applies to no matter what field you're in, medicine, law, technology; knowing your history, knowing the history of this house, and what it means to our country. There are billions of people around the world that know what this symbol means, this White House. And those are billions of people who will never come to our country, and certainly never visit the White House. Most of them won't even meet an American, but through this app, they'll be able to go into the doors of the White House and understand it more fully. >> Build a community around it too; is there any online social component? You guys looking around that at all? >> All of this is just launched, and so we do want to build some interactive, because it's important for us to know who these people are. One simple thing we're doing with that now, is we're asking people to socially post and tag us on these comparative pictures they take with presidents and first ladies. So there's been some fun from that. >> So Stewart, one of the things I've found interesting is your association, about 50 people, and what you were telling me off-camera, there's not a single really IT person inside there, so walk us through a little bit about how this partnership began, who helps you through all of these technical decisions, and how you do some pretty fun tech on your space. >> Unfortunately, a lot of historical organizations are a little dusty, or at least perceived to be that way. And so we want to be a first mover in this space, and an influencer of our peer institutions. Later this summer, we're convening 200 presidential sites from around the country, libraries, birthplaces, childhood homes, and we're going to share with them the experience that we've had with AWS. We'll partner or collaborate with them like we're already doing with some, like the Lincoln Library in Illinois, where we have a digitization partnership with them. So with us, it's about collaboration and partnership. We are content rich, but we are reach-challenged, and a way to extend our reach and influence is through wonderful partnerships like AWS, and so that's what we're doing. Now another thing we get with AWS is we're not just hiring an IT vendor of some type. They know our mission, they appreciate our mission, and they support our mission. Teresa Carlson was at the White House with us last Friday, and she had the app, and she was going through and looking at things, and it came to life for her in a new real and fresh way, and she'd been to the White House many times on business. >> That's great; great story. And the thing is, it's very inspirational on getting these other historic sites online. It's interesting. It's a digital library, it's a digital version. So, super good. Content rich, reach-challenged; I love that line. What else is going on? Who funds you guys? How do you make it all work? Who pays the bills? Do you guys do donations, is it philanthropy, is it-- >> We do traditional philanthropy, and we'd love for anybody to engage us in that. During the Reagan Administration in 1981, someone had the brilliant idea, now if I'd been in the room when this happened, I probably would have said, "Okay, fine, do that." But thank goodness we did, because it has funded our organization all these years. And that's the creation of the annual, official White House Christmas ornament, and we feature a different president each year sequentially so we don't have to make a political decision. This year, it's Harry Truman, and that ornament comes with a booklet, and it has elements of that ornament that talk about those years in the White House. So with Truman, it depicts the south balcony, the Truman Balcony on the south portico. The Truman seal that eventually evolved into being the Presidential Seal. On the reverse is the Truman Blue Room of the White House. So these are teaching tools, and we sell a lot of those ornaments. People collect them; once you start, you can't stop. A very traditional thing, but it's an important thing, and that's been a lifeblood. Actually, Teresa Carlson chairs our National Council on White House History. John Wood, that you just had on before me, is on our National Council on White House History. These are some of our strong financial supporters who believe in our mission, and who are collaborating it with us on innovative ways, and it's great to have them involved with us because it brings life in new ways, rather than just paper books. >> Stewart, I had a non-technical question for you. According to your mission, you also obtained pieces. I'm curious; what's the mission these days? What sort of things are you pulling in? >> Well, there's a curator in the White House. It's a government employee that actually manages the White House collection. Before President and Mrs. Kennedy came into the White House, a new president could come in and get rid of anything they wanted to, and they did. That's how they funded the new, by selling the old. That's not the case anymore. With the Kennedys, there's a White House collection, like a museum, and so we'll work with the White House and take their requests. For example, a recent acquisition was an Alma Thomas painting. Alma Thomas is the first African American female artist to have a work in the White House collection; a very important addition. And to have a work in the White House collection, the artist should be deceased and the work over 25 years old, so we're getting more of the 21st century. The great artists of the American 20th century are becoming eligible to have their works in the collection. >> Stewart, thanks so much for coming on theCUBE and sharing your story. It's good to see you speak, and thanks for the ornament we got last night. >> Sure. Well, you've teased this ornament. Everybody's going to want and need one now, so go to whitehousehistory.org. >> John, come on, you have to tell the audience who you got face matched recognition with on the app. >> So who did you get face matched with? >> I think I'm 20% James Buchanan, but you got the Gipper. >> I'm Ronald Reagan. Supply-side economics, trickle-down, what do they call it? Voodoo economics, was his famous thing? >> That's right. >> He had good hair, John. >> Well, you know, our job is to be story tellers, and thank you for letting us share a little bit of our story here today. We love to make good friends through our social channels, and I hope everyone will download this app and enjoy visiting the White House. >> We will help with the reach side and promote your mission. Love the mission, love history, love the digital convergence while preserving and maintaining the great history of the United States. And a great, good tool. It's going to open up-- >> Amazon gave us these stickers for everybody who had downloaded the app, so I'm officially giving you your downloaded app sticker to wear. Stu, this is yours. >> Thank you so much. >> Thanks guys, really appreciate it. >> Thank so much, great mission. Check out the White House-- >> Historical Association. >> Historicalassociation.org, and get the White House app, which is WHExperience on the App Store. >> That's right. >> Okay, thanks so much. Be back with more, stay with us. Live coverage here at AWS, Amazon Web Services Public Sector Summit. We'll be right back. (futuristic music)
SUMMARY :
covering the AWS Public and taking the time, appreciate it. to be with you. Tell the story. and the architect to build But it's always the and all of that takes place. and so the first thing she it's like, oh, the book and that has the fun gamification element It's going to open up a lot of windows and it gives the user an experience is about bringing the and to do other things and so we do want to and what you were telling me off-camera, and she had the app, And the thing is, it's very inspirational and it has elements of that ornament the mission these days? and the work over 25 years old, and thanks for the ornament so go to whitehousehistory.org. who you got face matched but you got the Gipper. trickle-down, what do they call it? and thank you for letting us share of the United States. so I'm officially giving you Check out the White House-- and get the White House app, Be back with more, stay with us.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
George Washington | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Teresa Carlson | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Ronald Reagan | PERSON | 0.99+ |
John Wood | PERSON | 0.99+ |
Amazon Web Services | ORGANIZATION | 0.99+ |
John Furrier | PERSON | 0.99+ |
Lincoln | PERSON | 0.99+ |
Stewart | PERSON | 0.99+ |
Stewart Mclaurin | PERSON | 0.99+ |
Obama | PERSON | 0.99+ |
Washington | LOCATION | 0.99+ |
1961 | DATE | 0.99+ |
Kennedy | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
1981 | DATE | 0.99+ |
Mars | LOCATION | 0.99+ |
Stu Miniman | PERSON | 0.99+ |
James Buchanan | PERSON | 0.99+ |
24th edition | QUANTITY | 0.99+ |
21st century | DATE | 0.99+ |
1962 | DATE | 0.99+ |
Washington, D.C. | LOCATION | 0.99+ |
White House | ORGANIZATION | 0.99+ |
1792 | DATE | 0.99+ |
White House Historical Association | ORGANIZATION | 0.99+ |
National Council on White House History | ORGANIZATION | 0.99+ |
Alexander Graham Bell | PERSON | 0.99+ |
whitehousehistory.org | OTHER | 0.99+ |
27% | QUANTITY | 0.99+ |
2018 | DATE | 0.99+ |
47% | QUANTITY | 0.99+ |
Roosevelt | PERSON | 0.99+ |
White House Historic Association | ORGANIZATION | 0.99+ |
500,000 people | QUANTITY | 0.99+ |
United States | LOCATION | 0.99+ |
200 presidential sites | QUANTITY | 0.99+ |
White House | LOCATION | 0.99+ |
Illinois | LOCATION | 0.99+ |
20% | QUANTITY | 0.99+ |
Harry Truman | PERSON | 0.99+ |
Tellus | ORGANIZATION | 0.99+ |
three facets | QUANTITY | 0.99+ |
Alma Thomas | PERSON | 0.99+ |
Historicalassociation.org | OTHER | 0.99+ |
Jackie Kennedy | PERSON | 0.99+ |
billions of people | QUANTITY | 0.99+ |
last Friday | DATE | 0.99+ |
App Store | TITLE | 0.99+ |
third tour | QUANTITY | 0.98+ |
about 500,000 people | QUANTITY | 0.98+ |
about 500,000 people | QUANTITY | 0.98+ |
Thomas Edison | PERSON | 0.98+ |
Alexa | TITLE | 0.98+ |
billions of people | QUANTITY | 0.98+ |
each year | QUANTITY | 0.98+ |
President's Park | LOCATION | 0.98+ |
Reagan Administration | ORGANIZATION | 0.98+ |
first time | QUANTITY | 0.98+ |
one | QUANTITY | 0.98+ |
third | QUANTITY | 0.97+ |
Western Digital Taking the Cloud to the Edge, Panel 2 | DataMakesPossible
>> They are disruptive technologies. And if you think about the disruption that's happening in business, with IoT, with OT, and with big data, you can't get anything more disruptive to the whole of the business chain as this particular area. It's an area that I focused on myself, asking the question, should everything go to the cloud? Is that the new future? Is 90% of the computing going to go to the cloud with just little mobile devices right on the edge? Felt wrong when I did the math on it, I did some examples of real-world environments, wind farms, et cetera, it clearly was not the right answer, things need to be near the edge. And I think one of the areas to me that solidified it was when you looked at an area like video. Huge amounts of data, real important decisions being made on the content of that video, for example, recognizing a face, a white hat or a black hat. If you look at the technology, sending that data somewhere to do that recognition just does not make sense. Where is it going? It's going actually into the camera itself, right next to the data, because that's where you have the raw data, that's where you have the maximum granularity of data, that's where you need to do the processing of which faces are which, right close to the edge itself, and then you can send the other data back up to the cloud, for example, to improve those algorithms within that camera, to do all that sort of work on the batch basis over time, that's what I was looking at, and looking at the cost justification for doing that sort of work. So today, we've got a set people here on the panel, and we want to talk about coming down one level to where IoT and IT are going to have to connect together. So on the panel I've got, I'm going to get these names really wrong, Sanjeev Kumar? >> Yes, that's right. >> From FogHorn, could you introduce yourself and what you're doing where the data is meeting the people and the machines? >> Sure, sure, so my name is Sanjeev Kumar, I actually run engineering for a company called FogHorn Systems, we are actually bringing analytics and machine learning to the edge, and, so our goal and motto is to take computing to where the data is, than the other way around. So it's a two-year-old company that started, was incubated in the hive, and we are in the process of getting our second release of the product out shortly. >> Excellent, so let me start at the other end, Rohan, can you talk about your company and what contribution you're focusing on? >> Sure, I'm head product marketing for Maana, Maana is a startup, about three years old, what we're doing is we're offering an enterprise platform for large enterprises, we're helping the likes of Shell and Maersk and Chevron digitally transform, and that simply means putting the focus on subject matter experts, putting the focus on the people, and data's definitely an important part of it, but allowing them to bring their expertise into the decision flows, so that ultimately the key decisions that are driving the revenue for these behemoths, are made at a higher quality and faster. >> Excellent. Well, two software companies, we have a practitioner here who is actually doing fog computing, doing it for real, has been doing it for some time, so could you like, Janet George from Western Digital, can you introduce yourself, and say something from the trenches, of what's really going on? >> Okay, very good, thank you. I actually build infrastructure for the edge to deal with fog computing, and so for Western Digital, we're very lucky, because we are the largest storage manufacture, and we have what we call Internet of Things, and Internet of Test Equipment, and I process petabytes of data that comes out of the Internet of Things, which is basically our factories, and then I take these petabytes of data, I process them both on the cloud and then on the edge, but primarily, to be able to consume that data. And the way we consume that data is by building very high-profile models through artificial intelligence and machine learning, and I'll talk a lot more about that, but at the end of the day, it's all about consuming the data that you collect from anywhere, Internet of Things, computer equipment, data that's being produced through products, you have to figure out a way to compute that, and the cloud has many advantages and many trade-offs, and so we're going to talk about the trade-offs, that's where the gap for computing comes into play. >> Excellent, thanks very much. And last but not least, we have Val, and I can never pronounce your surname. >> Bercovici. >> Thank you. (chuckling) You are in the midst of a transition yourself, so talk about where you have been and where you're going. >> For the better part of this century, I've been with NetApp, working at various functions, obviously enterprise storage, and around 2008, my developer instinct kind of fired up, and this thing called cloud became very interesting to me. So I became a self-anointed cloud czar at NetApp, and I ended up initiating a lot of our projects which we know today as the NetApp Data Fabric, that culminated about 18 months ago, in acquisition of SolidFire, and I'm now the acting CTO of SolidFire, but I plan to retire from the storage industry at the end of our fiscal year, at the end of April, and I'm spending a lot of time with particularly the Cloud Native Compute Foundation, that is, the opensource home of Google's Kubernetes Technology and about seven other related projects, we keep adding some almost every month, and I'm starting to lose track, and spending a lot of time on the data gravity challenge. It's a challenge in the cloud, it's a particularly new and interesting challenge at the edge, and I look forward to talking about that. >> Okay, and data gravity is absolutely key, isn't it, it's extremely expensive and extremely heavy to move around. >> And the best analogy is workloads are like electricity, they move fairly easily and lightly, data's like water, it's really hard to move, particularly large bodies around. >> Great. I want to start with one question though, just in the problem, the core problem, particularly in established industries, of how do we get change to work? In an IT shop, we have enough problems dealing with operations and development. In the industrial world, we have the IT and the OT, who look at each other with less than pleasure, and mainly disdain. How do we solve the people problem in trying to put together solutions? You must be right in the middle of it, would you like to start with that question? >> Absolutely, so we are 26 years old, probably more than that, but we have very old and new mix of manufacturing equipment, it's a storage industry, and in our storage industry, we are used to doing things a certain way. We have existing data, we have historical data, we have trend data, you can't get rid of what you already have. The goal is to make connectors such that you can move from where you're at to where you're going, and so you have to be able to take care of the shift that is happening in the market, so at the end of the day, if you look at five years from now, it's all going to be machine learning and AI, right? Agent technology's already here, it's proven, we can see, Siri is out here, we can see Alexa, we can see these agent technologies out there, so machine learning is a getting a lot of momentum, deep learning and neural networks, things like that. So we got to be able to look at that data and tap into our data, near realistically, very different, and the way to do that is really making these connections happen, tapping into old versus new. Like for example, if you look at storage, you have file storage, you have block storage, and then you have object storage, right? We've not really tapped into the field of object storage, and the reason is because if you are going to process one trillion objects like Amazon is doing right now with S3, you can't do it with the file system level storage or with the blog system level storage, you have to go to objects. Think Internet of Things. How many trillions of objects are going to come out of these Internet of Things? So one, you have to be positioned from an infrastructure standpoint. Two, you have to be positioned from a use case prototyping perspective, and three, you got to be able to scale that very rapidly, very quickly, and that's how change happens, change does not happen because you ask somebody to change their behavior, change happens when you show value, and people are so eager to get that value out of what you've shown them in real life, that they are so quick to adapt. >> That's an excellent-- >> If I could comment on that as well, which is, we just got through training a bunch of OT guys on our software, and two analogies that actually work very well, one is sort of, the operational people are very familiar with circuit diagrams, and so, and sort of, flow of things through essentially black boxes, you can think of these as something that has a bunch of inputs and has a bunch of outputs. So that's one thing that worked very well. The second thing that works very well is the PLC model, and there are direct analogies between PLC's and analytics, which people on the floor can actually relate to. So if you have software that's basically based on data streams and time, as a first-class citizen, the PLC model again works very well in terms of explaining the new software to the OT people. >> Excellent, okay, would you want to come in on that as well? >> Sure, I think a couple of points to add to what Janet said, I couldn't agree more in terms of the result, I think Maana did a few projects, a few pilots to convince customers of their value, and we typically focus very heavily on operationalizing the output, so we are very focused on making sure that there is some measurable value that comes out of it, and it's not until the end user started seeing that value that they were willing and open to adopt the newer methodologies. A second point to that is, a lot of the more recent techniques available to solve certain challenges, there are deep learning neural nets there's all sorts of sophisticated AI and machine learning algorithms that are out there, a lot of these are very sophisticated in their ability to deliver results, but not necessarily in the transparency of how you got that, and I think that's another thing that Maana's learning, is yes, we have this arsenal of fantastic algorithms to throw at problems, but we try to start with the simplest approach first, we don't unnecessarily try to brute force, because I think an enterprise, they are more than willing to have that transparency in how they're solving something, so if they're able to see how they were able to get to us, how the software was able to get to a certain conclusion, then they are a lot happier with that approach. >> Could you maybe just give one example, a real-world example, make it a little bit real? >> Right, absolutely, so we did a project for a very large organization for collections, they have a lot of outstanding capital locked up and customers not paying, it's a standard problem, you're going to find it in pretty much any industry, and so for that outstanding invoice, what we did was we went ahead and we worked with the subject matter experts, we looked at all the historical accounts receivable data, we took data from a lot of other sources, and we were able to come up with models to predict when certain customers are likely to pay, and when they should be contacted. Ultimately, what we wanted to give the collection agent were a list of customers to call. It was fairly straightforward, of course, the solution was not very, very easy, but at least on a holistic level, it made a lot of sense to us. When we went to the collection agents, many of them actually refused to use that approach, and this is part of change management in some sense, they were so used to doing things their way, they were so used to trying to target the customers with the largest outstanding invoice, or the ones that hadn't paid for the longest amount of time, that it actually took us a while, because initially, what the feedback we got was that your approach is not working, we're not seeing the results. And when we dug into it, it was because it wasn't being used, so that would be one example. >> So again, proof points that you will actually get results from this. >> Absolutely, and the transparency, I think we actually sent some of our engineers to work with the collections agents to help them understand what approach is it that we're taking, and we showed them that this is not magic, we're actually, instead of looking at the final dollar value, we're looking, we're calculating time value lost, so we are coming up with a metric that allows us to incorporate not just the outstanding amount, or the time that they haven't paid for, but a lot of other factors as well. >> Excellent, Val. >> When you asked that question, I immediately went to more of a nontechnical business side of my brain to answer it, so my experience over the years has been particularly during major industry transitions, I'm old enough to remember the mainframe to client server transition, and now client server to virtualization and cloud, and really, sales reps have that well-earned reputation of being coin-operated, though it's remarkable how much you can adjust compensation plans for pretty much anyone, in a capitalist environment, and the IT/OT divide, if you will, is pretty easy to solve from a business perspective when you take someone with an IT supporting the business mentality, and you compensate them on new revenue streams, new business, all of a sudden, the world perspective changes sometimes overnight, or certainly when that contract is signed. That's probably the number one thing you can do from a people perspective, is incent them and motivate them to focus on these new things, the technology is, particularly nowadays is evolving to support them for these new initiatives, but nothing motivates like the right compensation plan. >> Excellent, a great series of different viewpoints. So the second question I have again coming down a bit to this level, is how do we architect a solution? We heard you got to architect it, and you've got less, like this, it seems to me that that's pretty difficult to do ahead of where you're going, that in general, you take smaller steps, one step at a time, you solve one problem, you go on to the next. Am I right in that? If I am, how would you suggest the people go about this decision-making of putting architectures together, and if you think I'm wrong and you have a great new way of doing it, I'd love to hear about it. >> I can take a shorter route. So we have a number of customers that are trying to adopt, are going through a phased way of adopting our technology and products, and so it begins with first gathering of the data, and replaying it back, to build the first level of confidence, in the sense that the product is actually doing what you're expecting it to do. So that's more from monitoring administration standpoint. The second stage is you should begin to capture analytical logic into the project, where it can start doing prediction for you, so you go into, so from operational, you go into a predictive maintenance, predictive maintenance, predictive models standpoint. The third part is prescriptive, where you actually help create a machine learning model, now, it's still in flux in terms of where the model gets created, whether it's on the cloud, in a central fashion, or some sort of a, the right place, the right context in a multi-level hierarchical fog layer, and then, you sort of operationalize that as close to the data again as possible, so you go through this operational to predictive to prescriptive adoption of the technology, and that's how people actually build confidence in terms of adopting something new into, let's say, a manufacturing environment, or things that are pretty expensive, so I give you another example where you have the case of capacitors being built on a assembly line, manufacturing, and so how do you, can you look at data across different stations and manufacturing on a assembly line? And can you predict on the second station that it's going to fail on the eighth one? By that, what you're doing is you are actually reducing the scrap that's coming off of the assembly line. So, that's the kind of usage that you're going to in the second and third stage. >> Host: Excellent. Janet, do you want to go on? >> Yeah, I agree and I have a slightly different point of view also. I think architecture's very difficult, it's like Thomas Edison, he spent a lot of time creating negative knowledge to get to that positive knowledge, and so that's kind of the way it is in the trenches, we spend a lot of time trying to think through, the keyword that comes to mind is abstraction layers, because where we came from, everything was tightly coupled, and tightly coupled, computer and storage are tightly coupled, structured and unstructured data are tightly coupled, they're tightly coupled with the database, schema is tightly coupled, so now we are going into this world of everything being decoupled. In that, multiple things, multiple operating systems should be able to use your storage. Multiple models should be able to use your data. You cannot structure your data in any kind of way that is customized to one particular model. Many models have to run on that data on the fly, retrain itself, and then run again, so when you think about that, you think about what suits best to stay in the cloud, maybe large amounts of training data, schema that's already processed can stay on the cloud. Schema that is very dynamic, schema that is on the fly, that you need to read, and data that's coming at you from the Internet of Things that's changing, I call it heteroscedastic data, which is very statistical in nature, and highly variable in nature, you don't have time to sit there and create rows and columns and structure this data and put it into some sort of a structured set, you need to have a data lake, you need to have a stack on top of that data lake that can then adapt, create metadata, process that data and make it available for your models, so, and then over time, like I totally believe that now we're running into near realtime compute bottleneck, processing all this pattern processing for the different models and training sets, so we need a stack that we can quickly replace with GPUs, which is where the future is going, with pattern processing and machine learning, so your architecture has to be extremely flexible, high layers of abstraction, ability to train and grow and iterate. >> Excellent. Do you want to go next? >> So I'll be a broken record, back to data gravity, I think in an edge context, you really got to look at the cost of processing data is orders of magnitude less than moving it or even storing it, and so I think that the real urgency, I don't know, there's 90% that think of data at the edge is kind of wasted, you can filter through it and find that signal through the noise, so processing data to make sure that you're dealing with really good data at the edge first, figuring out what's worth retaining for future steps, I love the manufacturing example, I have lots of customer examples ourselves where, for quality control in a high-moving assembly line, you want to take thousands of not millions of images and compare frame and frame exactly according to the schematics where the device is compared to where it should be, or where the components, and the device compared to where they should be, processing all of that data locally and making sure you extract the maximum value before you move data to a central data lake to correlate it against other anomalies or other similarities, that's really key, so really focus on that cost of moving and storing data, yeah. >> Yes, do you want the last word? >> Sure, Maana takes an interesting approach, I'm going to up-level a little bit. Whenever we are faced with a customer or a particular problem for a customer, we try to go over the question-answer approach, so we start with taking a very specific business question, we don't look at what data sources are available, we don't ask them whether they have a data lake, or we literally get their business leaders, their subject matter experts, we literally lock them up in a room and we say, "You have to define "a very specific problem statement "from which we start working backwards," each problem statement can be then broken down into questions, and what we believe is any question can be answered by a series of models, you talked about models, we go beyond just data models, we believe anything in the real world, in the case of, let's say, manufacturing, since we're talking about it, any smallest component of a machine should be represented in the form of a concept, relationships between people operating that machinery should be represented in the form of models, and even physics equations that are going into predicting behavior should be able to represent in the form of a model, so ultimately, what that allows us is that granularity, that abstraction that you were talking about, that it shouldn't matter what the data source is, any model should be able to plug into any data source, or any more sophisticated bigger model, I'll give you an example of that, we started solving a problem of predictive maintenance for a very large customer, and while we were solving that predictive maintenance problem, we came up with a number of models to go ahead and solve that problem. We soon realized that within that enterprise, there are several related problems, for example, replacement of part inventory management, so now that you figured out which machine is going to fail at roughly what instance of time from now, we can also figure out what parts are likely to fail, so now you don't have to go ahead and order a ton of replacement parts, because you know what parts are going to likely fail, and then you can take that a step further by figuring out which equipment engineer has the skillset to go ahead and solve that particular issue. Now, all of that, in today's world, is somewhat happening in some companies, but it is actually a series of point solutions that are not talking to each other, that's where our pattern technology graph is coming into play where each and every model is actually a note on the graph including computational models, so once you build 10 models to solve that first problem, you can reuse some of them to solve the second and third, so it's a time-to-value advantage. >> Well, you've been a fantastic panel, I think these guys would like to get to a drink at the bar, and there's an opportunity to talk to you people, I think this conversation could go on for a long, long time, there's so much to learn and so much to share in this particular information. So with that, over to you! >> I'll just wrap it up real quick, thanks everyone, give the panel a hand, great job. Thanks for coming out, we have drinks for the next hour or two here, so feel free to network and mingle, great questions to ask them privately one-on-one, or just have a great conversation, and thanks for coming, we really appreciate it, for our Big Data SV Event livestreamed out, it'll be on demand on YouTube.com/siliconangle, all the video, if you want to go back, look at the presentations, go to YouTube.com/siliconangle, and of course, siliconangle.com, and Wikibond.com for the research and content coverage, so thanks for coming, one more time, big round of applause for the panel, enjoy your evening, thanks so much.
SUMMARY :
Is 90% of the computing going to go to the cloud of getting our second release of the product out shortly. and that simply means putting the focus so could you like, Janet George from Western Digital, consuming the data that you collect from anywhere, and I can never pronounce your surname. so talk about where you have been the acting CTO of SolidFire, but I plan to retire Okay, and data gravity is absolutely key, isn't it, And the best analogy is workloads are like electricity, would you like to start with that question? and the reason is because if you are going to process in terms of explaining the new software to the OT people. but not necessarily in the transparency of how you got that, and we were able to come up with models to predict So again, proof points that you will actually Absolutely, and the transparency, and the IT/OT divide, if you will, and if you think I'm wrong and you have a great new way and then, you sort of operationalize that Janet, do you want to go on? the keyword that comes to mind is abstraction layers, Do you want to go next? and the device compared to where they should be, and then you can take that a step further and there's an opportunity to talk to you people, all the video, if you want to go back,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Janet George | PERSON | 0.99+ |
Janet | PERSON | 0.99+ |
Western Digital | ORGANIZATION | 0.99+ |
Cloud Native Compute Foundation | ORGANIZATION | 0.99+ |
90% | QUANTITY | 0.99+ |
10 models | QUANTITY | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Shell | ORGANIZATION | 0.99+ |
Siri | TITLE | 0.99+ |
Sanjeev Kumar | PERSON | 0.99+ |
second | QUANTITY | 0.99+ |
Rohan | PERSON | 0.99+ |
one question | QUANTITY | 0.99+ |
Maana | ORGANIZATION | 0.99+ |
FogHorn Systems | ORGANIZATION | 0.99+ |
two analogies | QUANTITY | 0.99+ |
thousands | QUANTITY | 0.99+ |
Bercovici | PERSON | 0.99+ |
Thomas Edison | PERSON | 0.99+ |
second question | QUANTITY | 0.99+ |
second station | QUANTITY | 0.99+ |
SolidFire | ORGANIZATION | 0.99+ |
FogHorn | ORGANIZATION | 0.99+ |
third | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
third part | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
second thing | QUANTITY | 0.98+ |
Two | QUANTITY | 0.98+ |
two-year-old | QUANTITY | 0.98+ |
one problem | QUANTITY | 0.98+ |
end of April | DATE | 0.98+ |
Alexa | TITLE | 0.98+ |
first problem | QUANTITY | 0.98+ |
one example | QUANTITY | 0.97+ |
three | QUANTITY | 0.97+ |
second release | QUANTITY | 0.97+ |
third stage | QUANTITY | 0.97+ |
second point | QUANTITY | 0.96+ |
one trillion objects | QUANTITY | 0.96+ |
second stage | QUANTITY | 0.96+ |
one | QUANTITY | 0.96+ |
one level | QUANTITY | 0.95+ |
two software companies | QUANTITY | 0.95+ |
NetApp Data Fabric | ORGANIZATION | 0.95+ |
each | QUANTITY | 0.95+ |
millions of images | QUANTITY | 0.95+ |
2008 | DATE | 0.95+ |
first level | QUANTITY | 0.95+ |
both | QUANTITY | 0.95+ |
eighth one | QUANTITY | 0.94+ |
S3 | TITLE | 0.93+ |
trillions of objects | QUANTITY | 0.92+ |
Wikibond.com | ORGANIZATION | 0.92+ |
each problem statement | QUANTITY | 0.92+ |
one thing | QUANTITY | 0.92+ |
one step | QUANTITY | 0.91+ |
Big Data SV Event | EVENT | 0.91+ |
siliconangle.com | OTHER | 0.91+ |
NetApp | ORGANIZATION | 0.91+ |
Maersk | ORGANIZATION | 0.9+ |
about three years old | QUANTITY | 0.89+ |
five years | QUANTITY | 0.89+ |
Maana | PERSON | 0.89+ |
about 18 months ago | DATE | 0.88+ |
26 years old | QUANTITY | 0.82+ |
one particular model | QUANTITY | 0.82+ |
Kubernetes Technology | ORGANIZATION | 0.82+ |
Val | PERSON | 0.82+ |
every | QUANTITY | 0.81+ |
Chevron | ORGANIZATION | 0.79+ |
Panel 2 | QUANTITY | 0.77+ |
seven other related projects | QUANTITY | 0.7+ |
next hour | DATE | 0.69+ |
to | TITLE | 0.66+ |
petabytes | QUANTITY | 0.64+ |
time | QUANTITY | 0.64+ |
two | QUANTITY | 0.63+ |
series of models | QUANTITY | 0.52+ |