Dr. Shannon Vallor, Santa Clara University | Technology Vision 2018
>> Hey welcome back, everybody. Jeff Frick here with the CUBE. We're at the Accenture Technology Vision 2018, actually, the preview event, 'about 200 people. The actual report comes out in a couple of days. A lot of interesting conversations about what are the big trends in 2018 in Accenture. Surveyed Paul Daugherty and team and really excited. Just was a panel discussion to get into a little bit of the not exactly a technology, but really the trust and ethics conversations. We're joined by Dr. Shannon Vallor. She's a professor at Santa Clara University. Dr. Vallor, great to see you. >> Great to be here, thank you! >> So you were just on the panel, and of course there was a car guy on the panel. So everybody loves this talk about cars and autonomous vehicles. You didn't get enough time. (chuckles) So we've got a little more time, which is great. >> Great! >> But one of the things that you brought up that I think was pretty interesting is really, kind of a higher-level view of what role technology plays in our life before. And you said before it was ancillary, it was a toy, it was a gimmick. It was a cool new car, a status symbol, or whatever. But now technology is really defining who we are, what we do, how we interact, not only with the technology of other people. It's really taken such a much more fundamental role with a bunch more new challenges. >> Yeah, and fundamentally that means that these new technologies are helping to determine how our lives go, not just whether we have the latest gadget or status symbol. Previously, as I said, we tended to take on technologies as ornaments to our life, as luxuries to enrich our life. Increasingly, they are the medium through which we live our lives, right? They're the ways that we find the people we want to marry. They're the ways that we access resources, capital, healthcare, knowledge. They're the ways that we participate as citizens in a democracy. They are entering our bodies. They're entering our homes. And the level of trust that's required to really welcome technology in this way without ambivalence or fear, it's a kind of trust that many technology companies weren't prepared to earn. >> Jeff: Right, Right. >> Because it goes much deeper than simply having to behave in a lawful manner, or satisfy your shareholders, right? It means actually having to think about whether your technologies are helping people live better lives, and whether you're earning the trust that your marketing department, your engineers, your salespeople are out there trying to get from your customers. >> Right. And it's this really interesting. When you talked about a refrigerator, I just love that example 'cause most people would never let their next door neighbor look into their refrigerator. >> Shannon: Or their medicine cabinet, right? >> Or their medicine cabinet, right. And now you want to open that up to automatic replenishment. And it's interesting 'cause I don't think a lot of companies that came into the business with the idea that they were going to have this intimate relationship with their customers to a degree, and a personal responsibility to that data. They just want to sell them some good stuff and move on >> Sure. >> to the next customer. >> Yes. >> So it's a very different mindset. Are they adjusting? How are the legacy folks dealing with this? >> Well, the good news is, is that there are a lot more conversations happening about technology and ethics within industry circles. And you even see large organizations coming together to try to lead in an effort to develop more ethical approaches to technology design and development. So, for example, the big five leaders in AI have come together to form the partnership for AI and social good. And this is a really groundbreaking movement that could potentially lead other industry participants to say, "Hey we need to get on board with this, "and we have to start thinking >> Right. >> "about what ethical leadership looks like for us," as opposed to just a sort of PR kind of thing. Yeah, we throw the word "ethics" on a few websites or slides and then we're good, right? >> Right. >> It has to go much deeper than that. And that's going to be a challenge. But it has to be at a level where rank and file workers and project managers have procedures that they know how to go through that involve ethical analysis, prediction, and preparing ethical responses to failures or conflicts that might arise. >> Right, there's just so many layers to this that we could go on for a long time. >> Sure. >> But the autonomous band has kicked up. >> Yes, yes! >> But one of the things is when you're collecting the data for a specific purpose, and you put all the efficacy in as to why and how, and what you're going to treat, what you don't know is how that data might be used by someone else next week, >> Yes. >> next year, >> Yes. >> ten years from now. >> Absolutely. >> And you can't really know because there's maybe things that you aren't aware of. So a very difficult challenge. >> And I think we have to just start thinking in terms of different kinds of metaphors. So data up until now has been seen as something that had value and very little risk associated with it. Now our attitudes are starting to shift, and we're starting to understand that data carries not just value, not just the ability to monetized, but immense power. And that power can be both constructive or destructive. Data is like jet fuel, right? It can do great things. >> Right. >> But you've got to store it carefully. You have to make sure that the people handling it are properly trained. That they know what can go wrong. >> Right. >> Right? That they've got safety regimes in place. No one who handles jet fuel treats it the way that some companies treat data today. But today, data can cause disasters on a scale similar to a chemical explosion. People can die, lives can be ruined, and people can lose their life savings over a breach or a misuse of data that causes someone to be unjustly accused of fraud or a crime. So we have to start thinking about data as something much more powerful than we have in the past. >> Jeff: Right. >> And you have the responsibility to handle it appropriately. >> Right, but we're still so far away, right? We're still sending money to the Nigerian prince who needs help getting out of the airport at Newark Airport. I mean, even just the social, >> Yes. >> the social factors still haven't caught up. And then you've got this kind of whole API economy where so many apps are connected to so many apps. >> Right. >> So even, where is the data? >> Yeah. >> And that's before you even get into a plane flying over international borders while you send an email, I mean. >> Right, yes. >> The complexity is crazy! >> Yep, and we're never going to get a handle on all of it. So one of the things I like to tell people is, it's important not to let the perfect become the enemy of the good, right? >> Jeff: Right. >> So the idea is, yes, the problem is massive. Yes, it's incredibly complex. Can we address every possible risk? Can we forestall every possible disaster? No. Can we do much better than we're doing now? Absolutely. So, I think, the important thing is not to focus on how massive the problem or the complexities are, but think about how can we move forward from here to get ourselves in a better and more responsible position. And there's lots of ways to do that. Lots of companies are already leading the way in that direction. So I think that there's so much progress to be made that we don't have to worry too much about the progress that we might never get around to making. >> Right, right. But then there's this other interesting thing that's going on that we've seen with kind of the whole "fake news," right? Which is algorithms are determining what we see. >> Shannon: Yes. >> And if you look at the ad tech model as kind of where the market has taken over the way that that operates, >> Shannon: Yep. >> there's no people involved. So then you have things happen like what happened with YouTube, where advertisers' stuff is getting put into places where they don't want it. >> Yeah. >> But there's really no people, there's no monitoring. >> Yes. >> So how do you see that kind of evolving? 'Cause on one hand, you want more social responsibility and keeping track of things. On the other hand, so much is moving to software, automation, and giving people more of what they want, not necessarily what they need. >> Well, and that means that we have to do a much better job of investing in human intelligence. We have to, for every new form of artificial intelligence, we need an even more powerful provision of human intelligence to guide it, to provide oversight. So what I like to say is, AI is not ready for solo flight, right? And a lot of people would like that to be the case because, of course, you can save money if you can put an automated adjudication system in there and take the people out. But we've seen over and over again that that leads again and again to disaster and to huge reputational losses to companies, often huge legal liabilities, right? So we have to be able to get companies to understand that they are really protecting themselves and their long-term health if they invest in human expertise and human intelligence to support AI, to support data, to support all of the technologies that are giving these companies greater competitive advantage and profitability. >> But does the delta in the machine scale versus human scale just become unbearable? Or can we use the machine scale to filter out the relatively small number of things that need a person to get involved. I mean. >> Yeah, and the-- >> How do you see some kind of some best practices? >> Yeah, so the answer depends on the industry, depends upon the application. So there's no one size fits all solution. But what we can often do is recognize that typically human and AI function best together, right? So we can figure out the ways in which the AI can amplify the human expertise and wisdom, and the human expertise can fill in some of the gaps that still exist in artificial intelligence. Some of the things that AIs just don't see, just don't recognize, just aren't able to value or predict. And so when we figure out the ways that human and artificial intelligence can compliment each other in a particular stetting, then we can get the most reliable results, and often the fairest and safest results. They might not always be the most efficient from the narrow standpoint of speed and profit, right? >> Jeff: Right, right. >> So they have able to step back and say at the end of the day, quality matters, trust matters. And just as if we put together a shoddy project on the cheap and put it out there, it's going to come back to bite us. If we put shoddy AI in place of important human decisions that affect human lives, it's going to come back to bite us. So we need to invest in the human expertise and the human wisdom, which has that ethical insight to round out what AI still lacks. >> So do you think the execution of that trust building becomes the next great competitive advance? I mean, >> Yeah. >> nobody talks about that right? Data's the new oil, >> Sure! And blah, blah, blah, blah, blah. And software defined, AI driven automation, but that's not necessarily only to the goal in road, right? There's issues. >> Right. >> So is trust, you think? >> Absolutely. >> The next great competitive differentiator? >> Absolutely. I think in the long run it will be. If you look at, for example, the way that companies like Facebook and Equifax have really damaged, in pretty profound ways, the public perception of them as trustworthy actors in, not just the corporate space, right? But in the political space for Facebook, in the economic space for Equifax. And we have to be able to recognize that those associations of a major company with that level of failure are really lasting, right? Those things don't get forgotten in one news cycle. So I think we have to recognize that today people don't know who to trust, right? It used to be that you could trust the big names, the big Fortune 500 companies. >> The blue chips, right. >> The blue chips, right. >> Right. >> And then it was the little fly by night companies that you didn't really know whether you could trust, and maybe you'd be more cautious in dealing with them. Now the public has no way of understanding which companies will genuinely fulfill the trust in the relationship >> Right. >> that the customer gives them. And so there's a huge opportunity from a competitive standpoint for companies to step up and actually earn that trust and say, in a way that can be backed up by action and results, "Your data's safe with us," right? "Your property's safe with us. "Your bank account is safe with us. "Your personal privacy is safe with us. "Your votes are safe with us. "Your news is safe with us." >> Right. >> Right? And that's the next step. >> But everyone is so cynical that, unfortunately Walter Cronkite is dead, right? >> Sure. >> We don't trust politicians anymore. We don't trust news anymore. We don't trust, now more and more, the companies. So it's a really kind of rough world in the trust space. >> Yeah! >> So do you see any kind of (chuckles) silver lining? I mean, how do we execute in this kind of crazy world where you just don't know? >> Well, what I like to say is that you have to be cautiously optimistic about this because society simply doesn't keep going without some level of trust, right? Markets depend on trust. Democracy depends on trust. Neighborhoods depend on trust, right? >> Jeff: Right. >> So either trust comes back into our lives at some deep level or everything falls apart. Frankly, those are the only choices. So if nature abhors a vacuum, and right now we have a vacuum of trust, then there's a huge opportunity for people to start stepping into that space and filling that void. So I'd like to focus on the positive potential here rather than the worst case scenario, right? The worst case scenario is, we keep going as things have been going and trust in our most important institutions continues to crumble. Well, that just ends in societal collapse >> Right, right. >> one way or the other. If we don't want to do that, and I presume that if there's anything we can all agree on, it's that that's not where we want to go. >> Right. >> Then now is the time for companies, if need be, to come together and say, "We have to step into this space "and create new trusted institutions and practices "that will help stabilize society and drive progress "in ways that aren't just reflected in GDP "but are reflected in human wellbeing, "happiness, a sense of security, a sense of hope. "A sense that technology actually does gives us a future "that we want to to be happy about moving into." >> Right, right. >> Right? >> So I'll give you the last word. >> Sure. >> We'll end on a positive note. What are some examples of companies or practices that you see out there as kind of shining lights that other people should be either aware of, emulate. Let's talk about the positive before we >> Sure. cut you lose. >> Well, one thing that I mentioned already is the AI partnership that has come together with companies that are really leading the conversation along with a lot of other organizations like AI Now, which is an organization on the East Coast that's doing a lot of fantastic work. There are a lot of companies supporting research into ethical development, design, and implementation of new technologies. That's something we haven't seen before, right? This is something that's only happened in the last two or three years. It's an incredibly positive development. Now we just have to make sure that the recommendations that are developed by these groups are actually taken onboard and implemented. And it'll be up to many of the industry leaders to set an example of how that can be done because they have the resources >> Right. >> and the ability to lead in that way. I think one of the other things that we can look at is that people are starting to become less naive about technology. Perhaps the silver lining of the loss of trust is the ability of consumers to be a little wiser, a little more appropriately critical and skeptical, and to figure out ways that they can, in fact, protect their interests. That they can actually seek out and determine who earns their trust. >> Right. >> Where their data is safest. And so I'm optimistic that there will be a sort of meeting, if you will, of the public interest and the interests of technology developers who really need the public to be on board, right? >> Jeff: Right. >> You can't make a better world if society doesn't want to come along with you. >> Jeff: Right, right. >> So my hope is, and I'm cautiously optimistic about that, that these forces will come together and create a future for us that we actually want to move into. >> All right, good. I don't want to leave on a sad note! >> Great, yes. >> Dr. Shannon Vallor, she's positive about the future. It's all about trust. Thanks for taking a few minutes. >> Thank you. >> I'm Jeff Frick, she's Dr. Shannon. Thanks for watching. We'll catch you next time. (upbeat techno music)
SUMMARY :
but really the trust and ethics conversations. So you were just on the panel, But one of the things that you brought up They're the ways that we find the people we want to marry. It means actually having to think about whether I just love that example that came into the business with the idea How are the legacy folks dealing with this? to say, "Hey we need to get on board with this, as opposed to just a sort of PR kind of thing. that they know how to go through that we could go on for a long time. And you can't really know not just the ability to monetized, but immense power. You have to make sure that the people handling it that causes someone to be unjustly accused And you have the responsibility I mean, even just the social, the social factors still haven't caught up. And that's before you even get into a plane flying So one of the things I like to tell people is, that we don't have to worry too much about the progress But then there's this other interesting thing So then you have things happen On the other hand, so much is moving to software, Well, and that means that we have to do a much better job that need a person to get involved. and the human expertise can fill in some of the gaps So they have able to step back and say but that's not necessarily only to the goal in road, right? So I think we have to recognize that you didn't really know whether you could trust, that the customer gives them. And that's the next step. in the trust space. you have to be cautiously optimistic about this So I'd like to focus on the positive potential here and I presume that if there's anything we can all agree on, if need be, to come together and say, Let's talk about the positive before we in the last two or three years. and the ability to lead in that way. and the interests of technology developers if society doesn't want to come along with you. that these forces will come together and create a future I don't want to leave on a sad note! Dr. Shannon Vallor, she's positive about the future. We'll catch you next time.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff | PERSON | 0.99+ |
Equifax | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Jeff Frick | PERSON | 0.99+ |
Vallor | PERSON | 0.99+ |
Shannon | PERSON | 0.99+ |
Walter Cronkite | PERSON | 0.99+ |
Paul Daugherty | PERSON | 0.99+ |
Shannon Vallor | PERSON | 0.99+ |
2018 | DATE | 0.99+ |
next year | DATE | 0.99+ |
next week | DATE | 0.99+ |
YouTube | ORGANIZATION | 0.99+ |
today | DATE | 0.99+ |
Newark Airport | LOCATION | 0.99+ |
Accenture | ORGANIZATION | 0.98+ |
one | QUANTITY | 0.98+ |
Santa Clara University | ORGANIZATION | 0.98+ |
both | QUANTITY | 0.97+ |
five leaders | QUANTITY | 0.96+ |
about 200 people | QUANTITY | 0.95+ |
Accenture Technology Vision 2018 | EVENT | 0.91+ |
three years | QUANTITY | 0.91+ |
East Coast | LOCATION | 0.9+ |
one way | QUANTITY | 0.89+ |
one news cycle | QUANTITY | 0.87+ |
Dr. | PERSON | 0.86+ |
CUBE | ORGANIZATION | 0.82+ |
Nigerian | OTHER | 0.72+ |
years | QUANTITY | 0.57+ |
things | QUANTITY | 0.57+ |
two | QUANTITY | 0.53+ |
last | DATE | 0.52+ |
ten | DATE | 0.49+ |
500 | QUANTITY | 0.31+ |
Vision | EVENT | 0.25+ |