Jaron Lanier, Author | PTC LiveWorx 2018
>> From Boston, Massachusetts, it's the cube. covering LiveWorx 18, brought to you by PTC. (upbeat music) >> Welcome back to the Boston Seaport everybody. My name is David Vellante, I'm here with my co-host Stu Miniman and you're watching the cube, the leader in live tech coverage. We're at LiveWorx PTC's big IOT conference. Jaron Lanier is here, he's the father of virtual reality and the author of Dawn of the New Everything. Papa, welcome. >> Hey there. >> What's going on? >> Hey, how's it going? >> It's going great. How's the show going for you? It's cool, it's cool. It's, it's fine. I'm actually here talking about this other book a little bit too, but, yeah, I've been having a lot of fun. It's fun to see how hollow lens applied to a engines and factories. It's been really cool to see people seeing the demos. Mixed reality. >> Well, your progeny is being invoked a lot at the show. Everybody's sort of talking about VR and applying it and it's got to feel pretty good. >> Yeah, yeah. It seems like a VR IoT blockchain are the sort of the three things. >> Wrap it all with digital transformation. >> Yeah, digital transformation, right. So what we need is a blockchain VR IoT solution to transform something somewhere. Yeah. >> So tell us about this new book, what it's called? >> Yeah. This is called the deleting all your social media accounts right now. And I, I realize most people aren't going to do it, but what I'm trying to do is raise awareness of how the a psychological manipulation algorithms behind the system we're having an effect on society and I think I love the industry but I think we can do better and so I'm kind of agitating a bit here. >> Well Jaron, I was reading up a little bit getting ready for the interview here and people often will attack the big companies, but you point at the user as, you know, we need to kind of take back and we have some onus ourselves as to what we use, how we use it and therefore can have impact on, on that. >> Well, you know, what I've been finding is that within the companies and Silicon Valley, a lot of the top engineering talent really, really wants to pursue ethical solutions to the problem, but feels like our underlying business plan, the advertising business plan keeps on pulling us back because we keep on telling advertisers we have yet new ways to kind of do something to tweak the behaviors of users and it kind of gradually pulls us into this darker and darker territory. The thing is, there's always this assumption, oh, it's what users want. They would never pay for something the way they pay for Netflix, they would never pay for social media that way or whatever it is. The thing is, we've never asked users, nobody's ever gone and really checked this out. So I'm going to, I'm kind of putting out there as a proposition and I think in the event that users turn out to really want more ethical social media and other services by paying for them, you know, I think it's going to create this enormous sigh of relief in the tech world. I think it's what we all really want. >> Well, I mean ad-based business models that there's a clear incentive to keep taking our data and doing whatever you want with it, but, but perhaps there's a better way. I mean, what if you're, you're sort of proposing, okay, maybe users would be willing to pay for various services, which is probably true, but what if you were able to give users back control of their data and let them monetize their data. What are your thoughts on that? >> Yeah, you know, I like a lot of different solutions, like personally, if it were just up to me, if I ran the world, which I don't, but if I ran the world, I can make every single person of the world into a micro-entrepreneur where they can package, sell and price their data the way they want. They can, they can form into associations with others to do it. And they can also purchase data from others as they want. And I think what we'd see is this flowering of this giant global marketplace that would organize itself and would actually create wonders. I really believe that however, I don't run the world and I don't think we're going to see that kind of perfect solution. I think we're going to see something that's a bit rougher. I think we might see something approximating that are getting like a few steps towards that, but I think we are going to move away from this thing where like right now if two people want to do anything on online together, the only way that's possible is if there's somebody else who's around to pay them, manipulate them sneakily and that's stupid. I mean we can be better than that and I'm sure we will. >> Yeah, I'm sure we will too. I mean we think, we think blockchain and smart contracts are a part of that solution and obviously a platform that allows people to do exactly what you just described. >> And, and you know, it's funny, a lot of things that sounded radical a few years ago are really not sounding too radical. Like you mentioned smart contracts. I remember like 10 years ago for sure, but even five years ago when you talked about this, people are saying, oh no, no, no, no, no, this, the world is too conservative. Nobody's ever going to want to do this. And the truth is people are realizing that if it makes sense, you know, it makes sense. And, and, and, and so I think, I think we're really seeing like the possibilities opening up. We're seeing a lot of minds opening, so it's kind of an exciting time. >> Well, something else that I'd love to get your thoughts on and we think a part of that equation is also reputation that if you, if you develop some kind of reputation system that is based on the value that you contribute to the community, that affects your, your reputation and you can charge more if you have a higher reputation or you get dinged if you're promoting fake news. That that reputation is a linchpin to the successful community like that. >> Well, right now the problem is because, in the free model, there's this incredible incentive to just sort of get people to do things instead of normal capitalist. And when you say buy my thing, it's like you don't have to buy anything, but I'm going to try to trick you into doing something, whatever it is. And, and, and if you ever direct commercial relationship, then the person who's paying the money starts to be a little more demanding. And the reason I'm bringing that up is that right now there's this huge incentive to create false reputation. Like in reviews, a lot of, a lot of the reviews are fake, followers a lot of them are fake instance. And so there's like this giant world of fake stuff. So the thing is right now we don't have reputation, we have fake reputation and the way to get real reputation instead of think reputation is not to hire an army of enforcing us to go around because the company is already doing that is to change the financial incentives so you're not incentivizing criminals, you know I mean, that's incentives come first and then you can do the mop up after that, but you have to get the incentives aligned with what you want. >> You're here, and I love the title of the book. We interviewed James Scott and if you know James Scott, he's one of the principals at ICIT down PTC we interviewed him last fall and we asked him, he's a security expert and we asked them what's the number one risk to our country? And he said, the weaponization of social media. Now this is, this is before fake news came out and he said 2020 is going to be a, you know, what show and so, okay. >> Yeah, you know, and I want to say there's a danger that people think this is a partisan thing. Like, you know, if you, it's not about that. It's like even if you happen to support whoever has been on, on the good side of social media manipulation, you should still oppose the manipulation. You know, like I was, I was just in the UK yesterday and they had the Brexit foot where there was manipulation by Russians and others. And you know, the point I've made over there is that it's not about whether you support Brexit or not. That's your business, I don't even have an opinion. It's not, I'm an American. That's something that's for somebody else. But the thing is, if you look at the way Brexit happened, it tore society apart. It was nasty, it was ugly, and there have been tough elections before, but now they're all like that. And there was a similar question when the, the Czechoslovakia broke apart and they didn't have all the nastiness and it's because it was before social media that was called the velvet divorce. So the thing is, it's not so much about what's being supported, whatever you think about Donald Trump or anything else, it's the nastiness. It's the way that people's worst instincts are being used to manipulate them, that's the problem. >> Yeah, manipulation denial is definitely a problem no matter what side of the aisle you're on, but I think you're right that the economic incentive if the economic incentive is there, it will change behavior. And frankly, without it, I'm not sure it will. >> Well, you know, in the past we've tried to change the way things in the world by running around in outlying things. For instance, we had prohibition, we outlawed, we outlawed alcohol, and what we did is we created this underground criminal economy and we're doing something similar now. What we're trying to do is we're saying we have incentives for everything to be fake, everything to be phony for everything to be about manipulation and we're creating this giant underground of people trying to manipulate search results or trying to manipulate social media feeds and these people are getting more and more sophisticated. And if we keep on doing this, we're going to have criminals running the world. >> Wonder if I could bring the conversation back to the virtual reality. >> Absolutely. >> I'm sorry about that. >> So, but you know, you have some concerns about whether virtual reality will be something you for good or if it could send us off the deep end. >> Oh yeah, well. Look, there's a lot to say about virtual reality. It's a whole world after all. So you can, there is a danger that if the same kinds of games are being played on smartphones these days were transferred into a virtual reality or mixed reality modalities. Like, you could really have a poisonous level of mind control and I, I do worry about that I've worried about that for years. What I'm hoping is that the smartphone era is going to force us to fix our ways and get the whole system working well enough so that by the time technologies like virtual reality are more common, we'll have a functional way to do things. And it won't, it won't all be turned into garbage, you know because I do worry about it. >> I heard, I heard a positive segment on NPR saying that one of the problems is we all stare at our phones and maybe when I have VR I'll actually be talking to actual people so we'll actually help connections and I'm curious to hear your thoughts on that. >> Well, you know, most of the mixed reality demos you see these days are person looking at the physical world and then there's extra stuff added to the physical world. For instance, in this event, just off camera over there, there's some people looking at automobile engines and seeing them augmented and, and that's great. But, there's this other thing you can do which is augmenting people and sometimes it can be fun. You can put horns or wings or long noses or something on people. Of course, you still see them with the headsets all that's great. But you can also do other stuff. You can, you can have people display extra information that they have in their mind. You can have more sense of what each other are thinking and feeling. And I actually think as a tool of expression between people in real life, it's going to become extremely creative and interesting. >> Well, I mean, we're seeing a lot of applications here. What are some of your favorites? >> Oh Gosh. Of the ones right here? >> Yes. >> Well, you know, the ones right here are the ones I described and I really like them, there's a really cool one of some people getting augmentation to help them maintain and repair factory equipment. And it's, it's clear, it's effective, it's sensible. And that's what you want, right? If you ask me personally what really, a lot of the stuff my students have done, really charms me like up, there was just one project, a student intern made where you can throw virtual like goop like paint and stuff around in the walls and it sticks and starts running down and this is running on the real world and you can spray paint the real world so you can be a bit of a juvenile delinquent basically without actually damaging anything. And it was great, it was really fun and you know, stuff like that. There was this other thing and other student did where you can fill a whole room with these representations of mathematical objects called tensors and I'm sorry to geek out, but you had this kid where all these people could work together, manipulating tensors and the social environment. And it was like math coming alive in this way I hadn't experienced before. That really was kind of thrilling. And I also love using virtual reality to make music that's another one of my favorite things, >> Talk more about that. >> Well, this is something I've been doing forever since the '80s, since the '80s. I've been, I've been at this for awhile, but you can make an imaginary instruments and play them with your hands and you can do all kinds of crazy things. I've done a lot of stuff with like, oh I made this thing that was halfway between the saxophone and an octopus once and I'll just >> Okay. >> all this crazy. I love that stuff I still love it. (mumbling) It hasn't gotten old for me. I still love it as much as I used to. >> So I love, you mentioned before we came on camera that you worked on minority report and you made a comment that there were things in that that just won't work and I wonder if you could explain a little bit more, you know, because I have to imagine there's a lot of things that you talked about in the eighties that, you know, we didn't think what happened that probably are happening. Well, I mean minority report was only one of a lot of examples of people who were thinking about technology in past decades. Trying to send warnings to the future saying, you know, like if you try to make a society where their algorithms predicting what'll happen, you'll have a dystopia, you know, and that's essentially what that film is about. It uses sort of biocomputer. They're the sort of bioengineered brains in these weird creatures instead of silicon computers doing the predicting. But then, so there are a lot of different things we could talk about minority report, but in the old days one of the famous VR devices which these gloves that you'd use to manipulate virtual objects. And so, I put a glove in a scene mockup idea which ended up and I didn't design the final production glove that was done by somebody in Montreal, but the idea of putting a glove a on the heroes hand there was that glove interfaces give you arm fatigue. So the truth is if you look at those scenes there physically impossible and what we were hoping to do is to convey that this is a world that has all this power, but it's actually not. It's not designed for people. It actually wouldn't work in. Of course it kind of backfired because what happened is the production designers made these very gorgeous things and so now every but every year somebody else tries to make the minority report interface and then you discover oh my God, this doesn't work, you know, but the whole point was to indicate a dystopian world with UI and that didn't quite work and there are many other examples I could give you from the movie that have that quality. >> So you just finished the book. When did this, this, this go to print the. >> Yeah, so this book is just barely out. It's fresh from the printer. In fact, I have this one because I noticed a printing flaw. I'm going to call the publisher and say, Oh, you got to talk to the printer about this, but this is brand new. What happened was last year I wrote a kind of a big book of advert triality that's for real aficionados and it's called Dawn of the new everything and then when I would go and talk to the media about it they'd say, well yeah, but what about social media? And then all this stuff, and this was before it Cambridge Analytica, but people were still interested. So I thought, okay, I'll do a little quick book that addresses what I think about all that stuff. And so I wrote this thing last year and then Cambridge Analytica happened and all of a sudden it's, it seems a little bit more, you know, well timed >> than I could have imagined >> Relevant. So, what other cool stuff are you working on? >> I have to tell you something >> Go ahead. >> This is a real cat. This is a black cat who is rescued from a parking lot in Oakland, California and belongs to my daughter. And he's a very sweet cat named Potato. >> Awesome. You, you're based in Northern California? >> Yeah, yeah, yeah. >> Awesome And he was, he was, he was an extra on the set of, of the Black Panther movie. He was a stand-in for like a little mini black panthers. >> What other cool stuff are you working on? What's next for you? >> Oh my God, there's so much going on. I hardly even know where to begin. There's. Well, one of the things I'm really interested in is there's a certain type of algorithm that's really transforming the world, which is usually called machine learning. And I'm really interested in making these things more transparent and open so it's less like a black box. >> Interesting. Because this has been something that's been bugging me you know, most kinds of programming. It might be difficult programming, but at least the general concept of how it works is obvious to anyone who's program and more and more we send our kids to coding camps and there's just a general societal, societal awareness of what conventional programming is like. But machine learning has still been this black box and I view that as a danger. Like you can't have society run by something that most people feel. It's like this black box because it'll, it'll create a sense of distrust and, and, I think could be, you know, potentially quite a problem. So what I want to try to do is open the black box and make it clear to people. So that's one thing I'm really interested in right now and I'm, oh, well, there's a bunch of other stuff. I, I hardly even know where to begin. >> The black box problem is in, in machine intelligence is a big one. I mean, I, I always use the example I can explain, I can describe to you how I know that's a dog, but I really can't tell you how I really know it's a dog. I know I look at a dog that's a dog, but. Well, but, I can't really in detail tell you how I did that but it isn't AI kind of the same way. A lot of AI. >> Well, not really. There's, it's a funny thing right now in, in, in the tech world, there are certain individuals who happen to be really good at getting machine language to work and they get very, very well paid. They're sort of like star athletes. But the thing is even so there's a degree of almost like folk art to it where we're not exactly sure why some people are good at it But even having said that, we, it's wrong to say that we have no idea how these things work or what we can certainly describe what the difference is between one that fails and that's at least pretty good, you know? And so I think any ordinary person, if we can improve the user interface and improve the way it's taught any, any normal person that can learn even a tiny bit of programming like at a coding camp, making the turtle move around or something, we should be able to get to the point where they can understand basic machine learning as well. And we have to get there. All right in the future, I don't want it to be a black box. It doesn't need to be. >> Well basic machine learning is one thing, but how the machine made that decision is increasingly complex. Right? >> Not really it's not a matter of complexity. It's a funny thing. It's not exactly complexity. It has to do with getting a bunch of data from real people and then I'm massaging it and coming up with the right transformation so that the right thing spit out on the other side. And there's like a little, it's like to me it's a little bit more, it's almost like, I know this is going to sound strange but it's, it's almost like learning to dress like you take this data and then you dress it up in different ways and all of a sudden it turns functional in a certain way. Like if you get a bunch of people to tag, that's a cat, that's a dog. Now you have this big corpus of cats and dogs and now you want to tell them apart. You start playing with these different ways of working with it. That had been worked out. Maybe in other situations, you might have to tweak it a little bit, but you can get it to where it's very good. It can even be better than any individual person, although it's always based on the discrimination that people put into the system in the first place. In a funny way, it's like Yeah, it's like, it's like a cross between a democracy and a puppet show or something. Because what's happening is you're taking this data and just kind of transforming it until you find the right transformation that lets you get the right feedback loop with the original thing, but it's always based on human discrimination in the first place so it's not. It's not really cognition from first principles, it's kind of leveraging data, gotten from people and finding out the best way to do that and I think really, really work with it. You can start to get a two to feel for it. >> We're looking forward to seeing your results of that work Jared, thanks for coming on the cube. You're great guests. >> Really appreciate it >> I really appreciate you having me here. Good. Good luck to all of you. And hello out there in the land that those who are manipulated. >> Thanks again. The book last one, one last plug if I may. >> The book is 10 arguments for deleting your social media accounts right now and you might be watching this on one of them, so I'm about to disappear from your life if you take my advice. >> All right, thanks again. >> All right. Okay, keep it right there everybody. We'll be back with our next guest right after this short break. You're watching the cube from LiveWorx in Boston. We'll be right back. (upbeat music)
SUMMARY :
brought to you by PTC. and the author of Dawn see people seeing the demos. and applying it and it's are the sort of the three things. Wrap it all with to transform something somewhere. This is called the deleting but you point at the user as, a lot of the top engineering talent and doing whatever you want with it, Yeah, you know, to do exactly what you just described. And, and you know, it's funny, and you can charge more if and then you can do the mop up after that, and if you know James Scott, But the thing is, if you look that the economic incentive Well, you know, in the past bring the conversation So, but you know, and get the whole system that one of the problems is But, there's this other thing you can do a lot of applications here. Of the ones right here? and you know, stuff like that. and you can do all kinds of crazy things. I love that stuff So the truth is if you So you just finished the book. and it's called Dawn of the new everything stuff are you working on? and belongs to my daughter. You, you're based in Northern California? of the Black Panther movie. Well, one of the things and, and, I think could be, you know, but it isn't AI kind of the same way. and that's at least pretty good, you know? but how the machine made that decision and then you dress it up in different ways Jared, thanks for coming on the cube. you having me here. The book last one, and you might be watching right after this short break.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jaron Lanier | PERSON | 0.99+ |
David Vellante | PERSON | 0.99+ |
Jaron | PERSON | 0.99+ |
Jared | PERSON | 0.99+ |
James Scott | PERSON | 0.99+ |
Stu Miniman | PERSON | 0.99+ |
Dawn of the New Everything | TITLE | 0.99+ |
Boston | LOCATION | 0.99+ |
Montreal | LOCATION | 0.99+ |
10 arguments | QUANTITY | 0.99+ |
last year | DATE | 0.99+ |
two people | QUANTITY | 0.99+ |
Northern California | LOCATION | 0.99+ |
yesterday | DATE | 0.99+ |
Donald Trump | PERSON | 0.99+ |
UK | LOCATION | 0.99+ |
ICIT | ORGANIZATION | 0.99+ |
Oakland, California | LOCATION | 0.99+ |
Netflix | ORGANIZATION | 0.99+ |
10 years ago | DATE | 0.99+ |
Black Panther | TITLE | 0.99+ |
PTC | ORGANIZATION | 0.99+ |
Brexit | EVENT | 0.99+ |
Boston, Massachusetts | LOCATION | 0.99+ |
last fall | DATE | 0.99+ |
five years ago | DATE | 0.99+ |
LiveWorx | ORGANIZATION | 0.98+ |
NPR | ORGANIZATION | 0.98+ |
2020 | DATE | 0.98+ |
one | QUANTITY | 0.98+ |
one project | QUANTITY | 0.98+ |
three things | QUANTITY | 0.97+ |
Silicon Valley | LOCATION | 0.97+ |
first principles | QUANTITY | 0.97+ |
two | QUANTITY | 0.97+ |
one thing | QUANTITY | 0.96+ |
first place | QUANTITY | 0.96+ |
'80s | DATE | 0.95+ |
eighties | DATE | 0.95+ |
few years ago | DATE | 0.89+ |
past decades | DATE | 0.88+ |
LiveWorx 18 | COMMERCIAL_ITEM | 0.88+ |
Boston Seaport | LOCATION | 0.85+ |
Potato | PERSON | 0.81+ |
Russians | PERSON | 0.79+ |
IOT | EVENT | 0.78+ |
Cambridge Analytica | TITLE | 0.77+ |
first | QUANTITY | 0.73+ |
turtle | PERSON | 0.73+ |
Czechoslovakia | ORGANIZATION | 0.68+ |
single person | QUANTITY | 0.68+ |
years | QUANTITY | 0.68+ |
American | OTHER | 0.67+ |
LiveWorx PTC | ORGANIZATION | 0.64+ |
2018 | DATE | 0.62+ |
Cambridge Analytica | ORGANIZATION | 0.6+ |
once | QUANTITY | 0.57+ |
LiveWorx | EVENT | 0.56+ |
Emer Coleman, Disruption - Hadoop Summit 2016 Dublin - #HS16Dublin - #theCUBE
>> Narrator: Live from Dublin, Ireland. It's theCUBE, covering Hadoop Summit Europe 2016. Brought to you by Hortonworks. Now your host, John Furrier and Dave Vellante. >> Okay, welcome back here, we are here live in Dublin, Ireland, it's theCUBE SiliconANGLEs flagship program where we go out to the events and extract the signal from the noise, I'm John Furrier, my cohost Dave Vellante, our next guest is Emer Coleman who's with Disruption Limited, Open Data Governance Board in Ireland and Transport API, a growing startup built self-sustainable, growing business, open data, love that keynote here at Hadoop Summit, very compelling discussion around digital goods, digital future. Emer, welcome to theCUBE. >> It's great to be here. >> So what was your keynote? Let's just quickly talk about what you talked about, and then we can get in some awesome conversation. >> Sure. So the topic yesterday was we need to talk about techno ethics. So basically, over the last couple of months, I've been doing quite a lot of research on ethics and technology, and many people have different interpretations of that, but yesterday I said it's basically about three things. It's about people, it's about privacy, and it's about profits. So it's asking questions about how do we look at holistic technology development that moves away from a pure technocratic play and looks at the deep societal impacts that technology has. >> One of the things that we're super excited about and passionate about is this new era of openness going to a whole another level. Obviously, open source tier one software development environment, cloud computing allows for instant access to resources, almost limitless at this point, as you can project it forward with Moore's Law and whatnot. But the notion that digital assets are not just content, it's data, it's people, it's the things you mentioned about, create a whole new operating environment or user experience, user expectations with mobile phones and Internet of Things and Transport API which you have, if it moves, you capture it, and you're providing value there. So a whole new economy is developing around digital capital. Share your thoughts around this, because this is an area that you're passionate about, you've just done work here, what's your thoughts on this new digital economy, digital capital, digital asset opportunity? >> I think there's huge excitement about the digital economy, isn't there? And I think one of the things I'm concerned about is that that excitement will lead us to the same place that we are now, where we're not really thinking through what are the equitable distribution in that economy, because it seems to me that the spoils are going to a very tiny elite at the tops. So if you look at Instagram, 13 employees when it was purchased by Facebook for a billion dollars, but that's all our stuff, so I'm not getting any shares in the billion, those 13 people are. That's fantastic that you can build a business, build it to that stage and sell, but you have to think about two things, really: what are we looking at in terms of sustainable businesses into the future that create ethical products, and also the demands from citizens to get some value for their data back, because we're becoming shadow employees, we're shadow employees of Google, so when we email, we're not just corresponding, we're creating value for that company. >> And Facebook is a great example. >> And Facebook, and the thing is, when we were at the beginning of that digital journey, it was quite naive. So we were very seduced by free, and we thought, "This is great," and so we're happy with the service. And then the next stage of that, we realize what if we're not paying for the service, we're the product? >> John: Yeah. >> But we were too embedded in the platform to extricate ourselves. But now, I think, when we look at the future of work and great uncertainty that people are facing, when their labor's not going to be required to the same degree, are we going to slavishly keep producing capital and value for companies like Google, and ask for nothing more than the service in return? I don't think so. >> And certainly, the future will be impacted, and one of the things we see now in our business of online media and online open data, is that the data's very valuable. We see that, I'll say data is the new capital, new oil, whatever phrases of the day is used, and the brand marketers are the first ones to react to it, 'cause they're very data driven. Who are you, how do I sell stuff to you? And so what we're seeing is, brand marketers are saying, "Hey, I'm going to money to try to reach out to people, "and I'm going to activate that base and connect with, "engage with them on Facebook or other platform. "I'm going to add value to your Facebook or Google platform, "but yet I'm parasitic to your platform for the data. "Why just don't I get it directly?" So again, you're starting to see that thinking where I don't want to be a parasite or parasitic to a network that the value's coming from. The users have not yet gotten there, and you're teasing that out. What's your thoughts there, progression, where we're at, have people realized this? Have you seen any movement in the industry around this topic? >> No, I think there's a silence around... Technology companies want to get all the data they can. They're not going to really declare as much as they should, because it bends their service model a bit. Also, the data is emergent. Zuckerberg didn't start Facebook as something that was going to be a utility for a billion people, he started it as a social network for a university. And what grew out of that, we learned as we went along. So I'm thinking, now that we have that experience, we know that happens, so let's start the thinking now. And also, this notion of just taking data because you can, almost speculatively getting data at the point of source, without even knowing what you want it for but thinking, "I'm going to monetize this in the end." Jaron Lanier in his book Who Owns The Future talks about micro licensing back content. And I think that's what we need to do. We start, at the very beginning, we need to start baking in two things: privacy by design and different business models where it's not a winner takes all. It's a dialog between the user and the service, and that's iterated together. >> This idea that it's not a zero sum game is very important, and I want to go back to your Instagram and Facebook example. At its peak, I think Eastman Kodak had hundreds of thousands of employees, maybe four or five hundred, 450,000 employees, huge. Facebook has many many more photos, but maybe a few thousand employees? Wow, so all the jobs are gone, but at the same time, we don't want to be protecting the past from the future, so how do you square that circle? >> Correct, but I think what we know is that the rise of robotics and software is going to eat jobs, and basically, there's going to be a hollowing out of the middle class. You know, for sure, whether it's medicine, journalism, retail, exactly. >> Dave: It's not future, it's now. (laughs) >> Exactly. So we maybe come into a point where large swaths of people don't have work. Now, what do you do in a world where your labor is no longer required? Think about the public policy implications of that. Do we say you either fit in this economy or you die? Are we going to look at ideas which they are looking at in Europe, which is like a universal wage? And all of these things are a challenge to government, because they're going to have a citizenry who are not included in this brave new world. So some public policy thinking has to go into what happens when our kids can't get jobs. When the jobs that used to be done by people like us are done by machines. I'm not against the movement of technology, what I'm saying is there are deep societal implications that need some thinking, because if we get to a point where we suddenly realize, if all of these people who are unemployed and can't get work, this isn't a future we envisioned where robots would take all the crap jobs and we would go off to do wonderful things, like how are we going to bring the bacon home? >> It seems like in a digital world that the gap is creativity to combine technologies and knowledge. I find that it's scary when you talk about maybe micromanaging wages and things like that, education is the answer, but that's... How do you just transfer that knowledge? That's sort of the discussion that we're having in the United States anyway. >> I think some of the issue is that the technology is so, we're kind of seduced by simplicity. So we don't see the complexity underneath, and that's the ultimate aim of a technology, is to make something so simple, that complexity is masked. That's what the iPhone did wonderfully. But that's actually how society is looking now. So we're seduced by this simplicity, we're not seeing the complexity underneath, and that complexity would be about what do we do in a world where our labor is no longer required? >> And one of the things that's interesting about the hollowing of the middle class is the assumption is there's no replacements, so one of the things that could be counter argued is that, okay, as the digital natives, my daughter, she's a freshman in high school, my youngest son's eighth grade, they're natives now, so they're going to commit. So what is the replacement capital and value for companies that can be sustained in the new economy versus the decay and the darwinism of the old? So the digital darwinism aspect's interesting, that's one dilemma. The other one is business models, and I want to get your thoughts on this 'cause this is something we were teasing out with this whole value extraction and company platform issue. A company like Twitter. Highly valuable company, it's a global network of people tweeting and sharing, but yet is under constant pressure from Wall Street and investors that they basically suck. And they don't, they're good, people love Twitter, so they're being forced to behave differently against their mission because their profit motive doesn't really match maybe something like Facebook, so therefore they're instantly devalued, yet the future of someone connecting on Twitter is significantly high. That being said, I want to get your thoughts on that and your advice to Twitter management, given the fact it is a global network. What should they do? >> It's the same old capitalism, just it's digital, it's a digital company, it's a digital asset. It's the same approach, right? Twitter has been a wonderful thing. I've been a Twitter user for years. How amazing, it's played a role in the Arab Spring, all sorts of things. So they're really good, but I think you need as a company, so for example, in our company, in Transport API, we're not really looking to build to this massive IPO, we're trying to build a sustainable company in a traditional way using digital. So I think if you let yourself be seduced by the idea of phenomenal IPO, you kind of take your eye off the ball. >> Or in case this, in case you got IPOed, now you're under pressure to produce-- >> Emer: Absolutely, yeah. >> Which changes your behavior. But in Twitter's management defense, they see the value of their product. Now, they got there by accident and everyone loves it, but now they're not taking the bait to try to craft a short term solution to essentially what is already a valuable product, but not on the books. >> Yes, and also I think where the danger is, we know that their generation shifts across channel. So teenagers probably look at Facebook, I think one of them said, like an awkward family dinner they can't quite leave. But for next gen, they're just not going to go there, 'cause that's where your grandmother is. So the same is true of Twitter and Snapchat, these platforms come and go. It's an interesting phenomenon then to see Wall Street putting that much money into something which is essentially quite ephemeral. I'm not saying that Twitter won't be around for years, it may be, but that's the thing about digital, isn't it? Something else comes in and it's well, that becomes the platform of choice. >> Well, it's interesting, right? Everybody, us included, we criticize the... Michael Dell calls it the 90 day shock clock. But it's actually worked out pretty well, I mean, economically, for the United States companies. Maybe it doesn't in the future. What are your thoughts on that, particularly from a European perspective? Where you're reporting maybe twice a year, there's not as much pressure, but yet from a technology industry standpoint, companies outside the Silicon Valley in particular seem to be less competitive, why? >> For example, in our company, in Transport API, we've got some pretty heavyweight clients, we have a wonderful angel investor who has given us two rounds of investment. And it isn't that kind of avaricious absolutely built this super price. And that's allowed us to build from starting off with 2, now to a team of 10, and we're just about coming into break even, so it's doable. But I think it's a philosophy. We didn't want necessarily to build something huge, although we want to go global, but it was let's do this in a sustainable way with reasonable wages, and we've all put our own soul and money into it, but it's a different cultural proposition, I think. >> Well, the valuations always drive the markets. It's interesting too, to your point about things come and go channels, kind of reminds me, Dave and I used to joke about social networks like nightclubs, they're hot and then it's just too crowded and nobody goes there, as Yogi Bear would say. And then they shift and they go out of business, some don't open with fanfare, no one goes 'cause it's got different context. You have a contextual challenge in the world now. Technology can change things, so I want to ask you about identity 'cause there was a great article posted by the founder of the company called Secret which is one of these anonymous apps like Yik Yak and whatnot, and he shut it down. And he wrote a post, kind of a postmortem, saying, "These things come and go, they don't work, "they're not sustainable because there's no identity." So the role of identity in a social global virtual world, virtual being not just virtual reality, is interesting. You live in a world, and your company, Transport API, provides data which enables stuff and the role of identity. So anonymous versus identity, thoughts there, and that impact to the future of work? If you know who you're dealing with, and if they're present, these are concepts that are now important, presence, identity, attention. >> And that's the interesting thing, isn't it? Who controls that identity? Mark Zuckerberg said, "You only have one identity," which is what he said when he set up Facebook. You think, really? No, that's what a young person thinks. When we're older, we know. >> He also said that young people are smarter than older people. >> Yeah, right, okay. (John laughs) He could be right there, he could be right there, but we all have different identities in different parts of our lives. Who we are here, the Hadoop summit is different from what we're at home to when we're with friends. So identity is a multifaceted thing. But also, who gets to determine your identity? So I have 16 years of my search life and Google. Now, who am I in that server, compared to who I am? I am the sum total of my searches. But I'm not just the sum total of my searches, am I? Or even that contextualized, so I'll give you an example. A number of years ago I was searching for a large, very large waterproof plastic bag. And I typed it in, and I thought, "Oh my god, that sounds like I'm going to murder my husband "and try to bury him." (John and Dave laugh) It was actually-- >> John: Into the compost. >> Right, right. And I thought, "Oh my god, what does this look like "on the other side?" Now, it was actually for my summer garden furniture. But the point is, if you looked at that in an analytic way, who would I be? And so I think identity is very, you know-- >> John: Mistaken. >> Yeah, and also this idea of what Frank Pasquale calls the black box society. These secret algorithms that are controlling flows of money and information. How do they decide what my identity is? What are the moral decisions that they make around that? What does it say if I search for one thing over another? If I search constantly for expensive shoes, does that make me shallow? What do these things say? If I search for certain things around health. >> And there's a value judgment now associated with that that you're talking about, that you do not control. >> Absolutely, and which is probably linked to other things which will determine things like whether I get credit or not, but these can almost be arbitrary decisions, 'cause I have no oversight of the logic that's creating that decision making algorithm. So I think it's not just about identity, it's about who's deciding what that identity is. >> And it's also the reality that you're in, context, situations. Dark side, bright side of technology in this future where this new digital asset economy, digital capital. There's going to be good and bad, education can be consumed non-linear, new forms of consumptions, metadata, as you're pointing out, with the algorithms. Where do you see some bright spots and where do you see the danger areas? >> I think the great thing is, when you were saying software is the future. It's our present, but it's going to be even more so in our future. Some of the brightest brains in the world are involved in the creation of new technology. I just think they need to be focusing a bit more of that intellectual rigor towards the impact they're having on society and how they could do it better. 'Cause I think it's too much of a technocratic solution. Technologists say, "We can do this." The questions is, should they? So I think what we need to do is to loop them back into the more social and philosophical side of the discussion. And of course it's a wonderful thing, hopefully technology is going to do amazing things around health. We can't even predict how amazing it's going to be. But all I'm saying is that, if we don't ask the hard questions now about the downsides, we're going to be in a difficult societal position. But I'm hoping that we will, and I'm hoping that raising issues like techno ethics will get more of that discussion going. >> Well, transparency and open data make a big difference. >> Emer: Absolutely. >> Well, and public policy, as you said earlier, can play a huge role here. I wonder if you could give us your perspective on... Public policy, we're in the US most of the time, but it's interesting when we talk to customers here. To hear about the emphasis, obviously, on privacy, data location and so forth, so in the digital world, do you see Europe's emphasis and, I think, leading on those types of topics as an advantage in a digital world, or does it create friction from an economic standpoint? >> Yeah, but it's not all about economics. Friction is a good thing. There are some times when friction is a good thing. Most technologists think all friction is bad. >> Sure, and I'm not implying that it's necessarily good or bad, I'm curious though, is it potentially an economic advantage to have thought through and have policy on some of those issues? >> Well, what we're seeing here-- >> Because I feel like the US is a ticking time bomb on a lot of these issues. >> I was talking to VCs, some VC friends of mine here in the UK, and what they said they're seeing more and more, VCs asking what we call SMEs, small to medium enterprises, about their data policies, and SMEs not being able to answer those questions, and VCs getting nervous. So I think over time it's going to be a competitive advantage that we've done that homework, that we're basically not just rushing to get more users, but that we're looking at it across the piece. Because, fundamentally, that's more sustainable in the longer term. People will not be dumb too forever. They will not, and so doing that thinking now, where we work with people as we create our technology products, I think it's more sustainable in the long term. When you look at economics, sustainability is really important. >> I want to ask you about the Transport API business, 'cause in the US, same thing, we've seen some great openness of data and amazing innovations that have come out of nowhere. In some cases, unheard of entrepreneurs and/or organizations that better society for the betterment of people, from delivering healthcare to poor areas and whatnot. What has been the coolest thing, or of things you've seen come out of your enablement of the transport data. Use cases, have you seen any things that surprised you? >> It's quite interesting, because when I worked for the mayor of London as his director of digital projects, my job was to set up the London data store, which was to open all of London's public sector data. So I was kind of there from the beginning as a lobbyist, and when I was asking agencies to open up their data, they'd go, "What's the ROI?" And I'd just say, "I don't know." Because government's one and oh, I'm saying that was a chicken and egg, you got to put it out there. And we had a funny incident where some of the IT staff in transport for London accidentally let out this link, which is to the tracker net feed, and that powers the tube notice boards that says, "Your next tube is in a minute," whatever. And so the developer community went, "Ooh, this is interesting." >> John: Candy! >> Yeah, and of course, we had no documentation with it because it kind of went out under the radar. And one developer called Mathew Somerville made this map which showed the tubes on a map in real time. And it was like surfacing the underground. And people just thought, "Oh my god, that is amazing." >> John: It's illuminating. >> Yeah. It didn't do anything, but it showed the possibility. The newspapers picked it up, it was absolutely brilliant example, and the guy made it in half a day. And that was the first time people saw their transport system kind of differently. So that was amazing, and then we've seen hundreds of different applications that are being built all the time. And what we're also seeing is integration of transport data with other things, so one of our clients in Transport API is called Toothpick, and they're an online dental booking agency. And so you can go online, you can book your dental appointment with your NHS dentist, and then they bake in transport information to tell you how to get there. So we have pubs using them, and screens so people can order their dinner, and then they say, "You've got 10 minutes till the next bus." So all sorts of cross-platform applications. >> That you never could've envisioned. >> Emer: Never. >> And it's just your point earlier about it's not a zero sum game, you're giving so many ways to create value. >> Emer: Right, right. >> Again, I come back to this notion of education and creativity in the United States education system, so unattainable for so many people, and that's a real concern, and you're seeing the middle class get hollowed out. I think the stat is, the average wage in the United States was 55,000 in 1999, it's 50,000 today. The political campaigns are obviously picking at that scab. What's the climate like in Europe from that standpoint? >> In terms of education? >> No, just in terms of, yes, the education, middle class getting hollowed out, the sentiment around that. >> I don't think people are up to speed with that yet, I really don't think that they're aware of the scale. I think when they think robots or automation, they don't really think software. They think robots like there were in the movies, that would come, as I say, and do those jobs nobody wanted. But not like software. So when I say to them, look, E-discovery software, when it's applied retrospectively, what it shows is that human lawyers are only 60% accurate compared to it. Now, that's a no-brainer, right? If software is 100% accurate, I'm going to use the software. And the ratio difference is 1 to 500. Where you needed 500 lawyers before you need 1. So I don't think people are across the scale of change. >> But it's interesting, you're flying to Heathrow, you fly in and out, you're dealing with a kiosk. You drive out, the billboards are all electronic. There aren't guys doing this anymore. So it's tangible. >> And I think, to your point about education, I'm not as familiar with the education system in the US, but I certainly think, in Europe and in the UK, the education system is not capable of dealing even with the latest digital natives. They're still structuring their classrooms in the same way. These kids, you know-- >> John: They have missed the line with the technology. >> Absolutely. >> So reading, writing and arithmetic, fine. And the cost of education is maybe acceptable. But they may be teaching the wrong thing. >> Asynchronous non-linear, is the thing. >> There's a wonderful example of an Indian academic called Sugata Mitra, who has a fabulous project called a Hole in the Wall. And he goes to non-English speaking little Indian villages, and he builds a computer, and he puts a roof over it so only the children can do it. They don't speak English. And he came back, and he leaves a little bit of stuff they have to get around before they can play a game. And he came back six months later, and he said to them, "What did you think?" And one of the children said, "We need a faster CPU and a better mouse." Now, his point is self-learning, once you have access to technology, is amazing, and I think we have to start-- >> Same thing with the non-linear consumption, asynchronous, all this, the API economy enabling new kinds of expectation and opportunities. >> And it was interesting because the example, some UK schools tried to follow his example. And six months later, they rang him up and they said, "It's not working," and he said, "What did you do?" And they said, "Well, we got every kid a laptop." He said, "That's not the point." The point was putting a scarce resource that the children had to collaborate over. So in order to get to the game, they had figure out certain things. >> I think you're right on some of these (mumbles) that no one's talking about. And Dave and I are very passionate on this, and we're actually investing in a whole new e-learning concept. But it's not about doing that laptop thing or putting courseware online. That's old workflow in a new model. Come on, old wine in a new bottle. So that's interesting. I want to get your thoughts, so a personal question to end this segment. What are you passionate about now, what are you working, outside of the venture, which is exciting. You have a lot of background going back to technology entrepreneurship, public policy, and you're in the front lines now, thought leading on this whole new wide open sea of opportunity, confusion, enabling it. What are you passionate about, what are you working on? Share with the folks that are watching. >> So one of the main things we're trying to do. I work as an associate with Ernst & Young in London. And we've been having discussions over the past couple of months around techno ethics, and I've basically said, "Look, let's see if we can get EY "to build to build an EY good governance index." Like, what does good governance look like in this space, a massively complex area, but what I would love is if people would collaborate with us on that. If we could help to draw up an ethical framework that would convene the technology industry around some ethical good governance issues. So that's what I'm going to be working on as hard as I can over the next while, to try and get as much collaboration from the community, because I think we'd be so much more powerful if the technology industry was to say, "Yeah, let's try and do this better "rather than waiting for regulation," which will come, but will be too clunky and not fit for purpose. >> And which new technology that's emerging do you get most excited about? >> Hmm. Drones. (laughter) >> How about anything with bitcoin, block chains? >> Absolutely, absolutely, block chain. Yeah, block chain, you have to say, yeah. I think, 'cause bitcoin, you know, it's worth 20 p today, it's worth 200,000 tomorrow. >> Dave: Yeah, but block chain. >> Right, right. I mean, that is incredible potentiality. >> New terms like federated, that's not a new term, but federation, universal, unification. These are the themes right now. >> Emer: Well, it's like the road's been coated, isn't it? And we don't know where it's going to go. What a time we live in, right? >> Emer Coleman, thank you so much for spending your time and joining us on theCUBE here, we really appreciate the conversation. Thanks for sharing that great insight here on theCUBE, thank you. It's theCUBE, we are live here in Dublin, Ireland. I'm John Furrier with Dave Vellante. We'll we right back with more SiliconANGLEs, theCUBE and extracting the signal from the noise after this short break. (bright music)
SUMMARY :
Brought to you by Hortonworks. and extract the signal from the noise, and then we can get in and looks at the deep societal impacts the things you mentioned about, the spoils are going to And Facebook, and the thing is, embedded in the platform and one of the things we see now get all the data they can. Wow, so all the jobs are is that the rise of robotics and software Dave: It's not future, I'm not against the education is the answer, but that's... and that's the ultimate And one of the things It's the same old but not on the books. that becomes the platform of choice. Maybe it doesn't in the future. And it isn't that kind of avaricious and that impact to the future of work? And that's the He also said that young people But I'm not just the sum But the point is, if you looked at that What are the moral decisions that you do not control. 'cause I have no oversight of the logic And it's also the reality Some of the brightest brains in the world Well, transparency and open so in the digital world, Yeah, but it's not all about economics. Because I feel like the in the UK, and what they said 'cause in the US, same thing, and that powers the tube notice boards Yeah, and of course, we and the guy made it in half a day. And it's just your point earlier about and creativity in the United the sentiment around that. And the ratio difference is 1 to 500. You drive out, the billboards And I think, to your the line with the technology. And the cost of education And one of the children said, of expectation and opportunities. that the children had to collaborate over. outside of the venture, So one of the main I think, 'cause bitcoin, you I mean, that is incredible potentiality. These are the themes right now. Emer: Well, it's like the the signal from the noise
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Jaron Lanier | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
Emer Coleman | PERSON | 0.99+ |
55,000 | QUANTITY | 0.99+ |
Disruption Limited | ORGANIZATION | 0.99+ |
US | LOCATION | 0.99+ |
10 minutes | QUANTITY | 0.99+ |
John Furrier | PERSON | 0.99+ |
four | QUANTITY | 0.99+ |
100% | QUANTITY | 0.99+ |
Mark Zuckerberg | PERSON | 0.99+ |
UK | LOCATION | 0.99+ |
1999 | DATE | 0.99+ |
ORGANIZATION | 0.99+ | |
Frank Pasquale | PERSON | 0.99+ |
Ernst & Young | ORGANIZATION | 0.99+ |
Zuckerberg | PERSON | 0.99+ |
Emer | PERSON | 0.99+ |
200,000 | QUANTITY | 0.99+ |
London | LOCATION | 0.99+ |
16 years | QUANTITY | 0.99+ |
Open Data Governance Board | ORGANIZATION | 0.99+ |
Heathrow | LOCATION | 0.99+ |
Michael Dell | PERSON | 0.99+ |
1 | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
ORGANIZATION | 0.99+ | |
Silicon Valley | LOCATION | 0.99+ |
50,000 | QUANTITY | 0.99+ |
John Furrier | PERSON | 0.99+ |
Sugata Mitra | PERSON | 0.99+ |
500 lawyers | QUANTITY | 0.99+ |
Dublin, Ireland | LOCATION | 0.99+ |
yesterday | DATE | 0.99+ |
United States | LOCATION | 0.99+ |
Dublin, Ireland | LOCATION | 0.99+ |
Who Owns The Future | TITLE | 0.99+ |
two things | QUANTITY | 0.99+ |
tomorrow | DATE | 0.99+ |
today | DATE | 0.99+ |
20 p | QUANTITY | 0.99+ |
two rounds | QUANTITY | 0.99+ |
13 people | QUANTITY | 0.99+ |
half a day | QUANTITY | 0.99+ |
Ireland | LOCATION | 0.99+ |
iPhone | COMMERCIAL_ITEM | 0.99+ |
NHS | ORGANIZATION | 0.99+ |
90 day | QUANTITY | 0.99+ |
United States | LOCATION | 0.99+ |
one | QUANTITY | 0.99+ |
13 employees | QUANTITY | 0.98+ |
English | OTHER | 0.98+ |
billion | QUANTITY | 0.98+ |
500 | QUANTITY | 0.98+ |
Hadoop Summit | EVENT | 0.98+ |
six months later | DATE | 0.98+ |