Empowerment Through Inclusion | Beyond.2020 Digital
>>Yeah, yeah. >>Welcome back. I'm so excited to introduce our next session empowerment through inclusion, reimagining society and technology. This is a topic that's personally very near and dear to my heart. Did you know that there's only 2% of Latinas in technology as a Latina? I know that there's so much more we could do collectively to improve these gaps and diversity. I thought spot diversity is considered a critical element across all levels of the organization. The data shows countless times. A diverse and inclusive workforce ultimately drives innovation better performance and keeps your employees happier. That's why we're passionate about contributing to this conversation and also partnering with organizations that share our mission of improving diversity across our communities. Last beyond, we hosted the session during a breakfast and we packed the whole room. This year, we're bringing the conversation to the forefront to emphasize the importance of diversity and data and share the positive ramifications that it has for your organization. Joining us for this session are thought spots Chief Data Strategy Officer Cindy Housing and Ruhollah Benjamin, associate professor of African American Studies at Princeton University. Thank you, Paola. So many >>of you have journeyed with me for years now on our efforts to improve diversity and inclusion in the data and analytic space. And >>I would say >>over time we cautiously started commiserating, eventually sharing best practices to make ourselves and our companies better. And I do consider it a milestone. Last year, as Paola mentioned that half the room was filled with our male allies. But I remember one of our Panelists, Natalie Longhurst from Vodafone, suggesting that we move it from a side hallway conversation, early morning breakfast to the main stage. And I >>think it was >>Bill Zang from a I G in Japan. Who said Yes, please. Everyone else agreed, but more than a main stage topic, I want to ask you to think about inclusion beyond your role beyond your company toe. How Data and analytics can be used to impact inclusion and equity for the society as a whole. Are we using data to reveal patterns or to perpetuate problems leading Tobias at scale? You are the experts, the change agents, the leaders that can prevent this. I am thrilled to introduce you to the leading authority on this topic, Rou Ha Benjamin, associate professor of African studies at Princeton University and author of Multiple Books. The Latest Race After Technology. Rou ha Welcome. >>Thank you. Thank you so much for having me. I'm thrilled to be in conversation with you today, and I thought I would just kick things off with some opening reflections on this really important session theme. And then we could jump into discussion. So I'd like us to as a starting point, um, wrestle with these buzzwords, empowerment and inclusion so that we can have them be more than kind of big platitudes and really have them reflected in our workplace cultures and the things that we design in the technologies that we put out into the world. And so to do that, I think we have to move beyond techno determinism, and I'll explain what that means in just a minute. Techno determinism comes in two forms. The first, on your left is the idea that technology automation, um, all of these emerging trends are going to harm us, are going to necessarily harm humanity. They're going to take all the jobs they're going to remove human agency. This is what we might call the techno dystopian version of the story and this is what Hollywood loves to sell us in the form of movies like The Matrix or Terminator. The other version on your right is the techno utopian story that technologies automation. The robots as a shorthand, are going to save humanity. They're gonna make everything more efficient, more equitable. And in this case, on the surface, he seemed like opposing narratives right there, telling us different stories. At least they have different endpoints. But when you pull back the screen and look a little bit more closely, you see that they share an underlying logic that technology is in the driver's seat and that human beings that social society can just respond to what's happening. But we don't really have a say in what technologies air designed and so to move beyond techno determinism the notion that technology is in the driver's seat. We have to put the human agents and agencies back into the story, the protagonists, and think carefully about what the human desires worldviews, values, assumptions are that animate the production of technology. And so we have to put the humans behind the screen back into view. And so that's a very first step and when we do that, we see, as was already mentioned, that it's a very homogeneous group right now in terms of who gets the power and the resource is to produce the digital and physical infrastructure that everyone else has to live with. And so, as a first step, we need to think about how to create more participation of those who are working behind the scenes to design technology now to dig a little more a deeper into this, I want to offer a kind of low tech example before we get to the more hi tech ones. So what you see in front of you here is a simple park bench public bench. It's located in Berkeley, California, which is where I went to graduate school and on this particular visit I was living in Boston, and so I was back in California. It was February. It was freezing where I was coming from, and so I wanted to take a few minutes in between meetings to just lay out in the sun and soak in some vitamin D, and I quickly realized, actually, I couldn't lay down on this bench because of the way it had been designed with these arm rests at intermittent intervals. And so here I thought. Okay, the the armrest have, ah functional reason why they're there. I mean, you could literally rest your elbows there or, um, you know, it can create a little bit of privacy of someone sitting there that you don't know. When I was nine months pregnant, it could help me get up and down or for the elderly, the same thing. So it has a lot of functional reasons, but I also thought about the fact that it prevents people who are homeless from sleeping on the bench. And this is the Bay area that we were talking about where, in fact, the tech boom has gone hand in hand with a housing crisis. Those things have grown in tandem. So innovation has grown within equity because we haven't thought carefully about how to address the social context in which technology grows and blossoms. And so I thought, Okay, this crisis is growing in this area, and so perhaps this is a deliberate attempt to make sure that people don't sleep on the benches by the way that they're designed and where the where they're implemented and So this is what we might call structural inequity. By the way something is designed. It has certain effects that exclude or harm different people. And so it may not necessarily be the intense, but that's the effect. And I did a little digging, and I found, in fact, it's a global phenomenon, this thing that architects called hostile architecture. Er, I found single occupancy benches in Helsinki, so only one booty at a time no laying down there. I found caged benches in France. And in this particular town. What's interesting here is that the mayor put these benches out in this little shopping plaza, and within 24 hours the people in the town rallied together and had them removed. So we see here that just because we have, uh, discriminatory design in our public space doesn't mean we have to live with it. We can actually work together to ensure that our public space reflects our better values. But I think my favorite example of all is the meter bench. In this case, this bench is designed with spikes in them, and to get the spikes to retreat into the bench, you have to feed the meter you have to put some coins in, and I think it buys you about 15 or 20 minutes. Then the spikes come back up. And so you'll be happy to know that in this case, this was designed by a German artists to get people to think critically about issues of design, not just the design of physical space but the design of all kinds of things, public policies. And so we can think about how our public life in general is metered, that it serves those that can pay the price and others are excluded or harm, whether we're talking about education or health care. And the meter bench also presents something interesting. For those of us who care about technology, it creates a technical fix for a social problem. In fact, it started out his art. But some municipalities in different parts of the world have actually adopted this in their public spaces in their parks in order to deter so called lawyers from using that space. And so, by a technical fix, we mean something that creates a short term effect, right. It gets people who may want to sleep on it out of sight. They're unable to use it, but it doesn't address the underlying problems that create that need to sleep outside in the first place. And so, in addition to techno determinism, we have to think critically about technical fixes that don't address the underlying issues that technology is meant to solve. And so this is part of a broader issue of discriminatory design, and we can apply the bench metaphor to all kinds of things that we work with or that we create. And the question we really have to continuously ask ourselves is, What values are we building in to the physical and digital infrastructures around us? What are the spikes that we may unwittingly put into place? Or perhaps we didn't create the spikes. Perhaps we started a new job or a new position, and someone hands us something. This is the way things have always been done. So we inherit the spike bench. What is our responsibility when we noticed that it's creating these kinds of harms or exclusions or technical fixes that are bypassing the underlying problem? What is our responsibility? All of this came to a head in the context of financial technologies. I don't know how many of you remember these high profile cases of tech insiders and CEOs who applied for Apple, the Apple card and, in one case, a husband and wife applied and the husband, the husband received a much higher limit almost 20 times the limit as his wife, even though they shared bank accounts, they lived in Common Law State. And so the question. There was not only the fact that the husband was receiving a much better interest rate and the limit, but also that there was no mechanism for the individuals involved to dispute what was happening. They didn't even know what the factors were that they were being judged that was creating this form of discrimination. So in terms of financial technologies, it's not simply the outcome that's the issue. Or that could be discriminatory, but the process that black boxes, all of the decision making that makes it so that consumers and the general public have no way to question it. No way to understand how they're being judged adversely, and so it's the process not only the product that we have to care a lot about. And so the case of the apple cart is part of a much broader phenomenon of, um, racist and sexist robots. This is how the headlines framed it a few years ago, and I was so interested in this framing because there was a first wave of stories that seemed to be shocked at the prospect that technology is not neutral. Then there was a second wave of stories that seemed less surprised. Well, of course, technology inherits its creator's biases. And now I think we've entered a phase of attempts to override and address the default settings of so called racist and sexist robots, for better or worse. And here robots is just a kind of shorthand, that the way people are talking about automation and emerging technologies more broadly. And so as I was encountering these headlines, I was thinking about how these air, not problems simply brought on by machine learning or AI. They're not all brand new, and so I wanted to contribute to the conversation, a kind of larger context and a longer history for us to think carefully about the social dimensions of technology. And so I developed a concept called the New Jim Code, which plays on the phrase Jim Crow, which is the way that the regime of white supremacy and inequality in this country was defined in a previous era, and I wanted us to think about how that legacy continues to haunt the present, how we might be coding bias into emerging technologies and the danger being that we imagine those technologies to be objective. And so this gives us a language to be able to name this phenomenon so that we can address it and change it under this larger umbrella of the new Jim Code are four distinct ways that this phenomenon takes shape from the more obvious engineered inequity. Those were the kinds of inequalities tech mediated inequalities that we can generally see coming. They're kind of obvious. But then we go down the line and we see it becomes harder to detect. It's happening in our own backyards. It's happening around us, and we don't really have a view into the black box, and so it becomes more insidious. And so in the remaining couple minutes, I'm just just going to give you a taste of the last three of these, and then a move towards conclusion that we can start chatting. So when it comes to default discrimination. This is the way that social inequalities become embedded in emerging technologies because designers of these technologies aren't thinking carefully about history and sociology. Ah, great example of this came Thio headlines last fall when it was found that widely used healthcare algorithm affecting millions of patients, um, was discriminating against black patients. And so what's especially important to note here is that this algorithm healthcare algorithm does not explicitly take note of race. That is to say, it is race neutral by using cost to predict healthcare needs. This digital triaging system unwittingly reproduces health disparities because, on average, black people have incurred fewer costs for a variety of reasons, including structural inequality. So in my review of this study by Obermeyer and colleagues, I want to draw attention to how indifference to social reality can be even more harmful than malicious intent. It doesn't have to be the intent of the designers to create this effect, and so we have to look carefully at how indifference is operating and how race neutrality can be a deadly force. When we move on to the next iteration of the new Jim code coded exposure, there's attention because on the one hand, you see this image where the darker skin individual is not being detected by the facial recognition system, right on the camera or on the computer. And so coated exposure names this tension between wanting to be seen and included and recognized, whether it's in facial recognition or in recommendation systems or in tailored advertising. But the opposite of that, the tension is with when you're over included. When you're surveiled when you're to centered. And so we should note that it's not simply in being left out, that's the problem. But it's in being included in harmful ways. And so I want us to think carefully about the rhetoric of inclusion and understand that inclusion is not simply an end point. It's a process, and it is possible to include people in harmful processes. And so we want to ensure that the process is not harmful for it to really be effective. The last iteration of the new Jim Code. That means the the most insidious, let's say, is technologies that are touted as helping US address bias, so they're not simply including people, but they're actively working to address bias. And so in this case, There are a lot of different companies that are using AI to hire, create hiring software and hiring algorithms, including this one higher view. And the idea is that there there's a lot that AI can keep track of that human beings might miss. And so so the software can make data driven talent decisions. After all, the problem of employment discrimination is widespread and well documented. So the logic goes, Wouldn't this be even more reason to outsource decisions to AI? Well, let's think about this carefully. And this is the look of the idea of techno benevolence trying to do good without fully reckoning with what? How technology can reproduce inequalities. So some colleagues of mine at Princeton, um, tested a natural learning processing algorithm and was looking to see whether it exhibited the same, um, tendencies that psychologists have documented among humans. E. And what they found was that in fact, the algorithm associating black names with negative words and white names with pleasant sounding words. And so this particular audit builds on a classic study done around 2003, before all of the emerging technologies were on the scene where two University of Chicago economists sent out thousands of resumes to employers in Boston and Chicago, and all they did was change the names on those resumes. All of the other work history education were the same, and then they waited to see who would get called back. And the applicants, the fictional applicants with white sounding names received 50% more callbacks than the black applicants. So if you're presented with that study, you might be tempted to say, Well, let's let technology handle it since humans are so biased. But my colleagues here in computer science found that this natural language processing algorithm actually reproduced those same associations with black and white names. So, too, with gender coded words and names Amazon learned a couple years ago when its own hiring algorithm was found discriminating against women. Nevertheless, it should be clear by now why technical fixes that claim to bypass human biases are so desirable. If Onley there was a way to slay centuries of racist and sexist demons with a social justice box beyond desirable, more like magical, magical for employers, perhaps looking to streamline the grueling work of recruitment but a curse from any jobseekers, as this headline puts it, your next interview could be with a racist spot, bringing us back to that problem space we started with just a few minutes ago. So it's worth noting that job seekers are already developing ways to subvert the system by trading answers to employers test and creating fake applications as informal audits of their own. In terms of a more collective response, there's a federation of European Trade unions call you and I Global that's developed a charter of digital rights for work, others that touches on automated and a I based decisions to be included in bargaining agreements. And so this is one of many efforts to change their ecosystem to change the context in which technology is being deployed to ensure more protections and more rights for everyday people in the US There's the algorithmic accountability bill that's been presented, and it's one effort to create some more protections around this ubiquity of automated decisions, and I think we should all be calling from more public accountability when it comes to the widespread use of automated decisions. Another development that keeps me somewhat hopeful is that tech workers themselves are increasingly speaking out against the most egregious forms of corporate collusion with state sanctioned racism. And to get a taste of that, I encourage you to check out the hashtag Tech won't build it. Among other statements that they have made and walking out and petitioning their companies. Who one group said, as the people who build the technologies that Microsoft profits from, we refuse to be complicit in terms of education, which is my own ground zero. Um, it's a place where we can we can grow a more historically and socially literate approach to tech design. And this is just one, um, resource that you all can download, Um, by developed by some wonderful colleagues at the Data and Society Research Institute in New York and the goal of this interventionist threefold to develop an intellectual understanding of how structural racism operates and algorithms, social media platforms and technologies, not yet developed and emotional intelligence concerning how to resolve racially stressful situations within organizations, and a commitment to take action to reduce harms to communities of color. And so as a final way to think about why these things are so important, I want to offer a couple last provocations. The first is for us to think a new about what actually is deep learning when it comes to computation. I want to suggest that computational depth when it comes to a I systems without historical or social depth, is actually superficial learning. And so we need to have a much more interdisciplinary, integrated approach to knowledge production and to observing and understanding patterns that don't simply rely on one discipline in order to map reality. The last provocation is this. If, as I suggested at the start, inequity is woven into the very fabric of our society, it's built into the design of our. Our policies are physical infrastructures and now even our digital infrastructures. That means that each twist, coil and code is a chance for us toe. We've new patterns, practices and politics. The vastness of the problems that we're up against will be their undoing. Once we accept that we're pattern makers. So what does that look like? It looks like refusing color blindness as an anecdote to tech media discrimination rather than refusing to see difference. Let's take stock of how the training data and the models that we're creating have these built in decisions from the past that have often been discriminatory. It means actually thinking about the underside of inclusion, which can be targeting. And how do we create a more participatory rather than predatory form of inclusion? And ultimately, it also means owning our own power in these systems so that we can change the patterns of the past. If we're if we inherit a spiked bench, that doesn't mean that we need to continue using it. We can work together to design more just and equitable technologies. So with that, I look forward to our conversation. >>Thank you, Ruth. Ha. That was I expected it to be amazing, as I have been devouring your book in the last few weeks. So I knew that would be impactful. I know we will never think about park benches again. How it's art. And you laid down the gauntlet. Oh, my goodness. That tech won't build it. Well, I would say if the thoughts about team has any saying that we absolutely will build it and will continue toe educate ourselves. So you made a few points that it doesn't matter if it was intentional or not. So unintentional has as big an impact. Um, how do we address that does it just start with awareness building or how do we address that? >>Yeah, so it's important. I mean, it's important. I have good intentions. And so, by saying that intentions are not the end, all be all. It doesn't mean that we're throwing intentions out. But it is saying that there's so many things that happened in the world, happened unwittingly without someone sitting down to to make it good or bad. And so this goes on both ends. The analogy that I often use is if I'm parked outside and I see someone, you know breaking into my car, I don't run out there and say Now, do you feel Do you feel in your heart that you're a thief? Do you intend to be a thief? I don't go and grill their identity or their intention. Thio harm me, but I look at the effect of their actions, and so in terms of art, the teams that we work on, I think one of the things that we can do again is to have a range of perspectives around the table that can think ahead like chess, about how things might play out, but also once we've sort of created something and it's, you know, it's entered into, you know, the world. We need to have, ah, regular audits and check ins to see when it's going off track just because we intended to do good and set it out when it goes sideways, we need mechanisms, formal mechanisms that actually are built into the process that can get it back on track or even remove it entirely if we find And we see that with different products, right that get re called. And so we need that to be formalized rather than putting the burden on the people that are using these things toe have to raise the awareness or have to come to us like with the apple card, Right? To say this thing is not fair. Why don't we have that built into the process to begin with? >>Yeah, so a couple things. So my dad used to say the road to hell is paved with good intentions, so that's >>yes on. In fact, in the book, I say the road to hell is paved with technical fixes. So they're me and your dad are on the same page, >>and I I love your point about bringing different perspectives. And I often say this is why diversity is not just about business benefits. It's your best recipe for for identifying the early biases in the data sets in the way we build things. And yet it's such a thorny problem to address bringing new people in from tech. So in the absence of that, what do we do? Is it the outside review boards? Or do you think regulation is the best bet as you mentioned a >>few? Yeah, yeah, we need really need a combination of things. I mean, we need So on the one hand, we need something like a do no harm, um, ethos. So with that we see in medicine so that it becomes part of the fabric and the culture of organizations that that those values, the social values, have equal or more weight than the other kinds of economic imperatives. Right. So we have toe have a reckoning in house, but we can't leave it to people who are designing and have a vested interest in getting things to market to regulate themselves. We also need independent accountability. So we need a combination of this and going back just to your point about just thinking about like, the diversity on teams. One really cautionary example comes to mind from last fall, when Google's New Pixel four phone was about to come out and it had a kind of facial recognition component to it that you could open the phone and they had been following this research that shows that facial recognition systems don't work as well on darker skin individuals, right? And so they wanted Thio get a head start. They wanted to prevent that, right? So they had good intentions. They didn't want their phone toe block out darker skin, you know, users from from using it. And so what they did was they were trying to diversify their training data so that the system would work better and they hired contract workers, and they told these contract workers to engage black people, tell them to use the phone play with, you know, some kind of app, take a selfie so that their faces would populate that the training set, But they didn't. They did not tell the people what their faces were gonna be used for, so they withheld some information. They didn't tell them. It was being used for the spatial recognition system, and the contract workers went to the media and said Something's not right. Why are we being told? Withhold information? And in fact, they told them, going back to the park bench example. To give people who are homeless $5 gift cards to play with the phone and get their images in this. And so this all came to light and Google withdrew this research and this process because it was so in line with a long history of using marginalized, most vulnerable people and populations to make technologies better when those technologies are likely going toe, harm them in terms of surveillance and other things. And so I think I bring this up here to go back to our question of how the composition of teams might help address this. I think often about who is in that room making that decision about sending, creating this process of the contract workers and who the selfies and so on. Perhaps it was a racially homogeneous group where people didn't want really sensitive to how this could be experienced or seen, but maybe it was a diverse, racially diverse group and perhaps the history of harm when it comes to science and technology. Maybe they didn't have that disciplinary knowledge. And so it could also be a function of what people knew in the room, how they could do that chest in their head and think how this is gonna play out. It's not gonna play out very well. And the last thing is that maybe there was disciplinary diversity. Maybe there was racial ethnic diversity, but maybe the workplace culture made it to those people. Didn't feel like they could speak up right so you could have all the diversity in the world. But if you don't create a context in which people who have those insights feel like they can speak up and be respected and heard, then you're basically sitting on a reservoir of resource is and you're not tapping into it to ensure T to do right by your company. And so it's one of those cautionary tales I think that we can all learn from to try to create an environment where we can elicit those insights from our team and our and our coworkers, >>your point about the culture. This is really inclusion very different from just diversity and thought. Eso I like to end on a hopeful note. A prescriptive note. You have some of the most influential data and analytics leaders and experts attending virtually here. So if you imagine the way we use data and housing is a great example, mortgage lending has not been equitable for African Americans in particular. But if you imagine the right way to use data, what is the future hold when we've gotten better at this? More aware >>of this? Thank you for that question on DSO. You know, there's a few things that come to mind for me one. And I think mortgage environment is really the perfect sort of context in which to think through the the both. The problem where the solutions may lie. One of the most powerful ways I see data being used by different organizations and groups is to shine a light on the past and ongoing inequities. And so oftentimes, when people see the bias, let's say when it came to like the the hiring algorithm or the language out, they see the names associated with negative or positive words that tends toe have, ah, bigger impact because they think well, Wow, The technology is reflecting these biases. It really must be true. Never mind that people might have been raising the issues in other ways before. But I think one of the most powerful ways we can use data and technology is as a mirror onto existing forms of inequality That then can motivate us to try to address those things. The caution is that we cannot just address those once we come to grips with the problem, the solution is not simply going to be a technical solution. And so we have to understand both the promise of data and the limits of data. So when it comes to, let's say, a software program, let's say Ah, hiring algorithm that now is trained toe look for diversity as opposed to homogeneity and say I get hired through one of those algorithms in a new workplace. I can get through the door and be hired. But if nothing else about that workplace has changed and on a day to day basis I'm still experiencing microaggressions. I'm still experiencing all kinds of issues. Then that technology just gave me access to ah harmful environment, you see, and so this is the idea that we can't simply expect the technology to solve all of our problems. We have to do the hard work. And so I would encourage everyone listening to both except the promise of these tools, but really crucially, um, Thio, understand that the rial kinds of changes that we need to make are gonna be messy. They're not gonna be quick fixes. If you think about how long it took our society to create the kinds of inequities that that we now it lived with, we should expect to do our part, do the work and pass the baton. We're not going to magically like Fairy does create a wonderful algorithm that's gonna help us bypass these issues. It can expose them. But then it's up to us to actually do the hard work of changing our social relations are changing the culture of not just our workplaces but our schools. Our healthcare systems are neighborhoods so that they reflect our better values. >>Yeah. Ha. So beautifully said I think all of us are willing to do the hard work. And I like your point about using it is a mirror and thought spot. We like to say a fact driven world is a better world. It can give us that transparency. So on behalf of everyone, thank you so much for your passion for your hard work and for talking to us. >>Thank you, Cindy. Thank you so much for inviting me. Hey, I live back to you. >>Thank you, Cindy and rou ha. For this fascinating exploration of our society and technology, we're just about ready to move on to our final session of the day. So make sure to tune in for this customer case study session with executives from Sienna and Accenture on driving digital transformation with certain AI.
SUMMARY :
I know that there's so much more we could do collectively to improve these gaps and diversity. and inclusion in the data and analytic space. Natalie Longhurst from Vodafone, suggesting that we move it from the change agents, the leaders that can prevent this. And so in the remaining couple minutes, I'm just just going to give you a taste of the last three of these, And you laid down the gauntlet. And so we need that to be formalized rather than putting the burden on So my dad used to say the road to hell is paved with good In fact, in the book, I say the road to hell for identifying the early biases in the data sets in the way we build things. And so this all came to light and the way we use data and housing is a great example, And so we have to understand both the promise And I like your point about using it is a mirror and thought spot. I live back to you. So make sure to
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Cindy | PERSON | 0.99+ |
Ruth | PERSON | 0.99+ |
France | LOCATION | 0.99+ |
Natalie Longhurst | PERSON | 0.99+ |
California | LOCATION | 0.99+ |
Boston | LOCATION | 0.99+ |
Paola | PERSON | 0.99+ |
Japan | LOCATION | 0.99+ |
thousands | QUANTITY | 0.99+ |
$5 | QUANTITY | 0.99+ |
Ruhollah Benjamin | PERSON | 0.99+ |
Chicago | LOCATION | 0.99+ |
Rou Ha Benjamin | PERSON | 0.99+ |
Bill Zang | PERSON | 0.99+ |
Helsinki | LOCATION | 0.99+ |
Vodafone | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Sienna | ORGANIZATION | 0.99+ |
US | LOCATION | 0.99+ |
Data and Society Research Institute | ORGANIZATION | 0.99+ |
Cindy Housing | PERSON | 0.99+ |
Last year | DATE | 0.99+ |
nine months | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
Obermeyer | PERSON | 0.99+ |
Apple | ORGANIZATION | 0.99+ |
European Trade unions | ORGANIZATION | 0.99+ |
Berkeley, California | LOCATION | 0.99+ |
Multiple Books | TITLE | 0.99+ |
two | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
Tobias | PERSON | 0.99+ |
February | DATE | 0.99+ |
University of Chicago | ORGANIZATION | 0.99+ |
New York | LOCATION | 0.99+ |
one case | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
Jim Crow | PERSON | 0.99+ |
This year | DATE | 0.99+ |
both | QUANTITY | 0.99+ |
20 minutes | QUANTITY | 0.99+ |
I Global | ORGANIZATION | 0.99+ |
two forms | QUANTITY | 0.98+ |
first step | QUANTITY | 0.98+ |
2% | QUANTITY | 0.98+ |
Terminator | TITLE | 0.98+ |
Thio | PERSON | 0.98+ |
last fall | DATE | 0.98+ |
One | QUANTITY | 0.98+ |
The Matrix | TITLE | 0.98+ |
24 hours | QUANTITY | 0.98+ |
The Latest Race After Technology | TITLE | 0.98+ |
Jim | PERSON | 0.98+ |
Princeton University | ORGANIZATION | 0.98+ |
Rou ha | PERSON | 0.97+ |
one | QUANTITY | 0.97+ |
both ends | QUANTITY | 0.97+ |
Accenture | ORGANIZATION | 0.96+ |
one booty | QUANTITY | 0.96+ |
almost 20 times | QUANTITY | 0.96+ |
Hollywood | ORGANIZATION | 0.95+ |
centuries | QUANTITY | 0.95+ |
rou | PERSON | 0.95+ |
one group | QUANTITY | 0.94+ |
Onley | PERSON | 0.93+ |
about 15 | QUANTITY | 0.93+ |
one discipline | QUANTITY | 0.93+ |
millions of patients | QUANTITY | 0.92+ |
four distinct ways | QUANTITY | 0.92+ |
Pixel four | COMMERCIAL_ITEM | 0.9+ |
few minutes ago | DATE | 0.9+ |
50% more | QUANTITY | 0.88+ |
few years ago | DATE | 0.88+ |
couple years ago | DATE | 0.88+ |
African | OTHER | 0.85+ |
one effort | QUANTITY | 0.85+ |
single occupancy benches | QUANTITY | 0.84+ |
African American | OTHER | 0.82+ |
Ben Parr | SXSW 2017
>> Narrator: Live from Austin, Texas, it's The Cube covering South by Southwest 2017, brought to you by Intel. Now, here's John Furrier. >> Hey, welcome everyone back for day two of live coverage of South by Southwest. This is the cube, our flagship program from Silicon Angle. We go out to the events and extract the (mumbles). We're at the Intel AI Lounge, people are rolling in, it's an amazing vibe here, South by Southwest. The themes are AI, virtual reality, augmented reality, technology. They got great booths here, free beers, free drinks, and of course great sessions and great conversations here with the Cube. My first guest of the day here is Ben Parr, a friend of the Cube. He's been an entrepreneur, he's been a social media maven, he's been a journalist, all around great guy. Ben, thanks for joining us today. >> Thank you for having me again. >> So you're a veteran with South by Southwest, you know the social scene, you've seen the evolution from Web 2.0 all the way to today, had Scobel on yesterday, Brian Fanzo, really the vibe is all about that next level, of social to connecting and you got a startup you're working on that you founded, co-founded called AI? >> Ben: Octane AI. >> Octane AI, that's in the heart of this new social fabric that's developing. Where AI is starting to do stuff, keep learning, analytics but, ultimately, it's just a connection. Talk about your company. What is Octane AI? Tell us a little bit about the company. >> So Octane AI is a platform that lets you build an audience on Facebook Messenger and then through a bot. And so, what we do is allow you to create a presence on Messenger because if I told you there was a social app that had a billion users every month, bigger than Snapchat plus Twitter plus Instagram combined you'd want to figure out a strategy for how to engage with those people right? And that social app is Facebook Messenger. And yet no one ever thinks, oh could I build an audience on a messaging app? Could I build an audience on Messenger or WeChat or any of the others. But you can through a bot. And you can not just build an audience but you can create really engaging content through conversation. So what we've done is, we've made it really easy to make a bot on messenger but more importantly, a real reason for people to, actually, come to your bot and engage with it and make it really easy to create content for it. In the same way you create content for a blog or create content for YouTube Channel. Maroon 5, Aerosmith, KISS, Lindsay Lohan, 30 seconds to MARS, Jason Derulo and a whole bunch more use us to build an audience and engage their fans on Messenger. >> So let me get your thoughts on a couple of trends around this. Cause this is really kind of, to me, a key part that chat bots illustrate the big trends that are going on. Chat bots were the hype. People were talking about, oh chat bots. It's a good mental model for people to see AI but it also has been, kind of, I won't say a pest, if you will, for users. It's been like a notification. A notification of the economy we're living in. Now you're taking it to the next level. This is what we're seeing. The deep learnings and the analytics around turning notifications which can be noisy after a while, into real content and connections. >> Into something useful, absolutely. Like look, the last year of bots. The Facebook platform is not even a year old. We've been in that fart apps stage of bots. Remember the first year of mobile apps? You had the fart app and that made $50,000 a day and that was annoying as hell. We're at that stage now, the experimentation stage. And we've seen different companies going in different, really cool directions. Our direction is, how do you create compelling content so you're not spamming people but you have content that you can share, not just in your bot but as a link on your social media to your followers, to your fans, on Twitter, everywhere else and have a scalable conversation about whatever you want. Maroon 5 has conversations with their audience about their upcoming tours or they even released an exclusive preview of their new song, Cold, through our bots. You could do almost anything with our bots or with any bot. We're just learning right now, as an industry, what are the best practices. >> So where do bots go for the next level? Because you and I have known each other for almost over 10 years, we've seen the whole movement and now we're living in a fake news era. But social media is evolving where content now is super important that glues people together, communities together. In a way, you're taking AI or bots, if you will. Which is a first, I mean, .5 version of where AI is going. Where content, now, is being blended into notifications. How important is content in community? >> Content in community are essential to any product. And I feel like when you hear the word bot, you don't think community and that you could build a community with it because it's a bot, it's supposed to be automated. But you, actually, can if you do it in the right way and it can be a very, very powerful experience. We're building features that allow you to build more community in your bot and have people who are talking with your bot communicate with each other. There's a lot of that. What I feel like is, we're at the zero point one or zero point two of the long scale of AI. What we need to do right now is showcase all the use cases that really work for AI, bots, machine learning. Over time, we will be adding more other great technologies from Intel and others that will make all these technologies and everything we do better, more social and most of all, more personalized. I think that's one of the big benefits of AI. >> Do you see bot technology or what bots can turn into being embedded into things like autonomous vehicles, AR, is there a stack developing, if you will, around bots? What you're talking about is a progression of bots. What's your vision on where this goes down the road? >> I see a bunch of companies, now, building the technological stack for AI. I see a bunch of companies building the consumer interface, bots is one of those consumer interfaces. Not just chat bots but voice bots. And then I see another layer that's more enterprise that's helping make more efficient things like recruiting or all sorts of automation or driving. That are being built as well. But you need each of those stacks to work really well to make this all work. >> So are there bots here at South by Southwest? Is there a bot explosion, is there bots that tell you where the best parties are? What's the scene here at Southby? Where are the bots and if there were bots, what would they be doing to help people figure out what to do? >> The Southby bot is, actually, not a bad bot. They launched their bot just before South by Southwest. It has a good party recommendations and things. But it the standard bot. I feel like what we're seeing is the best use, there's a lot of good bot people. What I'm seeing right now is that people are still flushing out the best use cases for their bots. There's no bot yet that can predict all the parties you want to go to. We got to have our expectations set. That will happen but we're still a few years away from really deep AI bots. But there are clearly ones where you can communicate faster with your friends. There's clearly ones that help you connect with your favorite artist. There's clearly ones that help you build an audience and communicate at scale. And I feel like the next step is the usefulness. >> Talk about the user interface. Robert Scobel and I were talking yesterday, we have some guests coming on today that had user experience background. With AI, with virtual reality, with bots, with deep learning, all this collective intelligence going on, what's your vision of the user interface as it changes, as people's expectations? What are some of those things that you might see developing pretty quickly as deep learning, analytics, more data stats come online? What is the user interface? Cause bots will intersect with that as an assistant or a value add for the user. What's your vision on? >> I'll tell you what I see in the near term and then I'll tell you a really crazy idea of how I see the long term. In the near term, I think what you're going to see is bots have become more predictive. That, based on your conversations, are more personalized and maybe not a necessarily need as much input from you to be really intelligent. And so voice, text, standard interfaces that we're used to. I think the bigger, longer run is neurological. Is the ability to interface without having to speak. Is AI as a companion to help us in everything we do. I feel like, in 30 years, we won't even, it's, kind of like, do your remember the world when it had no internet? It's hard, it feels so much different. There will be a point in about 20 years we will not understand what the world was before AI. Before AI assistance where assisting us mentally, automatically and through every interface. And so good AI's, in the long run, don't just run on one bot or one thing, they follow you wherever you go. Right now it might be on your phone. When you get home, it may be on your home, it may be in your car but it should be the same sets of AI's that you use daily. >> Doctor Nevine Rou, yesterday, called the AI the bulldozer for data. What bulldozers where in the real world, AI's going to do that for data. Cause you want to service more data and make things more usable for users. >> Yes, the data really helps AI become more personalized and that's a really big benefit to the user to every individual. The more personalized the experience, the less you have to do. >> Alright, so what's the most amazing thing you've seen so far this year at Southby? What's going on out there that's pretty amazing? That's popping out of the wood work? In terms of either trend, content, product, demos, what are some of the cool things you're seeing. >> So, as it is only Saturday, I feel like the coolest thing will still come to me. But outside of AI, there have been some really cool mixed reality, augmented reality demos. I can't remember the name. There's a product with butterflies flying around me. All sorts of really breaking edge technologies that, really, create another new interface honestly where AI may interact with us through the augmented reality of our world. I mean, that's Robert Scogul's thing exactly. But there's a lot of really cool things that are being built on that front. I think those are the obvious, coolest ones. I'm curious to see which ones are going to be the big winners. >> Okay, so I want to ask you a personal question. So you were doing some venture investing around AI and some other things. What caused you to put that pause button on that mission to start the chat bot AI company? >> So I was an investor for a couple of years. I invested in ubean, the wireless electricity company and Shots with Justin Bieber which is always fun. And I love investing and I love working with companies. But I got into Silicone Valley and I got into startups because I wanted to build companies. I wanted to build ideas. This happened, in part, because of my co-founders. My co-founder Matt, who is the first head of product at Ustream and twice into the Forbes 30 under 30. One of the king makers of the bot industry. The opportunity to be a part of building the future of AI was irresistible to me. I needed to be a part of that. >> Okay, can you tell any stories about Justin Bieber for us, while we're here inside the Cube? (laughs) >> I wonder how many of those I can, actually, tell? Okay, so look. Justin Bieber is an investor in a company I'm an investor in called Shots. Which is now a super studio that represents everyone from Lele Pons to Mike Tyson on digital online and they're doing really, really well. One of Justin's best friends is the founder, John Shahidi. And so it's just really random. Sitting with John, who I invested in and just getting random FaceTime's. Be like, oh it's Justin Bieber, say hi to Justin. As if it was nothing. As if it was a normal, it's a normal day in his life. >> Could you just have him retweet one of my Tweets. He's got like a zillion followers. What's his follower count at now? >> You don't want that. He's done that to me before. When Justin retweets you or even John retweets you, thousands of not tens of thousands of Justin Bieber fans, bots and not bots, start messaging you, asking you to follow them, talking to you all the time. I still get the tweets all the time from all the Justin fans. >> Okay don't tweet me then. I'm nice and happy with 21,000 followers. Alright, so next level for you in terms of this venture. Obviously, they got some rock stars in there. What's the next step for you guys right now? Give us a little inside baseball in the venture status where you guys are at. What's the next step? >> We launched the company publicly in November, we started in May. We raised 1.6 million from general catalyst, from Sherpa Ventures, a couple of others. When we launched our new feature, Convos, which allows you to create shareable bots, shareable conversations with the way you share blog posts. And that came out with all those launch partners I mentioned before like Maroon 5. We're working on perfecting the experience and, mostly, trying to make a really, really compelling experience with the user with bots because if we can't do that, then there's no use to doing anything. >> So you provide the octane for the explosive conversations? (laughs) >> Yes, there you go, thank you, thank you. And we make it really easy. So we're just trying to make it easier to do this. This is a product that your mom could use, that an artist could use, any social media team could use. Writing a convo is like writing a blog post on media. >> Are moms really getting the chat bot scene? I, honestly, get the Hollywood. I'm going to go back to Hollywood in a second but being a general, middle America kind of tech/genre, what are they like? Are they grokking the whole bot thing? What's the feedback from middle America tech? >> But think of it this way. There are a billion people on Messenger and it's a, really, part of the question, they all use Facebook Messenger. And so, they may be communicating with a bot without knowing it. Or they might want to communicate with their fans. It's not about the technology as much as this is like connecting with who you really care about. If I really care about a Maroon 5 or Rachel Ray, I can now have that option. And it doesn't really matter what the technology is as much as it is that personal connection, that experience is good. >> John: Is it one-one-one or group? Cause it sounds like it's town hall, perfect for a town hall situation. >> It's one-on-one, it's scale. So you could have a conversation with a bot while each of the audience members is having a conversation one-on-one. When you can choose different options and it could be a different conversation for each person. >> Alright, so I got to ask about the Hollywood scene. You mentioned Justin Bieber. I wanted to go down that because Hollywood really has adopted social media pretty heavily because they can go direct to the audience. We're seeing that. Obviously, with the election, Trump was on Twitter. He bypasses all the press but Hollywood has done very well with social. How are they using the bots? They are a tell sign of where it's going. Can you share some antidotal stories or data around how Maroon 5, Justin, these guys are leveraging this and what's some of the impact? >> Sure, so about a month 1/2, 2 months before Maroon 5 launched their new song, new single, Cold. They came to us and wanted to build a distribution. They wanted to reach their audience in a more direct personal way. And so we helped them make a bot. It didn't take long. We helped them write convos. And so what they did was they wrote convos about things like exclusive behind the scenes photos from their recent tour or their top moments of 2016 or things that their fans really care about. And they shared em. They got a URL just like you would get, a blog poster URL. They shared it out with their 39 million Facebook fans, they shared it with their Twitter followers, they shared it across their social media. And 10's of thousand's of people started talking with their bot each time they did this. About 24 hours before the bot, before their new single release, they exclusively released a 10 second clip of Cold through their bot. And when they did that, within 24 hours, the size of their bot doubled because it went viral within the Maroon 5 community. There's a share function in our convos and people shared the convo with their friends and with their friends friends and it kept on spreading. We saw this viral graph happen. And the next day when they released the single, 1000's of people bought the song because of the bot alone. And now the bot is a core of their social strategy. They share a convo every single week and it's not just them but now Lohan and a whole bunch of others are doing the same thing. >> John: Lindsay Lohan. >> Lindsay Lohan is one of our most popular bots. Her fans are really dedicated. >> And so you can almost see it's, almost connecting with CGI, looking at what CGI's doing in film making. You could almost have a CGI component built-in. So it's all this stuff coming together. >> Ben: Multimedia matters. >> So what do you think about the Intel booth here? The AI experience? They got some Kinetic photo experience, amazing non-profit activities in deep loading (mumbles), missing children, what do you think? >> This is some of the best use cases for AI which is, people think of AI as just like the direct consumer interface which is what we do but AI is an underlying layer to everything we do. And if it can help even 1% or 1,000% identify and find missing children or increase the efficiency of our technology stacks so that we save energy. Or we figure out new ways to save energy. This is where AI can really make an impact. It is just a fundamental layer of everything. In the same way the internet is just a fundamental layer of everything. So I've seen some very cool things here. >> Alright, Ben Parr, great guest, in venture capitalist now founder of a great company Octane AI. High octane, explosive conversations looking forward to adopting. We're going to, definitely, take advantage of the chat bot and maybe we can get some back stage passes to Maroon 5. (laughs) >> (laughs) There will be some fun times in the future, I know it. >> Alright Ben Parr. >> Ben: Justin Bieber. >> Justin Bieber inside the Cube right here and Ben Parr. Thanks for watching. It's the Intel AI Lounge. A lot of great stuff. A lot of great people here. Thanks for joining us. Our next guest will be up after this short break. (lively music)
SUMMARY :
covering South by Southwest 2017, brought to you by Intel. a friend of the Cube. and you got a startup you're working on Octane AI, that's in the heart In the same way you create content for a blog A notification of the economy we're living in. that you can share, not just in your bot Because you and I have known each other And I feel like when you hear the word bot, a stack developing, if you will, around bots? the consumer interface, bots is one And I feel like the next step is the usefulness. What is the user interface? the same sets of AI's that you use daily. called the AI the bulldozer for data. the less you have to do. the cool things you're seeing. I feel like the coolest thing Okay, so I want to ask you a personal question. One of the king makers of the bot industry. One of Justin's best friends is the founder, John Shahidi. Could you just have him retweet I still get the tweets all the time in the venture status where you guys are at. And that came out with all those This is a product that your mom could use, Are moms really getting the chat bot scene? and it's a, really, part of the question, John: Is it one-one-one or group? So you could have a conversation with a bot He bypasses all the press but Hollywood and people shared the convo with their friends Lindsay Lohan is one of our most popular bots. And so you can almost see it's, almost This is some of the best use cases for AI of the chat bot and maybe we can get in the future, I know it. It's the Intel AI Lounge.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Brian Fanzo | PERSON | 0.99+ |
Matt | PERSON | 0.99+ |
John Shahidi | PERSON | 0.99+ |
Robert Scobel | PERSON | 0.99+ |
May | DATE | 0.99+ |
Ben Parr | PERSON | 0.99+ |
Mike Tyson | PERSON | 0.99+ |
Sherpa Ventures | ORGANIZATION | 0.99+ |
Trump | PERSON | 0.99+ |
November | DATE | 0.99+ |
Lohan | PERSON | 0.99+ |
John | PERSON | 0.99+ |
thousands | QUANTITY | 0.99+ |
Justin | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
Justin Bieber | PERSON | 0.99+ |
Robert Scogul | PERSON | 0.99+ |
1.6 million | QUANTITY | 0.99+ |
Lindsay Lohan | PERSON | 0.99+ |
Ustream | ORGANIZATION | 0.99+ |
1% | QUANTITY | 0.99+ |
Ben | PERSON | 0.99+ |
1,000% | QUANTITY | 0.99+ |
39 million | QUANTITY | 0.99+ |
twice | QUANTITY | 0.99+ |
30 years | QUANTITY | 0.99+ |
2016 | DATE | 0.99+ |
Intel | ORGANIZATION | 0.99+ |
Jason Derulo | PERSON | 0.99+ |
Silicone Valley | LOCATION | 0.99+ |
21,000 followers | QUANTITY | 0.99+ |
Maroon 5 | ORGANIZATION | 0.99+ |
yesterday | DATE | 0.99+ |
Austin, Texas | LOCATION | 0.99+ |
South by Southwest | TITLE | 0.99+ |
first | QUANTITY | 0.99+ |
Rachel Ray | PERSON | 0.99+ |
each | QUANTITY | 0.98+ |
Facebook Messenger | TITLE | 0.98+ |
today | DATE | 0.98+ |
TITLE | 0.98+ | |
Messenger | TITLE | 0.98+ |
Southby | ORGANIZATION | 0.98+ |
One | QUANTITY | 0.98+ |
Hollywood | ORGANIZATION | 0.98+ |
this year | DATE | 0.98+ |
1000's of people | QUANTITY | 0.98+ |
first year | QUANTITY | 0.98+ |
about 20 years | QUANTITY | 0.98+ |
Hollywood | LOCATION | 0.98+ |
Octane AI | ORGANIZATION | 0.97+ |
$50,000 a day | QUANTITY | 0.97+ |
SXSW 2017 | EVENT | 0.97+ |
ORGANIZATION | 0.97+ | |
Shots | ORGANIZATION | 0.97+ |
one | QUANTITY | 0.97+ |
zillion followers | QUANTITY | 0.97+ |
Saturday | DATE | 0.96+ |
24 hours | QUANTITY | 0.95+ |
each person | QUANTITY | 0.95+ |
10 second clip | QUANTITY | 0.94+ |
Nevine Rou | PERSON | 0.94+ |