Alex Hanna, The DAIR Institute | WiDS 2022
(upbeat music) >> Hey everyone. Welcome to theCUBE's coverage of Women in Data Science, 2022. I'm Lisa Martin, excited to be coming to you live from Stanford University at the Ariaga alumni center. I'm pleased to welcome fresh keynote stage Alex Hanna the director of research at the dare Institute. Alex, it's great to have you on the program. >> Yeah, lovely to be here. >> Talk to me a little bit about yourself. I know your background is in sociology. We were talking before we went live about your hobbies and roller derby, which I love. >> Yes. >> But talk to me a little bit about your background and what the DAIR Institute this is, distributed AI research Institute, what it actually is doing. >> Sure, absolutely. So happy to be here talking to the women in data science community. So my background's in sociology, but also in computer science and machine learning. So my dissertation work was actually focusing on developing some machine learning and natural language processing tools for analyzing protest event data and generating that and applying it to pertinent questions within social movement scholarship. After that, I was a faculty at University of Toronto and then research scientist at Google on the ethical AI team where I met Dr. Timnit Gebru who is the founder of DAIR. And so, DAIR is a nonprofit research Institute oriented on around independent community based AI work, focused really on, the kind of, lots of discussions around AI are done by big companies or companies focus on solutions that are very much oriented around collecting as much data as they can. Not really knowing if it's going to be for community benefit. At DAIR, we want to flip that, we want to really want to prioritize what that would mean if communities had input into data driven technologies what it would mean for those communities and how we can help there. >> Double click and just some of your research, where do your passions lie? >> So I'm a sociologist and a lot of that being, I think one of the big insights of sociology is to really highlight at how society can be more just, how we can interrogate inequality and understanding how to make those distances between people who are underserved and over served who already have quite a lot, how we can reduce the disparities. So finding out where that lies, especially in technology that's really what I'm passionate about. So it's not just technology, which I think can be helpful but it's really understanding what it means to reduce those gaps and make the world more just. >> And that's so important. I mean, as more and more data is generated, exponentially growing, so are some of the biases and the challenges that that causes. You just gave your tech vision talk which I had a chance to see most of it. And you were talking about something that's very interesting. That is the biases in facial recognition software. Maybe on a little bit about what you talked about and why that is such a challenge. And also what are some of the steps being made in the right direction where that's concerned? >> Yeah. So there's the work I was talking about in the talk was highlighting, not work I've done, but the work by doctors (indistinct) and (indistinct) focusing on the distance that exists and the biases that exist in facial recognition as a technical system. The fact remains also that facial recognition is used and is disproportionately deployed on marginalized population. So in the U.S, that means black and brown communities. That's where facial recognition is used disproportionately. And we also see this in refugee context where refugees will be leaving the country. And those facial recognition software will be used in those contexts and surveilling them. So these are people already in a really precarious place. And so, some of the movements there have been to debias some of the facial recognition tools. I actually don't think that's far enough. I'm fundamentally against facial recognition. I think that it shouldn't be used as a technology because it is used so pervasively in surveillance and policing. And if we're going to approach that we really need to think, rethink our models of security models of immigration and whatnot. >> Right, it's such an important topic to discuss because I think it needs more awareness about some of the the biases, but also some to your point about some of those vulnerable communities that are really potentially being harmed by technologies like that. We have to be, there's a fine line. Or maybe it's not so fine. >> I don't think it's that fine. So like, I think it's used, in an incredibly harsh way. And for instance there's research that's being done in which, so I'm a transgender woman and there's a research being done by researchers who collected data sets that people had on YouTube documenting their transitions. And already there was a researcher collecting those data and saying, well, we could have terrorists or something take hormones and cross borders. And you talk to any trans person, you're like, well, that's not how it works, first off. Second off, it's already viewing trans people and a trans body as kind of a mode of deception. And so that's, whereas researchers in this space were collecting those data and saying that well, we should collect these data to help make these facial recognitions more fair. But that's not fair if it's going to be used on a population that's already intensely surveilled and held in suspicion. >> Right. That's, the question of fairness is huge, absolutely. Were you always interested in tech, you talked about your background in sociology. Was it something that you always, were you a stem kid from the time you were little? Talk to me about your background and how you got to where you are now? >> Yeah. I've been using computers since I was four. I've been using, I was taking a part, my parents' gateway computer. yeah, when I was 10. Going to computer shows, slapping hard drives into things, seeing how much we could upgrade computer on our own and ruining more than in one computer, to my parents chagrin but I've always been that. I went to undergrad in triple major to computer science, math and sociology, and originally just in computer science and then added the other two where I got interested in things and understanding that, was really interested in this section of tech and society. And I think the more and more I sat within the field and went and did my graduate work in sociology and other social sciences really found that there was a place to interrogate those, that intersection of the two. >> Exactly. What are some of the things that excite you now about where technology is going? What are some of the positives that you see? >> I talk so much about the negatives. It's really hard to, I mean, there's I think, some of the things that I think that are positive are really the community driven initiatives that are saying, well, what can we do to remake this in such a way that is going to more be more positive for our community? And so seeing projects like, that try to do community control over certain kinds of AI models or really try to tie together different kinds of fields. I mean, that's exciting. And I think right now we're seeing a lot of people that are super politically and justice literate and they how to work and they know what's behind all these data driven technologies and they can really try to flip the script and try to understand what would it mean to kind of turn this into something that empowers us instead of being something that is really becoming centralized in a few companies >> Right. We need to be empowered with that for sure. How did you get involved with WIS? >> So Margo, one of the co-directors, we sit on a board together, the human rights data analysis group and I've been a huge fan of HR dag for a really long time because HR dag is probably one of the first projects I've seen that's really focused on using data for accountability for justice. Their methodology has been, called on to hold perpetrators of genocide to accounts to hold state violence, perpetrators to account. And I always thought that was really admirable. And so being on their board is sort of, kind of a dream. Not that they're actually coming to me for advice. So I met Margo and she said, come on down and let's do a thing for WIS and I happily obliged >> Is this your first Wis? >> This is my very first Wis. >> Oh, excellent. >> Yeah. >> What's your interpretation so far? >> I'm having a great time. I'm learning a lot meeting a lot of great people and I think it's great to bring folks from all levels here. Not only, people who are a super senior which they're not going to get the most out of it it's going to be the high school students the undergrads, grad students, folks who, and you're never too old to be mentored, so, fighting your own mentors too. >> You know, it's so great to see the young faces here and the mature faces as well. But one of the things that I was, I caught in the panel this morning was the the talk about mentors versus sponsors. And that's actually, I didn't know the difference until a few years ago in another women in tech event. And I thought it was such great advice for those panelists to be talking to the audience, talking about the importance of mentors, but also the difference between a mentor and sponsor. Who are some of your mentors? >> Yeah, I mean, great question. It's going to sound cheesy, but my boss (indistinct) I mean, she's been a huge mentor for me and with her and another mentor (indistinct) Mitchell, I wouldn't have been a research scientist. I was the first social scientist on the research scientist ladder at Google before I left and if it wasn't for their, they did sponsor but then they all also mentored me greatly. My PhD advisor, (indistinct) huge mentor by, and I mean, lots of primarily and then peer mentors, people that are kind of at the same stage as me academically but also in professionally, but are mentors. So folks like Anna Lauren Hoffman, who's at the UDub, she's a great inspiration in collaborating, co-conspirator, so yeah. >> Co-conspirator, I like that. I'm sure you have quite a few mentees as well. Talk to me a little bit about that and what excites you about being a mentor. >> Yeah. I have a lot of mentees either informally or formally. And I sought that out purposefully. I think one of the speakers this morning on the panel was saying, if you can mentor do it. And that's what I did and sought out that, I mean, it excites me because folks, I don't have all the answers, no one person does. You only get to those places, if you have a large community. And I think being smart is often something that people think comes like, there's kind of like a smart gene or whatever but like there probably is, like I'm not a biologist or a cognitive, anything, but what really takes cultivation is being kind and really advocating for other people and building solidarity. And so that's what mentorship really means to me is building that solidarity and really trying to lift other people up. I mean, I'm only here and where I'm at in my career, because many people were mentors and sponsors to me and that's only right to pay that forward. >> I love that, paying that forward. That's so true. There's nothing like a good community, right? I mean, there's so much opportunity that that ground swell just generates, which is what I love. We are, tomorrow is international women's day. And if we look at the numbers, women are 50% of the workforce, but only less than a quarter in stem positions. What's your advice and recommendation for those young girls who might be intimidated or might be being told even to this day, no, you can't do physics. You can't do computer science. What can you tell them? >> Yeah, I mean, so individual solutions to that are putting a bandaid on a very big wound. And I mean I think, finding other people in a working to change it, I mean, I think building structures of solidarity and care are really the only way we'll get out of that. >> I agree. Well, Alex, it's been great to have you on the program. Thank you for coming and sharing what you're doing at DAIR. The intersection of sociology and technology was fascinating and your roller derby, we'll have to talk well about that. >> For sure. >> Excellent. >> Thanks for joining me. >> Yeah, thank you Lisa. >> For Alex Hanna, I'm Lisa Martin. You're watching theCUBE's coverage live, of women in data science worldwide conference, 2022. Stick around, my next guest is coming right up. (upbeat music)
SUMMARY :
to be coming to you live Talk to me a little bit about yourself. But talk to me a little and applying it to pertinent questions and a lot of that being, and the challenges that that causes. and the biases that exist but also some to your point it's going to be used Talk to me about your background And I think the more and What are some of the and they how to work and they know what's We need to be empowered and I've been a huge fan of and I think it's great to bring I caught in the panel this morning people that are kind of at the and what excites you about being a mentor. and that's only right to pay that forward. even to this day, no, and care are really the only to have you on the program. of women in data science
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Alex | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Alex Hanna | PERSON | 0.99+ |
Anna Lauren Hoffman | PERSON | 0.99+ |
Timnit Gebru | PERSON | 0.99+ |
DAIR | ORGANIZATION | 0.99+ |
Lisa | PERSON | 0.99+ |
Margo | PERSON | 0.99+ |
50% | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
Mitchell | PERSON | 0.99+ |
first | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
DAIR Institute | ORGANIZATION | 0.99+ |
one | QUANTITY | 0.99+ |
University of Toronto | ORGANIZATION | 0.99+ |
Second | QUANTITY | 0.99+ |
U.S | LOCATION | 0.99+ |
tomorrow | DATE | 0.98+ |
Stanford University | ORGANIZATION | 0.98+ |
10 | QUANTITY | 0.98+ |
2022 | DATE | 0.98+ |
dare Institute | ORGANIZATION | 0.98+ |
four | QUANTITY | 0.97+ |
YouTube | ORGANIZATION | 0.97+ |
less than a quarter | QUANTITY | 0.96+ |
AI research Institute | ORGANIZATION | 0.96+ |
UDub | ORGANIZATION | 0.95+ |
WIS | ORGANIZATION | 0.95+ |
Women in Data Science | TITLE | 0.94+ |
theCUBE | ORGANIZATION | 0.93+ |
Dr. | PERSON | 0.92+ |
few years ago | DATE | 0.91+ |
Double click | QUANTITY | 0.91+ |
this morning | DATE | 0.91+ |
HR dag | ORGANIZATION | 0.9+ |
first social | QUANTITY | 0.9+ |
first projects | QUANTITY | 0.88+ |
international women's day | EVENT | 0.8+ |
one computer | QUANTITY | 0.77+ |
triple | QUANTITY | 0.65+ |
Wis | ORGANIZATION | 0.65+ |
more | QUANTITY | 0.58+ |
WiDS | EVENT | 0.55+ |
Ariaga | ORGANIZATION | 0.52+ |