Sharon Hutchins, Intuit | WiDS 2022
>>Welcome everyone to the cubes coverage of women in data science conference width 2022. Live from Stanford at the Arriaga alumni center. I'm Lisa Martin. My next guest is joined me. Sharon Hutchins is here the VP and chief of AI plus data operations at Intuit Sharon. Welcome. Thank you. >>Excited to >>Be here. This is your first woods, very first but into it in words. >>That's right. Intuitively it's goes way back. I'm relatively new to the organization, but Intuit has been a long time sponsor of woods, and we love this organization. We have a great alignment with our goals, which has a passion and commitment to advancing women in technology and data science. And we have the same goal added to it. We are at 30% women in technology with the goal of hitting 37% by 2024. And I know that widths has a great goal of 30 by 30, so that's awesome. >>30 by 30. And here we are around, I think it's still less than 25% of stem positions are filled by women. But obviously you're ahead of that on Intuit congratulate. >>I think we're ahead of that. And I think part of the reason why we're ahead of that is because we've got great programs at Intuit to support women. One of our key programs is tech women at Intuit. And so it's an internal initiative where we focus on attracting, retaining and advancing women. So it's a great way for women across technology to support one another. Sure. You've heard of the term there's power in the pack, and we believe that when we connect women, we can help elevate their voices, which elevates our business and elevates our products. >>It does. In fact, there's some stats I was looking at recently that just showed if there was even 30% females at the executive level, how much more profitable organizations can be in how much higher performance they can have. So the data is there that suggests this is a really smart business decision to be making. >>Absolutely absolutely the data is, is no lie. I see it firsthand in my own business. And in fact, at Intuit, we've got a broader initiative around diversity and inclusion. It's led from the top. We have set goals across the company and we hold ourselves accountable because we know that if there are more women at the table and more diversity at the table, all around, we make better business decisions. And if you look at our product suite, which is a terrible tax, QuickBooks, mint, credit, karma, and MailChimp, we've got a diverse customer base of a hundred thousand, sorry, a hundred million customers. And so it's a lot of diversity in our customer base and we want a lot of diversity in the company. >>Fantastic. That there's such a dedicated effort to it. You just came in here from the career panel. Talk to me about that. What were some of the key things that were discussed? Yeah, >>I have my notebook open here because there were so many great takeaways from actually just from the day in general. I'm just so at, at the types of issues that women are tackling across different industries, they're tackling bias. And we know that bias is corrected when women are at the table, but from a career perspective, some of the things that were mentioned from the panel is the fact that women need to own their own careers and they need to actively manage their careers. And there's only so much your manager can do and should do. You've got to be in the driver's seat, driving your own career. One of the things that we've done at Intuit as we've implemented sort of a self promoting process. So twice a year during our promotion period, either your manager can nominate you for a promotion or you can self promote. So it's all about you creating a portfolio of all of your great work. And of course, you know, managers are very supportive of the process and support, you know, women and, and all technologists in crafting their portfolios for a fair chance at promotion. And so we just believe that if you take bias out of a career progression, you can close that fair and equitable gap that we see sometimes across industries with compensation. >>This is, that would be great if we can ever get there. One of the things that's nice about woods, I think it was last year or the year before they opened it up to high school students. So it was so nice walking in this morning, seeing the young, fresh faces, the mature faces, but you bring up a great point of women need to be their own mini to create their own personal board of directors and really be able to, to be at the helm of their career. Do you, did you find that the audience is receptive to that? Do they have the confidence to be able to do that? >>Yeah, absolutely. And, and that was a point that was raised a couple of times this morning, there were women who talked about having great mentors, but it is more important to have a board of your personal board of directors than one mentor, because you've got to make sure that you sort of tackle all aspects of your career life. And you know, it's not all about the technology, a good portion of how you spend your time and where you spend your time is collaborating and negotiating and communicating across the company. And so that's very important. And so that was a key message that folks shared this morning. >>That's good. That's incredibly important. I wish we had more time. You've got to run to the airport. Sharon, it's been a pleasure to have you on the program. Thank you for sharing what Intuit and woods are doing together, your involvement and some of the great messages, inspiring messages from the career panel. >>Exactly. And for all of the young expiring high school students. Yes. We want them to check out into it. www.intuit.com, careers, >>Intuit.com. Is it slash careers slash careers slash careers perfectly. I'm an Intuit customer. I will say. Awesome. It's been a pleasure talking to you. Thank you, Sharon. Bye-bye for Sharon Hutchins. I'm Lisa Martin. You're watching the cubes coverage of women in data science, 2022.
SUMMARY :
Welcome everyone to the cubes coverage of women in data science conference width This is your first woods, very first but into it in words. And we have the same goal added to it. are filled by women. You've heard of the term there's power in the pack, So the data is there that suggests and more diversity at the table, all around, we make You just came in here from the career And so we just believe that if you take bias out One of the things that's nice about woods, And so that was a key message that folks shared this morning. it's been a pleasure to have you on the program. And for all of the young expiring high school students. It's been a pleasure talking to you.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Sharon | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Intuit | ORGANIZATION | 0.99+ |
Sharon Hutchins | PERSON | 0.99+ |
Sharon Hutchins | PERSON | 0.99+ |
last year | DATE | 0.99+ |
30% | QUANTITY | 0.99+ |
2024 | DATE | 0.99+ |
30 | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
37% | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
one mentor | QUANTITY | 0.98+ |
2022 | DATE | 0.98+ |
less than 25% | QUANTITY | 0.98+ |
www.intuit.com | OTHER | 0.97+ |
Arriaga | ORGANIZATION | 0.95+ |
this morning | DATE | 0.95+ |
twice a year | QUANTITY | 0.94+ |
Intuit.com | ORGANIZATION | 0.86+ |
a hundred million customers | QUANTITY | 0.85+ |
a hundred thousand | QUANTITY | 0.84+ |
first woods | QUANTITY | 0.83+ |
2022 | OTHER | 0.78+ |
mint | ORGANIZATION | 0.75+ |
this morning | DATE | 0.67+ |
MailChimp | ORGANIZATION | 0.62+ |
QuickBooks | TITLE | 0.6+ |
Stanford | LOCATION | 0.59+ |
science | EVENT | 0.57+ |
karma | ORGANIZATION | 0.55+ |
woods | ORGANIZATION | 0.53+ |
things | QUANTITY | 0.49+ |
credit | ORGANIZATION | 0.47+ |
Hannah Sperling, SAP | WiDS 2022
>>Hey everyone. Welcome back to the cubes. Live coverage of women in data science, worldwide conference widths 2022. I'm Lisa Martin coming to you from Stanford university at the Arriaga alumni center. And I'm pleased to welcome my next guest. Hannah Sperling joins me business process intelligence or BPI, academic and research alliances at SAP HANA. Welcome to the program. >>Hi, thank you so much for having me. >>So you just flew in from Germany. >>I did last week. Yeah. Long way away. I'm very excited to be here. Uh, but before we get started, I would like to say that I feel very fortunate to be able to be here and that my heart and vicious still goes out to people that might be in more difficult situations right now. I agree >>Such a it's one of my favorite things about Wiz is the community that it's grown into. There's going to be about a 100,000 people that will be involved annually in woods, but you walk into the Arriaga alumni center and you feel this energy from all the women here, from what Margo and teams started seven years ago to what it has become. I was happened to be able to meet listening to one of the panels this morning, and they were talking about something that's just so important for everyone to hear, not just women, the importance of mentors and sponsors, and being able to kind of build your own personal board of directors. Talk to me about some of the mentors that you've had in the past and some of the ones that you have at SAP now. >>Yeah. Thank you. Um, that's actually a great starting point. So maybe talk a bit about how I got involved in tech. Yeah. So SAP is a global software company, but I actually studied business and I was hired directly from university, uh, around four years ago. And that was to join SAP's analytics department. And I've always had a weird thing for databases, even when I was in my undergrad. Um, I did enjoy working with data and so working in analytics with those teams and some people mentoring me, I got into database modeling and eventually ventured even further into development was working in analytics development for a couple of years. And yeah, still am with a global software provider now, which brought me to women and data science, because now I'm also involved in research again, because yeah, some reason couldn't couldn't get enough of that. Um, maybe learn about the stuff that I didn't do in my undergrad. >>And post-grad now, um, researching at university and, um, yeah, one big part in at least European data science efforts, um, is the topic of sensitive data and data privacy considerations. And this is, um, also topic very close to my heart because you can only manage what you measure, right. But if everybody is afraid to touch certain pieces of sensitive data, I think we might not get to where we want to be as fast as we possibly could be. And so I've been really getting into a data and anonymization procedures because I think if we could random a workforce data usable, especially when it comes to increasing diversity in stem or in technology jobs, we should really be, um, letting the data speak >>And letting the data speak. I like that. One of the things they were talking about this morning was the bias in data, the challenges that presents. And I've had some interesting conversations on the cube today, about data in health care data in transportation equity. Where do you, what do you think if we think of international women's day, which is tomorrow the breaking the bias is the theme. Where do you think we are from your perspective on breaking the bias that's across all these different data sets, >>Right. So I guess as somebody working with data on a daily basis, I'm sometimes amazed at how many people still seem to think that data can be unbiased. And this has actually touched upon also in the first keynote that I very much enjoyed, uh, talking about human centered data science people that believe that you can take the human factor out of any effort related to analysis, um, are definitely on the wrong path. So I feel like the sooner that we realize that we need to take into account certain bias sees that will definitely be there because data is humanly generated. Um, the closer we're going to get to something that represents reality better and might help us to change reality for the better as well, because we don't want to stick with the status quo. And any time you look at data, it's definitely gonna be a backward looking effort. So I think the first step is to be aware of that and not to strive for complete objectivity, but understanding and coming to terms with the fact just as it was mentioned in the equity panel, that that is logically impossible, right? >>That's an important, you bring up a really important point. It's important to understand that that is not possible, but what can we work with? What is possible? What can we get to, where do you think we are on the journey of being able to get there? >>I think that initiatives like widths of playing an important role in making that better and increasing that awareness there a big trend around explainability interpretability, um, an AI that you see, not just in Europe, but worldwide, because I think the awareness around those topics is increasing. And that will then, um, also show you the blind spots that you may still have, no matter how much you think about, um, uh, the context. Um, one thing that we still need to get a lot better at though, is including everybody in these types of projects, because otherwise you're always going to have a certain selection in terms of prospectus that you're getting it >>Right. That thought diversity there's so much value in thought diversity. That's something that I think I first started talking about thought diversity at a Wood's conference a few years ago, and really understanding the impact there that that can make to every industry. >>Totally. And I love this example of, I think it was a soap dispenser. I'm one of these really early examples of how technology, if you don't watch out for these, um, human centered considerations, how technology can, can go wrong and just, um, perpetuate bias. So a soap dispenser that would only recognize the hand, whether it was a certain, uh, light skin type that w you know, be placed underneath it. So it's simple examples like that, um, that I think beautifully illustrate what we need to watch out for when we design automatic decision aids, for example, because anywhere where you don't have a human checking, what's ultimately decided upon you end up, you might end up with much more grave examples, >>Right? No, it's, it's I agree. I, Cecilia Aragon gave the talk this morning on the human centered guy. I was able to interview her a couple of weeks ago for four winds and a very inspiring woman and another herself, but she brought up a great point about it's the humans and the AI working together. You can't ditch the humans completely to your point. There are things that will go wrong. I think that's a sends a good message that it's not going to be AI taking jobs, but we have to have those two components working better. >>Yeah. And maybe to also refer to the panel discussion we heard, um, on, on equity, um, I very much liked professor Bowles point. Um, I, and how she emphasized that we're never gonna get to this perfectly objective state. And then also during that panel, um, uh, data scientists said that 80% of her work is still cleaning the data most likely because I feel sometimes there is this, um, uh, almost mysticism around the role of a data scientist that sounds really catchy and cool, but, um, there's so many different aspects of work in data science that I feel it's hard to put that all in a nutshell narrowed down to one role. Um, I think in the end, if you enjoy working with data, and maybe you can even combine that with a certain domain that you're particularly interested in, be it sustainability, or, you know, urban planning, whatever that is the perfect match >>It is. And having that passion that goes along with that also can be very impactful. So you love data. You talked about that, you said you had a strange love for databases. Where do you, where do you want to go from where you are now? How much more deeply are you going to dive into the world of data? >>That's a good question because I would, at this point, definitely not consider myself a data scientist, but I feel like, you know, taking baby steps, I'm maybe on a path to becoming one in the future. Um, and so being at university, uh, again gives me, gives me the opportunity to dive back into certain courses and I've done, you know, smaller data science projects. Um, and I was actually amazed at, and this was touched on in a panel as well earlier. Um, how outdated, so many, um, really frequently used data sets are shown the realm of research, you know, AI machine learning, research, all these models that you feed with these super outdated data sets. And that's happened to me like something I can relate to. Um, and then when you go down that path, you come back to the sort of data engineering path that I really enjoy. So I could see myself, you know, keeping on working on that, the whole data, privacy and analytics, both topics that are very close to my heart, and I think can be combined. They're not opposites. That is something I would definitely stay true to >>Data. Privacy is a really interesting topic. We're seeing so many, you know, GDPR was how many years did a few years old that is now, and we've got other countries and states within the United States, for example, there's California has CCPA, which will become CPRA next year. And it's expanding the definition of what private sensitive data is. So we're companies have to be sensitive to that, but it's a huge challenge to do so because there's so much potential that can come from the data yet, we've got that personal aspect, that sensitive aspect that has to be aware of otherwise there's huge fines. Totally. Where do you think we are with that in terms of kind of compliance? >>So, um, I think in the past years we've seen quite a few, uh, rather shocking examples, um, in the United States, for instance, where, um, yeah, personal data was used or all proxies, um, that led to, uh, detrimental outcomes, um, in Europe, thanks to the strong data regulations. I think, um, we haven't had as many problems, but here the question remains, well, where do you draw the line? And, you know, how do you design this trade-off in between increasing efficiency, um, making business applications better, for example, in the case of SAP, um, while protecting the individual, uh, privacy rights of, of people. So, um, I guess in one way, SAP has a, as an easier position because we deal with business data. So anybody who doesn't want to care about the human element maybe would like to, you know, try building models and machine generated data first. >>I mean, at least I would feel much more comfortable because as soon as you look at personally identifiable data, you really need to watch out, um, there is however ways to make that happen. And I was touching upon these anonymization techniques that I think are going to be, um, more and more important in the, in the coming years, there is a proposed on the way by the European commission. And I was actually impressed by the sophisticated newness of legislation in, in that area. And the plan is for the future to tie the rules around the use of data science, to the specific objectives of the project. And I think that's the only way to go because of the data's out there it's going to be used. Right. We've sort of learned that and true anonymization might not even be possible because of the amount of data that's out there. So I think this approach of, um, trying to limit the, the projects in terms of, you know, um, looking at what do they want to achieve, not just for an individual company, but also for us as a society, think that needs to play a much bigger role in any data-related projects where >>You said getting true anonymization isn't really feasible. Where are we though on the anonymization pathway, >>If you will. I mean, it always, it's always the cost benefit trade off, right? Because if the question is not interesting enough, so if you're not going to allocate enough resources in trying to reverse engineer out an old, the tie to an individual, for example, sticking true to this, um, anonymization example, um, nobody's going to do it right. We live in a world where there's data everywhere. So I feel like that that's not going to be our problem. Um, and that is why this approach of trying to look at the objectives of a project come in, because, you know, um, sometimes maybe we're just lucky that it's not valuable enough to figure out certain details about our personal lives so that nobody will try, because I am sure that if people, data scientists tried hard enough, um, I wonder if there's challenges they wouldn't be able to solve. >>And there has been companies that have, you know, put out data sets that were supposedly anonymized. And then, um, it wasn't actually that hard to make interferences and in the, in the panel and equity one lab, one last thought about that. Um, we heard Jessica speak about, uh, construction and you know, how she would, um, she was trying to use, um, synthetic data because it's so hard to get the real data. Um, and the challenge of getting the synthetic data to, um, sort of, uh, um, mimic the true data. And the question came up of sensors in, in the household and so on. That is obviously a huge opportunity, but for me, it's somebody who's, um, very sensitive when it comes to privacy considerations straight away. I'm like, but what, you know, if we generate all this data, then somebody uses it for the wrong reasons, which might not be better urban planning for all different communities, but simple profit maximization. Right? So this is something that's also very dear to my heart, and I'm definitely going to go down that path further. >>Well, Hannah, it's been great having you on the program. Congratulations on being a Wood's ambassador. I'm sure there's going to be a lot of great lessons and experiences that you'll take back to Germany from here. Thank you so much. We appreciate your time for Hannah Sperling. I'm Lisa Martin. You're watching the QS live coverage of women in data science conference, 2020 to stick around. I'll be right back with my next guest.
SUMMARY :
I'm Lisa Martin coming to you from Stanford Uh, but before we get started, I would like to say that I feel very fortunate to be able to and some of the ones that you have at SAP now. And that was to join SAP's analytics department. And this is, um, also topic very close to my heart because Where do you think we are data science people that believe that you can take the human factor out of any effort related What can we get to, where do you think we are on the journey um, an AI that you see, not just in Europe, but worldwide, because I think the awareness around there that that can make to every industry. hand, whether it was a certain, uh, light skin type that w you know, be placed underneath it. I think that's a sends a good message that it's not going to be AI taking jobs, but we have to have those two Um, I think in the end, if you enjoy working So you love data. data sets are shown the realm of research, you know, AI machine learning, research, We're seeing so many, you know, many problems, but here the question remains, well, where do you draw the line? And the plan is for the future to tie the rules around the use of data Where are we though on the anonymization pathway, So I feel like that that's not going to be our problem. And there has been companies that have, you know, put out data sets that were supposedly anonymized. Well, Hannah, it's been great having you on the program.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Hannah | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Cecilia Aragon | PERSON | 0.99+ |
Hannah Sperling | PERSON | 0.99+ |
Jessica | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
Germany | LOCATION | 0.99+ |
80% | QUANTITY | 0.99+ |
United States | LOCATION | 0.99+ |
2020 | DATE | 0.99+ |
Bowles | PERSON | 0.99+ |
next year | DATE | 0.99+ |
today | DATE | 0.99+ |
seven years ago | DATE | 0.99+ |
first step | QUANTITY | 0.99+ |
one role | QUANTITY | 0.99+ |
SAP | ORGANIZATION | 0.99+ |
tomorrow | DATE | 0.99+ |
last week | DATE | 0.99+ |
first keynote | QUANTITY | 0.99+ |
European commission | ORGANIZATION | 0.98+ |
first | QUANTITY | 0.98+ |
two components | QUANTITY | 0.98+ |
One | QUANTITY | 0.97+ |
SAP HANA | TITLE | 0.97+ |
one | QUANTITY | 0.96+ |
this morning | DATE | 0.95+ |
around four years ago | DATE | 0.94+ |
both topics | QUANTITY | 0.94+ |
100,000 people | QUANTITY | 0.93+ |
four winds | QUANTITY | 0.93+ |
international women's day | EVENT | 0.91+ |
California | LOCATION | 0.9+ |
GDPR | TITLE | 0.89+ |
one way | QUANTITY | 0.88+ |
couple of weeks ago | DATE | 0.87+ |
few years ago | DATE | 0.87+ |
2022 | DATE | 0.86+ |
Stanford university | ORGANIZATION | 0.84+ |
European | OTHER | 0.82+ |
Arriaga | ORGANIZATION | 0.8+ |
CPRA | ORGANIZATION | 0.8+ |
Wood | PERSON | 0.78+ |
one thing | QUANTITY | 0.75+ |
one last | QUANTITY | 0.74+ |
one of | QUANTITY | 0.74+ |
QS | EVENT | 0.72+ |
CCPA | ORGANIZATION | 0.69+ |
years | DATE | 0.6+ |
Margo | PERSON | 0.6+ |
about | QUANTITY | 0.54+ |
years | QUANTITY | 0.52+ |
WiDS | EVENT | 0.47+ |
Wiz | ORGANIZATION | 0.39+ |
Rukmini Iyer, Microsoft | WiDS 2022
>>Live from Stanford university on your host. Lisa Martin. My next guest joins me with many I, our corporate vice president at Microsoft, Rick Minnie. It's great to have you on the program. Thank you for having me. Tell me a little bit about your background. So you run Microsoft advertising, engineering organizations. You also manage a multi-billion dollar marketplace globally. Yes. Big responsibilities. >>A little bit >>About you and your role at Microsoft. >>So basically online advertising, you know, funds a lot of the consumer services like search, you know, feeds. And so I run all of the online advertising pieces. And so my team is a combination of machine learning in theory, software engineers, online services. So you think of you think of what needs to happen for running an online advertising ecosystem? That's billions of dollars. I have all these people on my team when I get to work with these fantastic people. So that's my >>Roles. We have a really diverse team. >>Yes. My background itself is in AI. So my PhD was in language modeling and natural language processing. That's how I got into the space. And then I did, you know, machine learning. Then I did some auctions and then I'd, you know, I basically have touched almost all pieces of the puzzle. So from, I appreciate what's required to run a business the size. And so from that perspective, you know, yeah, it is a lot of diverse people, but at the same time, I feel like I know what they do >>Right then interdisciplinary collaboration must be incredibly important and >>Powerful. It is. I mean, for machine learning engineer or machine learning scientists to be successful, when you're running a production system, they have to really appreciate what constraints are there, you know, required online. So you have to look at how much CPU you use, how much memory you need, how fast can your model inference run with your model. And so they have to work very closely with the soft, soft engineering field. But at the same time, the software engineering guys need to know that their job is not to constrain the machine learning scientists. So, you know, as the models get larger, they have to get more creative. Right. And if that balance is right, then you get a really ambitious product. If that balance is not right, then you end up with a very small micro micro system. And so my job is to really make sure that the team is really ambitious in their thinking, not always liking, pushing the borders of what can be done. >>I like that pushing the borders of what can be done. You know, we, we often, when we talk about roles in, in stammered technology, we've talked about the hard skills, but the soft skills you've mentioned creativity. I always think creativity and curiosity are two soft skills that are really important in data science and AI. Talk to me about what your thoughts are. There >>Definitely creativity, because a lot of the problems that you, you know, when you're in school, the problems you face are very theoretical problems. And when you go into the industry and you realize that you need to solve a problem using the theory you learned, then you have to either start making different kinds of assumptions or realize that some assumptions just can be made because life is messy and online. You know, users are messy. They don't all interact with your system the same way. So you get creative in what can be solved. And then what needs to be controlled and folks who can't figure that piece out, they try to solve everything using machine learning, and they become a perfectionist, but nothing ever gets done then. So you need this balance and, and creativity plays a huge role in that space. And collaboration is you're always working with a diverse group of people. So explaining the problem space to someone who's selling your product, say someone is, you know, you build this automated bidding engine and they have to take this full mouth full and sell it to a customer. You've got to give them the terminology to use, tell, explain to them what are the benefits if somebody uses that. So I, I feel people who can empathize with the fact that this has to be explained, do a lot better when they're working in a product system, you know, bringing machine learning to a production system. >>Right. There's a lot of enablement >>There. Yes, exactly. Yeah. Yeah. >>Were you always interested in, in stem and engineering and AIS from when you were small? >>Somewhat? I mean, I've been, I got to my college degree. I was very certain by that point I wanted to be an engineer and my path to AI was kind of weird because I didn't really want to do computer science. So I ended up doing electrical engineering, but in my last year I did a project on speech recognition and I got introduced to computer programming. That was my first introduction to computer programming at the end of it, I knew I was going to work in the space. And so I came to the U S with less than three or four months of a computer engineering background. You know, I barely knew how to code. I had done some statistics, but not nearly enough to be in machine learning. And, but I landed in a good place. And I came to be in Boston university and I landed in a great lab. And I learned everything on my feet in that lab. I do feel like from that point onwards, I have always been interested and I'm never satisfied with just being interested in what's hot right now. I really want to know what can be solved later in the future. So that combination, I think, you know, really keeps me always learning, growing, and I'm never happy with just what's being done. >>Right? Yeah. We here, we've been hearing a lot about that today at weds. Just the tremendous opportunities that are here, the opportunities for data science, for good drones, for good data science and AI in healthcare and in public transportation. For example, you've been involved in with winds from the beginning. So you've gotten to see this small movement grow into this global really kind of is a >>Phenomenon. It is, >>It's a movement. Yes. You talk to me about your involvement with winds from the beginning and some of the things that you're helping them do. And now, >>So I, I first met Karen and marble initially when I was trying to get students from ICME to apply for roles in Microsoft. I really thought they had the right mix of applied and research mindset and the skill sets that were coming out of ICME rock solid in their math and theoretical foundations. So that's how I got to know them. And then they were just thinking about bids at that point in time. And so I said, you know, how can I help? And so I think I've been a keynote speaker, Pam list run a workshop. And then I got involved with the woods high school volunteer effort. And I'd say, that's the most rewarding piece of my visit involvement. And so I've been with them every year. I never Ms. Woods. I'm always here. And I think it is, you know, Grace Hopper was the technology conference for women and, and it's, it's, it's an awesome conference. I mean, it's amazing to sit next to so many women engineers, but data science was a part of it, but not a critical part of it. And so having this conference, that's completely focused on data science and making it accessible. The talks are accessible, making it more personable to, to all the invitees here. I think it creates a great community. So for me, I think it's, I hope they can run this and grow this for >>Yeah. Over 200 online events this year in 60 countries, they're aiming to reach a hundred thousand people annually. It's, it's grown dramatically in a short time period. Yes, >>Absolutely. Yeah. It hasn't been that long. It hasn't been that long and every year they add something new to the table. So for this year, I mean last year I thought the high schoolers, they started bringing in the high schoolers and this year again, I thought the high school. >>Yeah, >>Exactly. And I think the mix of getting data science from across a diversity, because a lot of the conferences are very focused. Like, you know, they, they will be the focused on healthcare and data science or pure AI or pure machine learning. This conference has a mix of a lot of different elements. And so attendees get to see how it's something is being used in healthcare and how something is being used in recommendations. And I think that diversity is really valuable. >>Oh, it's hugely valuable that the thought diversity is this is probably the conference where I discovered what thought diversity was if only a few years ago and the power and the opportunities that it can unlock for people everywhere for businesses in any industry. Yes. >>I want to kind of play off one of the things you said before, you know, data science for good, the, the incredible part of data sciences, you can do good wherever you are with data science. So take online advertising, you know, we build products for all advertisers, but we quickly figured out that are really large advertisers. They have their own data science teams and they are optimizing and, you know, creating new ads and making sure the best ads are serving at all times. They have figured out, you know, they have machine learning pipelines, so they are really doing their best already. But then there's this whole tale of small advertisers who just don't have the wherewithal or the knowledge to do any of that. Now, can you make data, use data science and your machine learning models and make it accessible for that long table? Pretty much any product you build, you will have the symptom of heavy users and then the tail users. And can you create an experience that is as valuable for those tailored users as it is for the heavy users. So data science for good exists, whatever problem you're solving, basically, >>That's nice to hear. And so you're going to be participating in some of the closing remarks today. What are some of the pearls of wisdom that you're going to enlighten the audience with today? >>Well, I mean the first thing I, to tell this audiences that they need to participate, you know, in whatever they shaped form, they need to participate in this movement of getting more women into stem and into data science. And my reasoning is, you know, I joined the lab and my professor was a woman and she was very strong scientists, very strong engineer. And that one story was enough to convince me that I belong. And if you can imagine that we create thousands of these stories, this is how you create that feeling of inclusion, where people feel like they belong. Yeah. Look, just look at those other 50 people here, those other a hundred stories here. This is how you create that movement. And so the first thing I want the audience to do is participate, come back, volunteer, you know, submit papers for keynote speeches, you know, be a part of this movement. >>So that's one. And then the second is I want them to be ambitious. So I don't want them to just read a book and apply the theory. I really want them to think about what problem are they solving and could they have solved it in the, in the scale manner that it can be solved. So I'll give a few examples and problems and I'll throw them out there as well. So for instance, experimentation, one of the big breakthroughs that happened in a lot of these large companies in data science is experimentation. You can AB experiment pretty much anything. You know, we can, Google has this famous paper where they talk about how they experimented with thousands of different blues just to get the right blue. And so experimentation has been evolving and data scientists are figuring out that if they can figure out interactions between experiments, you can actually run multiple experiments on the same user. >>So at any given time, you may be subject to four or five different experiments. Now, can we now scale that to infinity so that you can actually run as many experiments as you want questions like these, you shouldn't stop with just saying, oh, I know how AB experimentation works. The question you should be asking is how many such experiments can I run? How do I scale the system? As one of the keynote speakers initially talked about the unasked questions. And I think that's what I want to leave this audience with that don't stop at, you know, answering the questions that you're asked or solving the problems. You know, of you think about the problems you haven't solved your blind spots, you know, those blind spots and that I think I want ambitious data scientists. And so that's the message I want to give this audience. >>I can feel your energy when you say that. And you're involved with, with, with Stanford program for middle school and high school girls. If we look at the data and we see, there's still only about a quarter of stem positions are filled by females, what do you see? Do you see an inspiring group of young women in those middle school and high school girls that, that you see we're, we're on trend to start increasing that percentage. >>So I had a high schooler who just went, you know, she, she, she just, she's at UCLA now shout out to her and she, but she just went through high school. And what I realized is it's the same problem of not having enough stories around you, not having enough people around you that are all echoing the sentiment for, Hey, I love math. A lot of girls just don't talk about us. Yeah. And so I think the reason I want to start in middle school and high school is I think the momentum needs to start there. Yes. Because they get to college. And actually you heard my story. I didn't know any programming until I came here and I had already finished my four years of college and I still figured it out. Right. But a lot of women lose confidence to change fields after four years of college. >>Yes. And so if you don't catch them in early and you're catching them late, then you need to give them this boost of confidence or give them that ramp up time to learn, to figure out, like, I have a few people who are joining me from pure math nowadays. And these kids, these kids come in and within six months they're off and running. So, you know, in the interview phase, people might say, oh, they don't have any coding skills. Six months later, if you interview them, they pick up coding skills. Yeah. And so if you can get them started early on, I think, you know, they don't have this crisis of confidence of moving, changing fields. That's why I feel, and I don't think we are there yet, to be honest, I don't think yet. I think >>You still think there are plenty of girls being told. Now you can't do computer science. No, you can't do physics. No, you can't do math. >>Actually. They are denying it to themselves in many cases because they say, Hey, I go to physics class and there are two boys, two girls out of 50 boys. And I don't think girls are in, you know, you get the stereotype that maybe girls are not interested in physics. And it's not about, Hey, as a girl, I'm doing really well in physics. Maybe I should take this as my career. So I do feel we need to create more resounding stories in the area. And then I think we'll drum up that momentum. That's >>A great point. More stories, more and names to success here so that she can be what she can see exactly what many it's been great having you on the program. Thank you for joining me and sharing your background and some of the pearls of wisdom that you're gonna be dropping on the audience shortly today. We appreciate your insights. Thank you. My pleasure. Who Rick, Minnie, I are. I'm Lisa Martin. You're watching the cubes coverage weds 2022. We'll be right back after a short break.
SUMMARY :
It's great to have you on the program. So basically online advertising, you know, funds a lot of the consumer services like search, We have a really diverse team. And so from that perspective, you know, yeah, it is a lot of diverse people, And so they have to work I like that pushing the borders of what can be done. And when you go into the industry and you realize There's a lot of enablement And so I came to the U S with less than opportunities that are here, the opportunities for data science, It is, And now, And so I said, you know, how can I help? Yes, So for this year, I mean last year I thought the high schoolers, And so attendees get to see how it's something is being used in healthcare and how the power and the opportunities that it can unlock for people everywhere I want to kind of play off one of the things you said before, you know, data science for good, And so you're going to be participating in some of the closing remarks today. And if you can imagine that we create thousands of these stories, this is how you create out that if they can figure out interactions between experiments, you can actually run multiple experiments You know, of you think about the problems you haven't solved your blind spots, what do you see? So I had a high schooler who just went, you know, she, she, she just, she's at UCLA now shout out to her and And so if you can get them started early on, No, you can't do physics. you know, you get the stereotype that maybe girls are not interested in physics. what many it's been great having you on the program.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Lisa Martin | PERSON | 0.99+ |
Karen | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Woods | PERSON | 0.99+ |
Rick Minnie | PERSON | 0.99+ |
Rukmini Iyer | PERSON | 0.99+ |
four | QUANTITY | 0.99+ |
last year | DATE | 0.99+ |
two girls | QUANTITY | 0.99+ |
four years | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
two boys | QUANTITY | 0.99+ |
50 people | QUANTITY | 0.99+ |
less than three | QUANTITY | 0.99+ |
one story | QUANTITY | 0.99+ |
60 countries | QUANTITY | 0.99+ |
UCLA | ORGANIZATION | 0.99+ |
Six months later | DATE | 0.99+ |
Rick | PERSON | 0.98+ |
second | QUANTITY | 0.98+ |
five different experiments | QUANTITY | 0.98+ |
today | DATE | 0.98+ |
one | QUANTITY | 0.98+ |
Over 200 online events | QUANTITY | 0.98+ |
ICME | ORGANIZATION | 0.97+ |
billions of dollars | QUANTITY | 0.97+ |
50 boys | QUANTITY | 0.96+ |
Minnie | PERSON | 0.96+ |
six months | QUANTITY | 0.95+ |
Stanford | ORGANIZATION | 0.95+ |
first | QUANTITY | 0.95+ |
this year | DATE | 0.95+ |
few years ago | DATE | 0.94+ |
thousands of different blues | QUANTITY | 0.93+ |
first introduction | QUANTITY | 0.9+ |
hundred stories | QUANTITY | 0.89+ |
Boston | LOCATION | 0.89+ |
two soft skills | QUANTITY | 0.89+ |
first thing | QUANTITY | 0.86+ |
multi-billion dollar | QUANTITY | 0.85+ |
a hundred thousand people | QUANTITY | 0.85+ |
Pam | PERSON | 0.84+ |
four months | QUANTITY | 0.78+ |
Stanford university | ORGANIZATION | 0.77+ |
2022 | DATE | 0.7+ |
U S | ORGANIZATION | 0.7+ |
thousands of these stories | QUANTITY | 0.69+ |
woods | ORGANIZATION | 0.67+ |
annually | QUANTITY | 0.65+ |
Grace Hopper | EVENT | 0.57+ |
2022 | OTHER | 0.41+ |
weds | DATE | 0.39+ |
university | ORGANIZATION | 0.35+ |
Nandi Leslie, Raytheon | WiDS 2022
(upbeat music) >> Hey everyone. Welcome back to theCUBE's live coverage of Women in Data Science, WiDS 2022, coming to live from Stanford University. I'm Lisa Martin. My next guest is here. Nandi Leslie, Doctor Nandi Leslie, Senior Engineering Fellow at Raytheon Technologies. Nandi, it's great to have you on the program. >> Oh it's my pleasure, thank you. >> This is your first WiDS you were saying before we went live. >> That's right. >> What's your take so far? >> I'm absolutely loving it. I love the comradery and the community of women in data science. You know, what more can you say? It's amazing. >> It is. It's amazing what they built since 2015, that this is now reaching 100,000 people 200 online event. It's a hybrid event. Of course, here we are in person, and the online event going on, but it's always an inspiring, energy-filled experience in my experience of WiDS. >> I'm thoroughly impressed at what the organizers have been able to accomplish. And it's amazing, that you know, you've been involved from the beginning. >> Yeah, yeah. Talk to me, so you're Senior Engineering Fellow at Raytheon. Talk to me a little bit about your role there and what you're doing. >> Well, my role is really to think about our customer's most challenging problems, primarily at the intersection of data science, and you know, the intersectional fields of applied mathematics, machine learning, cybersecurity. And then we have a plethora of government clients and commercial clients. And so what their needs are beyond those sub-fields as well, I address. >> And your background is mathematics. >> Yes. >> Have you always been a math fan? >> I have, I actually have loved math for many, many years. My dad is a mathematician, and he introduced me to, you know mathematical research and the sciences at a very early age. And so, yeah, I went on, I studied in a math degree at Howard undergrad, and then I went on to do my PhD at Princeton in applied math. And later did a postdoc in the math department at University of Maryland. >> And how long have you been with Raytheon? >> I've been with Raytheon about six years. Yeah, and before Raytheon, I worked at a small to midsize defense company, defense contracting company in the DC area, systems planning and analysis. And then prior to that, I taught in a math department where I also did my postdoc, at University of Maryland College Park. >> You have a really interesting background. I was doing some reading on you, and you have worked with the Navy. You've worked with very interesting organizations. Talk to the audience a little bit about your diverse background. >> Awesome yeah, I've worked with the Navy on submarine force security, and submarine tracking, and localization, sensor performance. Also with the Army and the Army Research Laboratory during research at the intersection of machine learning and cyber security. Also looking at game theoretic and graph theoretic approaches to understand network resilience and robustness. I've also supported Department of Homeland Security, and other government agencies, other governments, NATO. Yeah, so I've really been excited by the diverse problems that our various customers have you know, brought to us. >> Well, you get such great experience when you are able to work in different industries and different fields. And that really just really probably helps you have such a much diverse kind of diversity of thought with what you're doing even now with Raytheon. >> Yeah, it definitely does help me build like a portfolio of topics that I can address. And then when new problems emerge, then I can pull from a toolbox of capabilities. And, you know, the solutions that have previously been developed to address those wide array of problems, but then also innovate new solutions based on those experiences. So I've been really blessed to have those experiences. >> Talk to me about one of the things I heard this morning in the session I was able to attend before we came to set was about mentors and sponsors. And, you know, I actually didn't know the difference between that until a few years ago. But it's so important. Talk to me about some of the mentors you've had along the way that really helped you find your voice in research and development. >> Definitely, I mean, beyond just the mentorship of my my family and my parents, I've had amazing opportunities to meet with wonderful people, who've helped me navigate my career. One in particular, I can think of as and I'll name a number of folks, but Dr. Carlos Castillo-Chavez was one of my earlier mentors. I was an undergrad at Howard University. He encouraged me to apply to his summer research program in mathematical and theoretical biology, which was then at Cornell. And, you know, he just really developed an enthusiasm with me for applied mathematics. And for how it can be, mathematics that is, can be applied to epidemiological and theoretical immunological problems. And then I had an amazing mentor in my PhD advisor, Dr. Simon Levin at Princeton, who just continued to inspire me, in how to leverage mathematical approaches and computational thinking for ecological conservation problems. And then since then, I've had amazing mentors, you know through just a variety of people that I've met, through customers, who've inspired me to write these papers that you mentioned in the beginning. >> Yeah, you've written 55 different publications so far. 55 and counting I'm sure, right? >> Well, I hope so. I hope to continue to contribute to the conversation and the community, you know, within research, and specifically research that is computationally driven. That really is applicable to problems that we face, whether it's cyber security, or machine learning problems, or others in data science. >> What are some of the things, you're giving a a tech vision talk this afternoon. Talk to me a little bit about that, and maybe the top three takeaways you want the audience to leave with. >> Yeah, so my talk is entitled "Unsupervised Learning for Network Security, or Network Intrusion Detection" I believe. And essentially three key areas I want to convey are the following. That unsupervised learning, that is the mathematical and statistical approach, which tries to derive patterns from unlabeled data is a powerful one. And one can still innovate new algorithms in this area. Secondly, that network security, and specifically, anomaly detection, and anomaly-based methods can be really useful to discerning and ensuring, that there is information confidentiality, availability, and integrity in our data >> A CIA triad. >> There you go, you know. And so in addition to that, you know there is this wealth of data that's out there. It's coming at us quickly. You know, there are millions of packets to represent communications. And that data has, it's mixed, in terms of there's categorical or qualitative data, text data, along with numerical data. And it is streaming, right. And so we need methods that are efficient, and that are capable of being deployed real time, in order to detect these anomalies, which we hope are representative of malicious activities, and so that we can therefore alert on them and thwart them. >> It's so interesting that, you know, the amount of data that's being generated and collected is growing exponentially. There's also, you know, some concerning challenges, not just with respect to data that's reinforcing social biases, but also with cyber warfare. I mean, that's a huge challenge right now. We've seen from a cybersecurity perspective in the last couple of years during the pandemic, a massive explosion in anomalies, and in social engineering. And companies in every industry have to be super vigilant, and help the people understand how to interact with it, right. There's a human component. >> Oh, for sure. There's a huge human component. You know, there are these phishing attacks that are really a huge source of the vulnerability that corporations, governments, and universities face. And so to be able to close that gap and the understanding that each individual plays in the vulnerability of a network is key. And then also seeing the link between the network activities or the cyber realm, and physical systems, right. And so, you know, especially in cyber warfare as a remote cyber attack, unauthorized network activities can have real implications for physical systems. They can, you know, stop a vehicle from running properly in an autonomous vehicle. They can impact a SCADA system that's, you know there to provide HVAC for example. And much more grievous implications. And so, you know, definitely there's the human component. >> Yes, and humans being so vulnerable to those social engineering that goes on in those phishing attacks. And we've seen them get more and more personal, which is challenging. You talking about, you know, sensitive data, personally identifiable data, using that against someone in cyber warfare is a huge challenge. >> Oh yeah, certainly. And it's one that computational thinking and mathematics can be leveraged to better understand and to predict those patterns. And that's a very rich area for innovation. >> What would you say is the power of computational thinking in the industry? >> In industry at-large? >> At large. >> Yes, I think that it is such a benefit to, you know, a burgeoning scientist, if they want to get into industry. There's so many opportunities, because computational thinking is needed. We need to be more objective, and it provides that objectivity, and it's so needed right now. Especially with the emergence of data, and you know, across industries. So there are so many opportunities for data scientists, whether it's in aerospace and defense, like Raytheon or in the health industry. And we saw with the pandemic, the utility of mathematical modeling. There are just so many opportunities. >> Yeah, there's a lot of opportunities, and that's one of the themes I think, of WiDS, is just the opportunities, not just in data science, and for women. And there's obviously even high school girls that are here, which is so nice to see those young, fresh faces, but opportunities to build your own network and your own personal board of directors, your mentors, your sponsors. There's tremendous opportunity in data science, and it's really all encompassing, at least from my seat. >> Oh yeah, no I completely agree with that. >> What are some of the things that you've heard at this WiDS event that inspire you going, we're going in the right direction. If we think about International Women's Day tomorrow, "Breaking the Bias" is the theme, do you think we're on our way to breaking that bias? >> Definitely, you know, there was a panel today talking about the bias in data, and in a variety of fields, and how we are, you know discovering that bias, and creating solutions to address it. So there was that panel. There was another talk by a speaker from Pinterest, who had presented some solutions that her, and her team had derived to address bias there, in you know, image recognition and search. And so I think that we've realized this bias, and, you know, in AI ethics, not only in these topics that I've mentioned, but also in the implications for like getting a loan, so economic implications, as well. And so we're realizing those issues and bias now in AI, and we're addressing them. So I definitely am optimistic. I feel encouraged by the talks today at WiDS that you know, not only are we recognizing the issues, but we're creating solutions >> Right taking steps to remediate those, so that ultimately going forward. You know, we know it's not possible to have unbiased data. That's not humanly possible, or probably mathematically possible. But the steps that they're taking, they're going in the right direction. And a lot of it starts with awareness. >> Exactly. >> Of understanding there is bias in this data, regardless. All the people that are interacting with it, and touching it, and transforming it, and cleaning it, for example, that's all influencing the veracity of it. >> Oh, for sure. Exactly, you know, and I think that there are for sure solutions are being discussed here, papers written by some of the speakers here, that are driving the solutions to the mitigation of this bias and data problem. So I agree a hundred percent with you, that awareness is you know, half the battle, if not more. And then, you know, that drives creation of solutions >> And that's what we need the creation of solutions. Nandi, thank you so much for joining me today. It was a pleasure talking with you about what you're doing with Raytheon, what you've done and your path with mathematics, and what excites you about data science going forward. We appreciate your insights. >> Thank you so much. It was my pleasure. >> Good, for Nandi Leslie, I'm Lisa Martin. You're watching theCUBE's coverage of Women in Data Science 2022. Stick around, I'll be right back with my next guest. (upbeat flowing music)
SUMMARY :
have you on the program. This is your first WiDS you were saying You know, what more can you say? and the online event going on, And it's amazing, that you know, and what you're doing. and you know, the intersectional fields and he introduced me to, you And then prior to that, I and you have worked with the Navy. have you know, brought to us. And that really just And, you know, the solutions that really helped you that you mentioned in the beginning. 55 and counting I'm sure, right? and the community, you and maybe the top three takeaways that is the mathematical and so that we can therefore and help the people understand And so, you know, Yes, and humans being so vulnerable and to predict those patterns. and you know, across industries. and that's one of the themes I think, completely agree with that. that inspire you going, and how we are, you know And a lot of it starts with awareness. that's all influencing the veracity of it. And then, you know, that and what excites you about Thank you so much. of Women in Data Science 2022.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Lisa Martin | PERSON | 0.99+ |
Nandi | PERSON | 0.99+ |
Carlos Castillo-Chavez | PERSON | 0.99+ |
Simon Levin | PERSON | 0.99+ |
Nandi Leslie | PERSON | 0.99+ |
Nandi Leslie | PERSON | 0.99+ |
NATO | ORGANIZATION | 0.99+ |
Raytheon | ORGANIZATION | 0.99+ |
International Women's Day | EVENT | 0.99+ |
100,000 people | QUANTITY | 0.99+ |
Department of Homeland Security | ORGANIZATION | 0.99+ |
Raytheon Technologies | ORGANIZATION | 0.99+ |
2015 | DATE | 0.99+ |
today | DATE | 0.99+ |
University of Maryland | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Army Research Laboratory | ORGANIZATION | 0.99+ |
Navy | ORGANIZATION | 0.99+ |
first | QUANTITY | 0.98+ |
one | QUANTITY | 0.98+ |
pandemic | EVENT | 0.98+ |
millions of packets | QUANTITY | 0.97+ |
55 | QUANTITY | 0.97+ |
Cornell | ORGANIZATION | 0.97+ |
Howard University | ORGANIZATION | 0.97+ |
each individual | QUANTITY | 0.97+ |
about six years | QUANTITY | 0.97+ |
Howard | ORGANIZATION | 0.96+ |
55 different publications | QUANTITY | 0.96+ |
Stanford University | ORGANIZATION | 0.96+ |
One | QUANTITY | 0.96+ |
Unsupervised Learning for Network Security, or Network Intrusion Detection | TITLE | 0.96+ |
University of Maryland College Park | ORGANIZATION | 0.96+ |
Army | ORGANIZATION | 0.96+ |
WiDS | EVENT | 0.95+ |
Women in Data Science 2022 | TITLE | 0.95+ |
Women in Data Science | EVENT | 0.95+ |
Princeton | ORGANIZATION | 0.94+ |
hundred percent | QUANTITY | 0.94+ |
theCUBE | ORGANIZATION | 0.93+ |
CIA | ORGANIZATION | 0.93+ |
Secondly | QUANTITY | 0.92+ |
tomorrow | DATE | 0.89+ |
WiDS | ORGANIZATION | 0.88+ |
Doctor | PERSON | 0.88+ |
200 online | QUANTITY | 0.87+ |
WiDS 2022 | EVENT | 0.87+ |
this afternoon | DATE | 0.85+ |
three takeaways | QUANTITY | 0.84+ |
last couple of years | DATE | 0.83+ |
this morning | DATE | 0.83+ |
few years ago | DATE | 0.82+ |
SCADA | ORGANIZATION | 0.78+ |
top | QUANTITY | 0.75+ |
three | QUANTITY | 0.71+ |
2022 | DATE | 0.7+ |
DC | LOCATION | 0.64+ |
Breaking the Bias | EVENT | 0.52+ |
WiDS | TITLE | 0.39+ |
Maggie Wang, Skydio | WiDS 2022
(upbeat music) >> Hey, everyone. Welcome back to theCUBE's live coverage of Women in Data Science Worldwide Conference, WiDS 2022, live from Stanford Uni&versity. I'm Lisa Martin. I have a guest next here with me. Maggie Wang is here, Autonomy Engineer at Skydio. Maggie, welcome to the program. >> Thanks so much. I'm so happy to be here. >> Excited to talk to you. You are one of the event speakers, but this is your first WiDS. What's your take so far? >> I'm really excited that there's a conference dedicated to getting more women in STEM. I think it's extremely important, and I'm so happy to be here. >> Were you always interested in STEM subjects when you were growing up? >> I think I've always been drawn to STEM, but not only STEM, but I've always been interested in arts, humanities. I'm getting more interested in the science as well. And I think STEM robotics was really my way to express myself and make things move in the real world. >> Nice. So you've got interests, I was reading about you, interests in motion planning, control theory, computer vision and deep learning. Talk to me about those interests. It sounds very fascinating. >> Yeah. So I think what really drew me into robotics was just how interdisciplinary the subject is. So I think a lot goes into creating a robot. So not only is it about actually understanding where you are in the world, it's also about seeing where you are in the world. And it's so interesting, because I feel like humans, you know, we take this all for granted, but it's actually so difficult to do that in an actual robot. So I'm excited about the possibilities of robotics now and in the future. >> Lots of possibilities. And you only graduated from Harvard last May, with a Bachelor's and a Masters? >> Yeah. >> Tell me a little bit about what you studied at Harvard. >> Yeah, so I studied Physics as my undergrad degree. And that was really interesting, because I've always been interested in science. And, actually, part of what got me interested in STEM was just learning about the universe and astrophysics. And that's what gets me excited. And I think I also wanted to supplement that with computer science and building things in the real world. And so that's why I got my Masters in that. And I always knew that I wanted to kind of blend a lot of different disciplines and study them. >> There's so much benefit from blending disciplines, in terms of even the thought diversity alone, which just opens up the opportunities to be almost endless. So you graduated in May. You're now at Skydio. Autonomy Engineer. Talk to me a little bit, first of all, tell me a bit about Skydio as a company, the products, what differentiates it, and then talk to me about what you're doing there specifically. >> So Skydio is a really amazing company. I'm super-fortunate to work there. So what they do is create autonomous drones, and what differentiates them is the autonomy. So in typical drones, it's very difficult to actually make sure that it has full understanding of the environment and obstacle avoidance. So what happens is we fly these drones manually, but we aren't able to harness the full potential of these drones because of lack of autonomy. So what we do is really push into this autonomous sphere, and make sure that we're able to understand the environment. We have deep learning algorithms on the drone, and we have really good planning and controls on the drone as well. So yeah, our company basically makes the most autonomous drones in the market. >> Nice. And tell me about your role specifically. >> Yeah. So as an autonomy engineer, I write algorithms that run on the drone, which is super-exciting. I can create some algorithms and design it, and then also fly it in simulation, and then fly it in the real world. So it's just really amazing to see the things I work with actually come to life. >> And talk to me about how you got involved in WiDS. You were saying it was your first WiDS, and Margot Gerritsen found you on LinkedIn, but what are some of the things that you've heard so far? I mean, I was in one of the panels this morning before we came out to the set, and I loved how they were talking about the importance of mentors and sponsors. Talk to me about some of your mentors along the way. >> Yeah, I had so many great mentors along the way. I definitely would not be here had it not been for them. Starting from my parents, they're immigrants from China, and they inspire me in so many ways. They're very hard-working, and they always encourage me to fail, and just be courageous, and, you know, follow my passions. And I think beyond that, like in high school, I had great mentors. One was an astrophysics professor. >> Wow. >> Yeah. So it was very amazing that I was able to have these opportunities at a young age. And even in high school, I was involved in all girls robotics team. And that really opened my eyes to how technology can be used and why more women should be in STEM. And that, you know, STEM should not be only for males. And it's really important for everyone to be involved. >> It is, for so many reasons. If we look at the data, and the workforce is about 50-50, but the number of women in STEM positions is less than 25%. It's something that's new to the tech industry. What are some of the things that... Do you see that, do you feel that, or are you just really excited to be able to focus on doing the autonomous engineering that you're doing? >> Well, I think that it's kind of easy to try to separate yourself and your identity from your work, but I don't necessarily agree with that. I think you need to, as best as possible, bring yourself to the table and bring your whole identity. And I think part of growing up for me was trying to understand who I was as a woman and also as an Asian American, and try to combine all of my identities into how I bring myself to the workplace. And I think as we become more vulnerable and try to understand ourselves and express ourselves to others, we're able to build more inclusive communities, in STEM and beyond. >> I agree. Very wise words. So you're going to be talking on the career panel today. What are some of the parts of wisdom are you going to leave the audience with this afternoon? >> Well, wisdom. I think everyone should be able to know, and have intuitive understanding of what they actually bring to the table. I think so many times women shy away from bringing themselves and showing up as themselves. And I think it's really important for a woman to understand that they hold a lot of power, that they have a voice that need to be heard. And I think I just want to encourage everyone to be passionate and show up. >> Be passionate and show up. That's great advice. One of the things that was talked about this morning, and we talk about this a lot when we talk about data or data science, is the inherent bias in data. Talk to me about the importance of data in robotics. Is there bias there? How do you navigate around that? >> Yeah, there's definitely bias in robotics. There's definitely a lot of data involved in robotics. So in many cases right now in robotics, we work in specialized fields, so you can see picking robots that will pick in specific factory locations. But if you bring them to other locations, like in your garage or something, and make it clean up, it's really difficult to do so. So I think having a lot of different streams of data and having very diverse sets of data is very important. And also being able to run these in the real world I think is also super-important, and something that Skydio addresses a lot. >> So you talked about Skydio, what you guys do there, and some of the differentiators. What are some of the technical challenges that you face in trying to do what you're doing? >> Well, first of all, Skydio's trying to run everything on board on the drone. So already there's a lot of technical challenges that goes into putting everything in a small form factor and making sure that we trade off between compute and all of these different resources. And yeah, making sure that we utilize all of our resources in the best possible way. So that's definitely one challenge. And making sure that we have these trade-offs, and understand the trade-offs that we make. >> That's a good point. Talk to me about why robotics researchers and industry practitioners, what should be some of the key things that they're focusing on? >> Yeah, so I think right now, as I said, a lot of robotics is in very specialized environments, and what we're trying to do in robotics is try to expand to more complex real world applications. And I think Skydio's at the forefront of this. And trying to get these drones in all different types of locations is very difficult, because you might not have good priors, you might not have good information on your data sources. So I think, yeah, getting good, diverse data and making sure that these robots can work in multiple environments can hopefully help us in the future when we use robots. >> Right. There's got to be so many real world a applications of that. >> Yeah, for sure. >> I imagine. Definitely. So talk to me about being a female in the drone industry. What's that like? Why do you think it's important to have the female voice in mind in the drone industry? >> Well, I think first of all, I think it's kind of sad to see not many women in the drone space, because I think there's a lot of potential for drones to be used for good in all the different areas that women care about. And for instance, like climate change, there's a lot of ways that drones can help in reducing waste in many different ways. Search and rescue, for instance. Those are huge issues, and potential solutions from drones. And I think that if women understand these solutions and understand how drones can be used for good, I think we could get more women in and excited about this. >> And how do you see your role in that, in helping to get more women excited, and maybe even just aware of it as a career opportunity? >> Yeah. So I think sometimes robotics can be a very niche subject, and a lot of people get into it from gaming or other things. But I think if we come to it as a way to solve humanity's greatest problems, I think that's what really inspires me. I think that's what would inspire a lot of young women, is to see that robotics is a way to help others. And also that it may not, if we don't consciously make it so that robotics helps others, and if we don't put our voices into the table, then potentially robotics will do harm. But we need to push it into the right direction. >> Do you feel it's going in the right direction? >> Yes, I think with more conferences like this, like WiDS, I think we're going in the right direction. >> Yeah, this is a great conference. It's one of my favorite shows to host. And you know, it only started back in 2015 as a one-day technical conference. And look at it now. It's a global movement. They found you. You're now part of the community. But there's hundreds of events going on in 60 countries. You have the opportunity there to really grow your network, but also reach a much bigger audience, just based on something like what Margot Gerritsen and the team have done with WiDS. What does that mean to you? >> It means a lot. I think it's so amazing that we're able to spread the word of how technology can be used in many different fields, not just robotics, but in healthcare, in search and rescue, in environmental protection. So just seeing the power that technology can bring, and spreading that to underserved communities, not just in the United States, I love how WiDS is a global community and there's regional chapters everywhere. And I think there should be more of this global collaboration in technology. >> I agree. You know, every company these days is a technology company, or a data company, or both. You think of even your local retailer or grocery store that has to be a technology company. So for women to get involved in technology, there's so many different applications of that. It doesn't have to be just coding, for example. You're doing work with drones. There's so much potential there. I think the more that we can do events like this, and leverage platforms like theCUBE, the more we can get that word out there. >> I agree. >> So you have the career panel. And then you're also doing a tech vision talk. >> Yeah, a tech talk. >> What are some of the things you're going to talk about there? >> Yeah, so I'm going to talk about... So at the career panel, just advice in general to young people who may be as confused and starting off their career, just like I am. And at the tech talk, I'll be talking about some different aspects of Skydio, and a specific use case, which is 3D scanning any physical object and putting that into a digital model. >> Ooh, wow. Tell me a little bit more about that. >> Yeah, so 3D scan is one of our products, and it allows for us to take pictures of anything in the physical world and make sure that we can put it into a digital form. So we can create digital twins into digital form, which is very cool. >> Very cool. So we're talking any type of physical object. >> Mm hm. So if you want to inspect a building, or any crumbling infrastructure, a lot of the times right now we use helicopters, or big snooper trucks, or just things that could be expensive or potentially dangerous. Instead, we can use a drone. So this is just one example of how drones can be used to help save lives, potentially. >> Tremendous amount of opportunity that drones provide. It's very exciting. What are some of the things that you're looking forward to this year? We are very early in calendar year 2022, but what are you excited about as the year progresses? >> Hmm. What am I excited about? I think there's a lot of really interesting drone-related companies, and also a lot of robotics companies in general, a lot of startups, and there's a lot of excitement there. And I think as the robotics community grows and grows, we'll be seeing more robots in real life. And I think that's just extremely exciting to me. >> It is. And you're at the forefront of that. Maggie, it's great to have you on the program. Thank you for sharing what you're doing at Skydio, your history, your past, and what you're going to be encouraging the audience to be able to go and achieve. We appreciate your time. >> Thanks so much. >> All right. From Maggie Wang. I'm Lisa Martin. You're watching theCUBE's coverage of Women in Data Science Worldwide Conference, WiDS 2022. Stick around. I'll be right back with my next guest. (upbeat music)
SUMMARY :
Welcome back to theCUBE's live coverage I'm so happy to be here. You are one of the event speakers, and I'm so happy to be here. I think I've always been drawn to STEM, Talk to me about those interests. and in the future. And you only graduated what you studied at Harvard. And I think I also and then talk to me about and make sure that we're able And tell me about your role specifically. to see the things I work And talk to me about how And I think beyond that, And that, you know, STEM What are some of the things that... And I think as we become more vulnerable What are some of the parts of wisdom I think everyone should be able to know, One of the things that was And also being able to run to do what you're doing? and making sure that we Talk to me about why robotics researchers And I think Skydio's at There's got to be so many real So talk to me about being a And I think that if women But I think if we come to it going in the right direction. and the team have done with WiDS. and spreading that to I think the more that we So you have the career panel. And at the tech talk, Tell me a little bit more about that. and make sure that we can So we're talking any a lot of the times right What are some of the things And I think as the robotics and what you're going to of Women in Data Science
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Maggie | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Skydio | ORGANIZATION | 0.99+ |
Maggie Wang | PERSON | 0.99+ |
China | LOCATION | 0.99+ |
United States | LOCATION | 0.99+ |
2015 | DATE | 0.99+ |
Margot Gerritsen | PERSON | 0.99+ |
one-day | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
both | QUANTITY | 0.99+ |
May | DATE | 0.99+ |
last May | DATE | 0.99+ |
less than 25% | QUANTITY | 0.99+ |
one example | QUANTITY | 0.99+ |
today | DATE | 0.98+ |
ORGANIZATION | 0.98+ | |
one | QUANTITY | 0.98+ |
60 countries | QUANTITY | 0.98+ |
One | QUANTITY | 0.98+ |
one challenge | QUANTITY | 0.98+ |
Women in Data Science Worldwide Conference | EVENT | 0.98+ |
WiDS | ORGANIZATION | 0.98+ |
WiDS | EVENT | 0.97+ |
WiDS 2022 | EVENT | 0.96+ |
Stanford Uni | ORGANIZATION | 0.96+ |
this morning | DATE | 0.96+ |
hundreds of events | QUANTITY | 0.96+ |
Harvard | ORGANIZATION | 0.95+ |
theCUBE | ORGANIZATION | 0.94+ |
this year | DATE | 0.94+ |
Asian American | OTHER | 0.88+ |
this afternoon | DATE | 0.81+ |
about 50-50 | QUANTITY | 0.81+ |
2022 | DATE | 0.67+ |
versity | ORGANIZATION | 0.42+ |
Tina Hernandez Boussard | WiDS 2022
(upbeat music) >> Hey, everyone, welcome to theCUBE's live coverage of the Women in Data Science Worldwide Conference 2022. I'm your host, Lisa Martin, coming to you live from Stanford University. I'm pleased to welcome fresh from the main stage, Tina Hernandez-Boussard the Associate Professor of medicine here at Stanford. Tina, it's great to have you on the program. >> Thank you so much for this opportunity. I love being here and I've been coming to WIDS for many years, so it's exciting to be part of this and participate. >> It is exciting, it's one of my favorite events since I was telling you before we went live. And if you think about, they only started back in 2015. And how it's now, it was a one day technical conference and now it's this worldwide- >> It's amazing >> Phenomenon in 60 countries, and Just 200 different local events. Talk to me about, I caught part of the panel that you were on this morning. And one of the things that I love that you were talking about was mentors and sponsors. Talk to me about what you guys were talking about on the panel overall and some of who your mentors were as you came up in your career. >> Yeah, so mentorship is so important and really it makes a difference in people's careers. So I come from first generation family. No one in my family has had any higher education. So having a mentor, an academic mentor just made all the difference in the world for me. So I started undergraduate and I was immediately paired with somebody, a mentor because I was first generation. And this person, he's no longer with us today, but he believed in me and he opened doors for me. And he opened my eyes to all of these different opportunities. And having somebody who believes in you and really can help you pursue these other ideas, it's so important. And so we talked about in the panel, we talked about the importance of having a mentor but we also talked about the importance of being a mentor. And you know, helping people and students coming into the field, find that place and develop the confidence that this is for everybody. There's something for everybody here. And you've got to try, you've got to put your name out there. And having support is really important. >> Oh, it's critical. Even some interviews I was doing last week for International Women's Day which is tomorrow, I was surprised at the number of women that I talked to who said, well, I was told no, no you can't study computer science. No, you can't study physics and talking- >> This is a really difficult field, are you sure you want to pursue this? Well, yeah. You know, yeah. >> So having those mentors and that encouragement to help build that confidence from within is a game changer. >> Absolutely. Absolutely. >> Tell me a little bit about your research. >> Yeah, so we use electronic health data, all types of different health data to really define and predict prognose healthcare outcomes. We develop AI algorithms, tools to analyze the data and we really try, and one incorporate the patient's voice in the tools we develop. And two, we try and get those back to point of care. So I think a lot of emphasis has been about model development and model performance, and we really focus on, okay, that's great but it's only useful if we can get it back to the hands of the clinician, back to the patients to really improve health outcomes. And so that's a big piece of what we do. And as part of that, understanding patient values and patient preferences is really a big, important aspect of developing optimal treatment and optimal models. >> That's good involving them in the process. How can data science promote health equity? First of all, what is health equity, and how can data science help drive it? >> So health equity is really an important topic and there's a lot of different definitions about what is health equity. Health equity, what we want is we want equal outcomes and that's not equal resources. And so a lot of times, there's this contingency behind, you know are we trying to have equal resources for every patient or is it equal outcomes? And so we really focus on equal outcomes. We want everyone to have the opportunity to have the same outcomes. So health equity requires that we really think about different populations and their different needs, different preferences, et cetera. So that's what we focus on. And so to your question about how data science can promote health equity, one of the things we've been working on is really thinking about the gold standard of clinical care which is the clinical trial, right? So a clinical trial is our gold standard for what treatments work, in what situation and for what populations. However, a lot of the clinical trials are developed in a non-represented population, right? So we have to have patients who can come into the care setting at multiple times during a period, they can't have particular diseases, They can't, you know, for example, in one particular trial we were looking at, they had to have a specific BMI, they couldn't have diabetes they couldn't have all of these other healthcare conditions which at the end of the day it doesn't really represent the community at risk. And so when we develop these models using clinical trial data, a lot of times it's not generalizable to routine care. And so AI can really help that. AI can help us understand, how we can better identify patients to include in trials, what patients are going to be more likely to complete the trials. And so there's a lot of opportunity there to think about how we diversify clinical trials and also how we can start thinking stimulating and doing pragmatic clinical trials if we don't have enough data to represent certain populations. >> Right, one of the, the exciting things about data science is all the things that it's informing. It's also, there's pros and cons. >> Absolutely, right. >> But when we talk about AI, we always talk about ethics. How is it being used in healthcare? How are you seeing it being used? Ethically and effectively in healthcare to really turn the table on some of those biases... >> Exactly. >> To your point, weren't representing some of the most vulnerable part of the community. >> Right. And I think we've been taking this holistic approach of the AI life cycle. So not just focusing on the data we capture, but the whole life cycle. So what does that mean? That means, you know where's the data coming from, who's capturing it, and in what setting, right. If we're only looking at the healthcare setting, well, we're missing another large population. It is only collected via a mobile device. There's another population we're missing. So thinking about where the data's coming from, and then thinking about who it represents and who's missing from that. The next step is really thinking about the questions we're asking. I'll give you a good example. We can ask, you know, can I use AI to predict a no-show appointment? Or can I use AI to identify barriers for this patient to access care? So really even thinking about how we can flip that question to make it more equitable, make it more diverse is really important. And then there's been a lot of work in model development and algorithm fairness, and so there's a lot of research on that. But then there's another piece that we don't really see a lot on, and that's model deployment. So what are the biases when you introduce this into the healthcare system? The clinicians, how do they understand the data we're giving them, the tools. How do they use that to make clinical decisions? And then also, what systems can actually deploy these AI algorithms because they're very resource intensive. So we think about the AI and equity along all of those aspects of the AI life cycle. And it really helps us get a more holistic view because each of these components intersects. >> They do, you're right. Tell me, I'm curious a little bit about your background. You are associate professor of medicine here at Stanford. Give the audience an overview of the path that you took to get where you are. >> Yeah, so not a straight path, which is often typical that we're hearing today. So I started getting a Master's in Epidemiology and Public Health. And from there, I was like, you know, I wasn't really sure what I wanted to do. I applied to medical school, but I'm like, I'm not sure that's what I want to do. So I went and got a PhD in Computational Biology. I'm always data savvy. And so thinking about how I could use data and I was interested in healthcare. And so I got my PhD in Computational Biology. And from there, I was thinking about, well, I was really interested in the application of data science to the healthcare field. So then I got a Master's in Health Services Research. So it's the combination of all these different degrees that make me really have, I think, a diverse view. I really understand the need for multidisciplinary teams and how we need opinions and viewpoints from so many different disciplines to really create something that's equitable and fair and something that is feasible and usable. >> Thought diversity is so important. >> Oh, it is. >> In every aspect of life, whether we're talking about business life, personal life and without it, there's bias. >> Absolutely. Absolutely. And so we see this, we'll have a clinician maybe come to us with a question and then we'll have, you know the health economists think about it. We'll have the other people think about it. And we kind of work it and we massage it to get to something that's meaningful and something that we can really use that's going to change care for patients or particular patients. And so it really important to have that diversity. >> Absolutely, talk to me about your team that you're working with. >> Yeah, so my team is very diverse and I'm very proud of that. We have diversity across every aspect. So we have racial-ethnic diversity. We're probably about 80% female on my team. And so interestingly, one of my members was like, wow, I didn't realize I'm such a minority on your team. It was a male. And so I'm like, and I'm very proud of that. But we also have very diverse disciplines. So we have a lot of medical students, medical faculty we have computer scientists, engineers, epidemiologists, health policy experts. And so it's very, very diverse. And what I like to do is I like to pair people up in teams. So I might put a health economist with a computer scientist and watch them go. And it's just amazing how they can learn from each other and the directions they go in. It's just, it's really incredible. >> Well, and the opportunities that that interdisciplinary relationship builds I mean, opportunities and possibilities must be endless. >> Yeah, and it also allows students to understand how to speak to different groups because we don't speak the same language, we really don't. And equity is a good example. So equity to me might have a certain meaning, but equity to the health policy expert might have a different meaning. And so even understanding how we speak to other groups is so important and being able to translate something in a simple language that other people get is really key. >> Absolutely. >> Yeah, now here we are, tomorrow was International Women's Day- >> Exciting. >> It is exciting and Women's History Month, we get a whole month to ourselves, which is fantastic. But one of the things, you know, when we look at the at the data, the workforce 50- 50 males to females, but the STEM positions are still so low, right? Below 25%. Are you seeing, obviously WIDS is a positive step in that direction to start shifting that. But what do you tell the younger set in terms of- >> Yeah, it is a challenge. It is a challenge to really, and this is the example I always give. As a woman, we've all walked into these rooms that are all male. >> Lisa: Oh, yes. >> We've all walked into these rooms where you're sitting at the table. Oh, can you take notes? And it's hard, it's really hard. But you know what, it takes courage. So again, that mentorship being able to speak up, being able to set your place at that table I think is really important. And we're doing better. We're doing better. But it really is through consistent mentorship, consistent confidence building, et cetera. >> It is. >> Yeah. Yeah. >> And this, this event is fantastic for that. It's going to reach about 100,000 people annually. >> Amazing. >> Yeah. Men and women of all, of all ages, of all different career backgrounds, which is fantastic. But the International Women's Day theme tomorrow, is "Breaking the Bias." Hashtag breaking the bias. Where do you think we are on that? >> I think we have a lot of ways to go. And so there's bias in our teams, there's bias in the way we think, there's bias in our data. And there's been a lot of publicity and hype about the bias in the data. And it is so true and certainly in healthcare systems. And, you know, it's important to understand that when we're developing AI and all these machine learning or data-driven models, they learn from the data we give it. So if we're giving it a biased data sample, or an unrepresented data sample, it's going to learnt those characteristics. And so I think it's important that we think about, you know how do we do a better job at capturing data, diversity, voices from different populations? And it's not just using the same tools and technology we have today and going to another community and saying, okay, here's what I have. You know, it's not working that way. So I think we need to think outside of the box. To think how do these people want to communicate with us? How do they want to share our data? It's about trust too, because trust is a big issue with that. So I think there's a lot of opportunities there to just further develop that. Do you think there really is going to be such a thing one of these days of an a non-biased unbiased set of data? >> I don't think so. I don't think so because the more we dig, the more biases we find and while we're making great strides in race and ethnicity, diversity in our data sets, there's other biases. You know, male, female, age biases, disease biases, et cetera. So just the more we dig into this, the more we identify. But it's great because when we find these gaps in our data or gaps, we take steps to address that and to mitigate those biases. So we're, we're moving in the right direction for sure putting the spotlight on it and being transparent about it, I think is key to move forward. >> I agree that transparency is critical. >> Yes, absolutely. >> And, you know, we often say she can't be what she can't see. Right, and so from a transparency perspective in data and also in the visibility of the leaders and the mentors and the sponsors, that transparency is table stakes. >> Absolutely. Absolutely. >> What are some of the things that you're looking forward to as we hopefully move out of the pandemic and to the end- I can imagine with the Masters in public health you have in MPH. Yes, your prospect must be so interesting on living through it during the pandemic. >> It is, and, and it's interesting because we've gone through the pandemic and now it's turning into this endemic, right? And so how do we deal with that? And one of the things I think that is really important is we find way to still meet and collaborate face to face, share ideas. This conference is amazing where we can share ideas, we can meet new people, we can learn new perspectives and being able to continue to do that is so important. I think that during the pandemic, we really took a big hit in the transfer of learning in our labs and in our teams. And now it's funny because my team they're like let's go to lunch, let's do a happy hour. Let's, you know, they just want that social interaction. And it's more to better understand the perspectives of where they're coming from with their questions, better understanding of the skills they bring to the table. But it's just this wonderful opportunity to think about how we move forward now in our new world, right? >> Yes. We're getting there slowly, but surely. Well, Tina, thank you for joining me talking about your role, what you're doing, the importance of mentors and sponsors, and the opportunity for data science in healthcare. We appreciate your insights. >> Absolutely. Thank you for having me. It's my pleasure. >> You're welcome. >> Excellent. >> Thank you, For Tina Hernandez-Boussard, I'm Lisa Martin. You're watching theCUBE's live coverage of Women In Data Science Worldwide Conference 2022. Stick around, my next guest will join me shortly. (gentle music)
SUMMARY :
coming to you live from so it's exciting to be part And if you think about, they And one of the things that I love And so we talked about in the panel, that I talked to who said, are you sure you want to pursue this? and that encouragement Absolutely. about your research. in the tools we develop. and how can data science help drive it? And so to your question about data science is all the Ethically and effectively in healthcare of the most vulnerable on the data we capture, of the path that you took and how we need opinions and viewpoints In every aspect of life, and something that we can really use Absolutely, talk to me about your team So we have a lot of medical Well, and the opportunities So equity to me might But one of the things, It is a challenge to really, being able to set your place at that table It's going to reach about is "Breaking the Bias." that we think about, you know So just the more we dig into and the mentors and the sponsors, Absolutely. the pandemic and to the end- And one of the things I think and the opportunity for Thank you for of Women In Data Science
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Lisa Martin | PERSON | 0.99+ |
Lisa | PERSON | 0.99+ |
2015 | DATE | 0.99+ |
Tina | PERSON | 0.99+ |
Tina Hernandez-Boussard | PERSON | 0.99+ |
International Women's Day | EVENT | 0.99+ |
one day | QUANTITY | 0.99+ |
last week | DATE | 0.99+ |
25% | QUANTITY | 0.99+ |
Tina Hernandez Boussard | PERSON | 0.99+ |
tomorrow | DATE | 0.99+ |
International Women's Day | EVENT | 0.99+ |
50 | QUANTITY | 0.99+ |
International Women's Day | EVENT | 0.99+ |
one | QUANTITY | 0.99+ |
today | DATE | 0.98+ |
two | QUANTITY | 0.98+ |
about 100,000 people | QUANTITY | 0.98+ |
Stanford University | ORGANIZATION | 0.98+ |
60 countries | QUANTITY | 0.98+ |
first generation | QUANTITY | 0.98+ |
each | QUANTITY | 0.97+ |
pandemic | EVENT | 0.97+ |
Women's History Month | EVENT | 0.96+ |
Women in Data Science Worldwide Conference 2022 | EVENT | 0.96+ |
about 80% | QUANTITY | 0.96+ |
theCUBE | ORGANIZATION | 0.95+ |
Stanford | ORGANIZATION | 0.95+ |
200 different local events | QUANTITY | 0.94+ |
Women In Data Science Worldwide Conference 2022 | EVENT | 0.94+ |
First | QUANTITY | 0.9+ |
Stanford | LOCATION | 0.81+ |
trial | QUANTITY | 0.76+ |
this morning | DATE | 0.76+ |
annually | QUANTITY | 0.73+ |
one of my members | QUANTITY | 0.69+ |
events | QUANTITY | 0.54+ |
WiDS | EVENT | 0.37+ |
2022 | EVENT | 0.29+ |
Tierra Bills, UCLA | WiDS 2022
>>Welcome everyone to the cubes coverage of women in data science, worldwide conference 2022. I'm Lisa Martin, coming to you live from Stanford university at the Arriaga alumni center. It's great to be back at widths in person, and I'm pleased to welcome fresh from the main stage Tiara Bill's assistant professor at UCLA Tierra. Welcome to the program. >>I'm glad to be here. Thank you for having me. Tell >>Me a little bit about your background. You're a civil engineer and I was telling you, so it was my dad. So I'm, I'm partial to civil engineers, but give our audience an overview of your background, what you studied and all that. Good. >>Yeah. So I'm a civil engineer, um, specifically transportation engineer, um, at UCLA. I also have an appointment in the public policy department. And so, um, I'm split between the two, my work focuses on travel demand modeling and how to use these tools to better inform, uh, and learn more about transportation equity and how to advance transportation equity. Um, and what that means is that we are prioritizing the needs of vulnerable communities, um, in terms of the data that we're using, the models that we're using to guide decision-making, um, in terms of the very projects that we evaluate and ultimately the decisions that we make to invest in certain transportation improvements. How >>Did you get interested in transportation equity? >>Yeah, so I think it, it stems from growing up, uh, in Detroit, some or Detroit born and raised native, and it stems from growing up in an environment where it was very clear that space matters that where you live the most, that you have access to, uh, whether you have a car or not. Um, whether you have flexibility in your, in your travel, it all matters. And it all governs the opportunities that you have access to. So it was very clear to me, um, when I would realize that certain certain kids didn't really leave their neighborhood, you know, they didn't travel about the city, let alone outside of the city and abroad. And so, um, and there are also other, you know, examples of, um, there are examples and cases after case where it's clear that communities are, um, being exposed to a high level of emissions, for example, um, that might result from transportation, but they're not positioned to benefit, um, in the same ways that the people who own the infrastructure on the freight or what have you. So, um, these are all very real experiences that have motivated my interest in transportation equity. >>Interesting. It's something I actually had never thought about, but you bring up a great point. How are talk to me about the travel demand models, how they're relevant and, and where some of the biases are in travel data, >>Right? So travel demand models, they are they're computational tools. They're empirically estimated meaning that their estimated from raw data, um, everything about them is driven by the data that you have access to. And how they're used is in largely in regional transportation planning, when it is necessary for regions to assess 10, maybe 15, 20 years into the future. Um, how is transportation going to change as a result of changes in travel patterns, growth in the population, um, changes and how firms are distributed across the landscape. Um, environmental changes, all sorts of changes that, um, that guide and direct our transportation decisions at an individual level. So regions are assessing these things over time and they need these powerful travel demand models in order to perform those assessments. And then they also, once they have an understanding of what the need is, because for example, they expect traffic congestion to improve, or sorry to increase over time. Um, there needs to be a means of assessing alternatives for mitigating those issues. And so they use the same types of models to understand if we expand highway capacity, if we, uh, build a new form of transit, is that going to mitigate, uh, the challenges that we're going to face in the future >>And travel demand, modeling and equity? What's the connection there? I imagine there's a pretty good >>Deep connection, right? So the connection is that. So we're using these tools to decide on the future of transportation investments and because of a history of understanding that we have around how ignoring the conditions for vulnerable communities, ignoring how, um, uh, transportation decisions might differentially impact different, different groups, different segments. Um, if we ignore that, then it can lead to devastating outcomes. And so I'm citing, um, examples of the construction of the Eisenhower interstate system back in the fifties and sixties, where, uh, we know today that there were millions of black and minority communities that were, uh, displace. Um, they weren't fairly compensated all because of lack of consideration for, for outcomes to these communities and the planning process. And so we are aware that these kinds of things can happen. Um, and because of that, we now have federal regulations that require, uh, equity analysis to occur for any project that's going to leverage federal funding. And so it's, it's tied to our understanding of what can happen when we don't focus on equity is also tied to what the current regulations are, but challenge is that we need better guidance on how to do this, how to perform the equity analysis. What types of improvements are actually going to move the needle and advance us toward a state where we can prioritize the needs of the vulnerable travelers and residents? What >>Excites you about the work that you're doing? >>You know, I, I have a vested interest in seeing conditions improve for, um, for the underdog, if you will, for folks who, um, they, they work hard, but they still struggle, um, for folks who experience discrimination in different forms. Um, and so I have a vested interest in seeing conditions improve for them. And so I'm really excited about, uh, the time that we're in, I'm excited that equity is now at the height of many discussions, um, because it's opening up resources, right? To have, uh, more folks paying attention, more folks, researching more folks, developing methods and processes that will actually help to advance equity, >>Advancing equity. We definitely need that. And you're right. There's, there's good V visibility on it right now. And let's take advantage of that for the good things that can come out of it. Talk to me a little bit about what you talked about in your talk earlier today here at widths. >>Right? So today I got a chance to elaborate on how travel demand models can end up, um, uh, with, with issues of bias and under-representation, and it's tied to a number of things, but one of them is the data that reusing, because these are, uh, empirically estimated tools. They take their form, they take their, uh, significance. Everything about them is shaped by the data that we use. Um, and at the same time, we are aware that vulnerable communities are more prone to issues that contribute to data bias. And under-representation so issues, for example, like non-response, um, issues like coverage bias with means that, um, certain groups are for whatever reason, not in the sample link frame. Um, and so, because we know that these types of errors are more prevalent for vulnerable communities, it brings, uh, it raises questions about, um, the quality of the decisions that come out of these models that we estimate based on these data. >>And so I'm interested in weaving these parts together. Um, and part of it has to do with understanding the conditions that, um, that underlie the data. So what do I mean by conditions? I gave an example of, uh, cases where there is discrimination and as evidenced by the data that we have available as evidenced, uh, for example, by examining, um, the quality of service across racial groups, um, using Uber and Lyft, right? So we have information that, that, that presents this to us, but that information is still outside of what we typically use to estimate travel, demand models. That information is not being used to understand the context under which people are making decisions. It's not being used to better understand the constraints that people are facing when they're making, uh, decisions. And so what is the connection that means that we are using data, um, that does not will capture the target group. >>People who are low income, elderly, um, transit dependent, uh, we're not capturing these groups very well because of the prevalence of, of various types of survey bias. Um, and it is shaping our models in unknown ways. And so my group is really trying to make that connection between, okay, how do we collect Bader, better data, first of all, but second, what does that mean? What are the ramifications for prediction, accuracy for VR, for various groups, and then beyond that, what are the policy implications? Right. Um, I think that the risk is that we might be making wrong decisions, right? We might be assuming that, uh, certain types of improvements are actually going to improve quality of service for vulnerable communities when they actually don't. Right. Um, and so that's the worry and that's part of the unknown, and that's why I'm working in this >>Part of the anonymity. Also, I'm sure part of your passion and your interest international women's day is tomorrow. And the theme this year is break the bias of breaking the bias with >>Mercy back >>To travel equity. Where do you think we are on, on being able to start mitigating some of the biases that you've talked about? >>I think that it's all about phasing. I think that there are things that we can do now, right? And so, um, at the point of making decisions, um, we can view the results that we have through this lens, that it might be an incomplete picture. We can view it through a historical lens. We can also view it, um, using emerging data that allows for us to explore some of these constraints that, you know, might be exogenous to the models or X, not in, not included in how we estimate the models. Um, and so that's one thing that we can do in practice is okay. We already know that there are some challenges let's view this from a different lens, as opposed to assuming that it's giving us the complete picture. Right. Um, and that's kind of been my theme, uh, today is that, you know, as decision-makers, as analysts, as data scientists, as researchers, we do have tendency of assuming that the data that we have, the results that we have is giving us the complete picture when we know, but it's not, we know that we act as if it is, but we know that it's not right. >>So, you know, we need to, there's a lot of learning and changing of behaviors, um, that that has to happen. >>Changing behaviors is challenging. >>It is behavior changes is tough, but it's necessary, but it's necessary. It's necessary. And it's urgent. And it's critical, especially if you're going to, uh, improve conditions for vulnerable community. >>What are some of the things that excite you, that looking at where we are now, we've got a nice visibility on equity. There, there's the conscious understanding of the bias and data and the work to help to mitigate that. What are some of the things that excite you about what you're doing and maybe even some of the policies that you think should be enacted as a result of more encompassing datasets? >>It's a good question. Um, one thing I will say is what excites me is it's also tied to the emerging data that we have available. So I'm trying to go back to an example that I gave about measuring constraints. Think that we can now do that in interesting ways, because we're collecting data about everything we're collecting data about, um, not just about where we travel, but how we travel, why we travel. Um, you know, we, we collect information on who we're traveling with, you know, so there's a lot more information that we can make use of, um, in particular to understand constraints. So it's, it's really exciting to me. And when I say that again, um, talking about, um, how would we make a choice to take a certain mode of transportation or to leave our house at a certain time in the morning to, to get to work. >>Um, we're making that under some conditions, right? Right. And those conditions aren't always observed and traditional data sets. I think now we're at a time where emerging data sources can start to capture some of that. And so we can ask questions that we weren't able to, or answer questions that we weren't able to answer before. And the reason why it's important in the modeling is because in the models, you have this sort of choice driven side and you have the alternatives. So you're making a choice amongst some set of alternatives. We model the choices and we spend a lot of time and pay a lot of attention to the decision process. And what factors goes into making the choice, assuming that everyone really has the same set of universal choices. Right. I think that we need to take a little, pay a little more intention, um, to understanding the constraints that people have, um, and how that guides the overall outcomes. Right? So, so that's what I'm excited about. I mean, it's basically leveraging the new data in new ways that we weren't able to before >>Leveraging the data in new ways. Love it. Tierra, thank you for joining me, talking about transportation equity, what you're doing there, the opportunities and kind of where we are on that road. If you will. Thank you so much for having me, my pleasure. I'm Lisa Martin. You're watching the cubes coverage of women in data science conference, 2022. We'll be right back with our next guest.
SUMMARY :
I'm Lisa Martin, coming to you live from Stanford university at I'm glad to be here. So I'm, I'm partial to civil engineers, in terms of the very projects that we evaluate and ultimately the decisions that we make to invest And it all governs the opportunities that you have access to. the travel demand models, how they're relevant and, and where some of the biases are And so they use the same types of models to understand if we And so it's, it's tied to our understanding of what can happen when we don't focus for, um, for the underdog, if you will, And let's take advantage of that for the good things that can come out of it. Um, and at the same time, we are aware that vulnerable the quality of service across racial groups, um, using Uber and Lyft, Um, and so that's the worry and that's part of the unknown, And the theme this year is break the bias of breaking the bias with on being able to start mitigating some of the biases that you've talked about? at the point of making decisions, um, we can view the results that So, you know, we need to, there's a lot of learning and changing of behaviors, And it's critical, especially if you're going to, What are some of the things that excite you about what you're doing and maybe even some of the policies the emerging data that we have available. And so we can ask questions that we weren't able to, Leveraging the data in new ways.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Lisa Martin | PERSON | 0.99+ |
Detroit | LOCATION | 0.99+ |
two | QUANTITY | 0.99+ |
Uber | ORGANIZATION | 0.99+ |
10 | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
Tiara Bill | PERSON | 0.99+ |
UCLA | ORGANIZATION | 0.99+ |
15 | QUANTITY | 0.99+ |
Lyft | ORGANIZATION | 0.99+ |
2022 | DATE | 0.99+ |
tomorrow | DATE | 0.99+ |
millions | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
UCLA Tierra | ORGANIZATION | 0.99+ |
this year | DATE | 0.98+ |
second | QUANTITY | 0.98+ |
fifties | DATE | 0.97+ |
20 years | QUANTITY | 0.96+ |
one thing | QUANTITY | 0.95+ |
Arriaga | ORGANIZATION | 0.92+ |
women's day | EVENT | 0.87+ |
Tierra | PERSON | 0.85+ |
earlier today | DATE | 0.83+ |
Tierra Bills | ORGANIZATION | 0.83+ |
2022 | EVENT | 0.76+ |
Stanford university | ORGANIZATION | 0.71+ |
first | QUANTITY | 0.7+ |
black | QUANTITY | 0.55+ |
sixties | DATE | 0.55+ |
science | EVENT | 0.51+ |
Eisenhower | PERSON | 0.41+ |
WiDS | EVENT | 0.26+ |
Alex Hanna, The DAIR Institute | WiDS 2022
(upbeat music) >> Hey everyone. Welcome to theCUBE's coverage of Women in Data Science, 2022. I'm Lisa Martin, excited to be coming to you live from Stanford University at the Ariaga alumni center. I'm pleased to welcome fresh keynote stage Alex Hanna the director of research at the dare Institute. Alex, it's great to have you on the program. >> Yeah, lovely to be here. >> Talk to me a little bit about yourself. I know your background is in sociology. We were talking before we went live about your hobbies and roller derby, which I love. >> Yes. >> But talk to me a little bit about your background and what the DAIR Institute this is, distributed AI research Institute, what it actually is doing. >> Sure, absolutely. So happy to be here talking to the women in data science community. So my background's in sociology, but also in computer science and machine learning. So my dissertation work was actually focusing on developing some machine learning and natural language processing tools for analyzing protest event data and generating that and applying it to pertinent questions within social movement scholarship. After that, I was a faculty at University of Toronto and then research scientist at Google on the ethical AI team where I met Dr. Timnit Gebru who is the founder of DAIR. And so, DAIR is a nonprofit research Institute oriented on around independent community based AI work, focused really on, the kind of, lots of discussions around AI are done by big companies or companies focus on solutions that are very much oriented around collecting as much data as they can. Not really knowing if it's going to be for community benefit. At DAIR, we want to flip that, we want to really want to prioritize what that would mean if communities had input into data driven technologies what it would mean for those communities and how we can help there. >> Double click and just some of your research, where do your passions lie? >> So I'm a sociologist and a lot of that being, I think one of the big insights of sociology is to really highlight at how society can be more just, how we can interrogate inequality and understanding how to make those distances between people who are underserved and over served who already have quite a lot, how we can reduce the disparities. So finding out where that lies, especially in technology that's really what I'm passionate about. So it's not just technology, which I think can be helpful but it's really understanding what it means to reduce those gaps and make the world more just. >> And that's so important. I mean, as more and more data is generated, exponentially growing, so are some of the biases and the challenges that that causes. You just gave your tech vision talk which I had a chance to see most of it. And you were talking about something that's very interesting. That is the biases in facial recognition software. Maybe on a little bit about what you talked about and why that is such a challenge. And also what are some of the steps being made in the right direction where that's concerned? >> Yeah. So there's the work I was talking about in the talk was highlighting, not work I've done, but the work by doctors (indistinct) and (indistinct) focusing on the distance that exists and the biases that exist in facial recognition as a technical system. The fact remains also that facial recognition is used and is disproportionately deployed on marginalized population. So in the U.S, that means black and brown communities. That's where facial recognition is used disproportionately. And we also see this in refugee context where refugees will be leaving the country. And those facial recognition software will be used in those contexts and surveilling them. So these are people already in a really precarious place. And so, some of the movements there have been to debias some of the facial recognition tools. I actually don't think that's far enough. I'm fundamentally against facial recognition. I think that it shouldn't be used as a technology because it is used so pervasively in surveillance and policing. And if we're going to approach that we really need to think, rethink our models of security models of immigration and whatnot. >> Right, it's such an important topic to discuss because I think it needs more awareness about some of the the biases, but also some to your point about some of those vulnerable communities that are really potentially being harmed by technologies like that. We have to be, there's a fine line. Or maybe it's not so fine. >> I don't think it's that fine. So like, I think it's used, in an incredibly harsh way. And for instance there's research that's being done in which, so I'm a transgender woman and there's a research being done by researchers who collected data sets that people had on YouTube documenting their transitions. And already there was a researcher collecting those data and saying, well, we could have terrorists or something take hormones and cross borders. And you talk to any trans person, you're like, well, that's not how it works, first off. Second off, it's already viewing trans people and a trans body as kind of a mode of deception. And so that's, whereas researchers in this space were collecting those data and saying that well, we should collect these data to help make these facial recognitions more fair. But that's not fair if it's going to be used on a population that's already intensely surveilled and held in suspicion. >> Right. That's, the question of fairness is huge, absolutely. Were you always interested in tech, you talked about your background in sociology. Was it something that you always, were you a stem kid from the time you were little? Talk to me about your background and how you got to where you are now? >> Yeah. I've been using computers since I was four. I've been using, I was taking a part, my parents' gateway computer. yeah, when I was 10. Going to computer shows, slapping hard drives into things, seeing how much we could upgrade computer on our own and ruining more than in one computer, to my parents chagrin but I've always been that. I went to undergrad in triple major to computer science, math and sociology, and originally just in computer science and then added the other two where I got interested in things and understanding that, was really interested in this section of tech and society. And I think the more and more I sat within the field and went and did my graduate work in sociology and other social sciences really found that there was a place to interrogate those, that intersection of the two. >> Exactly. What are some of the things that excite you now about where technology is going? What are some of the positives that you see? >> I talk so much about the negatives. It's really hard to, I mean, there's I think, some of the things that I think that are positive are really the community driven initiatives that are saying, well, what can we do to remake this in such a way that is going to more be more positive for our community? And so seeing projects like, that try to do community control over certain kinds of AI models or really try to tie together different kinds of fields. I mean, that's exciting. And I think right now we're seeing a lot of people that are super politically and justice literate and they how to work and they know what's behind all these data driven technologies and they can really try to flip the script and try to understand what would it mean to kind of turn this into something that empowers us instead of being something that is really becoming centralized in a few companies >> Right. We need to be empowered with that for sure. How did you get involved with WIS? >> So Margo, one of the co-directors, we sit on a board together, the human rights data analysis group and I've been a huge fan of HR dag for a really long time because HR dag is probably one of the first projects I've seen that's really focused on using data for accountability for justice. Their methodology has been, called on to hold perpetrators of genocide to accounts to hold state violence, perpetrators to account. And I always thought that was really admirable. And so being on their board is sort of, kind of a dream. Not that they're actually coming to me for advice. So I met Margo and she said, come on down and let's do a thing for WIS and I happily obliged >> Is this your first Wis? >> This is my very first Wis. >> Oh, excellent. >> Yeah. >> What's your interpretation so far? >> I'm having a great time. I'm learning a lot meeting a lot of great people and I think it's great to bring folks from all levels here. Not only, people who are a super senior which they're not going to get the most out of it it's going to be the high school students the undergrads, grad students, folks who, and you're never too old to be mentored, so, fighting your own mentors too. >> You know, it's so great to see the young faces here and the mature faces as well. But one of the things that I was, I caught in the panel this morning was the the talk about mentors versus sponsors. And that's actually, I didn't know the difference until a few years ago in another women in tech event. And I thought it was such great advice for those panelists to be talking to the audience, talking about the importance of mentors, but also the difference between a mentor and sponsor. Who are some of your mentors? >> Yeah, I mean, great question. It's going to sound cheesy, but my boss (indistinct) I mean, she's been a huge mentor for me and with her and another mentor (indistinct) Mitchell, I wouldn't have been a research scientist. I was the first social scientist on the research scientist ladder at Google before I left and if it wasn't for their, they did sponsor but then they all also mentored me greatly. My PhD advisor, (indistinct) huge mentor by, and I mean, lots of primarily and then peer mentors, people that are kind of at the same stage as me academically but also in professionally, but are mentors. So folks like Anna Lauren Hoffman, who's at the UDub, she's a great inspiration in collaborating, co-conspirator, so yeah. >> Co-conspirator, I like that. I'm sure you have quite a few mentees as well. Talk to me a little bit about that and what excites you about being a mentor. >> Yeah. I have a lot of mentees either informally or formally. And I sought that out purposefully. I think one of the speakers this morning on the panel was saying, if you can mentor do it. And that's what I did and sought out that, I mean, it excites me because folks, I don't have all the answers, no one person does. You only get to those places, if you have a large community. And I think being smart is often something that people think comes like, there's kind of like a smart gene or whatever but like there probably is, like I'm not a biologist or a cognitive, anything, but what really takes cultivation is being kind and really advocating for other people and building solidarity. And so that's what mentorship really means to me is building that solidarity and really trying to lift other people up. I mean, I'm only here and where I'm at in my career, because many people were mentors and sponsors to me and that's only right to pay that forward. >> I love that, paying that forward. That's so true. There's nothing like a good community, right? I mean, there's so much opportunity that that ground swell just generates, which is what I love. We are, tomorrow is international women's day. And if we look at the numbers, women are 50% of the workforce, but only less than a quarter in stem positions. What's your advice and recommendation for those young girls who might be intimidated or might be being told even to this day, no, you can't do physics. You can't do computer science. What can you tell them? >> Yeah, I mean, so individual solutions to that are putting a bandaid on a very big wound. And I mean I think, finding other people in a working to change it, I mean, I think building structures of solidarity and care are really the only way we'll get out of that. >> I agree. Well, Alex, it's been great to have you on the program. Thank you for coming and sharing what you're doing at DAIR. The intersection of sociology and technology was fascinating and your roller derby, we'll have to talk well about that. >> For sure. >> Excellent. >> Thanks for joining me. >> Yeah, thank you Lisa. >> For Alex Hanna, I'm Lisa Martin. You're watching theCUBE's coverage live, of women in data science worldwide conference, 2022. Stick around, my next guest is coming right up. (upbeat music)
SUMMARY :
to be coming to you live Talk to me a little bit about yourself. But talk to me a little and applying it to pertinent questions and a lot of that being, and the challenges that that causes. and the biases that exist but also some to your point it's going to be used Talk to me about your background And I think the more and What are some of the and they how to work and they know what's We need to be empowered and I've been a huge fan of and I think it's great to bring I caught in the panel this morning people that are kind of at the and what excites you about being a mentor. and that's only right to pay that forward. even to this day, no, and care are really the only to have you on the program. of women in data science
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Alex | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Alex Hanna | PERSON | 0.99+ |
Anna Lauren Hoffman | PERSON | 0.99+ |
Timnit Gebru | PERSON | 0.99+ |
DAIR | ORGANIZATION | 0.99+ |
Lisa | PERSON | 0.99+ |
Margo | PERSON | 0.99+ |
50% | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
Mitchell | PERSON | 0.99+ |
first | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
DAIR Institute | ORGANIZATION | 0.99+ |
one | QUANTITY | 0.99+ |
University of Toronto | ORGANIZATION | 0.99+ |
Second | QUANTITY | 0.99+ |
U.S | LOCATION | 0.99+ |
tomorrow | DATE | 0.98+ |
Stanford University | ORGANIZATION | 0.98+ |
10 | QUANTITY | 0.98+ |
2022 | DATE | 0.98+ |
dare Institute | ORGANIZATION | 0.98+ |
four | QUANTITY | 0.97+ |
YouTube | ORGANIZATION | 0.97+ |
less than a quarter | QUANTITY | 0.96+ |
AI research Institute | ORGANIZATION | 0.96+ |
UDub | ORGANIZATION | 0.95+ |
WIS | ORGANIZATION | 0.95+ |
Women in Data Science | TITLE | 0.94+ |
theCUBE | ORGANIZATION | 0.93+ |
Dr. | PERSON | 0.92+ |
few years ago | DATE | 0.91+ |
Double click | QUANTITY | 0.91+ |
this morning | DATE | 0.91+ |
HR dag | ORGANIZATION | 0.9+ |
first social | QUANTITY | 0.9+ |
first projects | QUANTITY | 0.88+ |
international women's day | EVENT | 0.8+ |
one computer | QUANTITY | 0.77+ |
triple | QUANTITY | 0.65+ |
Wis | ORGANIZATION | 0.65+ |
more | QUANTITY | 0.58+ |
WiDS | EVENT | 0.55+ |
Ariaga | ORGANIZATION | 0.52+ |
Vidya Setlur, Tableau | WiDS 2022
(bright music) >> Hi, everyone. Welcome to theCUBE's coverage of WiDS 2022. I'm Lisa Martin, very happy to be covering this conference. I've got Vidya Setlur here with me, the director of Tableau Research. Vidya, welcome to the program. >> Thanks, Lisa. It's great to be here. >> So this is one of my favorite events. You're a keynote this year. You're going to be talking about what makes intelligent visual analytics tools really intelligent. Talk to me a little bit about some of the key takeaways that the audience is going to glean from your conversation. >> Yeah, definitely. I think we've reached a point where everybody understands that data is important, trying to understand that data is equally important. And we're also getting to that point where technology and AI is really picking up. Algorithms are getting better, computers are getting faster. And so there's a lot of dialogue and conversation around how AI can help with visual analysis to make our jobs easier, help us glean insights. So I thought it was a really timely point where we can really actually talk about it, and distilling into the specifics of how these tools can actually be intelligent beyond just the general buzz of AI. >> And that's a great point that you bring up. There's been a lot of buzz around AI for a long time. The organizations talk about it, software vendors talk about it being integrated into their technologies, but how can AI really help to make visual analytics interpretable in a way that makes sense for the data enthusiast and the business? >> Yeah, so to me, I think my point of view, which tends to be the general agreement among the research community, is AI is getting better. And there are certain types of algorithms, especially these repetitive tasks. We see this with even Instagram, right? You put a picture on Instagram, there are filters that can maybe make the image look better, some fun backgrounds. And those, generally speaking, are AI algorithms at work. So there are these simple, either fun ways or tasks that reduce friction where AI can play a role, and they tend to be really good with these repetitive tasks, right? If I had to upload a picture and constantly edit the background manually, that's a pain. So AI algorithms are really good at figuring out where people tend to do a particular task often, and that's a good place for these algorithms to come into play. But that being said, I think fundamentally speaking, there are going to be tasks where AI can't simply replace a human. Humans have a really strong visual system. We have a very highly cognitive system where we can glean insights and takeaways beyond just the pixels, or just the text. And so how do we actually design systems where algorithms augment a human, where a human can stay in the driver's seat, stay creative, but defer all these mundane or repetitive tasks that simply add friction to the computer? And that's what the keynote is about. >> And talk to me about when you're talking with organizations, where are they in terms of appetite to understand the benefits that natural language processing, AI and humans together, can have on visual analytics, and being able to interpret that data? >> Yeah. So I would say it's really moving fast. So three years ago, organizations were like AI, it's a great buzzword, we're weary because when rubber hits the road, it's really hard to take that into action. But now we're slowly seeing places where it can actually work. So organizations are really thirsty to figure out how do we actually add customer value? How do we actually build products where AI can move from a simple, cute proof of concept working in a lab to actual production? And that is where organizations are right now. And we've already seen that with various types of examples, like machine translation. You open up a Google page in Spanish, and you can hit auto translate and it will convert it into English. Now, is it perfect? Not, but is it good enough? Yes. And I think that's where AI algorithms are heading, and organizations are really trying to figure out what's in it for us, and what's in it for our customers. >> What are some of the cultural, anytime we talk about AI, we always talk about ethics. But what are some of the cultural, or the language specific challenges with respect to natural language techniques that organizations need to be aware of? >> Yeah, that's a great question, and it's a common question, and really important. So as I've said, these AI algorithms are only as good as the data that they're often trained on. And so it's really important, in addition to the cultural aspects of incorporating those into the techniques, is to really figure out what sort of biases come into play, right? So a simple example is there's sarcasm in language, and different cultures have different ways of interpreting it. There are subtleties in language, jokes. My kids have a certain type of language when they're talking with each other that I may not understand. So there's a whole complexity around cultural appropriation generations that, where language constantly evolves, as well as biases. For example, we've had conversations in the news where AI algorithms are trained on a particular data set for detecting crime. And there are hidden biases that go into play with that sort of data. So we're really, it's important to be acknowledged of where the data is, and what sorts of cultural biases come into play. But translation, simple language translation is already more or less a solved problem. But beyond the simple language translation, we also have to account for language subtleties as well. >> Right, and the subtleties can be very dramatic. When you're talking with organizations that are really looking to become data driven. Everybody talks about being data driven, and we hear it on the news all the time, it's mainstream. But what that actually really means, and how an organization actually delivers on that are two different things. When you're talking with customers that are, okay, we've got to talk about ethics. We know that there's biases and data. How do you help them get around that so that they can actually adopt that technology, and make it useful and impactful to the business? >> Yeah. So just as important as figuring out how AI algorithms can help an organization's business, it's equally important for an organization to be more data literate about the data that feeds into these algorithms. So making data as a first class citizen, and figuring out are there hidden biases? Is the data comprehensive enough? Acknowledging where there are limitations in the data and being completely transparent about that. And sharing that with customers, I think, is really key. And coming back to humans being in the driver's seat. If these experiences are designed where humans are, in fact, in the driver's seat, as a human, they can intervene and correct and repair the system if they do see certain types of oddities that come into play with these algorithms. >> Going to ask you in our final few minutes here, I know that you have a PhD in computer graphics from Northwestern, is it? >> Yep. >> Northwestern. >> Go Wildcats, yep. >> Were you always interested in STEM and data? Talk to me a little bit about your background. >> Yeah. I grew up in a family full of academics and female academics. And now, yes, I have boys, including my dog. Everybody's male, but I have a really strong vested interest in supporting women in STEM. And I actually would go further and say, STEAM. I think arts and science are both equally important. In fact, I would say that on our research team, there's a good representation of minorities and women. And data analysis and visual analysis, in particular, is a field that is very conducive for women in the field, because women tend to be naturally meticulous. They're very good at distilling what they're seeing. So I would argue that there are a host of disciplines in this space that make it equally exciting and conducive for women to jump in. >> I'm glad that you said that. That's actually quite exciting, and that's a real positive thing that's going on in the industry, and what you're seeing. So I'm looking forward to your keynote, and I'm sure the audience is as well. Vidya, it was a pleasure to have you on the program talking about intelligent visual analytics tools, and the opportunities that they bring to organizations. Thanks for your time. >> Thanks, Lisa. >> For Vidya Setlur, I'm Lisa Martin. You're watching theCUBE's coverage of WiDS conference 2022. Stick around, more great content coming up next. (bright music)
SUMMARY :
Welcome to theCUBE's It's great to be here. that the audience is going to and distilling into the specifics to make visual analytics there are going to be tasks where AI And that is where that organizations need to be aware of? in addition to the cultural Right, and the subtleties and repair the system if they do see Talk to me a little bit and conducive for women to jump in. and I'm sure the audience is as well. coverage of WiDS conference 2022.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Lisa Martin | PERSON | 0.99+ |
Lisa | PERSON | 0.99+ |
Vidya | PERSON | 0.99+ |
Vidya Setlur | PERSON | 0.99+ |
Tableau Research | ORGANIZATION | 0.99+ |
both | QUANTITY | 0.98+ |
three years ago | DATE | 0.98+ |
English | OTHER | 0.98+ |
two different things | QUANTITY | 0.97+ |
WiDS 2022 | EVENT | 0.97+ |
ORGANIZATION | 0.97+ | |
Northwestern | LOCATION | 0.93+ |
theCUBE | ORGANIZATION | 0.91+ |
WiDS conference 2022 | EVENT | 0.9+ |
ORGANIZATION | 0.86+ | |
Spanish | OTHER | 0.83+ |
this year | DATE | 0.82+ |
first class | QUANTITY | 0.81+ |
Northwestern | ORGANIZATION | 0.79+ |
one of my favorite events | QUANTITY | 0.66+ |
Tableau | ORGANIZATION | 0.64+ |
Cecilia Aragon, University of Washington | WiDS Worldwide Conference 2022
>>Hey, everyone. Welcome to the cubes coverage of women in data science, 2022. I'm Lisa Martin. And I'm here with one of the key featured keynotes for this year is with events. So the Aragon, the professor and department of human centered design and engineering at the university of Washington Cecilia, it's a pleasure to have you on the cube. >>Thank you so much, Lisa Lisa, it's a pleasure to be here as well. >>You got an amazing background that I want to share with the audience. You are a professor, you are a data scientist, an aerobatic pilot, and an author with expertise in human centered, data science, visual analytics, aviation safety, and analysis of extremely large and complex data sets. That's quite the background. >>Well, thank you so much. It's it's all very interesting and fun. So, >>And as a professor, you study how people make sense of vast data sets, including a combination of computer science and art, which I love. And as an author, you write about interesting things. You write about how to overcome fear, which is something that everybody can benefit from and how to expand your life until it becomes amazing. I need to take a page out of your book. You were also honored by president Obama a few years back. My goodness. >>Thank you so much. Yes. I I've had quite a journey to come here, but I feel really fortunate to be here today. >>Talk about that journey. I'd love to understand if you were always interested in stem, if it was something that you got into later, I know that you are the co-founder of Latinas in computing, a passionate advocate for girls and women in stem. Were you always interested in stem or was it something that you got into in a kind of a non-linear path? >>I was always interested in it when I was a young girl. I grew up in a small Midwestern town and my parents are both immigrants and I was one of the few Latinas in a mostly white community. And I was, um, I loved math, but I also wanted to be an astronaut. And I remember I, when we were asked, I think it was in second grade. What would you like to be when you grow up? I said, oh, I want to be an astronaut. And my teacher said, oh, you can't do that. You're a girl pick something else. And um, so I picked math and she was like, okay. >>Um, so I always wanted to, well, maybe it would be better to say I never really quite lost my love of being up in the air and potentially space. But, um, but I ended up working in math and science and, um, I, I loved it because one of the great advantages of math is that it's kind of like a magic trick for young people, especially if you're a girl or if you are from an underrepresented group, because if you get the answers right on a math test, no one can mark you wrong. It doesn't matter what the color of your skin is or what your gender is. Math is powerful that way. And I will say there's nothing like standing in a room in front of a room of people who think little of you and you silence them with your love with numbers. >>I love that. I never thought about math as power before, but it clearly is. But also, you know, and, and I wish we had more time because I would love to get into how you overcame that fear. And you write books about that, but being told you can't be an astronaut. You're a girl and maybe laughing at you because you liked Matt. How did you overcome that? And so nevermind I'm doing it anyway. >>Well, that's a, it's a, okay. The short answer is I had incredible imposter syndrome. I didn't believe that I was smart enough to get a PhD in math and computer science. But what enabled me to do that was becoming a pilot and I B I learned how to fly small airplanes. I learned how to fly them upside down and pointing straight at the ground. And I know this might sound kind of extreme. So this is not what I recommend to everybody. But if you are brought up in a way where everybody thinks little of you, one of the best things you can possibly do is take on a challenge. That's scary. I was afraid of everything, but by learning to fly and especially learning to fly loops and rolls, it gave me confidence to do everything else because I thought I appointed the airplane at the ground at 250 miles an hour and waited, why am I afraid to get a PhD in computer science? >>Wow. How empowering is that? >>Yeah, it really was. So that's really how I overcame the fear. And I will say that, you know, I encountered situations getting my PhD in computer science where I didn't believe that I was good enough to finish the degree. I didn't believe that I was smart enough. And what I've learned later on is that was just my own emotional, you know, residue from my childhood and from people telling me that they, you know, that they, that I couldn't achieve >>As I look what, look what you've achieved so far. It's amazing. And we're going to be talking about some of the books that you've written, but I want to get into data science and AI and get your thoughts on this. Why is it necessary to think about human issues and data science >>And what are your thoughts there? So there's been a lot of work in data science recently looking at societal impacts. And if you just address data science as a purely technical field, and you don't think about unintended consequences, you can end up with tremendous injustices and societal harms and harms to individuals. And I think any of us who has dealt with an inflexible algorithm, even if you just call up, you know, customer service and you get told, press five for this press four for that. And you say, well, I don't fit into any of those categories, you know, or have the system hang up on you after an hour. I think you'll understand that any type of algorithmic approach, especially on very large data sets has the risk of impacting people, particularly from low income or marginalized groups, but really any of us can be impacted in a negative way. >>And so, as a developer of algorithms that work over very large data sets, I've always found it really important to consider the humans on the other end of the algorithm. And that's why I believe that all data science is truly human centered or should be human centered, should be human centered and also involves both technical issues as well as social issues. Absolutely correct. So one example is that, um, many of us who started working in data science, including I have to admit me when I started out assume that data is unbiased. It's scrubbed of human influence. It is pure in some ways, however, that's really not true as I've started working with datasets. And this is generally known in the field that data sets are touched by humans everywhere. As a matter of fact, in our, in the recent book that we're, that we're coming out with human centered data science, we talk about five important points where humans touch data, no matter how scrubbed of human influence it's support it's supposed to be. >>Um, so the first one is discovery. So when a human encounters, a data set and starts to use it, it's a human decision. And then there's capture, which is the process of searching for a data set. So any data that has to be selected and chosen by an individual, um, then once that data set is brought in there's curation, a human will have to select various data sets. They'll have to decide what is, what is the proper set to use. And they'll be making judgements on this the time. And perhaps one of the most important ways the data is changed and touched by humans is what we call the design of data. And what that means is whenever you bring in a data set, you have to categorize it. No, for example, let's suppose you are, um, a geologist and you are classifying soil data. >>Well, you don't just take whatever the description of the soil data is. You actually may put it into a previously established taxonomy and you're making human judgments on that. So even though you think, oh, geology data, that's just rocks. You know, that's soil. It has nothing to do with people, but it really does. Um, and finally, uh, people will label the data that they have. And this is especially critical when humans are making subjective judgments, such as what race is the person in this dataset. And they may judge it based on looking at the individual skin color. They may try to apply an algorithm to it, but you know what? We all have very different skin colors, categorizing us into race boxes, really diminishes us and makes us less than we truly are. So it's very important to realize that humans touch the data. We interpret the data. It is not scrubbed of bias. And when we make algorithmic decisions, even the very fact of having an algorithm that makes a judgment say on whether a prisoner's likely to offend again, the judge just by having an algorithm, even if the algorithm makes a recommended statement, they are impacted by that algorithms recommendation. And that has obviously an impact on that human's life. So we consider all of this. >>So you just get given five solid reasons why data science and AI are inevitably human centric should be, but in the past, what's led to the separation between data science and humans. >>Well, I think a lot of it simply has to do with incorrect mental models. So many of us grew up thinking that, oh, humans have biases, but computers don't. And so if we just take decision-making out of people's hands and put it into the hands of an algorithm, we will be having less biased results. However, recent work in the field of data science and artificial intelligence has shown that that's simply not true that algorithmic algorithms reinforce human biases. They amplify them. So algorithmic biases can be much worse than human biases and can greater impact. >>So how do we pull ethics into all of this data science and AI and that ethical component, which seems to be that it needs to be foundational. >>It absolutely has to be foundational. And this is why we believe. And what we teach at the university of Washington in our data science courses is that ethical and human centered approaches and ideas have to be brought in at the very beginning of the algorithm. It's not something you slap on at the end or say, well, I'll wait for the ethicists to weigh in on this. Now we are all human. We can all make human decisions. We can all think about the unintended consequences of our algorithms as we develop them. And we should do that at the very beginning. And all algorithm designers really need to spend some time thinking about the impact that their algorithm may have. >>Right. Do you, do you find that people are still in need of convincing of that or is it generally moving in that direction of understanding? We need to bring ethics in from the beginning, >>It's moving in that direction, but there are still people who haven't modified their mental models yet. So we're working on it. And we hope that with the publication of our book, that it will be used as a supplemental textbook in many data science courses that are focused exclusively on the algorithms and that they can open up the idea that considering the human centered approaches at the beginning of learning about algorithms and data science and the mathematical and statistical techniques, that the next generation of data scientists and artificial intelligence developers will be able to mitigate some of the potentially harmful effects. And we're very excited about this. This is why I'm a professor, because I want to teach the next generation of data scientists and artificial intelligence experts, how to make sure that their work really achieves what they intended it to, which is to make the world a better place, not a worse place, but to enable humans to do better and to mitigate biases and really to lead us into this century in a positive way. >>So the book, human centered data science, you can see it there over Sicily, his right shoulder. When does this come out and how can folks get a copy of it? >>So it came out March 1st and it's available in bookstores everywhere. It was published by MIT press, and you can go online or you can go to your local independent bookstore, or you can order it from your university bookstore as well. >>Excellent. Got to, got to get a copy of, get my hands on that. Got cut and get a copy and dig into that. Cause it sounds so interesting, but also so thoughtful and, um, clear in the way that you described that. And also all the opportunities that, that AI data science and humans are gonna unlock for the world and humans and jobs and, and great things like that. So I'm sure there's lots of great information there. Last question I mentioned, you are keynoting at this year's conference. Talk to me about like the top three takeaways that the audience is going to get from your keynote. >>So I'm very excited to have been invited to wins this year, which of course is a wonderful conference to support women in data science. And I've been a big fan of the conference since it was first developed here, uh, here at Stanford. Um, the three, the three top takeaways I would say is to really consider the data. Science can be rigorous and mathematical and human centered and ethical. It's not a trade-off, it's both at the same time. And that's really the, the number one that, that I'm hoping to keynote will bring to, to the entire audience. And secondly, I hope that it will encourage women or people who've been told that maybe you're not a science person or this isn't for you, or you're not good at math. I hope it will encourage them to disbelieve those views. And to realize that if you, as a member of any type of unread, underrepresented group have ever felt, oh, I'm not good enough for this. >>I'm not smart enough. It's not for me that you will reconsider because I firmly believe that everyone can be good at math. And it's a matter of having the information presented to you in a way that honors your, the background you had. So when I started out my, my high school didn't have AP classes and I needed to learn in a somewhat different way than other people around me. And it's really, it's really something. That's what I tell young people today is if you are struggling in a class, don't think it's because you're not good enough. It might just be that the teacher is not presenting it in a way that is best for someone with your particular background. So it doesn't mean they're a bad teacher. It doesn't mean you're unintelligent. It just means the, maybe you need to find someone else that can explain it to you in a simple and clear way, or maybe you need to get some scaffolding that is Tate, learn extra, take extra classes that will help you. Not necessarily remedial classes. I believe very strongly as a teacher in giving students very challenging classes, but then giving them the scaffolding so that they can learn that difficult material. And I have longer stories on that, but I think I've already talked a bit too long. >>I love that. The scaffolding, I th I think the, the one, one of the high level takeaways that we're all going to get from your keynote is inspiration. Thank you so much for sharing your path to stem, how you got here, why humans, data science and AI are, have to be foundationally human centered, looking forward to the keynote. And again, Cecilia, Aragon. Thank you so much for spending time with me today. >>Thank you so much, Lisa. It's been a pleasure, >>Likewise versus silly Aragon. I'm Lisa Martin. You're watching the cubes coverage of women in data science, 2022.
SUMMARY :
of Washington Cecilia, it's a pleasure to have you on the cube. You are a professor, you are a data scientist, Well, thank you so much. And as a professor, you study how people make sense of vast data sets, including a combination of computer Thank you so much. if it was something that you got into later, I know that you are the co-founder of Latinas in computing, And my teacher said, oh, you can't do that. And I will say there's nothing like standing in And you write books about that, but being told you can't be an astronaut. And I know this might sound kind of extreme. And I will say that, you know, I encountered situations And we're going to be talking about some of the books that you've written, but I want to get into data science and AI And you say, well, I don't fit into any of those categories, you know, And so, as a developer of algorithms that work over very large data sets, And what that means is whenever you bring in a And that has obviously an impact on that human's life. So you just get given five solid reasons why data science and AI Well, I think a lot of it simply has to do with incorrect So how do we pull ethics into all of this data science and AI and that ethical And all algorithm designers really need to spend some time thinking about the is it generally moving in that direction of understanding? that considering the human centered approaches at the beginning So the book, human centered data science, you can see it there over Sicily, his right shoulder. or you can go to your local independent bookstore, or you can order it from your university takeaways that the audience is going to get from your keynote. And I've been a big fan of the conference since it was first developed here, the information presented to you in a way that honors your, to stem, how you got here, why humans, data science and AI women in data science, 2022.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Cecilia | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Aragon | PERSON | 0.99+ |
March 1st | DATE | 0.99+ |
Lisa | PERSON | 0.99+ |
2022 | DATE | 0.99+ |
three | QUANTITY | 0.99+ |
Lisa Lisa | PERSON | 0.99+ |
president | PERSON | 0.99+ |
Cecilia Aragon | PERSON | 0.99+ |
Sicily | LOCATION | 0.99+ |
Matt | PERSON | 0.99+ |
both | QUANTITY | 0.99+ |
five important points | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
today | DATE | 0.98+ |
250 miles an hour | QUANTITY | 0.98+ |
one | QUANTITY | 0.97+ |
MIT press | ORGANIZATION | 0.97+ |
second grade | QUANTITY | 0.97+ |
five solid reasons | QUANTITY | 0.97+ |
one example | QUANTITY | 0.97+ |
an hour | QUANTITY | 0.97+ |
three top takeaways | QUANTITY | 0.97+ |
four | QUANTITY | 0.96+ |
five | QUANTITY | 0.96+ |
first one | QUANTITY | 0.95+ |
this year | DATE | 0.94+ |
University of Washington | ORGANIZATION | 0.94+ |
this year | DATE | 0.94+ |
Midwestern | LOCATION | 0.93+ |
three takeaways | QUANTITY | 0.88+ |
WiDS Worldwide Conference 2022 | EVENT | 0.87+ |
few years back | DATE | 0.8+ |
university of Washington Cecilia | ORGANIZATION | 0.77+ |
Stanford | LOCATION | 0.76+ |
university of Washington | ORGANIZATION | 0.75+ |
silly | PERSON | 0.74+ |
Obama | PERSON | 0.74+ |
Tate | PERSON | 0.71+ |
Aragon | ORGANIZATION | 0.69+ |
top | QUANTITY | 0.6+ |
Latinas | PERSON | 0.57+ |
Latinas | OTHER | 0.57+ |