Cecilia Aragon, University of Washington | WiDS Worldwide Conference 2022
>>Hey, everyone. Welcome to the cubes coverage of women in data science, 2022. I'm Lisa Martin. And I'm here with one of the key featured keynotes for this year is with events. So the Aragon, the professor and department of human centered design and engineering at the university of Washington Cecilia, it's a pleasure to have you on the cube. >>Thank you so much, Lisa Lisa, it's a pleasure to be here as well. >>You got an amazing background that I want to share with the audience. You are a professor, you are a data scientist, an aerobatic pilot, and an author with expertise in human centered, data science, visual analytics, aviation safety, and analysis of extremely large and complex data sets. That's quite the background. >>Well, thank you so much. It's it's all very interesting and fun. So, >>And as a professor, you study how people make sense of vast data sets, including a combination of computer science and art, which I love. And as an author, you write about interesting things. You write about how to overcome fear, which is something that everybody can benefit from and how to expand your life until it becomes amazing. I need to take a page out of your book. You were also honored by president Obama a few years back. My goodness. >>Thank you so much. Yes. I I've had quite a journey to come here, but I feel really fortunate to be here today. >>Talk about that journey. I'd love to understand if you were always interested in stem, if it was something that you got into later, I know that you are the co-founder of Latinas in computing, a passionate advocate for girls and women in stem. Were you always interested in stem or was it something that you got into in a kind of a non-linear path? >>I was always interested in it when I was a young girl. I grew up in a small Midwestern town and my parents are both immigrants and I was one of the few Latinas in a mostly white community. And I was, um, I loved math, but I also wanted to be an astronaut. And I remember I, when we were asked, I think it was in second grade. What would you like to be when you grow up? I said, oh, I want to be an astronaut. And my teacher said, oh, you can't do that. You're a girl pick something else. And um, so I picked math and she was like, okay. >>Um, so I always wanted to, well, maybe it would be better to say I never really quite lost my love of being up in the air and potentially space. But, um, but I ended up working in math and science and, um, I, I loved it because one of the great advantages of math is that it's kind of like a magic trick for young people, especially if you're a girl or if you are from an underrepresented group, because if you get the answers right on a math test, no one can mark you wrong. It doesn't matter what the color of your skin is or what your gender is. Math is powerful that way. And I will say there's nothing like standing in a room in front of a room of people who think little of you and you silence them with your love with numbers. >>I love that. I never thought about math as power before, but it clearly is. But also, you know, and, and I wish we had more time because I would love to get into how you overcame that fear. And you write books about that, but being told you can't be an astronaut. You're a girl and maybe laughing at you because you liked Matt. How did you overcome that? And so nevermind I'm doing it anyway. >>Well, that's a, it's a, okay. The short answer is I had incredible imposter syndrome. I didn't believe that I was smart enough to get a PhD in math and computer science. But what enabled me to do that was becoming a pilot and I B I learned how to fly small airplanes. I learned how to fly them upside down and pointing straight at the ground. And I know this might sound kind of extreme. So this is not what I recommend to everybody. But if you are brought up in a way where everybody thinks little of you, one of the best things you can possibly do is take on a challenge. That's scary. I was afraid of everything, but by learning to fly and especially learning to fly loops and rolls, it gave me confidence to do everything else because I thought I appointed the airplane at the ground at 250 miles an hour and waited, why am I afraid to get a PhD in computer science? >>Wow. How empowering is that? >>Yeah, it really was. So that's really how I overcame the fear. And I will say that, you know, I encountered situations getting my PhD in computer science where I didn't believe that I was good enough to finish the degree. I didn't believe that I was smart enough. And what I've learned later on is that was just my own emotional, you know, residue from my childhood and from people telling me that they, you know, that they, that I couldn't achieve >>As I look what, look what you've achieved so far. It's amazing. And we're going to be talking about some of the books that you've written, but I want to get into data science and AI and get your thoughts on this. Why is it necessary to think about human issues and data science >>And what are your thoughts there? So there's been a lot of work in data science recently looking at societal impacts. And if you just address data science as a purely technical field, and you don't think about unintended consequences, you can end up with tremendous injustices and societal harms and harms to individuals. And I think any of us who has dealt with an inflexible algorithm, even if you just call up, you know, customer service and you get told, press five for this press four for that. And you say, well, I don't fit into any of those categories, you know, or have the system hang up on you after an hour. I think you'll understand that any type of algorithmic approach, especially on very large data sets has the risk of impacting people, particularly from low income or marginalized groups, but really any of us can be impacted in a negative way. >>And so, as a developer of algorithms that work over very large data sets, I've always found it really important to consider the humans on the other end of the algorithm. And that's why I believe that all data science is truly human centered or should be human centered, should be human centered and also involves both technical issues as well as social issues. Absolutely correct. So one example is that, um, many of us who started working in data science, including I have to admit me when I started out assume that data is unbiased. It's scrubbed of human influence. It is pure in some ways, however, that's really not true as I've started working with datasets. And this is generally known in the field that data sets are touched by humans everywhere. As a matter of fact, in our, in the recent book that we're, that we're coming out with human centered data science, we talk about five important points where humans touch data, no matter how scrubbed of human influence it's support it's supposed to be. >>Um, so the first one is discovery. So when a human encounters, a data set and starts to use it, it's a human decision. And then there's capture, which is the process of searching for a data set. So any data that has to be selected and chosen by an individual, um, then once that data set is brought in there's curation, a human will have to select various data sets. They'll have to decide what is, what is the proper set to use. And they'll be making judgements on this the time. And perhaps one of the most important ways the data is changed and touched by humans is what we call the design of data. And what that means is whenever you bring in a data set, you have to categorize it. No, for example, let's suppose you are, um, a geologist and you are classifying soil data. >>Well, you don't just take whatever the description of the soil data is. You actually may put it into a previously established taxonomy and you're making human judgments on that. So even though you think, oh, geology data, that's just rocks. You know, that's soil. It has nothing to do with people, but it really does. Um, and finally, uh, people will label the data that they have. And this is especially critical when humans are making subjective judgments, such as what race is the person in this dataset. And they may judge it based on looking at the individual skin color. They may try to apply an algorithm to it, but you know what? We all have very different skin colors, categorizing us into race boxes, really diminishes us and makes us less than we truly are. So it's very important to realize that humans touch the data. We interpret the data. It is not scrubbed of bias. And when we make algorithmic decisions, even the very fact of having an algorithm that makes a judgment say on whether a prisoner's likely to offend again, the judge just by having an algorithm, even if the algorithm makes a recommended statement, they are impacted by that algorithms recommendation. And that has obviously an impact on that human's life. So we consider all of this. >>So you just get given five solid reasons why data science and AI are inevitably human centric should be, but in the past, what's led to the separation between data science and humans. >>Well, I think a lot of it simply has to do with incorrect mental models. So many of us grew up thinking that, oh, humans have biases, but computers don't. And so if we just take decision-making out of people's hands and put it into the hands of an algorithm, we will be having less biased results. However, recent work in the field of data science and artificial intelligence has shown that that's simply not true that algorithmic algorithms reinforce human biases. They amplify them. So algorithmic biases can be much worse than human biases and can greater impact. >>So how do we pull ethics into all of this data science and AI and that ethical component, which seems to be that it needs to be foundational. >>It absolutely has to be foundational. And this is why we believe. And what we teach at the university of Washington in our data science courses is that ethical and human centered approaches and ideas have to be brought in at the very beginning of the algorithm. It's not something you slap on at the end or say, well, I'll wait for the ethicists to weigh in on this. Now we are all human. We can all make human decisions. We can all think about the unintended consequences of our algorithms as we develop them. And we should do that at the very beginning. And all algorithm designers really need to spend some time thinking about the impact that their algorithm may have. >>Right. Do you, do you find that people are still in need of convincing of that or is it generally moving in that direction of understanding? We need to bring ethics in from the beginning, >>It's moving in that direction, but there are still people who haven't modified their mental models yet. So we're working on it. And we hope that with the publication of our book, that it will be used as a supplemental textbook in many data science courses that are focused exclusively on the algorithms and that they can open up the idea that considering the human centered approaches at the beginning of learning about algorithms and data science and the mathematical and statistical techniques, that the next generation of data scientists and artificial intelligence developers will be able to mitigate some of the potentially harmful effects. And we're very excited about this. This is why I'm a professor, because I want to teach the next generation of data scientists and artificial intelligence experts, how to make sure that their work really achieves what they intended it to, which is to make the world a better place, not a worse place, but to enable humans to do better and to mitigate biases and really to lead us into this century in a positive way. >>So the book, human centered data science, you can see it there over Sicily, his right shoulder. When does this come out and how can folks get a copy of it? >>So it came out March 1st and it's available in bookstores everywhere. It was published by MIT press, and you can go online or you can go to your local independent bookstore, or you can order it from your university bookstore as well. >>Excellent. Got to, got to get a copy of, get my hands on that. Got cut and get a copy and dig into that. Cause it sounds so interesting, but also so thoughtful and, um, clear in the way that you described that. And also all the opportunities that, that AI data science and humans are gonna unlock for the world and humans and jobs and, and great things like that. So I'm sure there's lots of great information there. Last question I mentioned, you are keynoting at this year's conference. Talk to me about like the top three takeaways that the audience is going to get from your keynote. >>So I'm very excited to have been invited to wins this year, which of course is a wonderful conference to support women in data science. And I've been a big fan of the conference since it was first developed here, uh, here at Stanford. Um, the three, the three top takeaways I would say is to really consider the data. Science can be rigorous and mathematical and human centered and ethical. It's not a trade-off, it's both at the same time. And that's really the, the number one that, that I'm hoping to keynote will bring to, to the entire audience. And secondly, I hope that it will encourage women or people who've been told that maybe you're not a science person or this isn't for you, or you're not good at math. I hope it will encourage them to disbelieve those views. And to realize that if you, as a member of any type of unread, underrepresented group have ever felt, oh, I'm not good enough for this. >>I'm not smart enough. It's not for me that you will reconsider because I firmly believe that everyone can be good at math. And it's a matter of having the information presented to you in a way that honors your, the background you had. So when I started out my, my high school didn't have AP classes and I needed to learn in a somewhat different way than other people around me. And it's really, it's really something. That's what I tell young people today is if you are struggling in a class, don't think it's because you're not good enough. It might just be that the teacher is not presenting it in a way that is best for someone with your particular background. So it doesn't mean they're a bad teacher. It doesn't mean you're unintelligent. It just means the, maybe you need to find someone else that can explain it to you in a simple and clear way, or maybe you need to get some scaffolding that is Tate, learn extra, take extra classes that will help you. Not necessarily remedial classes. I believe very strongly as a teacher in giving students very challenging classes, but then giving them the scaffolding so that they can learn that difficult material. And I have longer stories on that, but I think I've already talked a bit too long. >>I love that. The scaffolding, I th I think the, the one, one of the high level takeaways that we're all going to get from your keynote is inspiration. Thank you so much for sharing your path to stem, how you got here, why humans, data science and AI are, have to be foundationally human centered, looking forward to the keynote. And again, Cecilia, Aragon. Thank you so much for spending time with me today. >>Thank you so much, Lisa. It's been a pleasure, >>Likewise versus silly Aragon. I'm Lisa Martin. You're watching the cubes coverage of women in data science, 2022.
SUMMARY :
of Washington Cecilia, it's a pleasure to have you on the cube. You are a professor, you are a data scientist, Well, thank you so much. And as a professor, you study how people make sense of vast data sets, including a combination of computer Thank you so much. if it was something that you got into later, I know that you are the co-founder of Latinas in computing, And my teacher said, oh, you can't do that. And I will say there's nothing like standing in And you write books about that, but being told you can't be an astronaut. And I know this might sound kind of extreme. And I will say that, you know, I encountered situations And we're going to be talking about some of the books that you've written, but I want to get into data science and AI And you say, well, I don't fit into any of those categories, you know, And so, as a developer of algorithms that work over very large data sets, And what that means is whenever you bring in a And that has obviously an impact on that human's life. So you just get given five solid reasons why data science and AI Well, I think a lot of it simply has to do with incorrect So how do we pull ethics into all of this data science and AI and that ethical And all algorithm designers really need to spend some time thinking about the is it generally moving in that direction of understanding? that considering the human centered approaches at the beginning So the book, human centered data science, you can see it there over Sicily, his right shoulder. or you can go to your local independent bookstore, or you can order it from your university takeaways that the audience is going to get from your keynote. And I've been a big fan of the conference since it was first developed here, the information presented to you in a way that honors your, to stem, how you got here, why humans, data science and AI women in data science, 2022.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Cecilia | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Aragon | PERSON | 0.99+ |
March 1st | DATE | 0.99+ |
Lisa | PERSON | 0.99+ |
2022 | DATE | 0.99+ |
three | QUANTITY | 0.99+ |
Lisa Lisa | PERSON | 0.99+ |
president | PERSON | 0.99+ |
Cecilia Aragon | PERSON | 0.99+ |
Sicily | LOCATION | 0.99+ |
Matt | PERSON | 0.99+ |
both | QUANTITY | 0.99+ |
five important points | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
today | DATE | 0.98+ |
250 miles an hour | QUANTITY | 0.98+ |
one | QUANTITY | 0.97+ |
MIT press | ORGANIZATION | 0.97+ |
second grade | QUANTITY | 0.97+ |
five solid reasons | QUANTITY | 0.97+ |
one example | QUANTITY | 0.97+ |
an hour | QUANTITY | 0.97+ |
three top takeaways | QUANTITY | 0.97+ |
four | QUANTITY | 0.96+ |
five | QUANTITY | 0.96+ |
first one | QUANTITY | 0.95+ |
this year | DATE | 0.94+ |
University of Washington | ORGANIZATION | 0.94+ |
this year | DATE | 0.94+ |
Midwestern | LOCATION | 0.93+ |
three takeaways | QUANTITY | 0.88+ |
WiDS Worldwide Conference 2022 | EVENT | 0.87+ |
few years back | DATE | 0.8+ |
university of Washington Cecilia | ORGANIZATION | 0.77+ |
Stanford | LOCATION | 0.76+ |
university of Washington | ORGANIZATION | 0.75+ |
silly | PERSON | 0.74+ |
Obama | PERSON | 0.74+ |
Tate | PERSON | 0.71+ |
Aragon | ORGANIZATION | 0.69+ |
top | QUANTITY | 0.6+ |
Latinas | PERSON | 0.57+ |
Latinas | OTHER | 0.57+ |
Hannah Sperling, SAP | WiDS 2022
>>Hey everyone. Welcome back to the cubes. Live coverage of women in data science, worldwide conference widths 2022. I'm Lisa Martin coming to you from Stanford university at the Arriaga alumni center. And I'm pleased to welcome my next guest. Hannah Sperling joins me business process intelligence or BPI, academic and research alliances at SAP HANA. Welcome to the program. >>Hi, thank you so much for having me. >>So you just flew in from Germany. >>I did last week. Yeah. Long way away. I'm very excited to be here. Uh, but before we get started, I would like to say that I feel very fortunate to be able to be here and that my heart and vicious still goes out to people that might be in more difficult situations right now. I agree >>Such a it's one of my favorite things about Wiz is the community that it's grown into. There's going to be about a 100,000 people that will be involved annually in woods, but you walk into the Arriaga alumni center and you feel this energy from all the women here, from what Margo and teams started seven years ago to what it has become. I was happened to be able to meet listening to one of the panels this morning, and they were talking about something that's just so important for everyone to hear, not just women, the importance of mentors and sponsors, and being able to kind of build your own personal board of directors. Talk to me about some of the mentors that you've had in the past and some of the ones that you have at SAP now. >>Yeah. Thank you. Um, that's actually a great starting point. So maybe talk a bit about how I got involved in tech. Yeah. So SAP is a global software company, but I actually studied business and I was hired directly from university, uh, around four years ago. And that was to join SAP's analytics department. And I've always had a weird thing for databases, even when I was in my undergrad. Um, I did enjoy working with data and so working in analytics with those teams and some people mentoring me, I got into database modeling and eventually ventured even further into development was working in analytics development for a couple of years. And yeah, still am with a global software provider now, which brought me to women and data science, because now I'm also involved in research again, because yeah, some reason couldn't couldn't get enough of that. Um, maybe learn about the stuff that I didn't do in my undergrad. >>And post-grad now, um, researching at university and, um, yeah, one big part in at least European data science efforts, um, is the topic of sensitive data and data privacy considerations. And this is, um, also topic very close to my heart because you can only manage what you measure, right. But if everybody is afraid to touch certain pieces of sensitive data, I think we might not get to where we want to be as fast as we possibly could be. And so I've been really getting into a data and anonymization procedures because I think if we could random a workforce data usable, especially when it comes to increasing diversity in stem or in technology jobs, we should really be, um, letting the data speak >>And letting the data speak. I like that. One of the things they were talking about this morning was the bias in data, the challenges that presents. And I've had some interesting conversations on the cube today, about data in health care data in transportation equity. Where do you, what do you think if we think of international women's day, which is tomorrow the breaking the bias is the theme. Where do you think we are from your perspective on breaking the bias that's across all these different data sets, >>Right. So I guess as somebody working with data on a daily basis, I'm sometimes amazed at how many people still seem to think that data can be unbiased. And this has actually touched upon also in the first keynote that I very much enjoyed, uh, talking about human centered data science people that believe that you can take the human factor out of any effort related to analysis, um, are definitely on the wrong path. So I feel like the sooner that we realize that we need to take into account certain bias sees that will definitely be there because data is humanly generated. Um, the closer we're going to get to something that represents reality better and might help us to change reality for the better as well, because we don't want to stick with the status quo. And any time you look at data, it's definitely gonna be a backward looking effort. So I think the first step is to be aware of that and not to strive for complete objectivity, but understanding and coming to terms with the fact just as it was mentioned in the equity panel, that that is logically impossible, right? >>That's an important, you bring up a really important point. It's important to understand that that is not possible, but what can we work with? What is possible? What can we get to, where do you think we are on the journey of being able to get there? >>I think that initiatives like widths of playing an important role in making that better and increasing that awareness there a big trend around explainability interpretability, um, an AI that you see, not just in Europe, but worldwide, because I think the awareness around those topics is increasing. And that will then, um, also show you the blind spots that you may still have, no matter how much you think about, um, uh, the context. Um, one thing that we still need to get a lot better at though, is including everybody in these types of projects, because otherwise you're always going to have a certain selection in terms of prospectus that you're getting it >>Right. That thought diversity there's so much value in thought diversity. That's something that I think I first started talking about thought diversity at a Wood's conference a few years ago, and really understanding the impact there that that can make to every industry. >>Totally. And I love this example of, I think it was a soap dispenser. I'm one of these really early examples of how technology, if you don't watch out for these, um, human centered considerations, how technology can, can go wrong and just, um, perpetuate bias. So a soap dispenser that would only recognize the hand, whether it was a certain, uh, light skin type that w you know, be placed underneath it. So it's simple examples like that, um, that I think beautifully illustrate what we need to watch out for when we design automatic decision aids, for example, because anywhere where you don't have a human checking, what's ultimately decided upon you end up, you might end up with much more grave examples, >>Right? No, it's, it's I agree. I, Cecilia Aragon gave the talk this morning on the human centered guy. I was able to interview her a couple of weeks ago for four winds and a very inspiring woman and another herself, but she brought up a great point about it's the humans and the AI working together. You can't ditch the humans completely to your point. There are things that will go wrong. I think that's a sends a good message that it's not going to be AI taking jobs, but we have to have those two components working better. >>Yeah. And maybe to also refer to the panel discussion we heard, um, on, on equity, um, I very much liked professor Bowles point. Um, I, and how she emphasized that we're never gonna get to this perfectly objective state. And then also during that panel, um, uh, data scientists said that 80% of her work is still cleaning the data most likely because I feel sometimes there is this, um, uh, almost mysticism around the role of a data scientist that sounds really catchy and cool, but, um, there's so many different aspects of work in data science that I feel it's hard to put that all in a nutshell narrowed down to one role. Um, I think in the end, if you enjoy working with data, and maybe you can even combine that with a certain domain that you're particularly interested in, be it sustainability, or, you know, urban planning, whatever that is the perfect match >>It is. And having that passion that goes along with that also can be very impactful. So you love data. You talked about that, you said you had a strange love for databases. Where do you, where do you want to go from where you are now? How much more deeply are you going to dive into the world of data? >>That's a good question because I would, at this point, definitely not consider myself a data scientist, but I feel like, you know, taking baby steps, I'm maybe on a path to becoming one in the future. Um, and so being at university, uh, again gives me, gives me the opportunity to dive back into certain courses and I've done, you know, smaller data science projects. Um, and I was actually amazed at, and this was touched on in a panel as well earlier. Um, how outdated, so many, um, really frequently used data sets are shown the realm of research, you know, AI machine learning, research, all these models that you feed with these super outdated data sets. And that's happened to me like something I can relate to. Um, and then when you go down that path, you come back to the sort of data engineering path that I really enjoy. So I could see myself, you know, keeping on working on that, the whole data, privacy and analytics, both topics that are very close to my heart, and I think can be combined. They're not opposites. That is something I would definitely stay true to >>Data. Privacy is a really interesting topic. We're seeing so many, you know, GDPR was how many years did a few years old that is now, and we've got other countries and states within the United States, for example, there's California has CCPA, which will become CPRA next year. And it's expanding the definition of what private sensitive data is. So we're companies have to be sensitive to that, but it's a huge challenge to do so because there's so much potential that can come from the data yet, we've got that personal aspect, that sensitive aspect that has to be aware of otherwise there's huge fines. Totally. Where do you think we are with that in terms of kind of compliance? >>So, um, I think in the past years we've seen quite a few, uh, rather shocking examples, um, in the United States, for instance, where, um, yeah, personal data was used or all proxies, um, that led to, uh, detrimental outcomes, um, in Europe, thanks to the strong data regulations. I think, um, we haven't had as many problems, but here the question remains, well, where do you draw the line? And, you know, how do you design this trade-off in between increasing efficiency, um, making business applications better, for example, in the case of SAP, um, while protecting the individual, uh, privacy rights of, of people. So, um, I guess in one way, SAP has a, as an easier position because we deal with business data. So anybody who doesn't want to care about the human element maybe would like to, you know, try building models and machine generated data first. >>I mean, at least I would feel much more comfortable because as soon as you look at personally identifiable data, you really need to watch out, um, there is however ways to make that happen. And I was touching upon these anonymization techniques that I think are going to be, um, more and more important in the, in the coming years, there is a proposed on the way by the European commission. And I was actually impressed by the sophisticated newness of legislation in, in that area. And the plan is for the future to tie the rules around the use of data science, to the specific objectives of the project. And I think that's the only way to go because of the data's out there it's going to be used. Right. We've sort of learned that and true anonymization might not even be possible because of the amount of data that's out there. So I think this approach of, um, trying to limit the, the projects in terms of, you know, um, looking at what do they want to achieve, not just for an individual company, but also for us as a society, think that needs to play a much bigger role in any data-related projects where >>You said getting true anonymization isn't really feasible. Where are we though on the anonymization pathway, >>If you will. I mean, it always, it's always the cost benefit trade off, right? Because if the question is not interesting enough, so if you're not going to allocate enough resources in trying to reverse engineer out an old, the tie to an individual, for example, sticking true to this, um, anonymization example, um, nobody's going to do it right. We live in a world where there's data everywhere. So I feel like that that's not going to be our problem. Um, and that is why this approach of trying to look at the objectives of a project come in, because, you know, um, sometimes maybe we're just lucky that it's not valuable enough to figure out certain details about our personal lives so that nobody will try, because I am sure that if people, data scientists tried hard enough, um, I wonder if there's challenges they wouldn't be able to solve. >>And there has been companies that have, you know, put out data sets that were supposedly anonymized. And then, um, it wasn't actually that hard to make interferences and in the, in the panel and equity one lab, one last thought about that. Um, we heard Jessica speak about, uh, construction and you know, how she would, um, she was trying to use, um, synthetic data because it's so hard to get the real data. Um, and the challenge of getting the synthetic data to, um, sort of, uh, um, mimic the true data. And the question came up of sensors in, in the household and so on. That is obviously a huge opportunity, but for me, it's somebody who's, um, very sensitive when it comes to privacy considerations straight away. I'm like, but what, you know, if we generate all this data, then somebody uses it for the wrong reasons, which might not be better urban planning for all different communities, but simple profit maximization. Right? So this is something that's also very dear to my heart, and I'm definitely going to go down that path further. >>Well, Hannah, it's been great having you on the program. Congratulations on being a Wood's ambassador. I'm sure there's going to be a lot of great lessons and experiences that you'll take back to Germany from here. Thank you so much. We appreciate your time for Hannah Sperling. I'm Lisa Martin. You're watching the QS live coverage of women in data science conference, 2020 to stick around. I'll be right back with my next guest.
SUMMARY :
I'm Lisa Martin coming to you from Stanford Uh, but before we get started, I would like to say that I feel very fortunate to be able to and some of the ones that you have at SAP now. And that was to join SAP's analytics department. And this is, um, also topic very close to my heart because Where do you think we are data science people that believe that you can take the human factor out of any effort related What can we get to, where do you think we are on the journey um, an AI that you see, not just in Europe, but worldwide, because I think the awareness around there that that can make to every industry. hand, whether it was a certain, uh, light skin type that w you know, be placed underneath it. I think that's a sends a good message that it's not going to be AI taking jobs, but we have to have those two Um, I think in the end, if you enjoy working So you love data. data sets are shown the realm of research, you know, AI machine learning, research, We're seeing so many, you know, many problems, but here the question remains, well, where do you draw the line? And the plan is for the future to tie the rules around the use of data Where are we though on the anonymization pathway, So I feel like that that's not going to be our problem. And there has been companies that have, you know, put out data sets that were supposedly anonymized. Well, Hannah, it's been great having you on the program.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Hannah | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Cecilia Aragon | PERSON | 0.99+ |
Hannah Sperling | PERSON | 0.99+ |
Jessica | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
Germany | LOCATION | 0.99+ |
80% | QUANTITY | 0.99+ |
United States | LOCATION | 0.99+ |
2020 | DATE | 0.99+ |
Bowles | PERSON | 0.99+ |
next year | DATE | 0.99+ |
today | DATE | 0.99+ |
seven years ago | DATE | 0.99+ |
first step | QUANTITY | 0.99+ |
one role | QUANTITY | 0.99+ |
SAP | ORGANIZATION | 0.99+ |
tomorrow | DATE | 0.99+ |
last week | DATE | 0.99+ |
first keynote | QUANTITY | 0.99+ |
European commission | ORGANIZATION | 0.98+ |
first | QUANTITY | 0.98+ |
two components | QUANTITY | 0.98+ |
One | QUANTITY | 0.97+ |
SAP HANA | TITLE | 0.97+ |
one | QUANTITY | 0.96+ |
this morning | DATE | 0.95+ |
around four years ago | DATE | 0.94+ |
both topics | QUANTITY | 0.94+ |
100,000 people | QUANTITY | 0.93+ |
four winds | QUANTITY | 0.93+ |
international women's day | EVENT | 0.91+ |
California | LOCATION | 0.9+ |
GDPR | TITLE | 0.89+ |
one way | QUANTITY | 0.88+ |
couple of weeks ago | DATE | 0.87+ |
few years ago | DATE | 0.87+ |
2022 | DATE | 0.86+ |
Stanford university | ORGANIZATION | 0.84+ |
European | OTHER | 0.82+ |
Arriaga | ORGANIZATION | 0.8+ |
CPRA | ORGANIZATION | 0.8+ |
Wood | PERSON | 0.78+ |
one thing | QUANTITY | 0.75+ |
one last | QUANTITY | 0.74+ |
one of | QUANTITY | 0.74+ |
QS | EVENT | 0.72+ |
CCPA | ORGANIZATION | 0.69+ |
years | DATE | 0.6+ |
Margo | PERSON | 0.6+ |
about | QUANTITY | 0.54+ |
years | QUANTITY | 0.52+ |
WiDS | EVENT | 0.47+ |
Wiz | ORGANIZATION | 0.39+ |