Jay Carney, AWS | AWS Public Sector Summit 2019
>> Narrator: Live from Washington D.C., it's theCUBE. Covering AWS Public Sector Summit. Brought to you by Amazon Web Services. >> Welcome back, everyone, to Washington D.C. and theCUBE's live coverage of AWS Public Sector Summit. I'm your host, Rebecca Knight, alongside John Furrier. We are joined by Jay Carney. He is the senior vice president global corporate affairs Amazon and AWS. Thank you so much for coming on theCUBE. >> Thank you so much for having me. It's great to be here. >> You are just coming from a panel with Senator Mark Warner of Virginia, where the topic was regulation and tech. I want to hear what was talked about and what your thoughts were there. >> Sure, there were a lot of topics, including the HQ2, which as you know, we're locating in northern Virginia. Senator Warner has a very specific interest in that, and we talked about that a lot. One thing that he's involved in, he's the vice chairman of the Senate Intelligence Committee, the leading democrat on the committee, and he takes these issue very seriously. He's very focused on, especially social media, but tech in general and national security concerns, as well as issues around deep fake news and fake news and the like. Now, a lot of that isn't our territory as a business, but we think that where we do fall into scrutiny for regulation, we welcome the scrutiny. We're a big company, obviously, and we're very focused on serving our customers. Part of delivering for our customers means ensuring that we work with elected officials and regulators and pass that scrutiny well. We'll see what the future brings in different spaces. Our concern, or our hope in general, if it's around privacy or other areas of tech regulation, that uniformity is obviously preferable to having, say, 50 state laws, whether it's around facial recognition technology or broader privacy initiatives. Senator Warner's supportive of a federal legislation, as a lot of folks are both sides of the aisle. >> Jay one of the things that you guys live every day at Amazon, and following you guys for the past nine, ten years now for theCUBE, is you're willing to be misunderstood as a company to continue the long game. Jeff Bezos talked about the long game all the time. Doesn't look at stock prices, all those kind of quips, but the innovation engine has been very strong, and with digital transformation now at an all time high, new value is being created in new ways that some people don't understand. You guys are on a constant mission to educate. Here in D.C., what's clear to me is this awakening of this value proposition, and in some cases, it's not very good, the value. Weaponizing is a word we've heard. Big tech is kind of under a lot of conversations, but there's a lot of good things happening. You guys create a lot of value as a company-- >> Sure, and I think the industry at large creates a lot of value. I think we need to ensure, we, the American people, American citizenry, and on our behalf, those elected officials who ultimately make the decisions, that as we scrutinize and explore regulating some of these arenas, that we do it in a way that creates public benefit, that prevents, wherever possible, misuse of technology, but that continues to allow the kind of innovation that's made the United States the center of technological innovation over the last 30 or 40 years. That's not an easy job, but I think that folks in tech need to work with and collaborate with regulators and lawmakers to talk about how to do that because you wouldn't want, I mean, a good example, I think is technological innovation is value neutral, usually. It's a new service or a new product that can do something. It itself is just a product, so it doesn't have a conscience. It's self moral. How you use it is really what determines whether it's something that's good or bad. Many technologies can be used for good or for ill. We have a service at AWS, a facial recognition service. We're certainly not the only company that provides that service to customers. Thus far, since Amazon recognition has been around, we've had reports of thousands of positive uses, finding missing children, breaking up human sex trafficking, human trafficking rings, assisting law enforcement in positive ways. We haven't heard yet any cases of abuses by law enforcement, but we certainly understand that that potential exists, and we encourage regulators and lawmakers to look closely at that. We've put forth publicly guidelines that we think would be useful as they build a legislative, a regulatory framework. >> (mumbles) asking last night even was saying you guys are very open. He wasn't hiding behind any kind of stories. How do we talk to regulators? We want to embrace those conversations. He wasn't saying, "We want to be regulated." He didn't say that, but he wasn't hiding from the fact that these conversations we need to have. >> I think we understand that the potential misuse of some technology is real. We've seen it in other countries, for example, in ways that violate civil liberties. We want to make sure that in this democracy, that we have an infrastructure in place, a regulatory infrastructure, that continues to allow innovation to blossom but protects the civil liberties of people in the United States. We're a global company, but we started off, and we are an American company, and we care deeply about those issues as a company. >> I think that that's really the big question, is how would this regulatory process work? You're talking about having these conversations, particularly around unintended consequences of these new technologies and services. How would it work? Particularly, someone like you who was in government, now in the private sector, at what point are these conversations taking place, and how might it work? At the innovation stage? At the creation, you know what I mean? Just now that we're really getting into it. >> In some cases, there's real progress being made. On privacy for example, all of your viewers no GDPR in Europe was the first multinational comprehensive privacy regulation that's been implemented. In the United States, we don't have a federal law yet. California's taken steps, has passed a bill, and other states are looking at it. We think for U.S. competitiveness, one law is better than 50 laws. We think that we're fully compliant with GDPR, and it actually was not as complicated for us to meet the compliance requirements as it might've been for other tech companies because of the nature of our business in the European Union. There are aspects of GDPR that I think are unnecessarily bureaucratic or clunky, so there's ways to take that as a base and improve it so that the privacy concerns are rightfully addressed, but innovation continues at pace. >> How about antitrust? We had a conversation a couple years ago to reinvent around antitrust. You made a comment to me, we're faster, ship faster, lower cheaper price, lower prices, how are people harmed? There's been a lot of young academics who are challenging the old antitrust definition. Does digital recast itself in antitrust? This is a conversation that think tanks are starting to have now around what does that mean for the modern era, or modernizing government, including laws of regulation? Your thoughts on that. >> I'm not a lawyer. I'm careful to speak authoritatively where I don't know all the details. Consumer harm is the standard. For all the reasons that you described, our mission as a company is to reward the customer with more convenience, more selection, and lower prices. Certainly, we fulfill that mission and don't meet that standard when it comes to any way you might look at that competitively. Even more broadly, there's a misconception about Amazon. Because we're a consumer-facing business primarily, and because we are involved in a lot of different things, some more successfully than others, that we're perceived as bigger than we are. The fact is retail, our original business, our core business, is the biggest marketplace there is. In the United States, we're less than 4% of retail, and we're not even the biggest retailer in the United States. Cloud, AWS, we're here at the Public Sector Summit. >> You've got competition-- >> We have intense, high quality competition, and deep-pocketed competition. As you know, and your viewers know this, the cloud revolution is in its early stages. The opportunity there is enormous, and we're just getting started. There'll be plenty of winners in this space, so again, I don't see any way that you might look at it, that there would be competitive issues. Also, there's a perception that Amazon itself is singular, so that you buy from Amazon, therefore you're not buying from somebody else, but in fact, when we opened Marketplace, I think in 2001, we opened the website to other sellers. What used to be 100% Amazon product and inventory for sale on amazon.com, has now, 2019, risen to over 55% not being Amazon. Third-party sellers, small and medium sized businesses, more than a million of them in the United States, sell in our store and get access to all the customers we have through our store. That side of our business is growing much faster than the Amazon retail business, and I think it demonstrates the value proposition for all of the small and medium sized businesses. >> Yeah, we've got time for one more question, for Rebecca and I, one, you might have one. As Steve Jobs once said, technology, liberal arts, you've got the nice street signs kind of intersecting, I think that plays now more than ever societal impact has become a huge part of the conversation around tech, tech impact. You're a policy expert. You've been studying it. You're living in D.C. The policy game seems to be more important now than ever before around tech and the participation of technology companies in policy, not just hiring a policy firm, or a team to do it, actively engage and be, as an ingredient of the company. Is there enough people (laughs) that can actually do that, one, and what are some of the key policy opportunities are out there for either young individuals, like my daughter, or other young people coming out of college? Because it seems to me the game is shaping into a new direction. >> The space is fascinating because these issues really are front and center right now around questions around technology and how to ensure that as it continues to evolve that it does so in a way that allows for innovation but also protects private, civil liberties, and the like. You can't be in a more exciting space if you're going to be in the private sector engaging in policy. Even if you're in government, if you're on that side, it's a very interesting space to be in. All of it, tech has grown up, the internet has grown up, and there's no question that with that more attention is being paid. That's fine and appropriate. >> More responsibility and accountability. >> More responsibility, sure. >> I just have one more final thing in this. Because of your vantage point of someone who is in a famously tech savvy administration, the Obama Administration, and then we also see lawmakers questioning Mark Zuckerberg, seemingly not understanding how Facebook makes money, do lawmakers get it? >> I think a lot of lawmakers do. I was just with one, Mark Warner, from Virginia, U.S. senator, former telecomm executive and investor. He very much gets it. The caricature is, I think, exaggerated, but look, that's our job. It's our job, it's the press', it's everybody... One thing we do here with the team we have in D.C. is be a resource of information, try to explain, here's what's happening. Here's how our model works. Here's how the technology works. I think that can only help as regulators and lawmakers decide how they want to approach these problems. >> A lot of innovation opportunities. Just the CIA deal alone is set off from a gestation period, now growth around cloud acceleration. >> I think it demonstrates in a way we're very customer focused, and that is especially true when it comes to our national security agencies and defense agencies, but also that security's our first concern at AWS, as well as at broader Amazon. We're glad to have those customers. >> Thanks for coming by. >> Yup, thanks a lot. >> Yes, excellent. Thanks so much, Jay. >> Thank you. >> I'm Rebecca Knight for John Furrier. Please stay tuned for more of theCUBE AWS Public Sector. We will have Theresa Carlson coming up next. (upbeat music)
SUMMARY :
Brought to you by Amazon Web Services. He is the senior vice president It's great to be here. and what your thoughts were there. legislation, as a lot of folks are both sides of the aisle. Jay one of the things that you guys live every day but that continues to allow from the fact that these conversations a regulatory infrastructure, that continues to allow At the creation, you know what I mean? In the United States, we don't have a federal law yet. This is a conversation that think tanks are starting to have For all the reasons that you described, for all of the small and medium sized businesses. and the participation of technology companies in policy, that as it continues to evolve that it does so and accountability. and then we also see lawmakers questioning It's our job, it's the press', it's everybody... Just the CIA deal alone is set off from a gestation period, but also that security's our first concern at AWS, Thanks so much, Jay. We will have Theresa Carlson coming up next.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jay Carney | PERSON | 0.99+ |
Rebecca | PERSON | 0.99+ |
Rebecca Knight | PERSON | 0.99+ |
Jeff Bezos | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Jay | PERSON | 0.99+ |
Steve Jobs | PERSON | 0.99+ |
Amazon Web Services | ORGANIZATION | 0.99+ |
John Furrier | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Theresa Carlson | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
United States | LOCATION | 0.99+ |
Senate Intelligence Committee | ORGANIZATION | 0.99+ |
2001 | DATE | 0.99+ |
ORGANIZATION | 0.99+ | |
D.C. | LOCATION | 0.99+ |
Mark Warner | PERSON | 0.99+ |
Virginia | LOCATION | 0.99+ |
GDPR | TITLE | 0.99+ |
Obama Administration | ORGANIZATION | 0.99+ |
Mark Zuckerberg | PERSON | 0.99+ |
100% | QUANTITY | 0.99+ |
Senator | PERSON | 0.99+ |
theCUBE | ORGANIZATION | 0.99+ |
CIA | ORGANIZATION | 0.99+ |
amazon.com | ORGANIZATION | 0.99+ |
2019 | DATE | 0.99+ |
first | QUANTITY | 0.99+ |
Washington D.C. | LOCATION | 0.99+ |
less than 4% | QUANTITY | 0.99+ |
northern Virginia | LOCATION | 0.99+ |
50 state laws | QUANTITY | 0.99+ |
more than a million | QUANTITY | 0.99+ |
one law | QUANTITY | 0.99+ |
50 laws | QUANTITY | 0.99+ |
both sides | QUANTITY | 0.98+ |
last night | DATE | 0.98+ |
One | QUANTITY | 0.98+ |
Public Sector Summit | EVENT | 0.97+ |
over 55% | QUANTITY | 0.97+ |
one | QUANTITY | 0.97+ |
one more question | QUANTITY | 0.97+ |
first concern | QUANTITY | 0.97+ |
Warner | PERSON | 0.96+ |
ten years | QUANTITY | 0.95+ |
thousands | QUANTITY | 0.95+ |
AWS Public Sector Summit | EVENT | 0.94+ |
European Union | ORGANIZATION | 0.92+ |
AWS Public Sector Summit 2019 | EVENT | 0.88+ |
Narrator: | TITLE | 0.84+ |
couple years ago | DATE | 0.82+ |
AWS Public Sector | ORGANIZATION | 0.81+ |
California | ORGANIZATION | 0.78+ |
One thing | QUANTITY | 0.72+ |
40 years | QUANTITY | 0.71+ |
U.S. | LOCATION | 0.67+ |
last 30 | DATE | 0.65+ |
one more final | QUANTITY | 0.64+ |
Cloud | ORGANIZATION | 0.62+ |
American | OTHER | 0.6+ |
Natalie Evans Harris, BrightHive | WiDS 2019
>> Live from Stanford University. It's the Cube covering global Women in Data Science conference brought to you by Silicon Angle media. >> Welcome back to the Cubes. Continuing coverage of the fourth annual Women and Data Science Conference with Hashtag with twenty nineteen to join the conversation. Lisa Martin joined by one of the speakers on the career panel today at Stanford. Natalie Evans Harris, the cofounder and head of strategic initiatives at right hive. Natalie. It's a pleasure to have you on the program so excited to be here. Thank you. So you have, which I can't believe twenty years experience advancing the public sectors. Strategic use of data. Nearly twenty. I got more. Is your career at the National Security Agency in eighteen months with the Obama administration? You clearly were a child prodigy, of course. Of course, I was born in nineteen ninety two s. So tell me a little bit about how you got involved with was. This is such an interesting movement because that's exactly what it is in such a short time period. They of a mask. You know, they're expecting about twenty thousand people watching the live stream today here from Stanford. But there's also fifty plus countries participating with one hundred fifty plus a regional events. You're here on the career panel. Tell me a little bit about what attracted you to wits and some of the advice and learnings that you're going to deliver this afternoon. Sure, >> absolutely So Wits and the Women and Data Science Program and Conference on what it's evolved to are the exact type of community collective impact initiatives we want to say. When we think about where we want data science to grow, we need to have diversity in the space. There's already been studies that have come out to talk about the majority of innovations and products that come out are built by white men and built by white men. And from that lens you often lose out on the African American experience or divers racial or demographic experiences. So you want communities like women and data science to come together and show we are a part of this community. We do have a voice and a seat at the table, and we can be a part of the conversation and innovation, and that's what we want, right? So to come together and see thousands of people talking and walking into a room of diverse age and diverse experience, it feels good, and it makes me hopeful about the future because people is what the greatest challenge to data science is going to be in the future. >> Let's talk about that because a lot of the topics around data science relate to data privacy and ethics. Cyber security. But if we look at the amount of data that's generated every day, two point five quintillion pieces of data, tremendous amount of impact for the good. You think of cancer research and machine learning in cancer research. But we also think, Wow, we're at this data revolution. I read this block that you co authored it about a year ago called It's time to Talk About Data Ethics, and I found it so interesting because how how do we get control around this when we all know that? Yes, there is so many great applications for data that were that we benefit from every day. But there's also been a lack of transparency on a growing scale. In your perspective, how do what's the human capital element and how does that become influenced to really manage data in a responsible way? I think that >> we're recognizing that data can solve all of these really hard problems and where we're collecting these quintillion bytes of data on a daily basis. So there's acknowledgment that there's things that humans just can't d'oh so a I and machine learning our great ways to increase access to that data so we can use it to start to solve problems. But we also need to recognize is that no matter how good A I gets, there's still humans that need to be a part of that context because the the algorithms air on Lee as strong as the people that have developed them. So we need data scientist. We need women with diverse experiences. We need people with diverse thoughts because they're the ones we're going to create, those algorithms that make the machine learning and the and the algorithms in the technology more powerful, more diverse and more equal. So we need to see more growth and experiences and people and learning the things that I talk about. When I when others asked me and what I'll mention on the career panel is when you think about data science. It's not just about teaching the technical skills. There's this empathy that needs to be a part of it. There's this skill of being able to ask questions in really interesting ways of the data. When I worked at National Security Agency and helped build the data science program there, every data scientist that came into the building, we, of course taught them about working in our vitamins. But we also made every single one of them take a class on asking questions. The same class that we had our intelligence analyst take so the same ways of the history and the foreign language experts needed to learn how to ask questions of data we needed, Our data scientist told. Learn that as well. That's how you start to look beyond just the ones and zeros and start to really think about not just data but the people that are impacted by the use of the data. >> Well, it's really one of the things I find interesting about data. Science is how diverse on I use that word, specifically because we talked about thought diversity. But it's not just the technical skills as you mentioned. It's empathy. It's communication. It's collaboration on DH those air. So it's such a like I said, Diverse opportunity. One of the things I think I read about in your blawg. If we look at okay, we need to not just train the people on how to analyze the data but howto be confident enough to raise their hand and ask questions. How do you also train the people? >> Two. >> Handle data responsibly. You kind of mentioned there's this notion of sort of like a Hippocratic oath that medical doctors take for data scientist. And I thought that was really intriguing. Tell me a little bit more about that. And how do you think that data scientists in training and those that are working now can be trained? Yeah, influenced to actually take something like that in terms of really individualizing that responsibility for ethical treatment of data. So, towards the >> end of my time at the White House, we it was myself deejay Patil and a number of experts and thought leaders in the space of of news and ethics and data science came together and had this conversation about the future of data ethics. And what does it look like? Especially with the rise of fake news and misinformation and all of these things? And born out of that conversation was just this. This realization that if you believe that, inherently people want to do the good thing, want to do the right thing? How do they do that? What does that look like? So I worked with Data for Democracy and Bloomberg to Teo issue a study and just say, Look, data scientist, what keeps you up at night? What are the things that as you as you build these algorithms and you're doing this? Data sharing keeps you up at night. And the things that came out of those conversations and the working groups and the community of practice. Now we're just what you're talking about. How do we communicate responsibly around this? How do we What does it look like to know that we've done enough to protect the data, to secure the data, to, to use the data in the most appropriate ways? And when we >> see a problem, what do >> we do to communicate that problem and address it >> out of >> that community of practice? And those principles really came the starts of what an ethics. Oh, the Hippocratic oath could look like it's a set of principles. It's not the answer, but it's a framework to help guide you down. Your own definition of what ethical behaviour looks like when you use data. Also, it became a starting point for many companies to create their own manifestos and their own goals to say as a company, these are the values that we're going to hold true to as we use data. And then they can create the environments that allow for data scientists to be able to communicate how they feel about what is happening around them and effect change. It's a form of empowerment. Amazing. I love >> that in the last thirty seconds, I just want to get your perspective on. Here we are spring of twenty nineteen. Where are we as a society? Mon data equaling trust? >> Oh, I love that we're having the conversation. And so we're at that point of just recognizing that data's more than ones and zeroes. And it's become such an integral part of who people are. And so we need some rules to this game. We need to recognize that privacy is more than just virus protection, that there is a trust that needs to be built between the individuals, the communities and the companies that are using this data. What the answers are is what we're still figuring out. I argue that a large part of it is just human capital. It's just making sure that you have a diverse set of voices, almost a brain trust as a part of the conversation. So you're not just going to the same three people and saying, What should we d'Oh But you're growing and each one teach one and building this community around collectively solving these problems. Well, >> Natalie's been such a pleasure talking with you today. Thank you so much for spending some time and joining us on the Cuban. Have a great time in the career panel this afternoon. Atwood's. >> Thank you so much. This is a lot of fun. >> Good. My pleasure. We want to thank you. You're watching the Cube from the fourth annual Women and Data Science Conference alive from Stanford University. I'm Lisa Martin. I'll be back with my next guest after a short break
SUMMARY :
It's the Cube covering It's a pleasure to have you on the program so excited to be here. are the exact type of community collective impact initiatives we want to say. Let's talk about that because a lot of the topics around data science relate to data privacy and learning the things that I talk about. the people on how to analyze the data but howto be confident enough to And how do you think that data scientists in training And the things that came out of those conversations and the working groups and the community of practice. but it's a framework to help guide you down. that in the last thirty seconds, I just want to get your perspective on. It's just making sure that you have a diverse set of voices, almost a brain trust Natalie's been such a pleasure talking with you today. Thank you so much. Women and Data Science Conference alive from Stanford University.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Lisa Martin | PERSON | 0.99+ |
National Security Agency | ORGANIZATION | 0.99+ |
Natalie | PERSON | 0.99+ |
twenty years | QUANTITY | 0.99+ |
Data for Democracy | ORGANIZATION | 0.99+ |
Natalie Evans Harris | PERSON | 0.99+ |
Silicon Angle | ORGANIZATION | 0.99+ |
eighteen months | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
Bloomberg | ORGANIZATION | 0.99+ |
Stanford University | ORGANIZATION | 0.99+ |
three people | QUANTITY | 0.99+ |
Lee | PERSON | 0.98+ |
today | DATE | 0.98+ |
one | QUANTITY | 0.98+ |
fifty plus countries | QUANTITY | 0.97+ |
about twenty thousand people | QUANTITY | 0.97+ |
Nearly twenty | QUANTITY | 0.97+ |
nineteen ninety two s. | QUANTITY | 0.97+ |
It's time to Talk About Data Ethics | TITLE | 0.96+ |
Women in Data Science | EVENT | 0.96+ |
five quintillion pieces | QUANTITY | 0.96+ |
Two | QUANTITY | 0.96+ |
twenty nineteen | QUANTITY | 0.94+ |
one hundred fifty plus | QUANTITY | 0.93+ |
this afternoon | DATE | 0.93+ |
each one | QUANTITY | 0.92+ |
zeroes | QUANTITY | 0.92+ |
about a year ago | DATE | 0.9+ |
two point | QUANTITY | 0.88+ |
deejay Patil | PERSON | 0.87+ |
Women and Data Science Conference | EVENT | 0.87+ |
Cube | ORGANIZATION | 0.84+ |
thousands of people | QUANTITY | 0.79+ |
Cubes | ORGANIZATION | 0.78+ |
Hippocratic | TITLE | 0.78+ |
Obama administration | ORGANIZATION | 0.77+ |
African American | OTHER | 0.77+ |
Stanford | LOCATION | 0.77+ |
White House | LOCATION | 0.77+ |
quintillion bytes of | QUANTITY | 0.76+ |
WiDS 2019 | EVENT | 0.76+ |
BrightHive | ORGANIZATION | 0.76+ |
So Wits and the Women and Data Science Program and | EVENT | 0.75+ |
Cube | TITLE | 0.72+ |
one of | QUANTITY | 0.7+ |
zeros | QUANTITY | 0.67+ |
thirty seconds | QUANTITY | 0.65+ |
Stanford | ORGANIZATION | 0.62+ |
spring of twenty nineteen | DATE | 0.61+ |
more than | QUANTITY | 0.57+ |
every single | QUANTITY | 0.54+ |
fourth annual | EVENT | 0.54+ |
fourth annual | QUANTITY | 0.51+ |
events | QUANTITY | 0.5+ |
Cuban | LOCATION | 0.49+ |
Teo | PERSON | 0.49+ |
ones | QUANTITY | 0.47+ |
Atwood | PERSON | 0.46+ |
Day One Kickoff | Grace Hopper 2017
>> Announcer: Live from Orlando, Florida, it's theCUBE, covering Grace Hopper's Celebration of Women in Computing, brought to you by SiliconANGLE Media. >> Welcome to day one of the Grace Hopper Conference here in Orlando, Florida. Welcome back to theCUBE, I should say. I'm your host, Rebecca Knight, along with my co-host, Jeff Frick. We have just seen some really great keynote addresses. We had Faith Ilee from Stanford University. Melinda Gates, obviously the co-founder of the Bill and Melinda Gates Foundation. We also had Diane Green, the founder of VMware. Jeff, what are your first impressions? >> You know, I love comin' to this show. It's great to be workin' with you again, Rebecca. I thought the keynotes were really good. I've seen Diane Green speak a lot and she's a super smart lady, super qualified, changed the world of VMware. She's not always the greatest public speaker, but she was so comfortable up there. She so felt in her element. It was actually the best I'd ever seen. For me, I'm not a woman, but I'm a dad of two daughters. It was really fun to hear the lessons that some of these ladies learned from their father that they took forward. So, I was really hap-- I admit, I'm feelin' the pressure to make sure I do a good job on my daughters. >> Make sure those formative experiences are the right ones, yes. >> It's just interesting though how people's early foundation sets the stage for where they go. I thought Dr. Sue Black, who talked about the morning she woke up and her husband threatened to kill her. So, she just got out of the house with her two kids and started her journey then. Not in her teens, not in her twenties, not in college. Obviously well after that, to get into computer science and to start her tech journey and become what she's done now. Now she's saving the estate where the codebreakers were in World War II, so phenomenal story. Melinda Gates, I've never seen her speak. Then Megan Smith, always just a ton of energy. Before she was a CTO for the United States, that was with the Obama administration. I don't think she hung around as part of the Trump Administration. She brings such energy, and now, kind of released from the shackles of her public service and her own thing. Great to see her up there. It's just a terrific event. The energy that comes from, I think, a third of the people here are young women. Really young, either still in college or just out of college. Really makes for an atmosphere that I think is unique in all the tech shows that we cover. >> I completely agree. I think the energy really is what sets the Grace Hopper Celebration of Women in Computing apart from all the other conferences. First of all, there's just many more women who come to this. The age, as you noted, it's a lot lower than your typical tech conference. But, I also just think what is so exciting about this conference is that it is this incredible mix of positivity. let's get more women in here, let's figure out ways to get more women interested in computer science and really working on their journey as tech leaders. But, also really understanding what we're up against in this industry. Understanding the bro-grammar culture, the biases that are really creating barriers for women to get ahead, and actually to even enter into the industry itself. Then, also there's the tech itself, so we have these women who are talking about these cool products that they're making and different pathways into artificial intelligence and machine-learning, and what they're doing. So, it's a really incredible conference that has a lot of different layers to it. >> It's interesting, Dr. Fei-Fei Li was talking a lot about artificial intelligence, and the programming that goes into artificial intelligence, and kind of the classic Google story where you use crowdsourcing and run a bunch of photographs through an algorithm to teach it. But, she made a really interesting point in all this discussion about, is it the dark future of AI, where they take over the world and kill us all? Or, is it a positive future, where it frees us up to do more important things and more enlightened things. She really made a good point that it's, how do you write the algorithms? How are we training the computers to do what we do? Women bring a different perspective. Diversity brings a different perspective. To bake that into the algorithms up front is so, so important to shape the way the AI shapes the evolution of our world. So, I found that to be a really interesting point that she brought up that I don't think is talked about enough. People have to write the algorithms. People have to write the stuff that trains the machines, so it's really important to have a broad perspective. You are absolutely right, and I think she actually made the point even broader than that in the sense of is if AI is going to shape our life and our economy going forward-- >> Which it will, right? >> Which it will. Then, the fact that there are so few women in technology, this is a crisis. Because, if the people who are the end-users and who are going to either benefit or be disadvantaged by AI aren't showing up and aren't helping create it, then yes, it is a crisis. >> Right. And I think the other point that came up was to bake more computer science into other fields, whether it's biology, whether it's law, education. The application of AI, the application of computer science in all those fields, it's much more powerful than just computing for the sake of computing. I think that's another way hopefully to keep more women engaged. 'Cause a big part of the issue is, not only the pipeline at the lead, but there's a lot of droppage as they go through the process. So, how do you keep more of 'em involved? Obviously, if you open it up across a broader set of academic disciplines, by rule you should get more retention. The other thing that's interesting here, Rebecca. This is our fourth year theCUBE's been at Grace Hopper's since way back in Phoenix in 2014, ironically, when there was also a big Microsoft moment at that show that we won't delve back into. But, it's a time of change. We have Brenda Darden Wilkerson, the brand new president of the Anita Borg organization. Telle Whitney's stepping down and she's passing the baton. We'll have them both on. So, again, Telle's done a great job. Look what she's created in the team. But, always fun to have fresh blood. Always fun to bring in new energy, new point of view, and I'm really excited to meet Brenda. She's done some amazing things in the Chicago Public School System, and if you've ever worked in a public school district, not a really easy place to innovate and bring change. >> Right, no, of course. Yeah, so our lineup of guests is incredible this week. We've got Sarah Clatterbuck, who is a CUBE alum. We have a woman who is the founder of Roar, which is a self-defense wearable technology. We're going to be looking at a broad array of the women technologists who are leading change in the industry, but then also leading it from a recruitment and retention point of-- >> So, should be a great three days, looking forward to it. >> I am as well. Excellent. Okay, so please keep joining us. Keep your channel tuned in here to theCUBE"s coverage of the Grace Hopper Conference here in Orlando, Florida. I'm your host, Rebecca Knight, along with my co-host, Jeff Frick. We will see you back here shortly. (light, electronic music)
SUMMARY :
brought to you by SiliconANGLE Media. We also had Diane Green, the founder of VMware. It's great to be workin' with you again, Rebecca. experiences are the right ones, yes. and now, kind of released from the shackles of her and actually to even enter into the industry itself. and kind of the classic Google story where you use Then, the fact that there are so few women in technology, The application of AI, the application of of the women technologists who are leading three days, looking forward to it. to theCUBE"s coverage of the Grace Hopper Conference
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Diane Green | PERSON | 0.99+ |
Brenda | PERSON | 0.99+ |
Rebecca Knight | PERSON | 0.99+ |
Sarah Clatterbuck | PERSON | 0.99+ |
Rebecca | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Megan Smith | PERSON | 0.99+ |
Telle | PERSON | 0.99+ |
two kids | QUANTITY | 0.99+ |
VMware | ORGANIZATION | 0.99+ |
Melinda Gates | PERSON | 0.99+ |
Faith Ilee | PERSON | 0.99+ |
Jeff | PERSON | 0.99+ |
2014 | DATE | 0.99+ |
Phoenix | LOCATION | 0.99+ |
Bill | ORGANIZATION | 0.99+ |
twenties | QUANTITY | 0.99+ |
Sue Black | PERSON | 0.99+ |
World War II | EVENT | 0.99+ |
Orlando, Florida | LOCATION | 0.99+ |
two daughters | QUANTITY | 0.99+ |
Fei-Fei Li | PERSON | 0.99+ |
Trump Administration | ORGANIZATION | 0.99+ |
Telle Whitney | PERSON | 0.99+ |
Brenda Darden Wilkerson | PERSON | 0.99+ |
Chicago Public School System | ORGANIZATION | 0.99+ |
Roar | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
fourth year | QUANTITY | 0.99+ |
Stanford University | ORGANIZATION | 0.98+ |
both | QUANTITY | 0.98+ |
Grace Hopper Conference | EVENT | 0.98+ |
SiliconANGLE Media | ORGANIZATION | 0.98+ |
first impressions | QUANTITY | 0.98+ |
three days | QUANTITY | 0.97+ |
United States | LOCATION | 0.97+ |
ORGANIZATION | 0.97+ | |
First | QUANTITY | 0.97+ |
Day One | QUANTITY | 0.96+ |
Grace Hopper | EVENT | 0.96+ |
Melinda Gates Foundation | ORGANIZATION | 0.95+ |
this week | DATE | 0.94+ |
Grace Hopper Celebration of Women in Computing | EVENT | 0.94+ |
Grace Hopper | ORGANIZATION | 0.92+ |
Anita Borg | ORGANIZATION | 0.85+ |
Dr. | PERSON | 0.81+ |
Celebration of Women in Computing | EVENT | 0.81+ |
2017 | DATE | 0.8+ |
day one | QUANTITY | 0.75+ |
theCUBE | ORGANIZATION | 0.74+ |
Obama administration | ORGANIZATION | 0.65+ |
Grace | PERSON | 0.63+ |
third of the people | QUANTITY | 0.58+ |
ton | QUANTITY | 0.39+ |
Hopper | TITLE | 0.33+ |
Dr. Dawn Nafus | SXSW 2017
>> Announcer: Live from Austin, Texas it's the Cube. Covering South by Southwest 2017. Brought to you by Intel. Now here's John Furrier. Okay we're back live here at the South by Southwest Intel AI Lounge, this is The Cube's special coverage of South by Southwest with Intel, #IntelAI where amazing starts with Intel. Our next guest is Dr. Dawn Nafus who's with Intel and you are a senior research scientist. Welcome to The Cube. >> Thank you. >> So you've got a panel coming up and you also have a book AI For Everything. And looking at a democratization of AI we had a quote yesterday that, "AI is the bulldozer for data." What bulldozers were in the real world, AI will be that bulldozer for data, surfacing new experiences. >> Right. >> This is the subject of your book, kind of. What's your take on this and what's your premise? >> Right well the book actually takes a step way back, it's actually called Self Tracking, the panel is AI For Everyone. But the book is on self tracking. And it's really about actually getting some meaning out of data before we start talking about bulldozers. So right now we've got this situation where there's a lot of talk about AI's going to sort of solve all of our problems in health and there's a lot that can get accomplished, whoops. But the fact of the matter is is that people are still struggling with gees, like, "What does my Fitbit actually mean, right?" So there's this, there's a real big gap. And I think probably part of what the industry has to do is not just sort of build new great technologies which we've got to do but also start to fill that gap in sort of data education, data literacy, all that sort of stuff. >> So we're kind of in this first generation of AI data you mentioned wearable, Fitbits. >> Dawn: Yup. >> So people are now getting used to this, so that it sounds this integration into lifestyle becomes kind of a dynamic. >> Yeah. >> Why are people grappling >> John: with this, what's your research say about that? >> Well right now with wearables frankly we're in the classic trough of disillusionment. (laughs) You know for those of you listening I don't know if you have sort of wearables in drawers right now, right? But a lot of people do. And it turns out that folks tend to use it, you know maybe about three or four weeks and either they've learned something really interesting and helpful or they haven't. And so there's actually a lot of people who do really interesting stuff to kind of combine it with symptoms tracking, location, right other sorts of things to actually really reveal the sorts of triggers for medical issues that you can't find in a clinical setting. It's all about being out in the real world and figuring out what's going on with you. Right, so then when we start to think about adding more complexity into that, which is the thing that AI's good at, we've got this problem of there's only so many data sets that AI's any actually any good at handling. And so I think there's going to have to be a moment where sort of people themselves actually start to say, "Okay you know what? "This is how I define my problem. "This is what I'm going to choose to keep track of." And some of that's going to be on a sensor and some of it isn't. Right and sort of being really intervening a little bit more strongly in what this stuff's actually doing. >> You mentioned the Fitbit and you were seeing a lot of disruption in the areas, innovation and disruption, same thing good and bad potentially. But I'll see autonomous vehicles is pretty clear, and knows what Tesla's tracking with their hot trend. But you mentioned Fitbit, that's a healthcare kind of thing. AIs might seem to be a perfect fit into healthcare because there's always alarms going off and all this data flying around. Is that a low hanging fruit for AI? Healthcare? >> Well I don't know if there's any such thing as low hanging fruit (John laughs) in this space. (laughs) But certainly if you're talking about like actual human benefit, right? That absolutely comes the top of the list. And we can see that in both formal healthcare in clinical settings and sort of imaging for diagnosis. Again I think there's areas to be cautious about, right? You know making sure that there's also an appropriate human check and there's also mechanisms for transparency, right? So that doctors, when there is a discrepancy between what the doctor believes and what the machine says you can actually go back and figure out what's actually going on. The other thing I'm particularly excited about is, and this is why I'm so interested in democratization is that health is not just about, you know, what goes on in clinical care. There are right now environmental health groups who are looking at slew of air quality data that they don't know what to do with, right? And a certain amount of machine assistance to sort of figure out you know signatures of sort of point source polluters, for example, is a really great use of AI. It's not going to make anybody any money anytime soon, but that's the kind of society that we want to live in right? >> You are the social good angle for sure, but I'd like to get your thoughts 'cause you mentioned democratization and it's kind of a nuance depending upon what you're looking at. Democratization with news and media is what you saw with social media now you got healthcare. So how do you define democratization in your context and you're excited about.? Is that more of freedom of information and data is it getting around gatekeepers and siloed stacks? I mean how do you look at democratization? >> All of the above. (laughs) (John laughs) I'd say there are two real elements to that. The first is making sure that you know, people are going to use this for more than just business, have the ability to actually do it and have access to the right sorts of infrastructures to, whether it's the environmental health case or there are actually artists now who use natural language processing to create art work. And people ask them, "Why are you using deblurting?" I said, "Well there's a real access issue frankly." It's also on the side of if you're not the person who's going to be directly using data a kind of a sense of, you know... Democratization to me means being able to ask questions of how the stuff's actually behaving. So that means building in mechanisms for transparency, building in mechanisms to allow journalists to do the work that they do. >> Sharing potentially? >> I'm sorry? >> And sharing as well more data? >> Very, very good. Right absolutely, I mean frankly we still have a problem right now in the wearable base of people even getting access to their own data. There's a guy I work with named Hugo Campos who has an arterial defibrillator and he's still fighting to get access to the very data that's coming out of his heart. Right? (laughs) >> Is it on SSD, in the cloud? I mean where is it? >> It is in the cloud. It's going back to the manufacturer. And there are very robust conversations about where it should be. >> That's super sad. So this brings up the whole thing that we've been talking about yesterday when we had a mini segment on The Cube is that there are all these new societal use cases that are just springing up that we've never seen before. Self-driving cars with transportation, healthcare access to data, all these things. What are some of the things that you see emerging on that tools or approaches that could help either scientists or practitioners or citizens deal with these new critical problem solving that needs to apply technology to. I was talking just last week at Stanford with folks that are looking at gender bias and algorithms. >> Right, uh-huh it's real. >> Something I would never have thought of that's an outlier. Like hey, what? >> Oh no, it's happened. >> But it's one of those things were okay, let's put that on the table. There's all this new stuff coming on the table. >> Yeah, yeah absolutely. >> What do you see? >> So they're-- >> How do we solve that >> John: what approaches? >> Yeah there are a couple of mechanisms and I would encourage listeners and folks in the audience to have a look at a really great report that just came out from the Obama Administration and NYU School of Law. It's called AI Now and they actually propose a couple of pathways to sort of making sure we get this right. So you know a couple of things. You know one is frankly making sure that women and people of color are in the room when the stuff's getting built, right? That helps. You know as I said earlier you know making sure that you know things will go awry. Like it just will we can't predict how these things are going to work and catching it after the fact and building in mechanisms to be able to do that really matter. So there was a great effort by ProPublica to look at a system that was predicting criminal recidivism. And what they did was they said, "Look you know "it is true that "the thing has the same failure rate "for both blacks and whites." But some hefty data journalism and data scraping and all the rest of it actually revealed that it was producing false positives for blacks and false negatives for whites. Meaning that black people were predicted to create more crime than white people right? So you know, we can catch that, right? And when we build in more system of people who had the skills to do it, then we can build stuff that we can live with. >> This is exactly to your point of democratization I think that fascinates me that I get so excited about. It's almost intoxicating when you think about it technically and also societal that there's all these new things that are emerging and the community has to work together. Because it's one of those things where there's no, there may be a board of governors out there. I mean who is the board of governors for this stuff? It really has to be community driven. >> Yeah, yeah. >> And NYU's got one, any other examples of communities that are out there that people can participate in or? >> Yup, absolutely. So I think that you know, they're certainly collaborating on projects that you actually care about and sort of asking good questions about, is this appropriate for AI or not, right? Is a great place to start of reaching out to people who have those technical skills. There are also the Engineering Professional Association actually just came out a couple months ago with a set of guidelines for developers to be able to... The kinds of things you have to think about if you're going to build an ethical AI system. So they came out with some very high level principles. Operationalizing those principles is going to be a real tough job and we're all going to have to pitch in. And I'm certainly involved in that. But yeah, there are actually systems of governance that are cohering, but it's early days. >> It's great way to get involved. So I got to ask you the personal question. In your efforts with the research and the book and all of your travels, what's some of the most amazing things that you've seen with AI that are out there that people may know about or may not know about that they should know about? >> Oh gosh. I'm going to reserve judgment, I don't know yet. I think we're too early on the curve to be able to talk about, you know, sort of the magic of it. What I can say is that there is real power when ordinary people who have no coding skills whatsoever and frankly don't even know what the heck machine learning is, get their heads around data that is collected about them personally. That opens up, you can teach five year olds statistical concepts that are learned in college with a wearable because the data applies to them. So they know how it's been collected. >> It's personal. >> Yeah they know what it is already. You don't have to tell them what a outlier effect is because they know because they wear that outlier. You know what I mean. >> They're immersed in the data. >> Absolutely and I think that's where the real social change is going to come from. >> I love immersion as a great way to teach kids. But the data's key. So I got to ask you with the big pillars of change going on and at Mobile World Congress I saw you, Intel in particular, talking about autonomous vehicles heavily, smart cities, media entertainment and the smart home. I'm just trying to get a peg a comparable of how big this shift will be. These will be, I mean the '60s revolution when chips started coming out, the PC revolution and server revolution and now we're kind of in this new wave. How big is it? I mean in order of magnitude, is it super huge with all of the other ships combined? Are we going to see radical >> I don't know. >> configuration changes? >> You know. You know I'm an anthropologist, right? (John laughs) You know everything changes and nothing changes at the same time, right? We're still going to wake up, we're still going to put on our shoes in the morning, right? We're still going to have a lot of the same values and social structures and all the rest of it that we've always had, right. So I don't think in terms of plonk, here's a bunch of technology now. Now that's a revolution. There's like a dialogue. And we are just at the very, very baby steps of having that dialogue. But when we do people in my field call it domestication, right? These become tame, they become part of our lives, we shape them and they shape us. And that's not radical change, that's the change we always have. >> That's evolution. So I got to ask you a question because I have four kids and I have this conversation with my wife and friends all the time because we have kids, digital natives are growing up. And we see a lot of also work place domestication, people kind of getting domesticated with the new technologies. What's your advice whether it's parents to their kids, kids to growing up in this world, whether it's education? How should people approach the technology that's coming at them so heavily? In the age of social media where all our voices are equal right now, getting more filters are coming out. It's pretty intense. >> Yeah, yeah. I think it's an occasion where people have to think a lot more deliberately than they ever have about the sources of information that they want exposure to. The kinds of interaction, the mechanisms that actual do and don't matter. And thinking very clearly about what's noise and what's not is a fine thing to do. (laughs) (John laughs) so yeah, probably the filtering mechanisms has to get a bit stronger. I would say too there's a whole set of practices, there are ways that you can scrutinize new devices for, you know, where the data goes. And often, kind of the higher bar companies will give you access back, right? So if you can't get your data out again, I would start asking questions. >> All right final two questions for you. What's your experiences like so far at South by Southwest? >> Yup. >> And where is the world going to take you next in terms of your research and your focus? >> Well this is my second year at South by Southwest. It's hugely fun, I am so pleased to see just a rip roaring crowd here at the Intel facility which is just amazing. I think this is our first time as in Dell proper. I'm having a really good time. The Self Tracking book is in the book shelf over in the convention center if you're interested. And what's next is we are going to get real about how to make, how to make these ethical principles actually work at an engineering level. >> Computer science meets social science, happening right now. >> Absolutely. >> Intel powering amazing here at South by Southwest. I'm John Furrier you're watching The Cube. We've got a great set of people here on The Cube. Also great AI Lounge experience, great demos, great technologists all about AI for social change with Dr. Dawn Nafus with Intel. We'll be right back with more coverage after this short break. (upbeat digital beats)
SUMMARY :
Brought to you by Intel. "AI is the bulldozer for data." This is the subject of your book, kind of. is that people are still struggling with gees, you mentioned wearable, Fitbits. so that it sounds this integration into lifestyle And so I think there's going to have to be a moment where You mentioned the Fitbit and you were seeing to sort of figure out you know signatures So how do you define democratization in your context have the ability to actually do it a problem right now in the wearable base of It's going back to the manufacturer. What are some of the things that you see emerging have thought of that's an outlier. let's put that on the table. had the skills to do it, and the community has to work together. So I think that you know, they're So I got to ask you the personal question. to be able to talk about, you know, You don't have to tell them what a outlier effect is is going to come from. So I got to ask you with the big pillars and social structures and all the rest of it So I got to ask you a question because kind of the higher bar companies will give you What's your experiences like so far It's hugely fun, I am so pleased to see happening right now. We'll be right back with more coverage
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
John | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
NYU School of Law | ORGANIZATION | 0.99+ |
Obama Administration | ORGANIZATION | 0.99+ |
Dawn Nafus | PERSON | 0.99+ |
five year | QUANTITY | 0.99+ |
four kids | QUANTITY | 0.99+ |
Tesla | ORGANIZATION | 0.99+ |
yesterday | DATE | 0.99+ |
ProPublica | ORGANIZATION | 0.99+ |
both | QUANTITY | 0.99+ |
last week | DATE | 0.99+ |
Hugo Campos | PERSON | 0.99+ |
second year | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
Intel | ORGANIZATION | 0.99+ |
Austin, Texas | LOCATION | 0.99+ |
one | QUANTITY | 0.99+ |
The Cube | TITLE | 0.99+ |
first time | QUANTITY | 0.99+ |
two questions | QUANTITY | 0.99+ |
four weeks | QUANTITY | 0.98+ |
Dawn | PERSON | 0.98+ |
NYU | ORGANIZATION | 0.98+ |
SXSW 2017 | EVENT | 0.98+ |
Engineering Professional Association | ORGANIZATION | 0.98+ |
first generation | QUANTITY | 0.97+ |
Mobile World Congress | EVENT | 0.97+ |
AI For Everything | TITLE | 0.97+ |
couple months ago | DATE | 0.96+ |
South by Southwest | ORGANIZATION | 0.96+ |
two real elements | QUANTITY | 0.95+ |
'60s | DATE | 0.95+ |
Dell | ORGANIZATION | 0.94+ |
Dr. | PERSON | 0.93+ |
about three | QUANTITY | 0.92+ |
Fitbit | ORGANIZATION | 0.89+ |
Stanford | ORGANIZATION | 0.86+ |
2017 | DATE | 0.86+ |
Dr. Dawn Nafus | PERSON | 0.85+ |
Fitbits | ORGANIZATION | 0.74+ |
South by Southwest | LOCATION | 0.72+ |
South by Southwest | TITLE | 0.7+ |
one of | QUANTITY | 0.7+ |
Southwest | LOCATION | 0.6+ |
Self Tracking | TITLE | 0.6+ |
Lounge | ORGANIZATION | 0.57+ |
couple | QUANTITY | 0.47+ |
The Cube | ORGANIZATION | 0.46+ |
Cube | COMMERCIAL_ITEM | 0.44+ |
AI | LOCATION | 0.44+ |
South by | TITLE | 0.42+ |
Southwest | ORGANIZATION | 0.39+ |