Ann Cavoukian and Michelle Dennedy | CUBE Conversation, August 2020
(upbeat music) >> Announcer: From the CUBE studios in Palo Alto, in Boston, connecting with thought leaders all around the world. This is theCUBE Conversation. >> Hey, welcome back everybody Jeffrey Frick with theCUBE. We are getting through the COVID crisis. It continues and impacting the summer. I can't believe the summer's almost over, but there's a whole lot of things going on in terms of privacy and contact tracing and this kind of this feeling that there's this conflict between kind of personal identification and your personal privacy versus the public good around things like contact tracing. And I was in a session last week with two really fantastic experts. I wanted to bring them on the show and we're really excited to have back for I don't even know how many times Michelle has been on Michelle Dennedy, She is the former chief privacy officer at Cisco and now she's running the CEO of Identity, Michelle great to see you. >> Good to see you always Jeff >> Yeah and for the first time Dr. Ann Cavoukian and she is the executive director Global Privacy & Security By Design Center. Joining us from Toronto, worked with the government and is not short on opinions about privacy. (laughing) Ann good to see you. >> Hi Jeff thank you >> Yes, so let's jump into it cause I think one of the fundamental issues that we keep hearing is this zero-sum game. And I know and it's a big topic for you that there seems to be this trade off this either or and specifically let's just go to contact tracing. Cause that's a hot topic right now with COVID. I hear that it's like you're telling everybody where I'm going and you're sharing that with all these other people. How is this even a conversation and where do I get to choose whether I want to participate or not? >> You can't have people traced and tracked and surveil. You simply can't have it and it can't be an either or win lose model. You have to get rid of that data. Zero-sum game where only one person can win and the other one loses and it sums to a total of zero. Get rid of that, that's so yesterday. You have to have both groups winning positive sum. Meaning yes, you need public health and public safety and you need privacy. It's not one versus the other. We can do both and that's what we insist upon. So the contact term tracing app that was developed in Canada was based on the Apple Google framework, which is actually called exposure notification. It's totally privacy protective individuals choose to voluntarily download this app. And no personal information is collected whatsoever. No names, no geolocation data, nothing. It's simply notifies you. If you've been exposed to someone who is COVID-19 positive, and then you can decide on what action you wish to take. Do you want to go get tested? Do you want to go to your family doctor, whatever the decision lies with you, you have total control and that's what privacy is all about. >> Jeffrey: But what about the person who was sick? Who's feeding the top into that process and is the sick person that you're no notifying they obviously their personal information is part of that transaction. >> what the COVID alerts that we developed based on the Apple Google framework. It builds on manual contact tracing, which also take place the two to compliment each other. So the manual contact tracing is when individuals go get to get tested and they're tested as positive. So healthcare nurses will speak to that individual and say, please tell us who you've been in contact with recently, family, friends, et cetera. So the two work together and by working together, we will combat this in a much more effective manner. >> Jeffrey: So shifting over to you Michelle, you know, there's PIN and a lot of conversations all the time about personal identifiable information but right. But then medical has this whole nother class of kind of privacy restrictions and level of care. And I find it really interesting that on one hand, you know, we were trying to do the contract tracing on another hand if you know, my wife works in a public school. If they find out that one of the kids in this class has been exposed to COVID somehow they can't necessarily tell the teacher because of HIPAA restriction. So I wonder if you could share your thoughts on this kind of crossover between privacy and health information when it gets into this kind of public crisis and this inherent conflict for the public right to know and should the teacher be able to be told and it's not a really clean line with a simple answer, I don't think. >> No and Jeff, and you're also layering, you know, when you're talking about student data, you layering another layer of legal restriction. And I think what you're putting your thumb on is something that's really critical. When you talk about privacy engineering, privacy by design and ethics engineering. You can't simply start with the legal premise. So is it lawful to share HIPAA covered data. A child telling mommy I don't feel well not HIPAA covered. A child seeing a doctor for medical services and finding some sort of infection or illness covered, right? So figuring out the origin of the exact same zero one. Am I ill or not, all depends on context. So you have to first figure out, first of all let's tackle the moral issues. Have we decided that it is a moral imperative to expose certain types of data. And I separate that from ethics intentionally and with apologies to true ethicists. The moral imperative is sort of the things we find are so wrong. We don't want a list of kids who are sick or conversely once the tipping point goes the list of kids who are well. So then they are called out that's the moral choice. The ethical choice is just because you can should you, and that's a much longer conversation. Then you get to the legal imperative. Are you allowed to based on the past mistakes that we made. That's what every piece of litigation or legislation is particularly in a common law construct in the US. It's very important to understand that civil law countries like the European theater. They try to prospectively legislate for things that might go wrong. The construct is thinner in a common law economy where you do, you use test cases in the courts of law. That's why we are such a litigious society has its own baggage. But you have to now look at is that legal structure attempting to cover past harms that are so bad that we've decided as a society to punish them, is this a preventative law? And then you finally get to what I say is stage four for every evaluation is isn't viable, are the protections that you have to put on top of these restrictions. So dire that they either cannot be maintained because of culture process or cash or it just doesn't make sense anymore. So does it, is it better to just feel someone's forehead for illness rather than giving a blood assay, having it sent away for three weeks and then maybe blah, blah, blah, blah, blah, blah. >> Right. >> You have to look at this as a system problem solving issue. >> So I want to look at it in the context of, again kind of this increased level of politicization and or, you know, kind of exposure outside of what's pretty closed. And I want to bring up AIDS and the porn industry very frankly right? Where people behaving in the behavior of the business risk a life threatening disease of which I still don't think it as a virus. So you know why, cause suddenly, you know, we can track for that and that's okay to track for that. And there's a legitimate reason to versus all of the other potential medical conditions that I may or may not have that are not necessarily brought to bear within coming to work. And we might be seeing this very soon. As you said, if people are wanting our temperatures, as we come in the door to check for symptoms. How does that play with privacy and healthcare? It's still fascinates me that certain things is kind of pop out into their own little bucket of regulation. I'm wondering if you could share your thoughts on that Ann. >> You know, whenever you make it privacy versus fill in the blank, especially in the context of healthcare. You end up turning it to a lose lose as opposed to even a win lose. Because you will have fewer people wanting to allow themselves to be tested, to be brought forward for fear of where that information may land. If it lands in the hands of your employer for example or your whoever owns your house if you're in renting, et cetera. It creates enormous problems. So regardless of what you may think of the benefits of that model. History has shown that it doesn't work well that people end up shying away from being tested or seeking treatment or any of those things. Even now with the contact tracing apps that have been developed. If you look globally the contact tracing apps for COVID-19. They have failed the ones that identify individuals in the UK, in Australia, in Western Canada that's how it started out. And they've completely dropped them because they don't work. People shy away from them. They don't use them. So they've gotten rid of that. They've replaced it with the, an app based on the Apple Google framework, which is the one that protects privacy and will encourage people to come forward and seek to be tested. If there's a problem in Germany. Germany is one of the largest privacy data protection countries in the world. Their privacy people are highly trusted in Germany. Germany based their app on the Apple Google framework. About a month ago they released it. And within 24 hours they had 6.5 million people download the app. >> Right. >> Because there is such trust there unlike the rest of the world where there's very little trust and we have to be very careful of the trust deficit. Because we want to encourage people to seek out these apps so they can attempt to be tested if there's a problem, but they're not going to use them. They're just going to shy away from them. If there is such a problem. And in fact I'll never forget. I did an interview about a month ago, three weeks ago in the US on a major major radio station that has like 54 million people followers. And I was telling them about the COVID alert the Canadian contact tracing app, actually it's called exposure notification app, which was built on the Apple Google framework. And people in hoard said they wouldn't trust anyone with it in the US. They just wouldn't trust it. So you see there's such a trust deficit. That's what we have to be careful to avoid. >> So I want to hold on the trust for just a second, but I want to go back to you Michelle and talk about the lessons that we can learn post 9/11. So the other thing right and keep going back to this over and over. It's not a zero-sum game. It's not a zero-sum game and yet that's the way it's often positioned as a way to break down existing barriers. So if you go back to 9/11 probably the highest profile thing being the Patriot Act, you know, where laws are put in place to protect us from terrorism that are going to do things that were not normally allowed to be done. I bet without checking real exhaustively that most of those things are still in place. You know, cause a lot of times laws are written. They don't go away for a long time. What can we learn from what happened after 9/11 and the Patriot Act and what should be really scared of, or careful of or wary of using that as a framework for what's happening now around COVID and privacy. >> It's a perfect, it's not even an analogy because we're feeling the shadows of the Patriot Act. Even now today, we had an agreement from the United States with the European community until recently called the Privacy Shield. And it was basically if companies and organizations that were, that fell under the Federal Trade Commissions jurisdiction, there's a bit of layering legal process here. But if they did and they agreed to supply enough protection to data about people who were present in the European Union to the same or better level than the Europeans would. Then that information could pass through this Privacy Shield unencumbered to and from the United States. That was challenged and taken down. I don't know if it's a month ago or if it's still March it's COVID time, but very recently on basis that the US government can overly and some would say indifferent nations, improperly look at European data based on some of these Patriot Act, FISA courts and other intrusive mechanisms that absolutely do apply if we were under the jurisdiction of the United States. So now companies and private actors are in the position of having to somehow prove that they will mechanize their systems and their processes to be immune from their own government intrusion before they can do digital trade with other parts of the world. We haven't yet seen the commercial disruption that will take place. So the unintended consequence of saying rather than owning the answers or the observations and the intelligence that we got out of the actual 9/11 report, which said we had the information we needed. We did not share enough between the agencies and we didn't have the decision making activity and will to take action in that particular instance. Rather than sticking to that knowledge. Instead we stuck to the Patriot Act, which was all but I believe to Congress people. When I mean, you see the hot mess. That is the US right now. When everyone but two people in the room vote for something on the quick. There's probably some sort of a psychological gun to your head. That's probably well thought out thing. We fight each other. That's part of being an American dammit. So I think having these laws that say, you've got to have this one solution because the boogeyman is coming or COVID is coming or terrorists or child pornographers are coming. There's not one solution. So you really have to break this down into an engineering problem and I don't mean technology when I say engineering. I mean looking at the culture, how much trust do you have? Who is the trusted entity? Do we trust Microsoft more than we trust the US government right now? Maybe that might be your contact. How you're going to build people, process and technology not to avoid a bad thing, but to achieve a positive objective because if you're not achieving that positive objective of understanding that safe to move about without masks on, for example, stop, just stop. >> Right, right. My favorite analogy Jeff, and I think I've said this to you in the past is we don't sit around and debate the merits of viscosity of water to protect concrete holes. We have to make sure that when you lead them to the concrete hole, there's enough water in the hole. No, you're building a swimming pool. What kind of a swimming pool do you want? Is it commercial, Is it toddlers? Is it (indistinct), then you build in correlation, protection and da da da da. But if you start looking at every problem as how to avoid hitting a concrete hole. You're really going to miss the opportunity to build and solve the problem that you want and avoid the risk that you do not want. >> Right right, and I want to go back to you on the trust thing. You got an interesting competent in that other show, talking about working for the government and not working directly for the people are voted in power, but for the kind of the larger bureaucracy and agency. I mean, the Edelman Trust Barometer is really interesting. They come out every year. I think it's their 20th year. And they break down kind of like media, government and business. And who do you trust and who do you not trust? What what's so fascinating about the time we're in today is even within the government, the direction that's coming out is completely diametrically opposed oftentimes between the Fed, the state and the local. So what does kind of this breakdown of trust when you're getting two different opinions from the same basic kind of authority due to people's ability or desire to want to participate and actually share the stuff that maybe or maybe not might get reshared. >> It leaves you with no confidence. Basically, you can't take confidence in any of this. And when I was privacy commissioner. I served for three terms, each term that was a different government, different political power in place. And before they had become the government, they were all for privacy and data protection believed in and all that. And then once they became the government all that changed and all of a sudden they wanted to control everyone's information and they wanted to be in power. No, I don't trust government. You know, people often point to the private sector as being the group you should distrust in terms of privacy. I say no, not at all. To me far worse is actually the government because everyone thinks they're there to do good job and trust them. You can't trust. You have to always look under the hood. I always say trust but verify. So unfortunately we have to be vigilant in terms of the protections we seek for privacy both with private sector and with the government, especially with the government and different levels of government. We need to ensure that people's privacy remains intact. It's preserved now and well into the future. You can't give up on it because there's some emergency a pandemic, a terrorist incident whatever of course we have to address those issues. But you have to insist upon people's privacy being preserved. Privacy forms the foundation of our freedom. You cannot have free and open societies without a solid foundation of privacy. So I'm just encouraging everyone. Don't take anything at face value, just because the government tells you something. It doesn't mean it's so always look under the hood and let us ensure the privacy is strongly protected. See emergencies come and go. The pandemic will end. What cannot end is our privacy and our freedom. >> So this is a little dark in here, but we're going to lighten it up a little bit because there's, as Michelle said, you know, if you think about building a pool versus putting up filling a hole, you know, you can take proactive steps. And there's a lot of conversation about proactive steps and I pulled Ann your thing Privacy by Design, The 7 Foundational Principles. I have the guys pull up a slide. But I think what's really interesting here is, is you're very, very specific prescriptive, proactive, right? Proactive, not reactive. Privacy is the default setting. You know, don't have to read the ULAs and I'm not going to read the, all the words we'll share it. People can find it. But what I wanted to focus on is there is an opportunity to get ahead of the curve, but you just have to be a little bit more thoughtful. >> That's right, and Privacy By Design it's a model of prevention, much like a medical model of prevention where you try to prevent the harms from arising, not just deal with them after the facts through regulatory compliance. Of course we have privacy laws and that's very important, but they usually kick in after there's been a data breach or privacy infraction. So when I was privacy commissioner obviously those laws were intact and we had to follow them, but I wanted something better. I wanted to prevent the privacy harms from arising, just like a medical model of prevention. So that's a Privacy By Design is intended to do is instantiate, embed much needed privacy protective measures into your policies, into your procedures bake it into the code so that it has a constant presence and can prevent the harms from arising. >> Jeffrey: Right right. One of the things I know you love to talk about Michelle is compliance, right? And is compliance enough. I know you like to talk about the law. And I think one of the topics that came up on your guys' prior conversation is, you know, will there be a national law, right? GDPR went through on the European side last year, the California Protection Act. A lot of people think that might become the model for more of a national type of rule. But I tell you, when you watch some of the hearings in DC, you know, I'm sure 90% of these people still print their emails and have their staff hand them to them. I mean, it's really scary that said, you know, regulation always does kind of lag probably when it needs to be put in place because people maybe abuse or go places they shouldn't go. So I wonder if you could share your thoughts on where you think legislation is going to going and how should people kind of see that kind of playing out over the next several years, I guess. >> Yeah, it's such a good question Jeff. And it's like, you know, I think even the guys in Vegas are having trouble with setting the high laws on this. Cameron said in I think it was December of 2019, which was like 15 years ago now that in the first quarter of 2020, we would see a federal law. And I participated in a hearing at the Senate banking committee, again, November, October and in the before times. I'm talking about the same thing and here we are. Will we have a comprehensive, reasonable, privacy law in the United States before the end of this president's term. No, we will not. I can say that with just such faith and fidelity. (laughing) But what does that mean? And I think Katie Porter who I'm starting to just love, she's the Congresswoman who's famous for pulling on her white board and just saying, stop fudging the numbers. Let's talk about the numbers. There's about a, what she calls the 20% legislative flip phone a caucus. So there are 20% or more on both sides of the aisle of people in the US who are in the position of writing our laws. who are still on flip phones and aren't using smart phones and other kinds of technologies. There's a generation gap. And as much as I can kind of chuckle at that a little bit and wink, wink, nudge, nudge, isn't that cute. Because you know, my dad, as you know, is very very technical and he's a senior citizen. This is hard. I hope he doesn't see that but... (laughing) But then it's not old versus young. It's not let's get a whole new group and crop and start over again. What it is instead and this is, you know, as my constant tome sort of anti compliance. I'm not anti compliance. You got to put your underwear on before your pants or it's just really hard. (laughing) And I would love to see anyone who is capable of putting their underwater on afterwards. After you've made the decision of following the process. That is so basic. It comes down to, do you want the data that describes or is donated or observed about human beings. Whether it's performance of your employees. People you would love to entice onto your show to be a guest. People you'd like to listen and consume your content. People you want to meet. People you want to marry. Private data as Ann says, does the form the foundation of our freedom, but it also forms the foundation of our commerce. So that compliance, if you have stacked the deck proactively with an ethics that people can understand and agree with and have a choice about and feel like they have some integrity. Then you will start to see the acceleration factor of privacy being something that belongs on your balance sheet. What kind of data is high quality, high nutrition in the right context. And once you've got that, you're in good shape. >> I'm laughing at privacy on the balance sheet. We just had a big conversation about data on the balance sheets. It's a whole, that's a whole another topic. So we can go for days. I have Pages and pages of notes here. But unfortunately I know we've got some time restrictions. And so, and I want to give you the last word as you look forward. You've been in this for a while. You've been in it from the private side, as well as the government side. And you mentioned lots of other scary things, kind of on the horizon. Like the kick of surveillance creep, which there's all kinds of interesting stuff. You know, what advice do you give to citizens. What advice do you give to leaders in the public sector about framing the privacy conversation >> I always want to start by telling them don't frame privacy as a negative. It's not a negative. It's something that can build so much. If you're a business, you can gain a competitive advantage by strongly protecting your customer's privacy because then it will build such loyalty and you'll gain a competitive advantage. You make it work for you. As a government you want your citizens to have faith in the government. You want to encourage them to understand that as a government you respect their privacy. Privacy is highly contextual. It's only the individual who can make determinations relating to the disclosure of his or her personal information. So make sure you build that trust both as a government and as a business, private sector entity and gain from that. It's not a negative at all, make it work for you, make it work for your citizens, for your customers, make it a plus a win win that will give you the best returns. >> Isn't it nice when doing the right thing actually provides better business outcomes too. It's like diversity of opinion and women on boards. And kind of things- >> I love that. we cover these days. >> Well ladies, thank you very very much for your time. I know you've got a hard stop, so I'm going to cut you loose or else we would go for probably another hour and a half, but thank you so much for your time. Thank you for continuing to beat the drum out there and look forward to our next conversation. Hopefully in the not too distant future. >> My pleasure Jeff. Thank you so much. >> Thank you. >> Thank you too. >> All right She's Michelle. >> She's Ann. I'm Jeff. You're watching theCUBE. Thanks for watching. We'll see you next time. (upbeat music)
SUMMARY :
leaders all around the world. and now she's running the CEO of Identity, Yeah and for the first And I know and it's a big topic for you and the other one loses and and is the sick person So the two work together and should the teacher be able to be told are the protections that you have to put You have to look at this and the porn industry very frankly right? of the benefits of that model. careful of the trust deficit. and the Patriot Act and what and the intelligence that we got out of and solve the problem that you want but for the kind of the as being the group you should I have the guys pull up a slide. and can prevent the harms from arising. One of the things I know you and in the before times. kind of on the horizon. that will give you the best returns. doing the right thing I love that. so I'm going to cut you loose Thank you so much. We'll see you next time.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Katie Porter | PERSON | 0.99+ |
Michelle | PERSON | 0.99+ |
Jeffrey | PERSON | 0.99+ |
Jeff | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Jeffrey Frick | PERSON | 0.99+ |
Canada | LOCATION | 0.99+ |
three terms | QUANTITY | 0.99+ |
Patriot Act | TITLE | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
November | DATE | 0.99+ |
Michelle Dennedy | PERSON | 0.99+ |
UK | LOCATION | 0.99+ |
three weeks | QUANTITY | 0.99+ |
Cisco | ORGANIZATION | 0.99+ |
Germany | LOCATION | 0.99+ |
20% | QUANTITY | 0.99+ |
Vegas | LOCATION | 0.99+ |
August 2020 | DATE | 0.99+ |
Ann | PERSON | 0.99+ |
Federal Trade Commissions | ORGANIZATION | 0.99+ |
Ann Cavoukian | PERSON | 0.99+ |
December of 2019 | DATE | 0.99+ |
HIPAA | TITLE | 0.99+ |
US | LOCATION | 0.99+ |
Congress | ORGANIZATION | 0.99+ |
California Protection Act | TITLE | 0.99+ |
United States | LOCATION | 0.99+ |
Australia | LOCATION | 0.99+ |
two people | QUANTITY | 0.99+ |
each term | QUANTITY | 0.99+ |
20th year | QUANTITY | 0.99+ |
Cameron | PERSON | 0.99+ |
last year | DATE | 0.99+ |
both groups | QUANTITY | 0.99+ |
DC | LOCATION | 0.99+ |
90% | QUANTITY | 0.99+ |
three weeks ago | DATE | 0.99+ |
Western Canada | LOCATION | 0.99+ |
CUBE | ORGANIZATION | 0.99+ |
last week | DATE | 0.99+ |
two | QUANTITY | 0.99+ |
Toronto | LOCATION | 0.99+ |
first quarter of 2020 | DATE | 0.99+ |
today | DATE | 0.99+ |
March | DATE | 0.99+ |
US | ORGANIZATION | 0.99+ |
Fed | ORGANIZATION | 0.99+ |
one solution | QUANTITY | 0.99+ |
Boston | LOCATION | 0.99+ |
US government | ORGANIZATION | 0.99+ |
6.5 million people | QUANTITY | 0.99+ |
yesterday | DATE | 0.99+ |
both | QUANTITY | 0.99+ |
zero | QUANTITY | 0.99+ |
a month ago | DATE | 0.99+ |
15 years ago | DATE | 0.98+ |
one person | QUANTITY | 0.98+ |
COVID-19 | OTHER | 0.98+ |
two really fantastic experts | QUANTITY | 0.98+ |
first | QUANTITY | 0.98+ |
one | QUANTITY | 0.98+ |
both sides | QUANTITY | 0.98+ |
54 million people | QUANTITY | 0.98+ |
Apple | ORGANIZATION | 0.97+ |