Lisa Ho | Data Privacy Day 2017
>> Hey, welcome back everybody, Jeff Frick here with theCUBE. We're in downtown San Francisco at the Twitter headquarters at the Data Privacy Day Event. It's a full day event with a lot of seminars and presentations, really talking about data privacy, something that's getting increasingly important everyday, especially as we know, RSA's coming up in a couple of weeks and a lot of talk about phishing and increased surface area of attack, and et cetera, et cetera. So privacy is really important and we're excited to have Lisa Ho, Campus Privacy Officer at UC Berkeley. Welcome, Lisa. >> Thank you, glad to be here. >> So what does the Campus Privacy Officer do? >> Well, really anything that has to do with privacy that comes across. So making sure that we're in compliance or doing what I can to help the campus keep in compliance with privacy laws. But beyond that, also making sure that we stay aligned with our privacy values and when I say that, I mean, privacy is really important. It's critical for creativity and for intellectual freedom. So at the university, we need to make sure we hold on to those when we're dealing with new ideas and new scenarios that's got to come up. We have to balance privacy with all the other priorities and obligations we have. >> Yeah, I don't know if Berkeley gets enough credit and Stanford as really being two of the real big drivers of Silicon Valley. It attracts a lot of smart people. They come, they learn, and then more importantly, they stay. So you've got a lot of cutting edge innovation, you've got a ton of open source technologies come out of Berkeley over the years. Spark, et cetera. So you guys are really at the leading edge but at the same time, you're an old, established academic institution so what role do you have formally as an academic institution of higher education to help set some of these standards and norms as the world is changing around it so very, very quickly? >> Yeah, well, so as I say, the environment needs to be set for creativity and for allowing that intellectual freedom. So when we think about the university, the things that we do there are pretty much what we want to have in the community as a whole, and in our culture and environment. So some of the things that we think about particularly, first, if you talk about, think about school, you think about grades or you think about the letters evaluation that you get. Those things that, learning when you come down to it is a personal endeavor and you, developing internally. It's a transformation that's internal. And so what kind of feedback you get, what kind of critical evaluation, those need to be done in an area where you have the privacy to not be, have a reputation to either live up to or live down. Those are things that you keep secret or keep private and that's why school information and student data is so, as we've agreed as a society that that's something that needs to stay private. So that's one area that learning is personal. That's why the university is so important in that discussion. And secondly, I'd say, as we talked about, creativity requires time to develop and it requires freedom for taking risks. So whether you're working on a book or whether it's a piece of art or if you're a scientist, a formula, any kind of algorithm, a theory. Those are things that you need time to set aside and to be in your own head without the eyes of others until you're ready. Without not having judgment before it's ready for release. And those kind of things that you want to have space for creativity so that you can move beyond the status quo and take those risks to go somewhere to the next space and beyond. >> Jeff: Right. >> And I think lastly, I'd say that, this is not specific to the university, but where we hold particularly at Berkeley, is the fundamental rights that we have that privacy is one of those fundamental rights and as Ed Snowden said so famously, if you're saying I don't care about privacy because I have nothing to hide is like saying I don't care about freedom of speech because I have nothing to say. So just because you may not have something to say doesn't mean that you can take away the rights of someone else and you may find that you need those at some point in your life in the future, and no one has to justify why they need a fundamental right. So those things that are essential that come out in our university environment that we think of a lot are things that are applicable beyond just the learning space of the university, to the kind of society that we want to build. That's why the university's in the space to lead in these areas. >> Right, 'cause Berkeley's got a long history, right, of activism, and this goes back for decades and decades. I mean, is privacy starting to get elevated to the level that you're going to see more active, vocal points of view and statements, and I don't want to say marches, but potentially marches in terms of making sure this is taken care of? Because unfortunately, I think most privacy applications, at least historically, maybe it's changing, are really opt out, you know, not opt in. So do you see this? Is it becoming a more important kind of policy area versus just kind of an execution detail on an application? >> Yeah, we have a lot of really great professors working on these ideas around privacy and in cybersecurity that, those that are working on security and other things also have privacy in their background and are also advocating in that area as well. As far as marches, we all, you pretty much rely on the students for that and you can't dictate what the students are going to find as important. But there are. There's definitely a cadre of students that care and are interested in these topics and when you tie them together with the fundamental rights like free speech and academic freedom and creativity, that's where it becomes important and people get interested in that. >> Right. One of the real sticky areas that this bounces into is just security, security and unfortunately, there's been way too many instances at campuses over the last several years of crazy people grabbing a gun and shooting people, which, you know, hopefully won't happen today. And that's really kind of where the privacy and security thing runs up against should we have known? Should we have seen this person coming? If we had had access to whatever that they're doing, maybe we would have known and been able to prevent it. So when you look at kind of the, I don't want to say balance, but really, the conflict between security security and privacy, what are some of the rules coming out? How do you guys execute that to both provide a safe environment for people to study and learn and grow, as you mentioned, but at the same time, keep an eye out for unfortunately, there are bad characters in the world. >> Right, yeah well, I don't want to say that there's a dichotomy. I don't want to create a false dichotomy of it's either privacy or it's security and that's not the frame of mind that we want to be in. It's important for both and security is clearly important. Preventing unauthorized access to information or your personal information is clearly a part of privacy and so that's necessary for privacy and those are things that you would do to protect privacy. The two factor authentication and the antivirus and the network segmentation, those are all things that are important parts of protecting privacy as well. So it's not a dichotomy of one or the other, but there are things that you do for security purposes, whether it's cybersecurity or for the kind of security, personal security, that maybe in a conflict, have a different purpose than what you would do for privacy and monitoring is one of those areas specifically. When you're monitoring for attacks, this kind in particularly, now we have the continuous monitoring for any kind of attacks or to use that monitoring data as a forensic place to look for information after the fact. Those are things that really is lies in contrast with the idea in privacy of least perusal and not looking and not looking for information until you need it, so having that distance in the privacy of not having surveillance. So what we're coming to, at the University of California has outlined a privacy balancing analysis that's necessary for these kind of scenarios that are new, when we have, untested, when we don't have laws around them, to balance the many priorities and obligations and what you need to do is look at what does the security provide, look at the benefits together with the risks and do that balancing. And so you need to go through a series of questions. What is the utility that you're really getting out of that monitoring and not just in that normal scenario when you're expecting, how you're expecting to use it. But what about in the use cases that maybe you didn't expect that, but you can anticipate that it'll be wanted for those reasons or if you, what about when we're required to turn it over for a subpoena or another kind of letter. What are the use cases in that? What are the privacy impacts in those cases? What are the privacy impacts if it's hacked or what are the privacy impacts of an abuse by an employee? What are the privacy impacts for sharing it with partners? So that together, the utility with the impact you need to balance that and to look at those differences, and then also look at what's the scope of that? Does the scope change? If you change the scope of what you're monitoring, does it change the privacy impact? Does it change the utility? When you look at those kind of factors and keep them all in line, not just looking at what's the utility of what you're trying to do, but what are the other impacts to the privacy analysis and then what are the alternatives that you could do the same thing and are they appropriate? Do they give you the same kind of value that the proposed monitoring provides? Keeping transparent about and keeping accountable to what you're doing are really when it comes down to the key as you've done that analysis and making sure that you've looked through those questions of have you kept it, are you doing the least amount of perusal necessary to achieve the goals that you're trying to accomplish with that monitoring? And what about transparency and accountability coming back to whatever your decisions are, making those available to the community that's being monitored. >> Wow, well one you've got job security, I guess, for life, because that was amazing. Two, as you're talking balances, the word I was looking for before, so that is the right word. But you're balancing on so many axis and even once you get through the axis that you just went through that list of, it's phenomenal, then you still need to look at the alternatives, right? And do the same kind of analysis for each. So really, that was a great explanation. So I want to shift gears a little bit and talk about wearables. You're going to give a talk later on today about the wearables. Wearables are a whole new kind of interesting factor now that provide a whole bunch more data, really kind of the cutting edge of the internet of things with sensor data. People are things too, we like to say on theCUBE. So as you look at the wearables and the impact of wearables on this whole privacy conversation, what are some of the big, got you issues that are really kind of starting to be surfaced as these things get more popular? >> Yeah, I think a lot of the same kind of questions around what kind of monitoring you're doing, what's the utility, what is the privacy impact and how do you balance those in the various scenarios, the use cases that come up, really the same kind of questions apply to cybersecurity as they do to cybersecurity monitoring. We're finding, I think in college athletics and the university sponsored use of wearable technology is really just in infancy right now. It's not a big thing that we're working on. But it ties in so much as very much parallels the other kind of questions that we are talking about around learning data and how you jump or how your body functions is very private, very intimate. How you think, how you learn, that's right up there on the spectrum on that privacy and intimacy scale. So we're looking very much and we've been talking quite a bit in the university space about learning data and how we protect that. Some of the questions are who owns that data? It's about me, should I be, you know, it's about the student for example. Should I have control over how that information is used? When it's around learning data, maybe the average student, there may not be outside folks that are interested in that information but when you're talking about student athletes, potentially going pro, that's very valuable data that people may want, so that, people may want to pay for, maybe the student should have some say in the use of that data, monetizing that data, who owns that? Is it the student, is it the university, is it the company that we work with to provide that kind of monitoring the analytics on that? >> Jeff: Right, right. >> Even if we have a contract or right now, if it's through the university, we'd hopefully have made really clear who's the ownership, where the uses ally, what kind of things we can do with it, but as we move into kind of a consumer space, and it's where you just clicking the box and students may be asked, oh, use this technology, it's free and we'll be able to handle it, because of course, how much it costs is important in the university space >> Give you free slices at the pizza store. >> Right, well once we get into that consumer realm when it's just either not even having to click the box, the box is already clicked, can you say okay, that's the new come up to where students may be giving away data for reasons or for uses that they didn't intend, that they are not getting any compensation for, and in particular cases, when you talk about student athletes, that could be something that would be very meaningful for their career and beyond. >> Yeah or is it the guy that's come up with the unique and innovative training methodology that they're testing, is it Berkeley's information to see how people are learning so you can incorporate that into your lesson plans and the way that you teach 'em, and there's so many kind of angles but it always comes back, as you said, really the context. Kind of what's the context for the specific application that you're trying to use that and should you or should you not have rights for that context. It's really interesting space, a lot of interesting challenges, and like I said, job security for you for the unforeseeable future. >> Yeah, we're not going to run out of new and exciting applications and things to be thinking about in terms of privacy. It's just a non stop. >> Right, 'cause they're not, these are not technology questions, right? These are policy questions and rules questions. We heard a thing last night with the center and one of the topics was we need a lot more rules around these types of things because the technology's outpacing kind of the governance rules and really the thought processes, the ways that these things can all be used. >> It's a culture question, really. It's more than just what you allow or not, but how we feel about it and the kind of idea that privacy is dead is only true if we don't care about it anymore. So if we care about it and we pay attention to it, then privacy is not dead. >> Alright, well Lis, we'll leave it there. Lisa Ho from UC Berkeley, fantastic. Thank you for stopping by and good luck at your wearables panel later this afternoon. >> Thank you. >> Alright, I'm Jeff Frick. You're watching theCUBE, thanks for watching. (upbeat music)
SUMMARY :
the Twitter headquarters at the Data Privacy Day Event. So at the university, we need to make sure So you guys are really at the leading edge So some of the things that we think about particularly, is the fundamental rights that we have So do you see this? on the students for that and you can't dictate One of the real sticky areas that this bounces into and that's not the frame of mind that we want to be in. so that is the right word. is it the company that we work with slices at the pizza store. and in particular cases, when you talk about and the way that you teach 'em, and exciting applications and things and one of the topics was we need It's more than just what you allow or not, Thank you for stopping by and Alright, I'm Jeff Frick.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Ed Snowden | PERSON | 0.99+ |
Lisa Ho | PERSON | 0.99+ |
Lisa | PERSON | 0.99+ |
Stanford | ORGANIZATION | 0.99+ |
two | QUANTITY | 0.99+ |
University of California | ORGANIZATION | 0.99+ |
Silicon Valley | LOCATION | 0.99+ |
Lis | PERSON | 0.99+ |
both | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
Two | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
two factor | QUANTITY | 0.98+ |
first | QUANTITY | 0.98+ |
UC Berkeley | ORGANIZATION | 0.97+ |
last night | DATE | 0.97+ |
decades | QUANTITY | 0.96+ |
each | QUANTITY | 0.96+ |
Berkeley | ORGANIZATION | 0.95+ |
ORGANIZATION | 0.94+ | |
later this afternoon | DATE | 0.94+ |
One | QUANTITY | 0.93+ |
secondly | QUANTITY | 0.93+ |
theCUBE | ORGANIZATION | 0.92+ |
Data Privacy Day Event | EVENT | 0.92+ |
RSA | ORGANIZATION | 0.91+ |
Data Privacy Day 2017 | EVENT | 0.85+ |
Campus Privacy Officer | PERSON | 0.85+ |
Berkeley | LOCATION | 0.84+ |
downtown San Francisco | LOCATION | 0.79+ |
Spark | ORGANIZATION | 0.77+ |
last | DATE | 0.6+ |
years | DATE | 0.45+ |