a16z Podcast - a16z Podcast: What to Know about GDPR

Episode Date: April 12, 2018

with Lisa Hawke (@ldhawke) and Steven Sinofsky (@stevesi) Given concern around data breaches, the EU Parliament finally passed GDPR (General Data Protection Regulation) after four years of preparation... and debate; it goes into enforcement on May 25, 2018. Though it originated in Europe, GDPR is a form of long-arm jurisdiction that affects many U.S. companies -- including most software startups, because data collection and user privacy touch so much of what they do. With EU regulators focusing most on transparency, GDPR affects everything from user interface design to engineering to legal contracts and more. That's why it's really about "privacy by design", argues former environmental scientist and lawyer Lisa Hawke, who spent most of her career in regulatory compliance in the oil industry and is now Vice President of Security and Compliance at a16z portfolio company Everlaw (she also serves as Vice Chair for Women in Security and Privacy). And it's also why, observes a16z board partner Steven Sinofsky, everyone -- from founders to product managers to engineers and others -- should think about privacy and data regulations (like GDPR, HIPAA, etc.) as a culture... not just as "compliance".  The two break down the basics all about GDPR in this episode of the a16z Podcast -- the why, the what, the how, the who -- including the easy things startups can immediately do, and on their own. In fact, GDPR may give startups an edge over bigger companies and open up opportunities, argue Hawke and Sinofsky; even with fewer resources, startups have more organizational flexibility, if they're willing to put in the work.  for links mentioned in this episode (and other resources), please go to: https://a16z.com/2018/04/12/gdpr-why-what-how-for-startups/

Transcript
Discussion (0)
Starting point is 00:00:00 The content here is for informational purposes only, should not be taken as legal business, tax, or investment advice, or be used to evaluate any investment or security and is not directed at any investors or potential investors in any A16Z fund. For more details, please see A16Z.com slash disclosures. Hi, everyone. Welcome to the A6 and Z podcast. I'm Sonal. Today's topic is something that's top of mind for so many. GDPR, or General Data Protection Regulation by the EU Parliament, which goes into effect very soon. Since this affects so many startups and actually companies of all kinds, we thought we'd share a sort of primer by podcast. But be sure to also check out the show notes for links to some other resources mentioned in this episode. Our special guest is Lisa Hawke, who is VP of Security and Compliance at Everlaw, an A6 and Z portfolio company. She started as an environmental scientist and lawyer, but spent most of her career. career in regulatory compliance. And joining her to host this conversation is A6 and Z board partner,
Starting point is 00:01:01 Steven Zanoffsky. So I'm super excited to have a chance to talk about a really complicated topic with a great friend and guest to join the podcast. Thanks, Stephen. It's really, really exciting to be here. And especially to talk about this great topic, it's on everybody's mind lately. So we're going to just dive right in to GDPR, the general data protection and regulations of the European Union. So first off, like just who is going to be regulated by it? Like a bunch of people like our product managers and compliance people and engineering and ops are wondering whose job is and appointed each other. So like who is regulated by it? Well, in legal terms, there's a thing called a long-arm jurisdiction. And this is probably one of the longest of the long-arm
Starting point is 00:01:46 jurisdictions. And what I mean by that is a situation where a local court can actually assert jurisdiction over someone in another state, another county, and in this case, from the European Union to other countries and companies who process personal data of EU data subjects. So in a nutshell, it applies to anyone, so any company that processes personal data of EU data subjects. Which is going to sort of end up being everybody who's listening in one way or another. They just haven't realized that yet. So put a little scope on this. The actual GDPR is is 260 pages long, and my favorite is that it has 99 sections, and the problem ain't one of them, and 173 different sort of recitals, which when you read them, they kind of look
Starting point is 00:02:36 like the EU's tweets about what it should be. So it's a pretty big document. It is. And the preamble, the recitals are really interesting. U.S. law doesn't necessarily do the same thing, where they explain via the recital, sort of the intent behind the regulations. So the thing that is interesting for me is that the European Union, by virtue of the long arm, which it's hard to get the song out of my head right now, but I won't sing, I swear. One thing that's interesting is that they've sort of taken the global lead on privacy. And that's interesting because it puts the U.S. companies oddly under this new regulatory oversight, even though they didn't elect the people who sort of are passing it.
Starting point is 00:03:21 So does it sort of make it weird that Europe is almost harder to do business in in the U.S.? Yeah, and I think whether or not it's harder to do business there depends a little bit on your perspective. Certainly the Europeans view it as a way to make it easier to do business in the digital single market, merging all the laws of the 28 member states into sort of one data protection regime, so to speak. I think for U.S. companies that are working towards compliance, there are two different ways to look at it. You can either adopt the stricter standard and look at it as an opportunity to do business across Europe, or you can look at it as a barrier. I think that's really interesting because in a sense, you know, this actually can be a differentiating opportunity for a lot of companies. Yeah, I think so. And especially for startups. First of all, you know, it is a big deal. I'm not saying it's easy to do all the work.
Starting point is 00:04:16 that you need to do to get into compliance. But startups tend to be more nimble. You may have fewer resources, but it's also easier to make changes to your infrastructure, your org structure. And if you're willing to put the work in and you can do it, I think it could open up a ton of opportunities. Yeah, I really agree with that because when you're at a big company and you're hit with GDPR, I know exactly how this is going to work.
Starting point is 00:04:40 Like you've got all the subsidiaries and you know, too. You're just paralyzed. Well, you're paralyzed because you have to, you do, the EU just spent, four years pulling together 28 member states, but you're at a big company, you're going to spend two years pulling together the 400 different groups, each with their own data sets and their own vendors and their own policies and their own Ula's. Project managers. Right, right.
Starting point is 00:04:59 And so startups do have a real advantage. Okay, so let's dive into the main, the body of the GDPR. So the first thing is it comes out and it just defines like two kinds of main roles in the company. And they're awesomely named that are seen roughly synonymous. controllers and processors. And I guess this is important because you sort of want to know which one am I or what do they mean? Because they define everything else relative to being a controller or a processor.
Starting point is 00:05:25 And it is really important to figure out where you fall in that. And if you're processing personal data, there's also a really solid chance that you're both. So the terms controller and processor actually aren't any different from the 1995 directive, but the obligations of a controller and a processor have changed under GDPR. So there are definitely some things to be aware of in terms of what the obligations are. So what's a controller? So the data controller is the company that decides how and why the personal data is going to be processed. And the processor is processing that data on behalf of the controller.
Starting point is 00:06:06 And the reason I say that there is a good chance that you will actually fall into both of those categories is because, let's say, you're a company with a website in Europe and you have a contact form on that website. So folks are inputting their name, email, and phone number to go into a database at your company that you might use for marketing later. Then you may also have customers. So if you're collecting that data through the website, you're a controller of that data. And then you may have a product where you're processing other personal data on behalf of your clients or your customers. You may be a processor for that data, but a controller for the other information you collected. And so just to be clear, inserting a vendor or a third party in there doesn't change any of
Starting point is 00:06:51 this. No. And that's something that sometimes in U.S. law, you think, oh, there's a third party and liability is insulating, but you can't just hire a contractor and GDPR goes away. No. In fact, if you do that, they are probably the third term, which is the subprocessor. So then you've brought a subprocessor into the mix, which is just another processor, but you'll have to ensure that they are meeting their requirements and sort of the chain of
Starting point is 00:07:16 obligations and responsibilities. Yeah, and I think that's sort of a key theme about GDPR is difficult and as big as it all seems. They've done a lot of work to sort of make it hard to find loopholes or excuses, and there's not like an easy out, so to speak. No, certainly not. And with the data protection principles under a GDPR, the one that I've heard, the regulator is focusing on the most is transparency. So they are putting a huge amount of work into making sure individual citizens in Europe understand that they have the right to know what the data controllers and processors are doing with their information and who they're giving it to. That's probably important because ultimately I think what's going to happen is that
Starting point is 00:07:59 when it comes time for there to be complaints or problems, what I think the EU regulators are sort of counting on is that there'll be a bottom-up view. And like people will sort of police this on behalf of them because there's not a giant enforcement arm for these. And so it's likely that you as a company will see the first complaints coming from individuals who can simultaneously raise this to the regulators. And that's just another reason to go back to what is your role here. Are you a controller? Are you a processor?
Starting point is 00:08:29 Because when it comes to responding to data subject rights, there are some differences. And the controllers are likely going to be the entity. that are receiving these complaints. So it is important to know where you fall in that spectrum and what your obligations are in terms of responding to those kind of complaints. So when it comes to these parties, what is the data that we're talking about? Like how narrow or how broad is personal information or the privacy requirement? What does it cover? Well, it is very broad. And there's a bit of a history lesson as to why it's so broad, which is that the European Convention on Human Rights talks about respect,
Starting point is 00:09:08 for private and family life as a human right. So the definition is broad, and the definition really hasn't changed that much from the 1995 Directive. There's a few important updates. One of those important updates is the inclusion of an identifier, including online identifiers like location data. So the original definition says that personal data
Starting point is 00:09:32 is any information that relates to an identified or an identifiable living individual. So add on to that things like the online identifier and genetic information, those are the two key updates to the definition. But it was meant to be very broad and it was meant to apply to a large swath of information. And I think that the key thing is that they're very aware of taking one piece of data like your genetic information, like the equivalent of a social security number, a driver's license number, tax number, and then a whole bunch of other. data that might be innocuous, but triangulating it into one giant thing. And so the list of things that are other types of data that matter is very long. It just doesn't stop, as far as I can tell. Yeah, and I'm glad you pointed that out, because in addition to the things that are listed
Starting point is 00:10:24 as examples, there is a statement in there that says different pieces of information, which collected together can lead to the identification of a particular person counts. So certainly the triangulation of different information, which you might not think actually would identify a person, but taken together or taken in parts and pieces, can, that certainly would qualify. You know, people in startups, they want to look for like, okay, is there a scalable technology solution such that it can, like, reduce my overall sort of surface area that I have to worry? And so one question is, like, what if a startup from the very beginning has a clear, anonymized identifier mechanism and then encrypts and anonymizes somehow all of this other
Starting point is 00:11:09 kinds of data. Where does encryption and anonymization fit into all of this? It's really important because there is one escape route from GDPR, which is the provision in the regulation, which is also the same as the 95 directive, which says that data that are fully anonymized, meaning that no individuals can be identified are outside the scope of GDPR. So if you, if a company, a startup can truly anonymize data, then that data wouldn't be subject to the regulation. But I have to say, I think that the concept of anonymization will be tested, sort of what actually qualifies. And I know there's a lot of security enthusiasts and mathematics enthusiasts out there thinking about, you know, what this is going to look like when applied. Well, it is a big research topic over whether or not you
Starting point is 00:12:00 can truly anonymize something because the ability for machine learning to triangulate and find patterns that you can't readily see is so extreme. And even when Apple announced that that's the kind of thing that they do, there was a lot of pushback saying, well, it's not really proven and six degrees of separation, whatever. In fact, that goes way, way back to a very famous case in the U.S. over a product called Lotus Marketplace, which came out in the early 1990s, and it was anonymized census data. But the problem was it was so granular. at the city block and building level, but basically you knew how much a person made in salary just because they lived in a certain house.
Starting point is 00:12:38 I think there's a stat out there that says that over 80 or 85% of the U.S. population can be identified with three pieces of data. So it's certainly a challenge. The regulation also introduced another concept, pseudonymization. So pseudonymization is when information is obscured, but there is a key that can tie it back to individuals. And the regulation talks about pseudonymize data as a way to meet data protection by default and by design. But I think your non-legal advice is anonymization is good, but we should probably be aware that someone is going to find themselves having that tested in front of the
Starting point is 00:13:18 regulators or in court as to whether or not it went far enough. And the state of the art isn't even clear yet if it's far enough. And I think the even more practical advice is just know what personal data you have, why you have it and what you're doing with it. Okay, so this is the European Union. So what does this have to do with the U.S.? Well, you're probably using personal data from EU data subjects. They could live in the U.S. I'm married to one. So if- Okay, this is personal information, so let's be careful. Well, I was being a little bit facetious, but, you know, if you have a software product, you're probably, you're probably marketing to people in the EU, just because you have a U.S. website, if you offer it to the folks in the EU,
Starting point is 00:14:03 there's a good chance that you're collecting personal data. It's a big place. So the key is that these regulations cover European Union citizens, no matter where they happen to be at a given time, regardless of where the product resides that they're covered by. That's true. And so if you look at the scope, there's two different areas of coverage. One is offering goods and services to the EU, and there are things like, you know, even if you have a U.S. website, do you offer a translation? Do you have a contact number in the EU? You know, do you have employees in the EU? And then there is another aspect, which is a little bit less clear, but around, you know, monitoring and profiling. So targeted ads, are you collecting fitness
Starting point is 00:14:48 data from people in the EU on a wearable device? So it's very, very broad. And also, can your customers, even if they're in the U.S., can they bring EU people in in a viral networked kind of way? Yeah, maybe you have a company that has an office in California as your client, but they also have an office in London, and some of their London folks want to use your product. Guess what? Yeah. The U.S. just chose not to make laws and regulations that are nearly as all-encompassing as these EU ones. But often what happens is it's just better to just pick that as the standard by which do because it's it's the higher bar to the higher threshold. Yeah, that's true. I mean, right now in the U.S. there are some verticals that have their own privacy regulations. Typically, we think of
Starting point is 00:15:35 protected health information in HIPAA. So certainly there, in my view, there's a benefit to just adopting privacy by design and adopting the stricter standard because by doing so, you will sensibly be able to comply with more regulations and more jurisdictions. And so once you and your company have this personal identity and I've identified it, one of the things that's that I found the most interesting in the GDPR is that, and it's in the very beginning in the recitals, is that citizens in the EU, they have like very clear rights about their data. Like your product just has to do these things.
Starting point is 00:16:15 So why don't we just walk through like what these rights are? Sure. And like I said before, it definitely matters what role your company is as to how you respond to these rights, which is why it's really important that you start there and understand where you fit into GDPR, what information you have. Because like you pointed out, if you have to comply with these data subject rights, you have to be able to do it. And you may need to build that into the product that you're building. Well, let's go through them all. So first, you have to get access to your personal data. What does that mean? Yes. So you have the right to request basic information about the nature of the processing, about, you know, what the company is actually doing with your data. You have that right to information. You also have the right to access it. So then all of this actually, to me, sounds a lot like if you have ever, like, gotten your own credit report. Yeah. In the U.S. law, they've actually done a very similar thing, but for this very narrow case for credit reports. And honestly, it's kind of adversarial. and it's not really designed for consumers. And I think the European Union has learned from that because they've been iterating since 1995 on this.
Starting point is 00:17:23 So I think they're going to look at the implementation of these as well as whether or not you have them. Yes. And in fact, one of the top regulators from Europe was in the Bay Area just a month ago, Helen Dixon, who is the Irish Data Protection Authority. And she actually said that transparency and how companies respond, to the data subject access rights when asked is going to be a big focus. So certainly how companies operationalize this will be in the...
Starting point is 00:17:54 Okay, so you have access to the data. You can fix it. What else can, do you have to be able to do? You have to be able to object to the use of your information for direct marketing. You have to be able to request that your data be erased when it's no longer needed. You can also request the portable version of your data, request that decisions based on algorithms made by humans. I think, oh, the one you're talking about is this really interesting one where it's that if they make a decision based on your data in software, like you have, this is crazy.
Starting point is 00:18:32 You actually have the right to make humans do the same, look at your data and make the same choice if I read it correctly. Is that what they meant? Yes, now you reminded me. And there's actually a really good, there's a few. sort of dueling journal articles around the right to be forgotten and around the automated data processing. So what you're talking about algorithms, how GDPR will affect data science and the types of automated decisions, you know, such as machine learning, that affect the outcome as it relates to an individual. So like a credit report, a loan, and so forth, and the right to have a human
Starting point is 00:19:05 actually look at the output of that algorithm and say and explain how the decision was made. so. Wow. So the GDPR offers up a whole bunch of things to worry about. Like you have to support all these features in your product. You have to do all these things. But it doesn't really tell you what to do. One area that it does is that I think is super interesting is that it tells you about privacy by design. That sounds familiar to me having lived through the security world of secure by design. But privacy by design, what does that really mean? Dr. Anne Kavukian wrote a paper back in the 90s called Privacy by Design. So the concept has been around for a while. The GDPR discusses it in the context of requiring data controllers to meet the principles of
Starting point is 00:19:54 data protection by design and data protection by default, which is very similar to the privacy by design. And it essentially just boils down to privacy being taken into account throughout the whole engineering development process. And it sounds kind of complicated, but it boils down to some really straightforward concepts. Like, for example, privacy is a default setting, privacy embedded into the design. So if you have a choice to design a function where you can make it easier to delete data later down the road or harder to delete data down the road, you want to go down the road where it will make it easier for you to delete that personal data. Visibility and transparency and keeping it user-centric.
Starting point is 00:20:39 around privacy. It's not a user design thing necessarily, or that it obviously has implications. But this feels much different in terms of the overall engineering design process. So you've integrated that yourself into Everlaw. Yeah, we've been talking about security for a long time. And certainly the concept of privacy, once you start talking about how it actually applies in practice with the engineers, it makes a lot of sense to them. And when we were talking about this, you know, the example I gave just a second ago, look,
Starting point is 00:21:09 We're going to design this thing, and we can either do it so that it's easier to delete data later or it's harder. The privacy by design way is to make it easier, and the light bulb goes off and they just get it. So certainly if you're working on implementing this with your engineering team, it may require a little bit of explanation up front, but it's common sense. Just so people who don't know whatever law does, it happens to be software for lawyers, but that doesn't, the domain does. doesn't really change anything. They have a massive amount of very, very sensitive information and also identifying information about attorneys and what they're working on. It's no different than any other product for collaboration. And so to me, it's been very interesting to watch this notion of GDPR get baked into the engineering cycle. But one area that is just fascinating
Starting point is 00:22:01 to me is that as vague as the GDPR might appear to an engineer in some places, it got very specific very quickly on penalties. I think that is just the huge focus because the numbers are so large. And frankly, that's what the reporters are writing about. They're doing the whole thing. Look at these giant fines, you know, 4% of global turnover. Oh, turnover. That's my favorite. I love reading European things because they talk about turnover. And I don't know what that means, but it's just revenue. Like top line revenue. Yeah. And I think, I think that scares a lot of people and it scares smaller companies into thinking that it's just, you know, too much of a hot potato. If you actually read Article 83, which is the part in GDPR where they're talking about the fines, okay, no one has read that except you. So tell us what it means.
Starting point is 00:22:44 Well, I have to, I have to say, I do recommend that folks take a look at it because working backwards from there, they actually tell you what they care about when they're going to potentially assess a fine, which is, you know, is their negligence, was it intentional? So if you're worried about mitigating this kind of risk, it helps to see what is. is listed in there as the factors around the penalties. And also keeping in mind that the fine is a bit of the last resort, and they certainly have the authority to impose a fine, but they also have a whole section in there on corrective actions. So there's a lot of things that will happen, you know, in terms of an investigation. You're going to hear from them way before a fine is coming your way.
Starting point is 00:23:30 They don't just show up one day. Now, the reason that this all got started was the problem of brief. And that what was happening was these giant data sets were being collected, and then they were leaking. So what does GDPR say about breaches? Well, there is some specificity there around timeframes for response and notification. So people kind of gravitate to the 72-hour language, which talks about the obligation of data controllers, notifying the supervisory authorities, so the regulators, within 72 hours of becoming aware of breach. And a lot of the discussion has been around companies thinking, okay, 72 hours, not a lot of
Starting point is 00:24:13 time. How do we respond in that time frame? So I think that, you know, for any startup, a key thing is there's probably a good chance that the company has a process around the service going down. And the reality is they need the same kind of checklist, process, call list, pagers, alerts, in case of being notified by a breach, which might actually not be a system's notification. It might actually just be, you know, it might, it actually might show up on some Reddit forum somewhere, or it might show up by people sending threatening mail that they have the data, but they need to have like an action plan. Yeah, absolutely. I mean, and it could be something as simple as an employee hitting send on an email containing some
Starting point is 00:25:00 personal data. So it doesn't have to necessarily be a hack. But absolutely, you need to have a process in place with the knowledge that nothing is ever going to go as planned, but you'd still need to have laid out some plans, even if it's very basic. And then I also recommend testing those plans because testing them and doing scenario, planning, and actually running scenarios is the best way to find out, okay, actually, I don't have Joe's phone number. And how do I get it? without my computer. Well, and I also think that part of this gets also to the kind of company culture that gets created around the information that the company has, like how many people have a password
Starting point is 00:25:43 that enables them to see it and just even these little things. Right, exactly. You know, culture is so important for a lot of reasons, but around a speak-up culture for everyone in the company feeling like, hey, something over here looks weird and having the ability to raise that and knowing that they will be supported when the issue is raised and that it will be responded to and that folks on the team take it seriously. I think from a compliance, security, privacy perspective, having a culture where it is common and is accepted and is encouraged to bring up issues is where you want to be because if something goes wrong and you
Starting point is 00:26:23 don't have that, you're already going to be on the back foot. I've been on the receiving end of a deferred prosecution agreement, largely relating to a cultural issue around not reporting things and sort of being scared to report things. And I can absolutely say, you know, without reservation, that that is not where you want to be. And anything you can do to encourage your team to raise issues and be supportive of them when they raise them is super important. I think that's one of the neat things about because Everlaw is in the regulatory space and the legal space, I can see it when I visit, that they view compliance with things like GDPR, not like a weird training exercise glued on the side that you have to worry about on one day a year. But it is a thing that gets baked
Starting point is 00:27:09 in sort of how the company functions. And then when new wires show up, they don't see it as weird or the weird training thing they have to go to. They just see it as, oh, this company cares a lot about this topic. Yeah. And that was how security was. You know, when I started it in the software industry, nobody talked about security. It was sort of funny that they were viruses, not like, oh, the world is going to end. And then one day, the world is going to end. And then all of a sudden, all the new hires learn, by the way, what you work on can cause the world to end if you don't do a good job. And they're like, cool, so how do I fix that? I don't want to be the person who makes the world end. Yeah. I would just encourage other people
Starting point is 00:27:42 to think about it as a culture, not as a, not as compliance, as sort of feature design. So we talked about that this applies to sort of everybody. Is there anything unique or special about being cloud or being on-prem? Because sometimes people think that, like, if you're on-prem, then you sort of escape all of the stuff because it's all just stuck on a server somewhere. But I'm not sure where does that fit in with these regulations. Well, of course, I think cloud is special.
Starting point is 00:28:09 But when it comes to GDPR, I don't think there's a huge difference because I just can't imagine how any company can run a business without collecting personal data. I mean, whether you are a B2B SaaS company, whether you're a social media company, whether you're selling on-premise software, you're still billing people, you're collecting names,
Starting point is 00:28:31 you're collecting emails, phone numbers, probably. You have user accounts, clients. The apps have telemetry in them, going back, all this stuff. Yeah, so I just, I'm not sure there's a huge difference. So a good rule of thumb is, if you can sign on to your product,
Starting point is 00:28:46 then you're collecting private information because you have that key, and then everything associated with it is private. So let's assume you want to be GDR, compliant. You know, is it like getting like FedRamp certified or FISMA or HIPAA or any of these other acronyms that not everybody is clear on? There is an alphabet soup of potential certifications around privacy and security, certainly. GDPR is a regulation and right now there is no certification for it. The regulation does have language around certification in it, but
Starting point is 00:29:20 nothing's actually been developed. It's referenced, but it's not developed. So there are a lot of advisors out there, consultants, that may, you know, try to sell you on some kind of certification. But at the end of the day, it's a law. So it's your job as a company to figure out how does this law apply to me? How can I take a risk-based approach to meeting my obligations? And then how can I document my rationale and what I've done to comply if I'm ever asked, if my door is ever the one that gets not done? So you were being super polite. I think what you're really saying is make sure you don't go pay a consultant a bunch of money to become GDPR compliant. Well, if you have money burning a hole in your wallet, I'm not going to tell you
Starting point is 00:30:01 not to do that. But I take a little bit of a more practical approach in that I think that I think you know, startups can do it themselves. I think they have a lot of smart people. There are tools out there that you can use. Eventually you will need a lawyer to draft some contracts. But I do think there are practical things you can do without engaging a really expensive consultant. When you say you have to have a lawyer to draft some language, that's because eventually like GDPR is going to either be in your sales contracts or in your Yula in some form or another. There are controller and processor obligations which need to be clearly laid out in a contract. And if you transfer data, you will have to have probably, at least right now, a lot of companies
Starting point is 00:30:46 are using the standard contractual clauses. So in a nutshell, yes. So you're now a product manager on the team trying to figure out what to do. Like one of the things you did was pull together a tool. It's just a spreadsheet, not just, it is spreadsheet. But it's the way that we started in security too. It's a checklist of what you have to do for a bunch of stuff. There's a lot of free resources out there, but I couldn't find anything that was free
Starting point is 00:31:08 and that put all of the information where I wanted it in one place. And yes, it's a spreadsheet. It's actually a Google Doc. It's really easy for a lot of people to be in there. working on it at once. And what it does is it lays out all of the things that you need to document to do your own risk assessment in terms of what data do you have. Where is it? What are you doing with it? Why do you have it? Do you really need it? How are you securing it? So it's a one-stop shop where startup can just document what they're doing with personal data that will then allow them
Starting point is 00:31:42 to assess their risk and decide what they need to do. We have to go back and say, okay, where do we have a username? Like where do we have an address? Where do we have an account history? Where do we have other? And you sort of just systematically have to go through and do that. Yeah. When it comes to privacy and personal data, we're just used to using information, putting it into productivity tools and doing our jobs, right? We're not used to thinking of how do I rely on personal data to actually perform my business function. So if you take this sheet and you sit down with one of your marketing folks and you say, you know, what personal data do you actually use to do your job? The next minute they'll be talking about, okay, well, inbound lead generation. I need contact
Starting point is 00:32:24 information for that. And then the next thing, I'll say, well, what do you do with that? And then I guarantee you there's at least two software programs that they put lead contact information into to actually do their jobs to generate emails and do the things they do. And then your passoffs and all sorts of stuff. Exactly. You're just sitting down with your colleagues and you're saying, you know, what do you do with personal data and going through and it will help you get all the information you need to figure out, you know, how much exposure you have? If there's an action item for a podcast, this is really going to be it. And I think everybody's going to be shocked at just how much potential there is for risk
Starting point is 00:32:59 that they weren't really thinking about. So just two things to wrap up. One, you obviously came with all of this experience and that focus when you join the company. but like an existing company might not have the opportunity to hire somebody like you or the budget or that's the priority. But who is the right person to focus on this in a company? Where do you, when you talk to your peers that are doing this, where are they in the company? Are they in product and ops and marketing? Who is doing this work?
Starting point is 00:33:26 Well, at Everlaw, it's a bit of all hands on deck because each of the teams, everyone does something with personal data. And so it's important for all the teams to be involved. But if you're at a smaller company and you need somebody to lead your GDPR compliance process, project, and you don't have a person in charge of compliance, then my advice is to look for what I think of as your risk sentinels. My life before the startup life was actually in the oil industry for about nine years. And I've had a bunch of different roles worn different hats around regulatory compliance from an energy trading perspective. And then after that, responding to Deepwater Horizon and leading environmental restoration for the company. So in my career,
Starting point is 00:34:10 compliance. I've had colleagues that came from trading. They came from engineering. They came from risk. Not everybody in compliance as a lawyer. Some come from audit. But you want the people who are thinking five steps ahead. You want the people who can triage issues, who can spot issues, who can think ahead to what the challenges might be and how you will solve them. I don't think it necessarily matters what function, but look for the people who have that detail-oriented nature and the ones that are just always thinking ahead. Awesome. So if you could give people like one reference or one document that they should go read
Starting point is 00:34:49 that isn't the GDPR regs itself, like what would you recommend? I would recommend certainly the privacy by design, the foundational principles. It's a document. You can Google it. Dr. Ann Kavuki, and she's the former Information and Privacy Commissioner from Canada. And that talks about in a very digestible format, the things you can do to incorporate private. privacy by design into your engineering and design process, certainly.
Starting point is 00:35:14 Well, thanks a lot. I just want to wrap up and remind everybody of sort of these core GDPR principles that I'm just going to read so that we end with these. But everything must be based on consent. You can only collect what's adequate, necessary, and not excessive in relation to a specific service. The right to transparency, as Lisa was saying, that seems to be where the main emphasis is. You have the right to be forgotten.
Starting point is 00:35:36 And, you know, IP addresses, email addresses, genetic information, all. All of those are our personally identifiable information. Thanks a lot, Lisa Hawke from Everlaw, for educating us about GDPR. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.