Embedded - 427: No Fisticuffs or Casting of Spells
Episode Date: September 15, 2022Elizabeth Wharton spoke to us about laws, computers, cybersecurity, and funding education in rural communities. She is a strong proponent of privacy by design and de-identification by default. Liz (@L...awyerLiz) is the VP of Operations at Scythe.io (@scythe_io), a company that works in cybersecurity. She won the Cybersecurity or Privacy Woman Law Professional of the Year for 2022 at DefCon. Liz is on the advisory board of the Rural Tech Fund (@ruraltechfund) which strives to reduce the digital divide between rural and urban areas. We mentioned disclose.io and the Computer Fraud and Abuse Act (CFAA, wiki). Transcript
Transcript
Discussion (0)
Welcome to Embedded. I am Alicia White, here with Christopher White. Our guest is Liz Wharton,
and we're going to talk about technology, law, public policy, and cybersecurity.
Hi, Liz. Welcome.
Hi. Thank you. Thank you so much for having me. Could you tell us about yourself as if we met at a random diner at DEF CON?
So I wish I had a much better superhero origin story. But yeah, so if we were sitting together at DEF CON, I would say I try my best to keep folks out of trouble and enable the doers to do what they do and do that from a legal as well as a policy and just a business operation standpoint. Okay. Now, if we met at the boardroom at Scythe,
how would you introduce yourself? There you go. Sorry. As I am the adult in the room and do my,
again, do my best, but I have almost 20 years of working with the hackers, the makers, and the breakers, and really just
enabling. And by that, I mean, I have done everything throughout my legal career from,
I started off as a business lawyer, and I worked in banking and real estate and just helping companies build and grow.
And this is what happens when you have friends that like to ask why and like to take things
apart, that they would start to take things, do things, for example, doing research with
unmanned systems and asking me to help them legally do that or draft the contract.
And it went downhill from there. All right. And that downhill is a large part of what we
want to talk about. Right. It's the fun part. The lawyer stuff tends to be the boring part.
Well, I mean, as you said, keeping us out of trouble is pretty important.
It is. And also helping different, everything from other members of the board and other C-suite
executives down to the legislators and the regulators and being that transition, that bridge
and translator between, hey, this is what people are doing.
And this is where you would like to see things go. Here's how we can do that. And it may not
be how you think. We want to do lightning round where we ask you short questions and we want
short answers. And if we're behaving ourselves, we won't ask how and why and who's your favorite
superhero? No, that one we will ask. Let's start with that and who's your favorite superhero. No, that one we will ask.
Let's start with that.
Who's your favorite superhero?
I like Black Widow, probably because of the red hair.
But she, hell, because a lawyer.
Sorry.
She is.
But I...
I'm breaking the rule already.
That looks like it requires, right?
But it looks like it requires a lot of gym time to stay quite that fit. And shopping for clothes seems to be. Admittedly, I still have to catch up
on both reading the novel as well as watching the show. So no spoiler alert.
Do you have a favorite video game?
Oh, gosh. See, this is where my downfall is, and it's embarrassing. I like the Mario Kart, the Pokemon Go, the things where basically you can pick it up and it's fairly intuitive and they can drop it.
Do you have a favorite historical case file? well? Ooh, I really don't have one. One of the fun one to look up is the state of Maine sued
the Declaration of Independence. I cannot make this up. It is a fascinating story. When you get
down into it, it's really about lawyers getting creative on how to establish who
owned or the proper chain of title for a document. But just the fact that the state of Maine had to
style a case that they were suing the Declaration of Independence. I just love that.
I always like those federal cases where it's like the United States versus $5,263.
Because they're trying to reclaim money from someone.
Or like 20 John Does or like, yes, any inanimate object.
But I'm like, it's the Declaration of Independence.
How can you sue this Maine?
What is wrong with you?
Spoiler alert, they lost.
Do you have a favorite law?
Ooh, that's a good one because it depends. Is it the favorite law that I like to make fun of,
which is the CFAA? I think it's terrible. I think it just like, let's start from scratch almost.
But do I have a favorite one that I think is fantastic? There's too many to choose from.
And do you have a tip everyone should know?
Oh, gosh. First of all, break it down to basics and with everything. Don't assume we're all
speaking the same acronyms, the same language. Break it down.
Okay. Well, then I'm going to go back to a previous question. And you said the CFAA,
the Computer Fraud and Abuse Act. Yes. Thank you. I'm so glad you circled back to that because as soon as I said, I was like,
oh gosh, not everyone may know. Yes. The CFAA, Computer Fraud and Abuse Act is really,
we're seeing it get reined in a little bit, but not truly. And we're seeing some states try to get creative, but that's frequently a law that's used to tamper or tamper or tampon down
research. And even companies will try to use it sometimes to say, ah, employees like stole our
stuff. They shouldn't have been doing it. It's like, no, stop it. Be adults.
Isn't that what law really comes down to? Someone saying, stop it, be adults. Isn't that what law really comes down to?
Someone saying, stop it, be adults.
Right?
And sit in that corner and think about what you just did
because that is not what you were supposed to do.
Yeah, frequently.
And that was a law that was used against Aaron Schwartz, right?
And that was, I think, where it came on a lot of people's radars
as what's going on here. Yeah. And not only that, you had, for example, the state of Georgia
had a hot topic is always election security. State of Georgia had their voter databases,
had some security flaws within them. Researchers alerted the Secretary of State
and the proper folks about it. And it put a little bit of egg on the state's face that in this case,
it was just stuff was not protected as it should have been. And so of course, the natural reaction to that was,
you know what we should do? Instead of spending time, effort, energy, hardening our security
around the voter rolls and this election systems, we should pass a law that makes it,
that extends the Computer Fraud and Abuse Act to cover this. And it's just like, luckily,
a significant amount EFF helped bring people together to write to the governor and the
governor at the time did veto it. But again, it's you just want to look at like, be adults.
Like, this is not what you should be doing. And just as with Aaron,
it's like, this is not what this was intended to do. You should be sparking creativity and the,
you know, free discussion of knowledge rather than just crushing it. And it does seem like the response to being informed of security holes or such things, privacy issues, is not to say, thank you, I'll fix that.
But here's a lawsuit. You can't tell anyone and I'm suing you for criminal acts against this. Right? It's absolutely fascinating that when you have researchers who are going about it the right way, and by that I mean they're being responsible, where they're providing the company or the maker of such and saying, Hey, here is, here is this flaw.
I've done this research for you and help,
like I'm trying to make it better. And instead they just, you know,
it ends up being like, no, we're going to see you.
The CFA was passed in, I think, 1986. Wait, wait wait i have information on this because i pulled up the
wiki page oh okay the original 1984 bill was enacted in response that computer-related crimes
might go unpunished from drumroll please original crime bill characterized in the 1983 techno film war games oh that is oh what
absolutely correct what yes war games inspired this law yes and it's 1986 is when it went into
effect but yes this is this is the war games law i didn't know that that's so cool yes like who knew the power Matthew Broderick had? He inspired.
And it gets in.
Yeah.
No.
A realistic representation of automatic dialing and access capabilities of the personal computer.
Yeah.
Very relevant to today's world.
You can't make this stuff up.
Right.
That was going to be my question, though, is how can a law written oh god almost 40 years ago wow since i saw war games in the theater that just hurt depressed
how can a law that old when computers were basically what our toasters do now and
and internet connectivity was was limited to three computers at Stanford.
How can that apply?
That can't even be relevant.
Well, so this is where frequently you have to go to legislators and regulators and help them shift the focus of stop focusing on the shiny object aspects of something. Stop trying to
be reactive and instead focus on what is the actual harm you were trying to prevent? Not the
how, but the what. And break it down like that. So people stop paying, because that was one of the things
with what we said about five years ago, it was dreamt ever. Like, oh my gosh, we need to have
all like, they're, they're going to be taking over the skies. They're going to be peeping Tom,
spying, all these things. And I had a lot of fun with like different at the state and local level, even
working out going, Hey, quit panicking. We've had unmanned systems in operation since before
world war two, Marilyn Monroe worked at a, um, a factory that built,-controlled aircraft. So stop trying to focus on this aspect of drought.
Instead, what are you trying to get at? Are you trying to, okay, we're afraid they're going to,
war driving being legal, but we're afraid they're going to do this, this, or this.
Well, okay, that we can work into legislation. But if you're worried about this, give researchers 10 minutes
and they have found a different way to do that. So your actual legislation is now obsolete.
And you didn't, you know, all you did was add one more law to the books that really doesn't work.
There are a lot of laws on the books.
Indeed. Indeed.
Is anyone ever going to say, you know what we need to do? We need to just
get as many of these off the books as possible. Do you know how hard it is to pass a law?
Imagine how hard it is to get rid of one. I think they should all just expire
over time. Invisible ink.
That could be fun.
Okay, so that's unrealistic.
Then you have to pass ones to replace them, even good ones that you want to keep.
Exactly.
If you've ever tried to have to go through, like, one of the things, because I work with a lot of startups now, and shameless shameless plug Scythe, if I'm permitted to
give a plug, like Scythe has grown. We've become an adult company in that we actually have to have
employee handbooks and all these policies. And when I was with the city of Atlanta,
we were going through and updating our policies and like a microcosm and example of that, if we, I think we at one point had maybe
like three different policies that dealt with like beepers, pagers, and, but we didn't have
something that dealt with, uh, you know, like laptops when you travel. I was like, huh, all right.
This is like, we can, let's get a home run.
Let's get that.
So we did.
But when you take that on a broader scale,
that's a lot harder to do.
So Scythe, what did they do?
We're an adversary emulation platform.
So we allow and enable that big picture. What's working, what's not, what's the latest and greatest when threats are coming out. We are post-breach, but we allow companies to see what's
going on in their systems, what's working, what's not. And I like that, having survived
being with the city of Atlanta when we had our ransomware attack,
and we're having to rebuild all of our systems and our processes, and yeah, I'd like to know,
like, okay, did we configure this correctly? Is it going to catch all these other things? What is it
catching? What is it not? And that's where platforms like Scythe come in and really just are
a force multiplier. So you're a red team. Think bigger. So what happens when the red team comes
in and says it's broken and gets to walk away? No, you want the blue team aspect. I mean,
it kind of is. It is like the, like you get to be like chaos bomb.
Exactly.
Now you get to deal with it.
Everyone wants to be the first scene in sneakers.
Right. And unfortunately I have been the one sitting in the boardroom going, well, great. Now what?
So what Psy does is we help the red team and the blue team work together better
um and can we define those for or for people what you don't know so you think of the red team
as the ones that are going in and doing the testing they're poking and finding the holes
and finding what's working, what isn't.
And either a long-term, like one shot or just over time, this is what we see that we were able to get in this.
We were able to do this, this and this.
Blue teams are, I say the poor folks because they have the harder part of they're actually
defending against.
They're the ones maintaining the systems and really running the sock and keeping the
lights on. And then you have the concept of the purple team, which is what happens if the red team
and the blue team are sitting together in the same room and working together and in real time
getting, or as close to real time as you could, you know, getting that feedback. And instead of having to wait six months for the report, like, okay, great.
So you're telling me for the past six months, somebody has been sitting in my network, like, ah, Huffle.
From a lawyer's summary, I hope that made sense.
Oh, it totally did.
But these are very different skills.
I mean, one is breaking things and one is building and, more importantly, maintaining them well.
And monitoring, too.
And monitoring.
Yeah.
Right?
The blue team has a much larger to-do list.
They also have a much larger liability.
And that's where folks like me can stay employed.
Because, yes, there are risks,
there are liabilities. And part of that is bridging, you know, going back to breaking it down, bridging that gap of, okay, how do you translate? Like, I, again, I cut my teeth on security working with the red team type folks doing the offensive
security. And, you know, again, that's, that is the fun. It's how did you figure out how to jail
break an iPhone, put an extra battery pack on it, add in some software so that it wouldn't go into sleep mode and absolutely do wifi sniffing
in the mailrooms of companies and find out just how unsecure their networks are.
That's fun. But when you go and deliver that report, how do you treat like the real, like how realistic if we're going
to prioritize from a company going, well, okay, I have a million dollar budget. Wouldn't that be
nice? But I have a million dollar budget. How should I spend it? Should I then, you know,
obviously go through, add passwords to stuff, uh, lock down the networks. Don't let, you know,
uh, have the network segmentation so that if they do get in one networks, don't let, you know, have the network segmentation
so that if they do get in one way, they are not doing all of that. But beyond that,
how do you prioritize like, okay, that's, that's a show hack. How do we like, how do we do that?
And that's where, you know, that dialogue needs to happen where you translate and you break it down and you help people
prioritize. That does make sense. But then when I think about some of the hacking that happens,
the being able to open car doors and enter houses that are secured with cyber locks. Right. All the cybers.
And okay, so the first person, the researcher, I totally think they should be protected.
That makes sense. We need people who are trying to break these things to be the red team,
to be the penetration tester, so we know where the errors are are and so we can fix them. As an engineer,
I want to fix those bugs. But where does the responsibility lie when these hacks are published
and other people use them for nefarious purposes? Such a multifaceted question. And now my brain is
like, oh, let me talk about this. And let me talk about
that. So one aspect being with the researchers, you know, you're coming in and you're presenting
the findings and saying, Hey, this is how I did it. This is how you could replicate it
and doing it in such a manner that almost like with a level of respect. And we see that you have
organizations like Disclose.io that is a nonprofit that really helps bridge that conversation.
And if you were turning this episode into a drinking game, I feel bad that I have said bridge so many times, I feel like, but you have, you know, you, you want
to bring that conversation in a productive manner so that it's not the, haha, your engineers are
idiots. Look what they didn't know how to do. And you're thinking, oh, you mean our team of five
people that are dealing with like insane amounts of tech debt and working
with like, you know, we're having to basically recreate everything each time we do a new feature
or iteration of the product. Like, oh yeah, no, they're idiots. Huh? No, that's not how this works.
But also helping, you know, providing that information, but running against a clock of
if one person's found it, there's a good likelihood it's already been found by somebody
else.
They just happen to be the first person to talk about it.
I mean, you think of how many people have different pocket research.
They just keep to themselves that, all right, I'm, you know,
like, this is cool. I don't know what I'm going to do with this yet. I don't feel like dealing with
a responsible, like, but it enables me to do this, this, or this. And there are a lot of people who
haven't disclosed. So just because it's reminding companies, look, someone came and just did your research
for you.
They just did this and they're willing to work with you.
And like, that's a good thing.
That just meant now your engineering team doesn't have to find it themselves.
Somebody else did.
And their help, like, we're collaboratively together.
I mean, in a shiny, happy world, that's what should happen.
Oh, yeah. But this is not shiny and this is not happy. And there is that, like you
look at it from the perspective of you just walked into a room and said, oh, hey, and especially if you look at what you have coming out of not only in the US,
but other countries as well, saying, oh, wow, now that you know about this vulnerability,
and in some cases, there aren't even the caveats of it is a significant, yeah, you had to patch,
you had to patch now. And if you don't patch now, or if you don't patch within 48 hours, 72 hours, et cetera, and if you don't publicly disclose that you had a breach of vulnerability,
et cetera, then you better open that wallet because you're about to get hammered.
Like, wow, there's a lot that goes into it and having to explain, well, sure, we can patch, we can fix, but in some cases, we're a hospital
and some of our critical systems are running on operating systems that haven't been used,
like that were introduced, war games, pre-lifetime. These were introduced before
some of our security team was even born.
And by patching it, we lose use of this machine.
And so, okay, now what do you want us to do, genius?
There's a lot of windows in medical devices.
It's sort of depressing.
Oh, gosh. I've, I represented a hospital that was the only medical, like, I forget which level trauma center within like a hundred mile radius of a very economically depressed area. And I think they had
one and a half CAT scan machines because the half was broken and they actually just used it to pull parts off of it. And they really didn't. But
if you took them offline, you took away that first line of medical care for an area where
some people did not have the luxury of being able to travel to a different hospital.
Are bug bounties something that's useful and that actually accomplished the goal? I know
certain companies resisted them for a while. I think Apple, it's only been recently they started paying them. Is there an argument that they do encourage the kind of disclosure or the kind of testing and disclosure that we want? Or is it that people can't compete with the black hats? Well, bug bounties done well, I think are great. And by that, I mean,
going into it with the right spirit. And I'm going to kind of use that loosely in the sense of
some of, and I have seen researchers who have shared with me a, you know, like, Hey, they want me to,
I disclosed this and they sent me this NDA. Can I sign it? And it looks, I was like, huh? No,
like, I'm sorry. This is a conversation you would need to have with your partner because yes,
if you signed this, they would certainly give you, I think it was like 30,
30 grand in cash.
I was like, that's a, that's a sum that I, I should not be the one making the decision
for you.
But on the flip side, if you signed this NDA, it means you have to walk, you're agreeing
to walk away from an entire like line of research that let's be honest, that's your passion.
And that's your, like and that's your day job.
So I would tell the company what they could do with this NDA, but I can say that with the luxury
of I'm not going to collect on this. So yeah, I am armchair quarterbacking this like nobody's business.
So you do you, but I wouldn't suggest you sign it.
I'm happy to, you know, and that's where you have to have that dialogue and understandably.
Yeah, some companies are just going to be absolute because they panic.
They don't know what to do.
And that's where, again, going back to you have companies like Luda Security, Katie Masouris,
who essentially created some of these concepts of hack the Pentagon.
Yeah, there is a way to do it.
And there is a way.
You go to talk about DEF CON, you go to DEF CON and hello, you can hack a SAT.
You can hack a satellite because we've created an environment where everyone understands.
We've put the parameters in place.
We've opened up systems that ordinarily you or I wouldn't be able to have access to.
Well, maybe not.
It depends on your security clearance.
But you have this, you have the car hacking village.
You have the vote hacking, the biohacking villages.
Those are great examples of how you can do bug bounties or even just that collaborative
environment.
But it took years and lots of yelling and lots of threatened lawsuits and intimidation for researchers.
And some researchers having to learn like, hey, maybe not presenting it that way. Let's present
it somewhere different or in a different fashion so that you can still get heard.
They will still respond, but let's build that level of trust.
There is something about saying your engineers are stupid that makes people not want to listen to you.
Oh, absolutely.
And also it's wrong.
Like, so if you're starting from that premise, it's like, well, why am I going to listen to
anything that you've like what comes next?
Because your initial statement is just wrong.
So it's reminding people to give
each other grace. Yes. And empathy. And to understand that many big companies have small
teams that do this, that do the blue team research and maintenance. Oh, yeah. And they're against however many people want to be against them. I mean,
some companies definitely draw more ire and therefore probably more penetration testing,
but yeah. And you have some companies where, you know, a first blush, and I haven't even had a chance to look into it, where you have like Patreon who lets go their entire security team.
Yeah, what was with that?
I don't know.
And it's like, wow, you are not going to draw any sympathy for anything that follows. And you have, again, talking about pulling from my experience,
where you have city and state governments, you look at the city of Atlanta, their blue teams
are defending not only a municipality that has, I forget how many, so you have the city government, but we also had, they were defending
the networks for the department of public works that are providing water to, and sewer treatment
facilities for over 6 million people and businesses. And then they also were defending the networks for the world's busiest airport.
So if you think it's bad when Southwest goes down, it was a couple of years ago, or somebody had,
was it Delta? Someone had a glitch in their system that caused everyone trying to get home from Black Hat and DEF CON. If they were flying that airline, there was an issue. That's not the world's
busiest airport. You have the world's busiest airport and the ripple effect. And it's like,
but they have a handful of people getting paid city government salaries that aren't being
necessarily offered the latest and greatest. They're not getting, you know, you have some of the training courses
are not priced such that they can easily access and everything's stacked against them.
And yet they keep it running on a day-to-day basis. Sorry, I'll get off my soapbox.
No, no, it's a good soapbox because it's such an impossible problem
and yeah i i want to tell those people you're doing a good job but then i remember being in
college i remember how fun it was trying to break somebody else's system just because
i mean come on their password file it was like open oh my god right it was the 90s things were
different and i mean they didn't expect anybody to be messing around just started salting It was like open. Oh my God. Right? It was the 90s. Right. Things were different.
And I mean, they didn't expect anybody to be messing around.
Just started salting and hashing passwords back then.
Right?
It is bananas to think of that. of the areas and ways that I try to make a difference is, and it was something that hopefully I left kind of an impression with my colleagues at the Department of Law, but also building up
that level of trust within the, I'd say the IT, we called it AIM, but the technology department of making a point to go down and say, Hey,
like, help me understand, help bridge that gap.
Because I'm sitting there in the C-suite with the chief operating officer, with the CIO,
with the mayor's office helping to do this stuff. So rather than me
sitting in my ivory tower, tell me what you actually need. Tell me where your pain points
are. Do you need, like, what can I, when I'm looking at these contracts and when I'm helping
prioritize stuff, you know, do you need more training? And if you need more training, do you need, like, what can I bring in? Do we need to create a partnership with someone like Cyberary
where we can get you hands-on stuff? Do you need access to your peers? Do you need data sharing?
Do you need, like, how about just like with SZA and the alerts that they'll put out? Like, how about just like with SZA and the alerts that they'll put out?
Like, what do you need as the blue team?
And how can I like, where are the sticking points for you?
And so if you get more dialogue between the C-suite, the legal department coming in and
telling the engineering teams, Hey, tell us, like break it down, break it down
into digestible building blocks. What's going to make your job easier? What can we do to empower
you? And I think that's a part of the conversation that it's a two-way street, but we need to be
having more of. But how do we make sure those people, the people, the blue team folks aren't
being held responsible or liable for the attacks that happen? You mentioned ransomware. And okay,
so was it negligence? Maybe not in that case, but there are times where it is pretty negligent to
leave some holes in your firewall or in your operating system and then ransomware happens.
And then, you know, the company has to pay a lot and then does somebody lose their job or is this
a look, I couldn't have done what you wanted me to do sort of event. Well, as long as we're not blaming the interns, that is the, you know, how many times if you have
your, like, what is it? Ransomware or data breach bingo card, always have a square for blaming the
interns because somebody will. And it's making sure that the right people are being blamed.
Is it the decision makers?
And what is the point of the blame?
Is it to change a behavior?
Or is it to craft like, well, we need somebody to scapegoat, drag in front of.
Draw in quarter.
Justice.
Justice.
Right. Right. Justice, justice.
Right, right.
Shame, shame.
Or is it because we need to identify kind of, look at how you see this with airplane and NTSB and FAA, the post accident conversations. So if they're determining it
was pilot error, it's because, or if it was aircraft malfunction, or was it a security system,
Boeing, what, um, that, you know, what, where are, what is the goal? What is the end game for why we're doing that?
And is it because we want to learn from the mistakes? And you are seeing a bigger push
where between the SEC, the Securities and Exchange Commission, the Federal Trade Commission,
and all these different government agencies,
at least on the US side, that are coming in and saying, all right, is the blame because somebody
in the boardroom knew about this and did not properly prioritize it? So they knew that,
hey, there might be gaps in our firewall, or we're not sending enough
money, or we're not properly protecting different data.
And we made a decision, but instead, we're going to send the money elsewhere, that they
are going to be held accountable.
And so it's not the person on the front line.
It's not the blue team.
It's not the person sitting in the sock or the knock that are sitting there going,
oh, like where they're going to get blamed.
But instead it's, well, who was in the position to make that decision?
And then we need to hit them where it hurts.
You mentioned opening wallets
if things don't get patched
in a reasonable amount of time.
Mm-hmm.
Is it who, I mean, I have had my data stolen
and I've never gotten any money.
I've gotten some free quasi-
Free credit locking.
Free credit locking,
free credit locking of people who lost my data.
Like that was useful.
Hey,
at least you got that.
You know,
I almost wish you would go back to,
was it you open a checking account and you get a free toaster or a set of steak knives.
I would,
I would not have to,
my kitchen would be all set.
Exactly. So, you know,
it's who are they paying? It's not me. And I'm the victim here. Right. And that's part of the problem. And where we are trying to shift some of the dialogue of like, oh, boo-hoo. Illinois is a great example because
a lot of the focus tends to go on California and even Massachusetts for some of their data
protection and their privacy laws. But Illinois has BIPA and now, oh gosh, biometric. Okay. We're just going to go ahead and admit,
I am just having a brain freeze on BIPA. I have given so many talks and doing so many research,
but under that, if you are sharing biometric or other, for example, think facial recognition. And Instagram just got hammered, hammered
by Illinois because they said, oh, hey, you were scanning all these pictures and tracking the
facial recognition and the biometric information from all these people without telling them.
And that violates the BIPA statute in Illinois. And I can't remember, I want to say it was like 65 million or something ridiculous. That money will go back into the state.
And I don't know how they're going to distribute it per se, because it also,
keep in mind, they smack them, smack companies with, uh, Facebook has also run afoul of BIPA.
Um, but the money gets collected, you know, uh, but they have lawyers, they'll appeal.
Um, if we're going to bash on different companies companies whether it's deserved or not uh but
you have twitter has actually been under a consent agreement with the federal trade commission
since what 2011 uh and they paid a fine and stuff and they all negotiated down but that's part of the problem is like for them it's like
cost of doing business yeah cost of doing business insurer we'll make it up here we'll make it up
there and you think of like uh anytime there's this class action lawsuits the lawyers get money
um and the yeah somebody gets a little bit and the rest of us get like a 50 cent check or I forgot
when I received, I'm like, let me just get this straight. Y'all paid more to print out and mail
this than I am going to receive. Like, okay. Okay.
This is interesting.
And broken.
Yeah, no, exactly.
Well, everything's awful.
Everything's on fire.
And yeah.
Cybersecurity topics, and I mean, DEF CON to some extent, are viewed from the outside as confrontational and ego-driven.
Is that a reasonable assessment?
See, I would have to, well, prefacing that with the fact that, so I've been going to DEF CON for 15, 16 years now, and I'm on the CFP review board and one
of the goons and have, so I don't think it's confrontational and that hasn't been my experience.
Do I think there are some of, especially when you look at early days, you had, again, we're going back to that. Is this being presented in, um, in a productive manner?
Is this a ha ha y'all are idiots. Um, and is it being received as, uh, like, okay,
yeah, you have spot the fed, uh, games. And now it's like, oh, we have an entire, my joke is that, and I've helped work on
it, is you have, it's not really a track per se, but you have a policy department that it's not a
village. We're actually a part of the main DEF CON, but our focus is to bridge and encourage those conversations between regulators and
policymakers and researchers and put them in the same room. And believe it or not,
nobody came to blows this year or last year, or the, I think we've been doing it through three
or four years. There have been no fisticuffs,
there have been no casting of spells and well, at least out loud that we have seen, um, and no,
like, you know, voodoo doll, uh, nothing has been burned in effigy, at least within
the official DEF CON space, which I think is great. And, but you think of like the,
what it took to get the car hacking village and the voting village, um, off the ground
and to where we are now. I mean, you had with biohacking, you had researchers, uh, that were
hacking their own, um, hacking their own insulin pumps
and then having to get in arguments with the manufacturers,
saying, I'm not researching this to be snarky, to be rude, to be mean.
This is the insulin pump that's attached to my body.
Can you please secure it so I don't die?
That seems like a reasonable request. I mean, it really does. You would think, you would think, and it's just, again, it's
bananas, but I'm proud of how far we've come. You won an award this year at DEF CON. Could you talk about that? Yes. Oh, thank you. Yes. So the
cybersecurity, so there are awards that go to women professionals and cybersecurity woman of
the year, and they break it down into different topics. So my focus is on both the law and public policy surrounding privacy in particular. And so I
was recognized as the woman of the year for in law and cybersecurity policy. So privacy.
So yeah, I even got a trophy and everything. I felt like telling my art,
a little statue. It's like, look, mom, I'm not a complete, uh, like raving lunatic. Occasionally
people care about these topics. Part of the award was about outreach and being a good example for
others. Is that important to you? Oh, absolutely. And thank you. I love, like, thank you so much
for the softball that's going to let me really share about some of my passions, which of course
I love Scythe and I love the work I'm doing there, but I have the privilege of being on the board of
advisors for the Rural Tech Fund, which Chris Sanders started on his own.
I forget how many years it's been around, but what RTF is doing is we connect students and
teachers to different, it's really bridging that access gap. And they have provided grants and support to students and
teachers in all 50 states. And one of the things that I got to do with them was go talk to an
elementary school class of students in North Georgia and share with them about the research I was doing and what I was
doing to help integrate drones into the airfield at Hartsfield Jackson at the airport. And they
had donated a, the Rural Tech Fund had also donated 3D printer and training and materials
to the same class. So it's really like, what do y'all need? What, what can we do and really help
students? Uh, they worked with an, uh, a native tribe up in Alaska that the teacher reached out
and said, I, from elementary school to high school, we have the same building, like one trailer that we do. And they said, Chris, what do you need? Do you need access to online labs? Do you need software? Do you need training materials? And got them up and running. And that kind of thing of how can we inspire the next generation and really spark
that curiosity that got us here. I mean, you think of old school researchers, they didn't have a
guide. They didn't have, you know, someone to show how to, they just had to figure it out.
And how do we take that curiosity and grow it and encourage it?
So work with the Rural Tech Fund as well as try to do my best with Black Girls in Cyber and serving
as a mentor there as well. And this was the first year DEF CON had a village for really encouraging both women and girls, but also women of color to engage more and learn more and grow in cybersecurity.
Okay, so there's so many things I want to unpack there.
But let's start with Rural Tech Fund because we have an organization, a nonprofit near us called Digital Nest that we've talked about on the Valley in terms of miles, but so far in terms of everything else.
Are you, is it about showing people what's possible?
Is it about giving them the tools?
Is it just about telling them that technology exists?
What are the main goals when
you talk about rural tech fund? So all of the above. So it's not only just
showing them the world of possibilities, but enabling students that wouldn't ordinarily have that access that one, they may not even know
that it's possible. And now that they know it's possible, they, or their teachers. And it's,
sometimes it's providing the training tools to the teachers and connecting them with the experts
of, Hey, here's how to, now that you know, it's out there, here's how you do it.
And everything from assistive technology and classrooms and providing that.
And really where Rural Tech Fund, to me, strikes a chord is it's not dictating,
this is a program, this is what you need. This is, it's more about going to the teachers and
the schools and the students and saying, you tell us what, what do you need? What can we do
to help you and your students? Is it connectivity? Is it just access to this? Is it training on this? Or is it even, as you said,
like in my case, I took a bunch of micro drones and, uh, down to the school and, uh, of the,
I think it was a third grade class of the 30 students in the class, only about six of them had ever been to an airport. And so educating them, I had a map,
like a topographical map, an aerial map of the airport. I said, this is what we're talking about,
guys. This is, you know, every minute a plane is taking off and a plane is landing. And what do
you think are some of the things we
should need to consider? And oh my word, those children have the most like graphic descent,
like since they're like, well, what happens if it explodes? And there's fire, fireballs everywhere and i was like well since you mentioned it kids uh we have special
uh we have special foam that is loaded into our uh fire engines that is designed just to fight
jet fuel fires but wouldn't it be great if we could use drones to get aerial imagery of the fire
and fight it better? But they were just like, limbs flying. I'm like, what have y'all been
watching or reading? Like, wow, you're not wrong. We do have to think of all of these things, but
wow. Okay. Please don't tell your parents I'm the one
that told you all about that stuff. Yes. Yes. I remember one of the people who worked for me when
I was a manager went to go talk to a classroom and came back and said, well, that didn't go as
I expected because halfway through they asked me to define what homicide meant.
Whoops. Right? Like, oh, okay. Okay. Yeah. But it's where it starts and I don't want to be the one like, yes, children, like get creative. You are only limited by your imagination and let's do it. Like,
do you just need a check or do you need security experts, the researchers who are hacking into the
cars or like, all right, y'all have seen this like autonomous vehicles. You know, you've seen
the Jetsons. I hope, is that even, can people still watch the Jetsons?
I don't know.
George was born recently.
I know.
But like, here are all the things.
And I love knowing that there are more and more of these organizations out there doing this.
You personally, you mentioned some time donated, but do you usually give money or time? What is it like being on an advisory board of a nonprofit? Well, I started off, I did put my money where my
mouth is. And I try to, when I'm making charitable donations, find organizations that are doing
kind of that I feel like are going to, the money's going to go to something I'm passionate
about.
So it might be a dog rescue, in this case, rural tech fund.
I got introduced to them in part through B-Sides Augusta.
And with being on the advisory board, it's not a requirement.
I just happened to like to every year send money and had long told Chris,
whatever you need, just point me in the direction. I have passion and energy and resources available.
I just, I need you to channel this energy for good.
How do you need this? And this is the, Chris just started the advisory board for Rural Tech Fund.
But with that, it's an ask of, and it just depends on the organization, but it's an, sometimes it's asked of just ideas
and time and Hey, you've, in my case, like I've worked in different, uh, for worked with
different nonprofits and part of their fundraising.
But I also, through my work with Scythe on the business side, I have access to other
VC funds and other companies. And it's like, Hey,
we want to be able to tap into, um, tap into your network and your ideas of how do we structure,
how do we grow these programs? And another, uh, senior online safety is another board that I'm on
for a nonprofit. And that one, it really isn't occasionally.
It's, hey, we're going to do this. Liz, you've helped from a legal side set up nonprofits,
point us in the right direction. So it just depends on skillset. So if someone wants to
get involved, and I highly encourage everyone to look around. There are organizations that are constantly in need of volunteers, advisors, supporters.
And people say, well, I don't have money.
I don't really have time.
But do you have social media?
Do you have a way, a network that you can connect them to?
Or are you a creative?
Like maybe you can help them grow their next project because they can pick your brain.
Or that $20 that you donate and is also a tax write-off for you, but it can grow and
support some stuff that you'd be surprised how, like even just like,
do you have two hours of time on a Saturday because they're running a booth
at a local cybersecurity conference and you can help them do this.
It's there are many ways to get involved and it doesn't always require one
thing or the other.
You mentioned Chris, could you say his name again?
Oh, certainly Chris Sanders.
He started Rural Tech Fund, which at ruraltechfund.org
or if you're on the Twitters at Rural Tech Fund, check it out.
And he started this as a passion project, being from a rural area and
growing up without perhaps all the access that you might have if you were in a major city.
I just wanted to make sure that we identified the person because usually when there is a Chris
mentioned, everyone believes it's my Chris.
No, they believe it's some other person.
And my Chris does not work for iRobot and does not run the Rural Tech Fund.
No, I'm far too lazy to do all those things.
I have a couple of listener questions.
Absolutely.
One of them's from Nick.
Are there any ways that privacy legislation
can get ahead of advances in technology and its uses in new spaces rather than constantly trying to play catch-up or games?
For example, limits on facial recognition tech by law enforcement. Unfortunately, part of that is already out of the, like the cat's out of the bag, the horse barn, whatever the sayings are. Really, it's raising awareness at this point. You do have some states that are doing a great job. Do we need national legislation? Yes. But the other downside is it's not just
you think of TikTok. That's not a US company. And we can put all the restrictions we want,
and they may or may not follow. And when it comes to local law enforcement, part of it is education
and helping them understand. And this goes back to
some of the engineering stuff of like, just because you can build it, do you really need
to build it? And do you really need, that was one of the things we worked on and really tried to
focus in on was smart city stuff. And when I would sit in the meetings and I would get, we would get the
request of, Hey, we want to do this. I'm like, all right. All right. Well, do you know, do you
know what else is going to happen with that? And do you know, like, okay, now we're going to have
all this data. We're going to become a target, even more of a target rich environment. And they're going to come after it and the opportunities for misuse.
And are you ready?
And are you willing to have all this?
And do you understand?
And it's like this company, it was like, oh, no, I just wanted to make this part of my
job easier.
It was like, well, okay, let's, let's find a different way then.
But having those conversations and helping, again, bridging that
of, do you understand how this technology is going to be used? And is this what you really want?
And is this what we really need to get the job done?
One more question from Nick. What should I look out for in the services terms and conditions if
I want to protect my privacy and data?
And I'm going to add on that. How much do I believe those terms and conditions?
Well, how often do you frequently read all three of them? And how many times do you
have that thing of like, are they accurate? How much time and thought has been put into them? And when you're talking about these big corporations, that's what we see with
Instagram and where they just got hammered of, Hey, you know, the terms and conditions, I mean,
you don't get to negotiate them. They are what they are. Uh, and are you not going to use?
And I do know some people who don't use certain devices or don't
use certain things because they're like these, yeah. Wow. I've just signed away everything.
This is a crock. Uh, no. Um, but when you're looking at that, it really boils down to some
of the States and some of the, some of the policy decisions that are getting made are actually
holding them accountable. But again, when we're talking about billion-dollar companies, how much
does a $65 million judgment really do? Other than raise awareness and remind everyone, hey,
these companies, their terms and conditions say, or some cases don't say what they're actually doing.
Exactly.
And it goes back to I don't really believe them.
And every single one of them says we may change this at any time.
And I don't know how many times I've given away my firstborn child.
There are days that most people I know would gladly give away their firstborn child.
I don't have any, so it's an easy one. That's why.
Exactly.
They've all been given away.
Yeah. Well, and you go through, and some of it is not necessarily an evil intent. It's just the byproduct of,
hey, we didn't realize that this would tie into this, to tie into this, to tie into this,
and the design, or that just wasn't the focus. That's why really the conversation,
and I think where change is going to come from is building in that privacy by design.
So the privacy is no longer an afterthought that we've moved it ahead on the prioritization list and where the default isn't, oh, we're going to put this little tracker here and this little
cookie here. And this little thing is like, no, let's work on de-identifying that information by design
and, or, you know, by default, that that is where we go first is that if we are tracking
people across apps, across devices, across different things, that that is a creation that was an intentional choice by the company, by the,
you know, by the engineers, by everyone involved, instead of you have to not only you have to opt
out, but you have to jump through so many hoops. And at that point, is it really effective?
So it's kind of flipping the script.
You said privacy by design and de-identification by default?
By default.
So, and again, it's let's build the privacy in to the initial designs.
That when we design how a system's going to work or what the product's going to, you know, the software is going to do that privacy isn't an afterthought that, Hey,
how are we protecting this information? How are we doing this? And instead of the default being,
Oh, well, let's pick up, let's use these cookies, trackers, uh, different identifiers. Instead, let's build it so that the default is that it
doesn't track, identify. That you have to, instead of having to opt out, you have to opt in.
I mean, I wish everything was like that. Yes. That would be nice. Yes. And it's really just
raising, starting to shout it from the rooftop and raising awareness of that, I think is the
first step. And then you have people like you, we saw with some of the researchers that were over at Amazon and Google who kind of were raising these concerns
and raising the alarm of this. And so again, it is a utopian idea, but one that I think if we
shout loudly enough about and then start implementing that we will see some movement on the needle.
Do you have any questions, Christopher?
I did have one, and I like to ask it of people whose careers seem cool.
Do you have advice for people who want to get into the infosec world and maybe cross that with
the legal world who are in college or thinking about their careers early
on. Yeah. And the granted, I fell backwards into it. Uh, if you had told me 20 years ago,
I would be sitting here talking about stuff with y'all now coming out, uh, you know, working at a
startup, my second start, like cybersecurity startup, I would have been like,
and you're obviously day drinking. And whatever it is, it's amazing. Share it with the rest of us.
But it's being willing to ask, follow the curiosity, persist and pivot. That if you see something that strikes your fancy
and you want to do, go for it and keep going for it. And if an interesting opportunity comes along
that, I mean, I worked in capital markets, commercial mortgage-backed securities,
right before the crash. And the comment from my father was, after we crashed world economies
with CMBS, he's like, well, I finally understand what you do. Because the Wall Street Journal says
y'all are now responsible for this. I was like, oh. But that was the moment I was able to take my, you know, the side stuff that
I was doing and say, well, my main gig, like area of, of practice just crashed. Uh, maybe I start
doing the stuff that I think is fun. And so don't be afraid to follow that and do the fun, I say fun, but do the stuff that you find curiosity and sparks of
joy in. Cool. Liz, it's been good to talk to you. Do you have any thoughts you'd like to leave us
with? Thank you so much. I, once again, I appreciate and have enjoyed the conversation
and cannot shout it enough. Well, and first of all,
I'd be remiss if I didn't say thank you, Scythe. Thank you to all of our customers, et cetera,
et cetera, Rural Tech Fund, but cannot shout it from the rooftop enough, especially as with an
audience of engineers, privacy by design, de-identification by default in what you're doing at work, as well as what you're doing as a consumer and a user of different technologies.
Our guest has been Liz Wharton, Vice President of Operations at Scythe.
She's also on the advisory board of the Rural Tech Fund.
And of course, links will be in the show notes.
Thanks, Liz.
Thank you. Thank you to Christopher links will be in the show notes. Thanks, Liz. Thank you.
Thank you to Christopher for producing and co-hosting. Thank you to our Patreon listeners
Slack group for questions. And of course, thank you for listening. You can always contact us
at show at embedded.fm or hit the contact link on embedded.fm. And now a quote to leave you with.
I was going to do war games, but I think Calvin and Hobbes is the way to go from Bill Watterson.
Mrs. Wormwood says, Calvin, can you tell us what Lewis and Clark did?
Calvin says, no, but I can recite the secret superhero origin member of each of Captain Napalm's Thermonuclear League of Liberty.
Mrs. Wormwood, See me after class, Calvin.
Calvin.
I'm not dumb.
I just have a command of thoroughly useless information.