The Changelog: Software Development, Open Source - Schneier on security for tomorrow’s software (Interview)
Episode Date: May 20, 2022This week we're talking with Bruce Schneier — cryptographer, computer security professional, privacy specialist, and writer (of many books). He calls himself a "public-interest technologist", a term... he coined himself, and works at the intersection of security, technology, and people. Bruce has been writing about security issues on his blog since 2004, his monthly newsletter has been going since 1998, he’s a fellow and lecturer at Harvard's Kennedy School, a board member of the EFF, and the Chief of Security Architecture at Inrupt. Long story short, Bruce has credentials to back up his opinions and on today’s show we dig into the state of cyber-security, security and privacy best practices, his thoughts on Bitcoin (and other crypto-currencies), Tim Berners-Lee's Solid project, and of course we asked Bruce to share his advice for today’s developers building the software systems of tomorrow.
Transcript
Discussion (0)
what's up welcome back this is the changelog thank you for tuning in my name is adam stachowiak i'm
the editor-in-chief here at changelog if you're new to the pod head to changelog.fm for all the
ways to subscribe if you're a long time listener thanks for coming back thank you for tuning in
if you haven't yet change all plus plus that's our membership it's for diehard listeners they
want to directly support us they want to drop the ads and they want to get a little
closer to the metal with bonus content and more. On today's show, we're talking with Bruce Schneier.
Bruce is a cryptographer, computer security professional, privacy specialist, and writer
of many books. He calls himself a public interest technologist, a term he coined himself,
and works at the intersection of security,
technology, and people. He's been writing about security issues on his blog since 2004.
His monthly newsletter has been going since 1998. He's a fellow and lecturer at Harvard's
Kennedy School, a board member of the EFF, and the chief of security architecture at
Enrupt. Long story short, Bruce has credentials to back up his opinions. And on today's show,
we dig into the state
of cybersecurity,
security and best practices,
his thoughts on Bitcoin
and other cryptocurrencies,
Tim Berners-Lee's
solid project.
And of course,
we asked Bruce
to share his advice
for today's devs
who are building
the software systems
of tomorrow.
A massive thank you
to our friends
and our partners
at Fastly
for having our back.
Our CDN back,
that is, our pods, our assets, our everything and our partners at Fastly for having our back. Our CDN back, that is.
Our pods, our assets, our everything is Fast Globally because Fastly is Fast Globally.
Check them out at Fastly.com.
This episode is brought to you by our friends at Influx Data, the makers of InfluxDB.
In addition to their belief in building their business around permissive license open source
and meeting developers where they are, they believe easy things should be easy.
And that extends to how you add monitoring to your application.
I'm here with Voychek Kajan, the lead maintainer of Telegraph Operator for Influx Data.
Voychek, help me understand what you mean by making monitoring applications easy.
Our goal at Influx Data is to make it easy to gather data and metrics around your application.
Specifically for Kubernetes workloads, where the standard is Prometheus,
we've created Telegraph Operator, which is an open source project around Telegraph,
which is another open source project that makes it easy to gather both Prometheus metrics
as well as other metrics such as Redis,
PostgreSQL, MySQL, any other commonly used applications
and send it wherever you want.
So it could be obviously in FlexDP Cloud,
which we would be happy to handle for you,
but it could be sent to any other location
like Prometheus server, Kafka,
any other of the supported plugins that we have.
And Telegraph itself provides around 300 different plugins. So there's a lot of different inputs that
we can handle. So data that we could scrape out of the box, different outputs, meaning that you
can send it to multiple different tools. There's also processing plugins such as aggregating data
on the edge so you don't send as much data. There's a lot of possibilities that telegraph
operator could be used to get your data where you are today. So we've permitted metrics, but you can
also use it for different types of data. You can also do more processing at the edge and you can
send your data wherever you want. Vojtech, I love it. Thank you so much. Easy things should be easy.
Listeners, Influx Data is the time-series data platform where you can build IoT, analytics, and cloud applications.
Anything you want on top of open source.
They're built on open source.
They love us.
You should check them out.
Check them out at InfluxData.com slash changelog.
Again, InfluxData.com slash changelog. so we're here with bruce schne, who's a cryptographer, computer security professional.
If you haven't heard of Bruce, you need to.
He's been around a long time, been keeping a lot of us up to date with what's going on in cybersecurity, infosecurity, etc.
Bruce, thanks for joining us.
Thanks for having me.
Happy to have you.
So first of all, you call yourself a public interest technologist.
This is a term that many of us probably haven't heard.
What does that mean?
Yeah, and I want you all to have heard of it because I think this is an important term and way of thinking.
Public interest tech is kind of an umbrella term for people who somehow marry tech and policy.
And traditionally, these are very different worlds. Technologists deal
in computers and numbers and algorithms and true, false, yes, no. And policy people deal in
consensus and just different ways of thinking and problem solving. And we know when it goes
badly wrong, if you watch like, you know, the tech hearings in Congress or attempts to reform various tech laws.
And I try to occupy this space between tech and policy.
So I am teaching cybersecurity policy at the Harvard University Kennedy School.
I'm teaching cybersecurity to students who deliberately never took math in college.
Right. So you imagined a technologist working on a lawmaker staff
at a federal agency, I don't know, in the military for an NGO,
trying to figure out the policy of tech, the tech of policy,
all the ways that tech and policy have to work together
and can't be at cross-purposes.
I think this is important. I'm here at Harvard. So there's a field called public interest law.
20% of the Harvard Law School graduates go into public interest law. They don't work for a big
law firm. They don't work for a corporation. They work on immigration law and housing law and discrimination law.
And all of these things that actually pay very well but make the world better.
Number of computer scientists who do that kind of thing is like here is like zero.
Right.
They all go work for the big tech. But we need this career path of people who want to do good with a tech degree or go to law school after a tech degree or, you know, they have a law degree and learn how to program.
And so all of these kind of bridging ways of thinking, I think, are really important.
The fundamental problems of our society in this century are tech.
And if you don't understand tech,
how could you deal with, I don't know, future of employment, let alone algorithm discrimination?
Right. Yeah.
So that's what I'm really trying to preach and model and push for.
Yeah.
And it's not just me. Ford Foundation is trying to fund public interest tech programs
at universities around the country. And they invented public interest law in the mid-70s,
so it's kind of good for them. And this notion that we need to train people to bridge that gap,
to have one foot in both camps. Yeah. What do you think is a more viable path?
Is it taking lawyer types and teaching them computer science and security or taking computer
science types and teaching them law and getting them interested in foregoing what are lucrative
salaries, right? Very relaxed work environments in many of these big tech companies. What's the
way to get it done? You need both. You need all ways to get it done, right? You know, so ACLU, right, pays what? One third to one tenth that you can, money you can make as a lawyer
at a, you know, a big corporate law firm. And they put out an application for us attorney
and they get a hundred resumes. I mean, so there are lots of people who are not just pursuing money.
If there's a viable career path, you work for the ACLU as an attorney, you feel good about your life when you come home.
You're not working for some horrible corporate interest.
You're not doing something you don't believe in.
And I don't think the problem is going to be supply.
I think a problem is going to be demand and path.
And I want both. I want there to be a path for an attorney or a policy student to learn enough tech to bridge the gap.
I want a path for a CS student to learn enough law or policy to bridge that gap.
I'm teaching cybersecurity policy, and I will get CS students, and that's fantastic.
I'll get business school students.
I want that mix.
What was your path then?
So if path is important, what was your path to – you kind of mentioned the why this is important, but how did you get there?
What was your path to make this a thing for you or even care so much?
My path was becoming more general, and path stories are interesting right now, but every
one of us is an exception and it has an exceptional and unique path.
Right.
So it's not something that's mimicable because here I am in the early 90s doing tech.
I get fired from AT&T, get laid off. And I write a book about cryptography,
which becomes a bestseller because no one knew about cryptography, the internet's taking off.
And really, suddenly, I'm in the middle of this tech renaissance. I'm good at explaining,
I'm good at making tech accessible, and I naturally get drawn into
policy. And as I start generalizing, I write about the mathematics of security. Then I write about
computer security, network security, security policy, the economics, the psychology of security.
And then my latest books are about the public policy of security. So I'm coming at it from tech, but I'm making it up as I go along.
Be a bestselling author is not a viable career path.
Can't all do that.
It is a fun thing to do, and I recommend it.
But if that's the only way, we're not getting anywhere.
So you mentioned your books, and I do have to thank you.
So when I was back in college, I was studying computer science and information security.
And I was knee-deep in Diffie-Hellman key exchanges and the one-way hashing algorithms
and really staring right at the trees.
And I was assigned to read Secrets and Lies, which you wrote the second edition.
I think you wrote the original one pre-9-11.
This was the post-911 update. And in that
book, you really made it clear to me how tangible and applicable these technical nuances and details
that I was studying actually affect the real world. And it was very useful. And so I appreciate
you writing that one. Of course, you've written many books since then. It's interesting.
Your story's got a few holes in it.
So Secrets and Lies is my second book, came out in, I think, 2000.
I can actually pull it off the shelf and check.
I never updated it.
In 2003, I wrote a book called Beyond Fear.
And that's where I actually talk about the terrorist attacks of 9-11.
You're holding up the paperback, which might have been issued post 9-11, even though it was published.
So I have chapter one, the introduction was copyright 2004.
Oh, so in the paperback, I wrote a new intro. Yeah, they make you do that.
Okay. So that's the one that I got.
Right. Because people think it's a new book, but you just wrote like four new pages. So fooled you.
Good play. Good play. Good
play. Yeah. Well, it was interesting. The reason why that I was kind of flabbergasted when you
said that is because in the intro, you do say in that one, that when you are making this update,
one thing that surprised you is how little had changed in between the two from 2000 to the 2004.
Interesting. And that's interesting because now we're, what, like 20 years from there,
and it's kind of like, I wonder, would you still say that,
or has so much changed since then, in the world of security specifically?
You know, it's interesting.
I mean, a lot has changed and not a lot has changed.
I mean, people still read Secrets and Lies and get a lot out of it
because a lot of it is still true.
But what's the threat landscape is now weirdly different.
You know, we worry about nation states in ways we weren't.
Ransomware didn't exist back then.
Business email compromise wasn't anything I wrote about.
I mean, a lot of the business of cybercrime and almost like the business of cyber espionage,
it's become institutionalized in a way that, you know know we didn't really think about 20 years ago
right but it's surprising how much is the same the stuff on passwords is the same the stuff on
firewalls and idss and network security is the same right so both the same and different which
i think is interesting yeah it's almost like the foundations are still there, only everything's kind of just
escalated, gotten more mature, more, like you said, it's been businessified.
I remember like the early worms and stuff, like people would do them as jokes or on accident
and like cause major harm.
And then at a certain point, it seemed like people realized, well, if I have this virus
or if I have this attack, actually, if I keep it secret and
don't let anybody know about it, I can actually do a lot better for myself and make a lot more
money. And now you've got the Russians who do this for espionage purposes. I mean, SolarWinds
was, you know, not a worm in the same way, but you think about WannaCry, NotPetya, and a lot of
these nation state attacks. I don't know if we really thought really about the Internet as a battlefield in the same way.
We were still believing John Perry Barlow's Declaration of Independence in cyberspace, that nations couldn't touch us there.
What's the most sophisticated or impressive current hack or technique that you've seen
you know in in modern era what's really impressed you is like sometimes these things are so clever
and interesting the way that people actually go about them you know solar winds was pretty clever
right subverting the update mechanism of a random piece of network management software you didn't
even know you had in a way to subvert
14,000 networks worldwide and then pick and choose who you want to actually attack and
go in and lay whatever groundwork you need so they can't possibly ever kick you out unless
they burn their network to the ground, which nobody ever does. That was pretty impressive. What's interesting, I think, is to think back at the NSA documents that we saw because of Snowden.
This is 2013.
So it's almost a decade old now.
And a lot of that was really impressive.
They had exploits that would survive reinstalling the operating system,
like wiping the computer and rebuilding it from scratch.
And that was 10 years ago.
Yeah, right.
Hasn't advanced since then, surely.
And they haven't done nothing in the past 10 years.
So I think the impressive exploits are the ones we don't see.
And you never use them when they can be exposed.
If you are an intelligence organization,
Russians, Chinese, Americans, Brits, whoever,
you never use a more sophisticated attack
than you absolutely have to.
You hold on to the best stuff for later.
You hold on to the best stuff until you really need it.
If you got a 10 and a 3 will get you in,
you're going to use a 3 or maybe use a 4 to make sure.
You save the 10 when you need a 10. You don't waste it. So the sophistication,
you don't almost need really. The fact that there are now business models for ransomware,
it's organizational sophistication as opposed to technical sophistication.
What's the actual business model of ransomware these days?
What is the business model?
The business model is to ransom and to get money.
Okay, that's easy.
But there are organizations that do this.
Ransomware is a service.
You can rent ransomware capability.
There are criminal organizations that specialize in getting in.
One specializes in getting the money.
They want to specialize in turning the Bitcoin
into actual cash you can spend.
There's a whole supply chain,
international criminal supply chain.
That's incredibly sophisticated.
That all is in the service of ransomware.
One thing I heard you say about ransomware,
which is interesting to me,
I would love for you to elaborate even more on it, is that it takes advantage of the fact that most people's data actually isn't all that interesting to anybody except for themselves.
And this is, I think, the fundamental insight.
There are really two, and that's the first one.
If I steal your data, what do I do with it?
I can sell it.
The only freaking person who wants to buy it is you.
Nobody else wants your photos.
Nobody else wants your email.
If you are an important celebrity, then yes, I can sell your stuff to somebody else.
But for the average person, the average company, no one else cares.
So that's insight one.
Not to steal your data, but to block you from having it and then sell you your access
back. I mean, I think that is a enormous insight. And whoever thought of that was being incredibly
creative. The second thing that makes ransomware possible is Bitcoin. You cannot, criminals can't
use the banking system. So two so two problems right you criminals are
prohibited from using the real banking system and suitcases full of hundred dollar bills are really
heavy the only way for me to pay a ransom is through a cryptocurrency and i'm not making this
up you go to your bank and try to wire fifty thousand dollars to a russian account i mean
just try you You can't.
Not like it's hard.
It's impossible.
What if you say, but they kidnapped my daughter.
I have to do this.
You can't do it.
You will not be able to do it. The banking system will not let you wire money that way.
It's not going to do reputable business.
It can't move.
There are a lot of banking regs to stop you from doing that.
So Bitcoin makes ransomware work.
How do you feel about that?
Does that make you negative Bitcoin?
How does that make you feel about Bitcoin?
It does not make me negative Bitcoin.
Bitcoin is completely stupid and useless for all sorts of other reasons.
This is just an ancillary bad thing.
No, I mean if ransomware didn't exist,
Bitcoin would still be stupid and useless and idiotic, and we hope it dies in a fire as soon as possible.
Who's we?
Everybody who does security, basically.
Okay.
Why is that?
Why do they have that feeling?
Because it doesn't solve any actual problems anybody has.
It isn't decentralized.
It isn't secure.
It isn't anything.
It causes all sorts of other problems and has absolutely no value at all plus speculative bubble people are losing lots of money
gotcha so what about censorship resistant money exchange like the concept of bitcoin with the
peer-to-peer exchange of money you think there's value in that concept of not having an intermediary between the two of us?
No, intermediaries serve value. I mean, there's no value in a system where if you're exchanging
money with somebody and they're a millisecond slower than you, you lose all your money.
And there's no recourse. That is not valuable. Sure. There's no value in a system where if you forget your password, you lose your life savings.
That's just dumb.
Intermediaries have value.
That's why they exist.
They're not just there because they hate us.
You want to exchange money?
Use Venmo.
It works great.
Why don't you like it?
I like Venmo.
Yeah.
I use Venmo.
Right.
We all like it. And most people who actually think they have Bitcoin don't you like it? I like Venmo. I use Venmo. Right. We all like it.
And most people who actually think they have Bitcoin don't actually have Bitcoin.
The blockchain does seven transactions per second.
Most people on Coinbase, they don't actually own their Bitcoin.
Coinbase has a database, just like Venmo, that allocates ownership.
The whole blockchain is largely a myth for most users.
So one aspect of the idea of using cash,
I'm talking about actual physical cash,
is privacy concerns, right?
So you talk a lot about government espionage,
spying, et cetera.
And of course, digital currencies are easy to spy on
what your citizens are doing, et cetera.
Now, Bitcoin, public ledger, of course, that's easy to spy on and track as well.
Right. Easy to spy on. So the notion of that it's private isn't true.
I mean, Bitcoin's built on a whole lot of lies.
Right. Well, I was wondering what you think of if you look at other privacy coins, people doing things like with Monero and Zcash, if you think there's any value in those.
Not really. I mean, yes, it facilitates ransomware,
facilitates a whole lot of crime.
I know your customer rules,
anti-money laundering, anti-terrorist financing.
I think these are things that are valuable for society.
I wouldn't toss them out the window.
You talk to people who deal in child exploitation,
and the fact that you can move that money around
without a government stopping you, it not good it harms people it's like uh this might be a little left field but
tinder swindler on netflix have you seen that i don't even know what this is it doesn't sound good
well yeah long i'll give you the tldr somebody who was uh just able to con people with large
amounts of money various various bank accounts.
He would con one person to con another person, credit cards, bank accounts that didn't match
his name. So when it comes to your customer and attaching a bank account to a cash app account
or to a Venmo account that doesn't match your name, that's anti-money laundering, right? That's
what you're speaking of. And so I mentioned that because this person never really used a bank account or a credit card that was in his name.
It was always somebody else's.
You know what I mean?
So when it comes to these intermediaries, I don't know how he's able to bypass this stuff, but that's what they do.
They say, okay, your account is in this name, Adam Stachowiak.
Does your bank account that you're attaching, money in or money going out match that same name?
If not, we're going to flag it for anti-money laundering.
And you have to prove you own it by way of W-2 or some sort of tax form or something.
W-2 wouldn't fit there, but some sort of tax form that you filled out that says this account is yours or whatever.
Something, right?
So that's where that comes into play this intermediary
benefit yeah i mean i think there's real value in governance i mean you know like we need governance
and you saw this in a lot of the hacks right the notion that blockchain money is secure they are
hacked all the math is never hacked the exchanges are hacked the are hacked. Right. Everything else is hacked all the time.
And there's nothing you can do about it.
Or it's a complete con.
Or it's a complete con.
In a lot of ways, all of this blockchain-based finance is speed running 500 years of financial fraud.
You've got wildcat banks.
You've got Ponzi schemes.
You've got unregulated securities.
You've got pump and dump. I mean got unregulated securities. You've got pump and dump.
I mean, it's all there, front and running.
And it's all illegal in the normal banking world because it's bad and should be illegal.
But because nobody's regulating these blockchain-based systems yet, a lot of people are losing money.
I mean, another fun experiment, go to
Twitter and type, I'm having trouble with my Bitcoin wallet. Can anybody help me? No, thanks.
You will get a lot of responses who will help you. And if you follow their advice,
you will unwittingly give them control of your account. Wow. That is the way that fraud works.
And it's Bitcoin. There is nothing you can do about it.
Period. Done.
This is not great.
This episode is brought to you by Century.
Build better software faster.
Diagnose, fix, and optimize the performance of your code.
More than a million developers in 68,000 organizations
already use Century, and that includes us.
Here's the easiest way to try Century.
Head to century.io slash demo slash sandbox.
That is a fully functional version of Sentry that you can poke at.
And best of all, our listeners get the team plan for free for three months.
Head to Sentry.io and use the code changelog when you sign up.
Again, Sentry.io and use the code changelog. So fraud leads us to one of the problems we have in computer security, which is social engineering, right? Fraud is just sophisticated social engineering.
You're tricking somebody into doing something that benefits you and doesn't benefit them.
It seems like those kind of things, like education, is really the only solution to that particular problem.
Is that what you think?
You know, not really.
Education is a lot, to me, victim blaming.
If you were smarter, you wouldn't have fallen for that
i think that's a convenient crutch to hide bad design okay right so think about some of the
security advice that we're given don't click on a random url it's a url what am i supposed to do
with it right don't stick a usb stick into your computer it's a wait wait what kind of dumb advice is that what i mean it's a usb stick the real problem to me is how can we design systems so that clicking on a url
isn't dangerous that's a design problem anytime i think you see a the user did something wrong
and bad thing happened or educate the user go Go a little deeper and look at the design.
What is it?
The design that forces us to throw this on the user.
We don't talk about,
Oh,
I don't know,
you know,
various salmonella and chickens or something and say,
well,
the user has to check.
No,
we have health codes,
right?
You know,
you got sick at a restaurant.
You should have gone into the kitchen and done an inspection.
Why didn't you?
We don't do that.
I'm doing that next time.
I'm just kidding.
I mean, I think we need to design systems so that naive and uneducated users can be safe.
Right.
I'm flying tomorrow, first time in a while, kind of exciting.
I'm going to get on the airplane, and I'm not going to inspect the engine.
I'm not going to look at the flight logs.
I'm not going to check the pilot's training record, or did he have a mandatory rest period.
I'm not going to do any of that.
I'm going to get on the plane and not even think about it.
I don't even have to know what the safety looks like.
It's naturally done for me by a government.
We need computers to be more like
that. It can't be that you need to be an intelligent user to safely use the internet.
That's not going to fly, so to speak. So then the next question logically is,
well, how do we get there? And it sounds like the answer is policy.
And to me, it comes back to policy. I mean, because the companies want to blame the user.
You know, the companies love that we blame the user for security issues because they don't have to fix anything.
You know, so I think it's all there.
Answers in regulation and liability.
The markets don't reward safety and security pretty much ever.
And if you want to get that right, if you want restaurants that, you know know won't poison you or drugs that are safe
or cars that won't blow up on impact that is always government intervention that's that's the
way we do it pajamas don't catch on fire you know whatever it is we whatever it is we inside you
like blankets yeah gosh well what's crazy is how data breaches are becoming normalized and they are
normal and right right but and the is, whose fault is it?
So we talk about the Harvard Law School.
They deal a lot in partial liability.
There's a car crash.
It's this driver.
It's that driver.
It's this car.
It's that car.
It's the road conditions.
It's the signs and the way that the road is designed, the weather.
And they figure out who's at fault and
by how much. We don't do that in the computer world. We don't really have that notion of
liability. But some of it's going to be the fault of the vendor. SolarWinds, why did they have a
faulty product that allowed the Russians to break into their update system and send a hacked backdoored update to 14,000 customers?
You'd think they'd have some liability here.
I mean, it wasn't my fault.
Yeah.
I just had this conversation, actually, for an upcoming episode of another show we have called Founders Talk.
You have more than one episode?
More than one show? Is that allowed?
Yeah, we have six different shows and maybe more in the future.
There's no regulations, so we just do what we want.
It keeps you busy, I guess.
The conversation was really around incident management, but the opposite of that, which is reliability.
And this idea that as part of incident management and this pursuit
of reliable software, part of a good design and hierarchy of an organization is this idea of
service ownership. So when you speak to like SolarWinds and who's at fault, some organizational
things can happen to sort of showcase service ownership. So if you have unreliable software
and you get called for pager duty, that's one way to say who's not so much at fault, but who sort of showcase service ownership. So if you have unreliable software and you get called for pager duty,
that's one way to say who's not so much at fault,
but who sort of owns it.
Could that kind of stuff begin,
like maturing engineering departments essentially,
could that begin to help more information,
more evidence to showcase who's at fault
and to how much when it comes to these kinds of hacks?
I think it makes some sense.
We're going to need to figure out
the way to really think about liability as a improvement tool is to look at who can fix the
problem. You want in general in society, whoever has the ability to fix the problem to be in charge
of the problem. So, you know, credit cards is probably a decent example. In the old days, in this early 70s, you were liable for credit card fraud on your card, right? Someone stole
your card, charged a bunch of stuff, you were liable. Now, you couldn't fix the problem.
Congress passes the Fair Credit Reporting Act 1978, and now the maximum liability for the
customer is $50. So now the credit card companies are suddenly losing money
due to fraud. So they do all sorts of things, right? They're fixing the problem, right?
They fix it, right? They start doing a real-time verification of card number
with these terminals. They start doing better anti-counterfeit protection, holograms and
micro-printing on the cards. They have the card and the pin. They mail you the card and the activation separately.
You know, all of these things.
And the biggest thing is they have these giant expert systems in the back end
looking at your spending patterns for patterns of fraud.
None of that the customer was able to do.
So pushing the liability onto the companies was for society better because society
could fix it. So if you think about SolarWinds, if I'm a SolarWinds customer, I get an update,
I install it. You want me to do that, right? We want people to install updates. If we want the
update to be safe, that has to be SolarWinds' problem. No one else can fix that.
So from a societal perspective,
I want them liable for defects in the update because only they can improve the process.
The customer can't.
And then it becomes a thing they can leverage
in terms of competitions.
Like, well, who's better at keeping their software safe?
Who's better at keeping their software more reliable?
Well, this company, so I give them my business.
It becomes a competitive advantage.
Yeah, somewhat.
That tends not to work.
It tends not to be a market driver.
Think about it.
No airline advertises themselves as,
we have fewer crashes than the other guys.
Sure.
Nobody.
They don't want you to think about crash.
And they're like, don't mention the word crash.
Don't say bomb.
Right.
Cars don't. The exception was Sa was sob like in the 80s they would
advertise we're a safer car but pretty much nobody does yeah right restaurants supermarkets like they
do not compete on these no salmonella here right right no salmonella here big sign no salmonella
here you never see that and you're right they don't want you to think about salmonella here. Big sign. No salmonella here. You never see that. And you're right.
They don't want you to think about salmonella when you're buying your chickens.
Truth.
Right.
So this isn't something the market can solve.
It is rare that you see market solutions for safety and security because they tend not to be things that are salient when someone makes a purchasing decision.
It's price and features.
Yeah, and convenience.
We've seen it over and over again.
Convenience is a feature, yeah.
Yeah, it is, but we'll trade our security,
our privacy for convenience.
All the time.
We do it all the time.
And it makes perfect sense.
Yeah, on the margins for sure.
So has any of these big breaches or cases been litigated in a sense that has brought
the liability back to the vendors or is it just not the case? Not liability. I mean, there has
been litigation. I'm not up on current state of litigation, but there are there class action
lawsuits. There are some regulatory fines. They tend to be rounding errors. The exception is going to be Europe and GDPR and privacy violations.
Europe is the regulatory superpower on the planet.
They do issue fines that companies notice and don't say, oh, yeah, that was cheaper than the attorney fees.
We'll take it, which the U.S. tends to do.
But not enough. You know, one of the problems with litigation as a driver of social change is that almost all cases never get to court where a judge decides.
They're almost always settled in private with nobody admitting any wrongdoing.
Even ones that the FTC brings to bear in companies.
So they tend not to be good models for others going forward.
I'm not sure how to fix that, but that seems to be a problem we're having.
What's your take on GDPR?
Are you happy with it?
Do you think it's worked out the way that they wanted it to?
It seems like to me, in a practical sense,
there's just a whole bunch of cookie banners now that weren't and it's like was that the intended yeah i mean
in a sense gdpr was medication to to stop the pain rather than medication to fix the illness
it was a good start it probably did what the uh people who wrote it thought it would
but there are too
many loopholes too many ways to get around it too many things it doesn't do so you know we can't
stop there but it is doing some it is not completely useless this only puts more pressure
on your point which is policy this idea of tech and policy right like we need more people to
have an understanding of technology to be involved in policymaking so that the – this is an iteration. Like you said, it's a beginning. I think if we're in software, we have to believe in iteration. So we have to believe in, I would imagine, iteration at the policy level as well.
So while GDPR may be a start, it's got to be something that begins in evolution. And that begins with more and more people,
as you had said, this vacuum that's there and demand for people involved in tech and policy.
I think that's right. And these are not easy problems we're talking about.
Now, the public policy of tech, I mean, look at the current battles on Section 230 and
free speech on Twitter and sort of all of these, these are not tech problems.
These are policy problems.
These are human value problems.
These are what kind of society you want to live in problems.
They're informed by tech.
You have to understand tech to think about the problems,
but you're not going to solve them with tech.
Tech is going to be a part of the solution.
So, yes, very much so.
One thing that has come up to me, though, with policy and I think even not so much to
go back to Bitcoin, but more so this idea that I think people believe in or want to
believe in this idea of a decentralized currency and crypto and Bitcoin is this lack of trust
in government.
I don't know.
I mean, I think if you don't trust government, you've got way bigger problems.
Well, isn't that what they do?
They're trying to hedge their bets against fiat currency that's controlled by government.
Yeah, you know, no, it's just a bunch of libertarian crypto bros.
It's not actually a legit sensical philosophy.
Sure.
You know, I don't buy it for a second.
Well, there's some out there that believe that,
even if it's not the majority, right?
I mean, a lot of people believe it.
It doesn't even make sense.
The point I'm trying to make to or to get to is less that,
but more so this idea that if we want to believe in policy change
and policy updates, which we do want,
I think we have to begin to trust our government more.
Or the people that trust it
less, they need to have that, that faith in it. And you mentioned Snowden and spying on folks like
that kind of stuff doesn't make you trust your government more. It makes you trust them less.
So what are your thoughts on like government trust? Yeah, it doesn't. And you know,
also the, the far right paranoia on government can't do good. There's a lot of anti-government
fear being stoked by people
who have ulterior motives. You know, I mean, the people who want you to mistrust government are
the people who want to poison your water supply and don't want anybody to stop them from doing it.
So, you know, I mean, yes, I did a book on trust. You have no choice but to trust your government.
And government actually does a lot of good in our world. But yeah, I think you are right that mistrusted government is a problem here
and a bigger problem than this, and it is one that we do have to solve
to figure out how to get back to the notion of good government doing good things.
Right. Well, it doesn't help when, as technologists,
we see these congresspeople questioning, talking about technologies and they're completely, you know, out of their depth.
They have no idea what they're talking about. It's hard to trust that person. Yeah.
All right. Remember who asked Mark Zuckerberg, how does Facebook make money? A legit question asked at a Senate hearing.
Like you people are trying to govern this and you have no idea that Facebook makes money by selling ads.
Right.
We sell ads.
Which is why I think skepticism of government regulation in that circumstance is, I think, well-founded.
Having said that, you're trying to change that.
But no government regulation is worse.
That's the problem.
Sure.
I guess my point I'm trying to drive at is you're trying to change that by having a more well-informed policymaking body, right?
Like you're trying to instruct.
I'm sure – do you advise policymakers as an expert?
I have.
It is not – I mean I know people who do that sort of full-time who work on congressional staffs and committee staffs, and they do really good work.
I mean I do some of it,
but that is not the one thing I do. But, you know, I'm trying to teach here a generation of people going into public policy, teach them how to, you know, listen to technologists,
figure out what they're saying. I'm really trying here.
Yeah. So you're talking to an audience of software developers and technologists,
you know, what would you teach us or instruct us? What can we do, you know, in our little part of
the world, whether we're an independent contributor on a large code base, or maybe we're starting a
new business in a software as a service, you know, we're building these things of the future.
What are the kinds of things that we can be doing now to push things in the right direction versus
the wrong? You know, I want us to think about the policy implications of that we can be doing now to push things in the right direction versus the wrong?
You know, I want us to think about the policy implications of what we do.
So this is actually interesting.
A few years ago, Google invented a new job title.
And it was, I think it's called Project Council.
And so here's the idea.
That in the old way of doing things is the engineers would build the thing. And at the end, they'd show it to the attorneys and say, well, we go to jail if we do this.
Is this good? Is this bad? And the attorneys would give an opinion.
And Google realized it's way better to embed the attorneys into the design team from the beginning where the changes are cheaper.
Right. Where the attorney can say, you know, if you did it this way and not that way, it's better.
And that's what Google does.
It's a great idea.
I think we need staff policy people, right?
I want a policy person on the design teams of these systems from the beginning to do
the same thing, to say, you you know if you did it this way
your thing won't be racist isn't that better yeah instead of like at the end when it's too late and
suddenly your system is racist and everyone hates you so you know i want us as developers, as techies, to be more open for non-tech input into our designs and development from the beginning.
I think that is incredibly valuable.
And if we can take into account human flourishing, the environment, lots of policy things. I think that that would be better.
What's the path then to get, so if this is something you think could be on the up and coming
SaaS, for example, or the up and coming next thing happening that is, you know, maybe
a well-funded company, $50 million series A, you know, half a billion
dollar valuation, which is pretty common for a SaaS business. How do they find that kind of
person? Are they going through your course? Like where is the... Yeah. So this is the hard part,
right? This is, we started with this, right? What's the career path? Right. We're back to
the beginning. Yeah. And these jobs are out there. You know, my students are getting hired by tech companies to do tech policy, but there's
no good job board.
There's no way I can say here, you want to do this?
Here's where you go.
We're working on it.
My Ford Foundation is trying to build these paths, these systems, but it's not yet there. So i don't have a good answer and that's bad right
i mean i want to have a good i want to have an easy answer your question right you want to do
this go do this thing yeah yeah right and there's a career path for you you could just go on twitter
and say i have uh some policy help needed and hope you don't get hacked or swindled or whatever it
might be right maybe that could be one path that's getting right to your bitcoin wallet address here for policy yeah you know and everyone
i know who finds these jobs they're all exceptions and i do i do try to pay attention because i got
a lot of students ask me you know i'm looking for a job and a job what do i do yeah but you know
certainly facebook and google and the big guys hire them but i'll give
you a lot of my students don't want to work for them because they're like evil they want to work
for some smaller more interesting company that's doing some social good so you mentioned this good
idea inside of google you mentioned that we should have you know policy decisions coming in in the
beginning when we're starting software projects and And it's making me think of sharing ideas, like idea sharing, like this is a good policy.
It makes me think of open source.
And we talked about how cybersecurity has kind of grown up over the last 20 years, gotten
way more serious, ratcheted up the stakes.
Open source has also matured during that time and gotten corporate and everything.
Both good and bad, yeah.
Yeah.
And I'm just curious, like, how does open source weave into the story, if at all? And what do you think is good and bad, yeah. Yeah, and I'm just curious, how does open source weave into this
story, if at all, and what do you think
is good and bad about it? I don't think it weaves
into the story. Open source is a
thing. There's a myth
that open source is more secure than
closed source. That's not true.
Software that's more secure
is software that's been looked at.
There's two ways to have that happen.
One is you can be Microsoft and hire people to look at your software. And two, you could be Linux and
put it out there and lots of people look at it. But you could also be a company like most software
companies that doesn't hire anybody to look at their software. You can be like most open source
projects and nobody looks at it anyway. So open source is another path path but it is not sort of a magic elixir so i i don't think
open source closed source really matters here in any important way right i was thinking more like
open source ideas applied to policies right applied so then now they're now here we're
getting interesting and and it's open source ideas. It's agile computing ideas.
Right.
How do we make policy at the speed of tech?
That's actually hard.
The story I'll tell in class is about drones.
And if you remember the history of drones,
drones start appearing on the consumer market,
and everyone says you can't regulate drones.
It is too early.
You will destroy the nascent industry and then one year
everyone gets on for christmas and then you can't regulate drones it's too late everybody has them
we're already flying them right there was never a moment when it was right to regulate drones
now this is i think a microcosm of of the problem we have in the beginning you don't know what to do it's too
early to do it and at the end there are too many uh i don't know rich uh lobbyists preventing you
from doing anything so how do we navigate that this is actually i think a very big problem of
regulation in the 21st century and uh way bigger than security anything we're talking about
and you know it's something that we really need to think about now can we use the ideas of open
source or agile right agile software development and apply it to legislation apply it to policy
i think the answer is yes i don't know how but we need to figure it out. Yeah.
What about the flip side of that on open source in terms of an attack vector?
What are your thoughts as a security person?
You know, again, open source and closed source both have attack vectors.
We have seen open source attacked.
A lot of open source projects are very poorly maintained.
And so by a hobbyist who doesn't have a lot of security,
you should open source projects being taken over by malicious actors and
being subverted.
But,
you know,
you see a lot of this,
and that's a project software as well.
I'm not sure it's a difference that makes a difference.
It kind of does some interesting things to open source too,
because you,
to make open source more secure in some ways, you have to put money involved.
You have to put organization involved, potentially more people involved, eyeballs or just more watchers, which essentially turns it into like a mini organization, which isn't necessarily proprietary software.
It's still open.
It's still open source.
It's still permissory license, all that that good stuff which is the virtues of open source but it does create a lot of complexity around the idea of open source
and there's also a tragedy of the commons right if everyone's using this open source project
in their software everyone assumes somebody else is evaluating it and then nobody evaluates it
and we see this a lot with log4j was an example of that right everyone thought
someone else was paying attention log4j was actually just this guy and suddenly there's this
huge vulnerability so there is a fix happening now i think it's the uh open source foundation
has set up a program and they're getting the big tech companies to put in money. I think Google and Microsoft each put in $5 million.
We're all going to evaluate these open source projects.
So this third party is going to do the work.
The big companies that benefit are going to put in money, and everyone benefits.
And it's called, I think, the Alpha Omega Project.
The idea is they're going to look at the most popular and critical open source projects really carefully, which is the alpha,
and then like run automatic vulnerability scanning tools against the top 10,000 libraries.
That's the omega.
And, you know, can we sort of bypass the tragedy of the commons
and then get some real evaluation of these things
that it turns out we're relying on,
even though we don't realize it. This episode is brought to you by our friends at Fire Hydrant.
Fire Hydrant is the reliability platform for every developer.
Incidents, they impact everyone, not just SREs.
They give teams the tools to maintain service catalogs, respond to incidents, communicate through status pages, and learn with retrospectives.
What would normally be manual, error-prone tasks across the entire spectrum of responding to an incident, they can all be automated in every way with FireHydrant. They have incident tooling to manage incidents of any type with any severity with consistency,
declare and mitigate incidents all from inside Slack.
Service catalogs allow service owners to improve operational maturity and document all your
deploys in your service catalog.
Incident analytics allow you to extract meaningful insights about your reliability over any facet
of your incident or the people who respond to them.
And at the heart of it all, incident run books.
They let you create custom automation rules to convert manual tasks into automated,
reliable, repeatable sequences that run when you want.
You can create Slack channels, Jira tickets, Zoom bridges instantly after declaring an incident.
Now your processes can be consistent and automatic.
The next step is to try it free.
Small teams up to 10 people can get started
for free with all fire hydrant features included no credit card is required get started at fire
hydrant.io again fire hydrant.io and by our friends at source graph they recently launched
code insights now you can track what really matters to you and your team in your code base
transform your code into a queryable database to create customizable visual dashboards in seconds.
Here's how engineering teams are using Code Insights.
They can track migrations, adoption,
and deprecation across the code base.
They can detect and track versions of languages or packages.
They can ensure the removal of security vulnerabilities
like log4j.
They can understand code by team,
track code smells and health and
visualize configurations and services here's what the engineering manager at presley has to say about
this new feature quote as we've grown so has a need to better track and communicate our progress
and our goals across the engineering team and the broader company with code insights our data and
migration tracking is accurate across our entire code base and our engineers and our managers can shift out of manual spreadsheets and spend more time working on code, end quote.
The next step is to see how other teams are using this awesome feature.
Head to about.sourcegraph.com slash code dash insights.
This link will be in the show notes again, about.sourcegraph.com slash code dash insights. This link will be in the show notes again, about that sourcegraph.com
slash code dash insights.
One thing that's amazing to me about you, Bruce,
is just how long you've been going at it.
Stop telling me I'm old.
The second time.
First time was about the book you had when you were in college.
I'm getting tired of this.
Longevity.
I'm speaking to your longevity, not your age.
So you've been doing this monthly newsletter, The Cryptogram,
and I think I did subscribe to it after reading the book,
and I just subscribed to it pretty much my whole adult life now.
And since 1998.
That's three.
That's three.
Sorry.
My question is, what drives you?
Like, how do you stay so on this?
Like every month, this thing.
And what I find about it is a lot of times I just read the headlines because there's so much in there.
I mean, you're writing a lot.
You're logging a lot.
How do you keep it going, man?
So cryptogram is from like 1998.
I started a monthly newsletter. So that was back it going, man? So here's your story. So Cryptogram is from like 1998 I started, a monthly newsletter.
So that was back when email newsletters were cool the first time before they got uncool and now they're cool again.
Well, you're cool again now.
And then I turned that into a blog in 2004.
And that's like the second wave of blogs when blogs were cool before they were uncool.
And now I guess something else is cool.
I don't know what's cool now so the monthly cryptogram is now just a compilation of the daily blog
right so i don't know if you so you see it in the email some people see it on on my website
and i have been doing it pretty much every day the daily weekdays since 2004 that's all right a long time that's impressive and a lot of it is i it it
forces me to stay current right it forces me to start reading around and you know seeing what's
happening seeing what's being talked about and that's good for me i get a lot of my entries and news items from readers.
I get a lot of email, which is really useful to me.
So some people will send me links all the time.
And that is something I use to stay current.
So that I really appreciate that.
Any listeners who send me emails when they see a good crypto story.
Thank you. Keep doing that.
And then, you know, to me, writing is how I understand things.
So it's how I process the news.
It's how I process all of this.
So you're seeing my process of processing.
I just do it a little bit in public.
Yeah. Super cool. What would you say over the last, since you said you've been doing it daily
since 2004, but it's been longer than that. Maybe give us a trip down memory lane. What are some of
the biggest, most surprising things you saw in security?
Oh man, I don't even know. I'm terrible at memory lane. I really am.
Let's say the last five years, last couple of years, what are some of the biggest deals?
Well, I mean, I remember writing about September 11th terrorist attacks. And I mean, it was the first time I ever did an issue
out of sequence. And I wrote a bunch of articles, I think, and I go back and read it. And this is,
you know, September 30th, 2001. I'm writing about, I think a lot of things that were,
became part of the debate years later. I thought was really kind of interesting.
Didn't you coin the term security theater?
Security theater, right? I invented that term.
I think that's my contribution to popular culture, if that's what you want to call it.
Yeah, yeah.
It's the notion of security theater.
The other thing I was going to call it was Potemkin security. But it turns out that
surprisingly few people younger than me recognize the term Potemkin village. Yeah. Right. It is a
Cold War term that people don't know anymore. Did that term come out of the post 9-11 Patriot Act?
No, Potemkin village is from communist Russia. No, I mean, the security theater,
like when were you thinking about it? Security theater? Yes. No, I mean the security theater. Like, when were you thinking about it?
Security theater, yes.
Yeah, I mean, I coined the phrase soon after 9-11.
I mean, Wikipedia has the actual origin of where.
What does it mean?
What does it mean?
It means people are acting like it's secure, but it's just for show.
Yeah, security theater is, so the example I would use right after 9-11, I don't know if you remember, there were National Guard troops stationed at airports.
They were just inside security, off to the side in uniform, holding a big gun.
Those guns had no bullets.
Because, my God, a 22-year-old with a gun in an airport, what could possibly go wrong?
You do not want to give him ammunition.
But it was there to make people feel better.
It actually was theater to make people feel safer flying.
You got any modern examples of security theater things that are going on today, maybe?
You know, there's a lot of COVID theater.
Yeah.
Right?
There's a lot of health measures that make no sense.
Remember people wiping down their mail?
It's amazing what FUD will do to you, you know?
It is amazing. Wear the mask on the way into the restaurant but once you sit down you're safe right and you take it
off what is it what are we doing here yeah but you know and and some of that is valuable because
you know if people are more afraid than they should be then a little theater is good but some
of it is just makes no sense yeah it's perception's perception too. It's like a perceived threat.
It's all about perception because,
because fear is a perception.
Yeah.
Even to yourself.
Security is a feeling and a reality.
It's both.
And they are different.
Yeah.
You can feel secure when you're not,
and you can be secure and not feel it.
Yeah.
There's an old saying,
just because you're not paranoid doesn't mean someone's not out to get you.
And just because you are paranoid doesn't mean people are out to get you.
Exactly.
You can say it either way.
It's both true.
Yeah, that's why I like it.
Yeah.
That's funny.
Jared asked the question before about developers and what they could do, building systems tomorrow. And you kind of mentioned some of the things they could do, which was essentially find somebody in policy and hire them,
though the supply of them is challenging to find because the path is challenging.
What else would you share with today's technologists
that they need to know?
Things that you're preaching that software devs, engineers,
leaders of engineering departments, people building products
should know about the state of security today
that they don't know. The state of security, the state of engineering departments, people building products, should know about the state of security today that they don't know?
The state of security, the state of the world,
that we are used to thinking that what we do ends at the keyboard and screen.
And it turns out it's not true, that the stuff we write affects the world,
affects society, affects people, affects human flour flourishing and we need to think that way
we really do we need to think that you know when we build a software product we're building a world
and you know and it was an old story but i think it's a good story i don't remember friendster
friendster was a social network before myspace right and uh really old they had something called
the top eight you could have as many friends as you want like any social something called the top eight. You could have
as many friends as you want, like any social
network, but the top eight
would appear on your home screen.
It was eight. It was not six. It was not ten.
It was eight. Whatever reason, some programmer decided
eight. Multiple of two. Power of two.
We're good. In high schools
all across the country,
who your top eight friends suddenly mattered.
Now, the engineer just picked a number.
But wouldn't it be great if there was like a teen psychologist who said,
no, no, no, if you make eight, it's going to be a disaster.
Make it 12.
You must make it 12.
Engineer would say, okay, it's 12.
Unintended consequences.
Right.
And it used to be the unintended consequences didn't matter.
Nobody cared how Usenet worked because Usenet wasn't used for anything important ever.
No one kind of cared in the beginning how email worked.
But now it matters.
Now the unintended consequences can affect democracy.
And maybe we should pay a little more attention to that yeah so my advice is that your tech system
is not only a tech system it is a human system fundamentally and you need people to understand
that on your design and development teams so you're saying don't move fast and break things.
Move sliveredly and fix things.
There you go.
There you go.
Bruce, one thing I've noticed,
and you confessed it here today,
that you live in your email.
I guess, in a sense,
it's kind of your primary social network, right?
That's how you communicate.
Yeah, and I'm not on any social network.
I come from the generation where email is my life.
And, of course, kids these days don't use email. They use text. And they send me a text saying, Yeah, and I'm not on any social network. I mean, email is my life. I come from the generation where email is my life. Yeah.
And, of course, kids these days don't use email.
They use text.
And I send me a text.
What are you doing?
Just send me an email.
I was sending you a calendar invite.
You're like, please don't.
Just email me.
Don't send me a calendar.
Just send me an email.
Stop it.
Keep it simple.
It made me wonder about some of your personal practices, like some of the, whether it's privacy or security best practice like what do you do in your life of technology that may be different or unique or at least is notable for people who
want to be like you yeah you know i wouldn't recommend what i do to anybody because you know
a lot of the stuff i do that's unique is like not using normal technology like calendar invites
okay i mean i don't use the cloud for anything that makes me a weirdo, right? I don't
keep my email in the cloud. Is that because you know better? No, because I've always done it my
way and it, and I, that means I can do my stuff, but I don't have an internet connection. Okay.
Man, I hate Google docs. I mean, I don't do, and, and it does make me a freak and hard to get along
with. So, so I've hard pressed to pressed to uh to like give you my advice as
something to follow i'm i think i'm a cautionary tale of something to avoid okay but yet you do it
yet i do it right you know and i can get away with it because i could be ordinary and you still want
me on your show but if i was someone like you know less important you'd say who is this idiot
we're not we're not gonna interview him He doesn't even use a calendar invite.
Well said. Well said. You don't use the cloud. I think, are you involved, at least the company,
there's a tie to the solid project. I wanted to ask you about that project, Tim Berners-Lee,
solid. We talked about decentralized networks with cryptocurrencies, but here's one that's
decentralized storage. It's got, of course, Tim Berners-Lee attached to it. So it sounds
like it's interesting.
Are you attached to that somehow?
Are you working on that?
So I am.
So I'm a big fan of decentralization, which is you don't use a blockchain.
Okay.
Email is decentralized.
I can send an email to anybody regardless of what they're using, which is different than like a Facebook message.
SMS is decentralized.
Webpages are decentralized.
Decentralization is great and uh solid is a vision
of decentralized data the idea being right now your data is siloed right fitbit has your health
data and your phone has your location data and someone else has your photographs and on and on
and on wouldn't it be great if all your data was in one place and
you got to decide? And then you can do things with your data that you couldn't do otherwise.
You know, I don't know, my airline has a lot of my data in this figure flyer program.
So does the hotel program I like. They don't actually want my data. They just want access to it. My data was in my, and
the term solid uses is pod. I have control over it. I can see who accesses it. I can give permissions.
If my address changes, I just change it in my pod and it propagates everywhere.
You know, I had to fill out, I mean, I had to download an app because I'm going to Spain to type in my health
information to get a QR code. So when I land in Spain, I can show it and get into the country.
So I'm entering my data again and again and again, right? That doesn't make sense.
And once this app has, but I don't even know what they're going to do with it. I have no idea what
Spain's going to do with my data. Probably know they're going to sell it to Cambridge Analytica. They could, who knows? And this is a way of thinking about data that puts
people in control. It's actually a way that actually solves the problems that GDPR tried to
solve. So yeah, I'm involved. I think it's a big deal. He's really important. And I think it's
valuable. That guy is Tim Berners-Lee. It could conceivably change the world he has a track record for doing
that so so yeah i'm super excited about it what's the status is it is it usable is it private it's
a couple of things it is a w3c standard so it's a web standard there's also a company so i'm
actually involved in a company called interrupt which is basically the red hat
of solid right they are making a commercial server and system for the public standard so there's a
free server and all kinds of free tools but there's also these commercial series of tools
can you use it today yes you can use it today you can get your pod you can do it you know you kind
of got to be a techie to use it today it It's like the early days of the web. You had to program HTML to use it. The early browsers were not at all intuitive. So it's early for regular people,. It's like, here's my data, and I have integrations to that data, and I can give them permission. I having everybody's data is a huge liability for Marriott hotels.
What they actually want is access to your data when they need it.
If they knew they had that, they wouldn't need to store a copy locally.
Because that is just dangerous.
Right.
But you can't guarantee access, so they need to store a copy locally.
Fixing that, I think, is important.
Have you ever had to write a contract and have somebody sign it?
Yeah.
It's challenging though, right?
Oh yeah. Right. Yeah. Right. Cause contracts are very human.
Right. So the reason why they're five years and not one years is because every time you
got to go back, you're reminding them. Right. And so I think maybe the challenge, however,
with solid might be that, okayilton wants access but man they're accessing
quite often way more than i want whereas if they actually had it they could do whatever they wanted
and access it whenever they wanted which we don't want actually right you know we don't want to have
that kind of now i like i would like to know when they're using it what they're using it for
yeah it seems fair yeah yeah and maybe you could even
build payment layers on top so now instead of you know facebook selling my data i can certainly
can't i just change banks and and i had to give them a whole lot of data yeah why can't i just
like say here here's my pod you now have access to all that data right done yeah or for a dollar
you can have access to my data well you know you know, but I'm opening a bank account.
I kind of want the data.
Oh, I know.
I'm just thinking in general.
So there's a transaction here.
I want to give them the data.
Sure.
I just don't want to type all the damn stuff in again.
Right.
That reminds me of your, you have this great quote from the Data and Goliath book.
You said, data is the pollution problem of the information age, and protecting privacy is the environmental challenge.
I like that casting of that. I think that plays well into this whole solid idea isn't it i do too
you know and it's actually a pretty rich metaphor yeah right because if you think about it all
all computer processes produce data it stays around kind of festering we spend a lot of time
talking about its reuse how how it's recycled,
how it's disposed of, what its secondary characteristics are. And I actually, if you
think back to the early decades of the industrial age, we as a society kind of ignored pollution
in our rush to build the industrial age. Today, we are ignoring data in our rush to build the industrial age. Today, we are ignoring data
in our rush to build the information age.
And I think just as we look back
at those people 100 years ago and say,
how could you have been so short-sighted?
We will be judged a couple of generations from now
on how we could be so short-sighted.
So I actually think the metaphor is really robust.
The data is the pollution problem of the information age.
Well, I'll be fascinated to see how solid goes.
Hopefully it gets adoption because I do think from what I've read about it
and what you're telling me about it,
I think it has a lot of fundamental things done well.
It is a huge chicken and egg problem in all of these.
But we're getting traction with governments, oddly enough. The notion of a country giving every citizen a pod because governments also don't want their citizens to have to type the same stuff in again and again and want them to be able to share data among different government agencies.
So mostly in Europe, but governments seem to be the early adopters here, which is weird
because government as early adopter is like an insane thing I just said.
Yeah.
Yeah.
That was surprising actually too.
Well, on the note of policy meets technology meets repetition, I would just
like to take a moment to say to the Stack
Overflow folks, I've already accepted your
cookie policy, okay?
I don't want to accept it every single time
I come to your website.
Every single time. It's like, you know, you
should remember, I just said you can keep my
cookies, so put that information in a cookie
and store it, so I don't have to
accept your cookie policy every time. Or just put it into the browser bake it into the browser they might be
required by the regulation to ask you every time i don't know the answer to that but it's that's
interesting yeah right and and now can we solve that by you having your cookie policy in your browser. Yes.
So it would check, you know, what is this person's cookie policy?
Did he change his mind?
I mean, we need to get you to change your mind.
So how do we do that?
So we're solving it the dumb way by asking you every time, do you consent?
Do you consent?
Right. Maybe you can put your consents in some kind of accessible document that they can look at.
But here again, right, this is a problem.
Yeah.
We have to ask you every time something we've asked you already.
Sounds like something that they might build into the Google Chrome browser.
And I know from our previous conversations that you refuse to use such things.
I'm curious.
Yeah, but if you build a Chrome browser, it would be like default spy on you.
Default spy.
Yeah.
Firefox fan over here.
I am a Firefox user, yes.
Fair enough.
Fair enough.
All right, Bruce, well, we've used lots of your time.
I really appreciated this conversation.
Adam, any other questions before we let him go?
I'm clear.
This was fun.
This was a lot of fun, Bruce.
I appreciate it.
I'm going to catch up with one of your books.
All right.
I just mentioned the top of the show.
That's cool.
Which is the one that we should read?
I love them all for different reasons.
So the newish books that are worth reading, I'm staring at my shelf.
Data and Goliath is about data and privacy.
After that, I wrote a book called Click Here to Kill Everybody,
my favorite title, which is really about the internet of things and safety. And before that,
I wrote a book called Liars and Outliers, which is about trust and how systems enable trust.
So those are my three most recent. I'm coming out with a book next year, which is due in two weeks.
So I'm kind of panicky about this, which is really about hacking society, broader social systems.
Wow.
What's that one going to be called?
Can you tease us?
Probably A Hacker's Mind, who was still finalizing a title.
Nice.
Very cool.
Yeah.
Bruce, thank you so much for all your wisdom.
And you know what,
honestly, the book writing, while you may not become rich and famous because of it,
you will become rich and famous in terms of helping other people. You know, I think that nobody writes books to make money with the exception of like, you know, the top New York
Times bestseller thriller, right? Nobody writes books to make money. Your wealth is in the
appreciation of the knowledge you're sharing. So that's my point is like.
You know, right.
For someone to say, I read your book in college and it changed my life.
That's like the best compliment you could ever get.
Yeah.
All I know, that's the point is thank you for sharing your wisdom.
I appreciate that.
Hey, thank you for having me.
All right.
That's it for the show.
Thank you for tuning in.
Thanks again to Bruce for joining us to talk about such an important subject.
What are your thoughts on the security of tomorrow's software?
Let us know in the comments.
Links are in the show notes.
If you haven't yet subscribed, now is the time.
Head to changelog.fm for all the ways.
And if you dig what we're doing on this show, you might enjoy our other pods in the ChangeLog podcast universe.
For fans of my show, Founders Talk, I've been cranking out some awesome episodes.
Here's a clip from episode 89 with Sitsa Brandage, CEO of GitLab, on the biggest difference between GitLab and GitHub.
Yeah, there's strong network effects around open source projects.
So if you're going to host your open source project somewhere, you can pick either.
But there's an incentive to be on GitHub because a lot of open source developers are already there.
That network effect is much reduced if you're talking about a company.
If it's a company, I'm going to choose a platform.
I can just tell all the people in the company working on the proprietary code to use something else.
So that's something where we specialize. GitLab is an open source
platform that mostly hosts closed source code. GitHub is the opposite. It's closed source,
and they're really good at hosting open source projects. So we've chosen different adventures,
and we're really comfortable with our adventure making companies more productive
having a devops platform that allows them to go quicker from planning something to getting it out
there and getting the feedback by integrating all the steps on the devops like cycle in a single
application a single data store and make that work really really well continue to listen to that pod
at founders talk.fm slash 89 that is 89. Big thanks again to our friends and partners
at Fastly. Check them out at fastly.com. Also to Breakmaster Cylinder for making those awesome
beats. They're fresh, they're banging, and we love them. All right, the show's done.
Thank you again for tuning in. We'll see you next week. you