Bankless - Cryptographer David Chaum | Layer Zero
Episode Date: August 9, 2022David Chaum is a computer scientist and an OG member of the cryptography community. His innovation has driven much of the mathematical foundation of cryptocurrency, and he continues to work on project...s like xx Network. In 1985, David issued this warning to the public: “Computerization is robbing individuals of the ability to monitor and control the ways information about them is used. Already, public and private sector organizations acquire extensive personal information and exchange it amongst themselves… The automation of payment and other consumer transactions is expanding these dangers to an unprecedented extent.” In today’s surveillance capitalism, it’s clear that we need cryptography now more than ever. ------ 📣 Forta | Help Make Web3 a Safer Place https://bankless.cc/Forta ------ 🚀 SUBSCRIBE TO NEWSLETTER: https://newsletter.banklesshq.com/ 🎙️ SUBSCRIBE TO PODCAST: http://podcast.banklesshq.com/ ------ BANKLESS SPONSOR TOOLS: 🚀 ROCKET POOL | ETH STAKING https://bankless.cc/RocketPool ⚖️ ARBITRUM | SCALING ETHEREUM https://bankless.cc/Arbitrum ❎ ACROSS | BRIDGE TO LAYER 2 https://bankless.cc/Across 🦁 BRAVE | THE BROWSER NATIVE WALLET https://bankless.cc/Brave 🌴 MAKER DAO | DECENTRALIZED LENDING https://bankless.cc/MakerDAO 🔐 LEDGER | SECURE STAKING https://bankless.cc/Ledger ------ Topics Covered: 0:00 Intro 6:00 David Chaum 11:30 60s Technology 20:00 60s Culture 23:55 A New Direction in Cryptography 33:30 The Fork in the Road 41:50 Minimum Disclosure 48:25 Traffic Analysis 56:05 Surveillance Capitalism 1:02:00 The State of Data Today 1:12:00 New Ideas and Society 1:15:40 Advice to Crypto ------ Resources: David on Twitter: https://twitter.com/chaumdotcom?s=20&t=u3ULosGOPIOTPZobdnMKkw David’s Website: https://chaum.com/ ----- Not financial or tax advice. This channel is strictly educational and is not investment advice or a solicitation to buy or sell any assets or to make any financial decisions. This video is not tax advice. Talk to your accountant. Do your own research.
Transcript
Discussion (0)
Welcome to Layer Zero. Layer Zero is a podcast of unscripted conversations with the people that make up the cryptography community.
Cryptography is of course built by code, but it's composed by people.
And each individual member of the cryptography community has their own story to tell.
Cypherpunks understood that the code they write impacts the people that use it.
And Layer Zero focuses on the people behind the code because cryptography is people all the way down and it always has been.
Now, you might have noticed a little difference in that intro.
And that's because I have not a member of the crypto community, not a member of the crypto community,
not a member of the Ethereum community, but a member of the OG cryptography community.
And I'll do admit, it doesn't make complete sense to say that cryptography is people all the way down
because actually cryptography is one of the few things that is trickly math through and through and through.
Today on the show, I have on David Chom.
If you don't know who that is, you need to do a little homework because we live in an industry that is built upon David Chom's work.
He's the founder of Digi Cash, one of the early experiments into online cash, but mainly,
one of the big proponents of privacy at all costs, where not just the data about a message is
private, but all of the metadata is private too. And I'd like to read a quote out to help emphasize
the importance of who David Chom is and what he's done for this industry. This is Chom's warning to the
world in 1985, where he talks about the dangers of user data that is building up around computing
systems. He says, computerization is robbing individuals of the ability to monitor and control the
ways information about them is used. Already, public and private sector organizations acquire extensive
personal information and exchange it amongst themselves. Individuals have no way of knowing if this
information is accurate, outdated, or otherwise inappropriate. New and more serious dangers
derive from computerized pattern recognition techniques. Even a small group of them tapping into
data gathered every day in consumer transactions could secretly conduct mass surveillance,
inferring individuals' lifestyles, activities, and associations. The automation of payment and
other consumer transactions is expanding these dangers to an unprecedented extent.
Projecting the vision of these two futures, one built with current technology and one
built with decentralized services, David Chom saw that the two approaches appear to hold
quite different answers. Large-scale automated transaction systems are imminent, and as the
initial choice for their architecture gathers economic and social momentum, it becomes increasingly
difficult to reverse. Whichever approach it prevails, it will likely have a profound
an enduring impact on our economic freedom, democracy, and our informational rights. David
Trump said this in 1985, before we had surveillance capitalism, before we had Web 2, and before we
have the state of, you know, Facebook, Instagram, all this collection of metadata and the
resailing to targeted ads and all these other practices that we now deem evil in this world,
but we now also have to live with. And so David Trump saw this in 1985, which is one of the reasons
why he's such a fantastic character, and he's been dedicating his life to building private
cryptographic systems that protect users' data and not just the contents of messages, but all
of the data that those messages leak, as in who is the sender, who is the receiver, how big is
that message, what time was that message sent, all of these bits of data that create a
Sudoku puzzle that's solvable to track who we live in the world of what we now call
the metaverse, but was previously called just our online footprint. David Chom, his
His name is on this shirt.
If you aren't watching the podcast, this is the cryptographer shirt that we made with MetaFactory.
It's the shirt with all the cryptographers that this industry is built on.
And so I will stop with my gushing and allow this conversation to progress into one of the core cryptographers
that this entire crypto industry is built on right after we get to some of these fantastic sponsors
that make the show possible.
There is a brand new staking feature in the Ledger Live app today.
We all like staking the assets that were bullish on.
And now you can stake seven different coins inside the Ledger Live app.
Cosmos, Pocodot, Tron, Algorithms,
Tesos, Solana, and of course Ethereum.
With Ledger Live, you can take money from your bank account,
buy your most bullish crypto asset,
and stake that asset to its network,
all inside the Ledger Live app.
Through a partnership with Figment,
Ledger also lets you choose which validator you want to stake your assets with.
And Ledger is running its own validating notes,
offering a convenient way to participate in network validation,
and it even comes with slashing insurance.
Ledger Live is truly becoming the battle station for the bankless world.
So go download ledger live.
If you have a ledger already, you probably already have it and get started securely staking your crypto assets.
Maker Dow is the OG Defi Protocol.
The first Defy protocol to ever exist even before we called it Defy.
Maker Dow produces dye, the industry's most battle-tested and resilient stable coin.
Using Maker, you don't need to sell your collateral if you need liquidity.
Instead, you can spin up a Maker vault and use your collateral to mince die directly.
With Maker, the power to mince new money is in your hands.
And there's something new in the Maker Dow ecosystem.
Every time a new Maker Dow is opened, the owner can claim a Po app, which contributes funds to one tree planted, an organization with ongoing global reforestation efforts, creating a world where digital participation and the health of our environment can live side by side.
Soon, Maker will be present on all chains and layer 2s, bringing the biggest and best Defi credit facility to everywhere there is defy.
So follow Maker on Twitter at MakerDAO and learn from the oldest and most resilient Dow in existence.
Rocket Pool is your decentralized Ethereum staking protocol.
You can stake your eth in Rocket Pool and get our ETH in return, allowing you to stake your ETH and use it in DFI at the same time.
You can get 4% on your ETH by staking it with Rocket Pool, but you can get even more by running a node.
Rocket Pool is the only staking provider that allows anyone to permissionlessly join their network of validating Ethereum nodes.
Setting up your Rocket Pool node is easier than running a node solo, and you only need 16Eth to get started.
You get an extra 15% staking commission on the pooled ETH that uses your node to stake.
You also get RPL token rewards on top.
So if you're bullish e-staking, you can boost your yield by adding your node to the decentralized rocket pull network, which currently has over 1,000 independent node operators.
It's yield farming, but with Ethereum notes.
You can get started at rocket pool.net.
And you can also join the rocket pool community in their Discord.
You can find me hanging out there sometimes in the chat.
So I'll see you there.
David Chom, I am so honored to have you on this Layer Zero podcast.
Welcome to Bankless.
It's so great to be here with you, David.
So, David, there was this episode that we did on Bankless a while ago called Before Bitcoin.
And it was just a reading of one of my friends' explorations into the early days of cryptography.
And I think that's something that this crypto industry really forgets to pay attention to
as to the significance of the shoulders of giants that this industry stands on.
And so I really want to go and explore that part of crypto's history, like the pre-currency side
of this industry, going back to our roots and going back to the cryptography side of this
industry.
Are you ready to go down that journey with me?
Sure, sure. Yeah, I think you're right. It's something that most people don't pay enough attention to these days, although it is a layer sort of below what we're using today in the sense. But, yeah, we can talk about it, sure.
Yeah, and I actually want to start at the very beginning. Cryptography has many different parallels that are less about math, right? Like, it's a puzzle, it's a lock that you try and break. I'm wondering when you were growing up as a child, what were like the early indications or characteristics that or things that you like to pay?
attention to that later would be obvious about cryptography.
Of you, yes.
I was, you know, really the truth is that I appreciated the super strength of secrets as a child.
Much, yeah, of course, I was interested in all kinds of security mechanisms and this sort
of thing, but I really had a reverence for how powerful it is if you know something and
no one else does. And I thought that maybe there was a way to leverage that to help make the world a
better place. And what were some of those first things that captured your attention when you were
young? Well, you know, it was a different growing up than a lot of, I think your listeners have. So
I've had, because, you know, there was the anti-war movement and I was, you know, very much
caught up in all that kind of thing. And, you know, I learned how to program computers by punching
the little squares out of the hollerith cards with a toothpick. Then we'd give the deck and then
the weekend, the next week, we'd go to, you know, back to the science museum and they'd give us the
output from the, you know, Fortrend program. And it would say, you know, invalid punch in column six
failed, you know, and it was to repeat this process. You know, I was interested in all kinds of
things, but I think the really powerful force in my youth was the, the, you know, the, you
the idea that the world really needed, it was going to go off in the wrong direction and it really
needed some help. And the power of secrecy maybe could be leveraged for that. So that's kind of,
you know, they say people know their life story when they're six or something. If you've read that,
you know, yeah, so something like that. What was the indications that came to you that the world
was about to go off the rails, for example? Well, don't forget. We think that, you know, January 6 was
an unprecedented big deal, but when I was growing up, you know, there was the whole Watergate thing,
and people found that, you know, equally shocking. And I think there's a common thread running
through a lot of that stuff. But, you know, the War of Vietnam was not really that popular,
but it was a pretty tragic thing. And I think it polarized and galvanized, but there was a very
different era, you know, people were extremely excited about the future as well, because there was
a lot of other vectors besides that when the war and all.
But yeah, the power of technology to influence society, you know,
was very much top of mind, also with nuclear weapon, for example, in those days.
And, you know, Cold War and all these things.
It was, yeah.
So as someone interested in technology, I could see that this was something that, you know,
could be used by the powers that be in their standard way of using these.
kind of mechanisms and so on to sort of buttress their position and not really innovatively,
or maybe there was a different way through.
Yeah.
I'm wondering if you thought about technology back then, back in the 60s, but also its relationship
to politics and institutions.
I'm wondering, when did you make that connection, or did you?
I'm afraid I did.
I got a little bit of trouble here and there for kind of, you know, doing stuff.
And I was also like kind of a student of what the government was doing with technology.
you know, and I would call people up and pretend to be like, you know, I don't know, much older than I was and find out all kinds of things.
And then people started like threatening me because they felt that I was, you know, prying into things that I shouldn't and so forth.
You know, sometimes I'd call these companies and then they would kind of send me over to some of the guy who tried to entrap me.
You know, it was some kind of spook that worked there.
I mean, scary stuff like that happened.
It was a different time, you know, you could order manuals for all kinds of.
of stuff for free, like how to build the IBM System 360 out of, you know, or how the very
sophisticated burglar alarm technology worked or all kinds of stuff. It was a little more available
in those days. So I was interested in all these things, but with an eye towards seeing how they
could be deployed in the people's interests as opposed to institutional interests. I'd like to
jump to your time at Berkeley. What was your major at Berkeley? Oh, well, you know, I was a graduate student
there I had gotten this like for your graduate fellowship.
Regions for your graduate fellowship.
They only give one every two years and
for UCLA and I transferred
to Berkeley after the first quarter.
I was like I was much happier
there and now, but most of the people who were
doing cryptography there left.
So one guy went to work for the government
and another guy went to
try to influence government. So, you know,
and Merkel had left then and so
on. But it was fantastic
being there. It was the beginning of, you know,
my advisor were, we'll
ran the Berkeley Linux distribution project.
And so my office mate was Eric Schmidt, you know, just the two of us sat in this office
looking out at the bell tower and Bill Joy for another founder's son was, you know,
kind of around the corner, you know, talked to him a lot.
And so it was a very exciting place.
But there was also this feeling like, you know, the government wants you to work for them, you know.
And a lot of the funding was from DARPA and, you know, all kinds of government agencies.
and military organizations and, you know, a lot of big corporation stuff.
And then there were a lot of people that didn't really want to participate in that.
And the DARPA funding, you know, they always had the tagline, their posters and stuff.
DARPA, a mission-oriented agency.
And so I thought, you know, I'm going to be a mission-oriented agency.
And I really decided just to devote myself to trying to use cryptography,
to make the world more, let's say, democratic.
And that was my mission.
And I've basically stuck with that over the last 40 years, you know.
So there were people who were sort of the, you know,
the layer minus one of cryptography, let's say, was public heat cryptography,
you know, and this was a big deal when it was invented.
And, you know, so I knew all these people like Diffey was a buddy mine.
Then he moved.
He lived right at the border of Berkeley,
but then he moved to Mountain View.
So Marty Hellman and so Ralph Merkel had invented public heat cryptography, but no one we wanted to publish his paper.
So it was only published like 10 years later with apologies.
But Diffy and Helman, their paper was published at a conference.
And then it turns out that some smart summer intern at the British NSA had also invented public e-propography earlier than these guys.
And now he's running that place.
So there was the RSA work, you know, as well, and I knew those folks, of course, as well.
But, you know, this was also an era that's hard to appreciate these days because there was government pressure not to do research in cryptography, no, public research.
And in fact, the head of the National Security Agency, the NSA, you know, is a pretty serious organization, started in effect saying that cryptography could.
be considered born classified.
Right.
Like nuclear secret.
So if you just thought of it yourself, it could still be classified automatically.
And so there was this guy started writing letters to the major, you know, I, AAA, ACM, the major
scientific organizations, they say, don't have any conferences or sessions on cryptography,
we're going to throw the book at you.
And this was reported in Science Magazine.
You could read about it on my website, chaum.com.
And so then I thought, well, man, this cannot stand because cryptography is.
too important a technology to empower individuals.
And so I sat in my apartment with my girlfriend and we stuffed the envelopes for this.
We created a, you know, had a conference on cryptography.
We did it without using the phone.
So we got the addresses on paper from Len Edelman.
And we, you know, we got, went down to this place where they printed the free frats,
a little like off of Telegraph Avenue, you know, this little print shop where it was like,
You know, don't ask, don't tell, just, you know, pay us casual print stuff.
And they printed the invitations and, you know, mailed them out.
So, but most people that were interested that we knew of in the field came to the conference about 100 people.
And, you know, I stood up there because as the general chairman and organizer, I just stood up on the stage and said,
okay, you know, thanks for coming.
You know, lunch will be 1230 or whatever, but bathrooms are over there.
But now that you paid your whatever it was like 75 bucks, you're now a member of the International Association for Cryptologic Research, which I secretly found it.
And these are the officers.
The next event will be in Italy and this guy's going to run it.
And so there was a handful of people attending in suits, kind of like with their name badges all said, you know, private citizen from, you know, a Laurel, Maryland.
where the NSA is headquartered.
And then you want to say that they were working for the NSA.
But they were all there in the front row.
They were very anxious.
And you could just see them all turn green because it was over.
You know, they were trying to stop this research.
But at that point, you know, because it was an international organization now,
it's basically backed by the UN.
And, you know, they couldn't stop it.
So they never, like, put me in jail or anything.
So, yeah, I said cryptography free.
And that's still the major organization by far that does research and cryptography.
We have three conferences every year plus a handful of workshops every year and published the
preceding, you know, the journal of cryptology and a bunch of, you know, proceedings.
And it's the main event.
So it really created a community.
That's, you know, pretty vibrant still.
And of course, all of this is going on during the whole counterculture of the late 60s and 70s, right?
And I'm wondering how much of that counterculture zeitzeckyce of the,
the 70s worked its way into the cryptography community. Because it sounds like you are quite rebellious,
but I'm wondering, like, if you were a cryptographer, were you likely also a rebel? Or were you more
willing to work inside of state lines? Or was everyone kind of on board with the whole? We are not a part of,
we don't work for the government. We work for cryptography. So overall, what would you say is the
rebellious nature of the cryptographers back then? Yeah, I don't want to be lumped in with the rest.
Sure. It's fair to do so.
because my, you know, angle, if you will, my vector, my approach, my mission has been, as I've mentioned,
to use cryptography to protect members of society and to protect democracy and to allow democracy to flourish.
And that entails what is called privacy technology, basically mixing and blind signatures,
to put it very simply.
I refer back maybe to your earlier show you mentioned.
So, you know, the rest of the community was not working on that kind of stuff.
They're working on the layer below that, which is, you know, encryption and public key encryption.
It's a lot of good work and hard work.
But it's all about communication security of the very basic type that you understand when you're a little kid, you know.
You have the key, you send the message, encrypted, you send the message.
counterparty has the key.
They decrypted and they get it.
And the eavesdropper can't hear it.
I can't understand it.
You know, you can't decipher it.
So that was the thing that, you know, Zimmerman and all these guys focused on that.
And the government was trying to block this sort of thing.
They didn't want people to use it.
And they were trying to wheel out.
Since they couldn't do the board classified thing, they were saying, well, now it's, you know, export controls and this and that.
It's military equipment.
And you can't.
That was a whole war, crypto wars.
It's unbelievable. It went on for a long time and there were all kinds of advocacy groups and people focused on that. I steered clear of all that. So I was pretty much the only one working on privacy technology. The layer above that. So hiding, who talks to who and when, the social graph, what's called traffic, predicting traffic analysis and allowing payments in such a context not to unwind the protections. And so it was a bit different. And only now with the XX,
network. I don't want to jump forward, but you know, we have founded this project, X-6 network,
and it has a messenger and it protects privacy and so for the now, you know, I have to include
the end-to-end encryption as well and be involved in that, but that's another story because none of
the messengers today, you know, even though people moved to them thinking they were more private,
none of them has quantum resistant end-to-end encryption, and we do. And we announced it very widely
and no one followed suit.
So it's like, I think that that proves basically that that has never really been the issue.
The real issue is traffic is the metadata.
Right.
Since we used to call it, military used to call it traffic analysis.
Right.
Because it's far more revealing and harder to, you know, compromised by lying or, you know, misleading people and so on.
There was a paper that you wrote on this titled Untraceable Electronic Mail,
return addresses and digital signatures. But there was a paper.
Too mean. Right. But I want to get there. But before there was a paper that you didn't write
that was monumental in the world of crypto called a new direction and cryptography from Martin
Helman. That was the Diffie Helman. Right. Yeah. Yeah. No, I knew those guys very, very well.
But that was about, you know, end-end encryption. So basically, it's a simple, it's the thing I was
mentioning before. Conventional cryptography. Right. You basically have to drag the key that you make up
over to the other guy person, and then you too can use it to keep the eavesdropper from listening in.
With the advent of public key cryptography, which I mentioned that, you know, the different people that come up with it,
you could publish one of the keys and anyone could use it to, let's say, encrypt a message that you could
receive, that you could uniquely receive. And that made it so you didn't have to physically transport the keys.
You still had to make sure you were using the right keys. It's not a,
complete panacea, but that's right, yes.
Yeah, that paper and I think the RSA paper were, you know, widely heralded and influential,
sort of brought cryptography out from the little codes that you did as a kid out into this modern
cryptography world, but it was thought by governments and, you know, it was a whole war.
Right.
You sort of think the government lost, but really they won because they were able to delay this
dramatically.
And I think then in the end, misdirect people into focusing on that stuff and ignoring the metadata for the longest time until now.
Right.
Certainly.
Yeah.
And this goes to what you were saying where this was the foundation that cryptography is built on.
But what you were doing is something a little bit more applied, a little bit more on the data and privacy layer on top of like this basic cryptography.
You needed new kinds of cryptography in order to do the privacy stuff.
So the mixing paper that you referenced, you know, you could build that just from public key, just for the basis.
But for the payments, which are also needed, you have to have privacy and payments.
Otherwise, it'll ruin the privacy, the unlinkability that you would have in communication, right?
You'd be linked by your payments.
In order to achieve payment privacy, at least in those days, it was thought that you needed these,
an innovation, the blind signatures, which I created.
And I created a whole bunch of other kinds of signatures.
And no one else really did, you know, the undeniable signatures and all these other stuff.
So it extended the functionality of the primitives with new features that have.
proved to be of enduring value.
And so it wasn't just really building on an earlier layer.
It used those things, but it also had fundamental cryptographic insights of its own
that enabled its unique protections.
There's a quote that I want to read that I'm hoping that you can help elaborate for the listener.
This is coming from Choms Warning to the World in 1985, again from this before Bitcoin series,
where you say that there are dangers of user data that is building up around computing systems.
I'm wondering if you could unpack this quote and tell us what this means.
Well, you know, basically it was my realization early on that there were only two ways.
You can read my early papers in the Scientific American and the earlier version of that
that was published in the cover of academic journals.
It basically always said there's only two ways that technology could go.
And we see it today.
You know, it's one way is where the powers that be.
can see everything you do, you know, and the other way is where you have the keys to prevent that.
And you run the show.
You control the use of your own information by possession of your own keys.
And those are the two competing scenarios.
There's really no halfway middle ground.
And, you know, it's really starting to come to a head now because the fundamental difference between China,
as we perceive it, you know, in the West here,
and our ethos of personal liberty and rights and so on liberal democracy
is crystallized in the simple question of,
does your messenger allow the government to see who you talk to and when or not?
That's basically what it comes down to.
We chat, people in China know that we chat, you know,
can see everything they do and that information is available to the government.
So it's a kind of panopticon.
You know, it's, it has a chilling effect on society, but that's the way they want to do it.
And what I said in 2018 was that the XX network project was going to and the XX Messenger
was going to be a we chat with privacy built in.
And that's what we've done now.
And that's the fundamental difference.
So, you know, you hear a lot about CBDC.
Is that something you talk about?
Yeah.
Well, you know, there's only two kinds of CBDC.
There's the kind with privacy and the kind without.
And, you know, when the European Central Bank had a whole big comment on CBDC, you know,
the majority of the comments were about privacy that people wanted it.
And so it's not easy to get in the CBDC.
But, you know, if we do a CBDC that is fully surveillance capable,
it doesn't give the actual privacy keys to the users, allow them to create them,
then, you know, we're not.
going to have the moral high ground over China. I mean, people are trying to precipitate some kind
of new Cold War or something. I'm not a big fan of that, but it would seem that the basis of it
is that it's not a free society and they're spying on people and, you know, the state has too much power.
And that really comes down to, can we chat, see who you talk to or not? What kind of CBDC are we
going to build and what kind of message are we going to have? The ones that actually let you protect
that or is going to be like, trust us, you know? Or the government,
has the keys because we might need to check on you, you know, and all that's, you know,
that's basically the fork in the road. And I wrote about that and saw that very early on
in no uncertain terms. And it remains the central issue because it's so fundamental.
There's no way to protect your own information unless you have the keys to it.
You know, it's, you just can't trust other people with your keys because you don't know
if they leak, what they leak and what they do. And so it's as simple as that.
This is one of the immaculate parts of your story, I think, is that in 1985, you talked about this fork in the road where one would lead to a panopticon and one would lead to user privacy.
And it seems to have gotten true.
Yes, certainly.
And it seems to only get more and more and more true as our technology these days becomes more and more powerful.
People are realizing it.
Now the survey shows, you know, we hear a lot about Web 3.
But look at the surveys they show international and U.S. surveys, the major, you know, giant thousand
of people surveyed show that three quarters of people find privacy of the internet, so Web 2, as the
biggest problem.
The one thing they really don't like.
It's also the top three things they don't like, you know, and different names, you know.
So it's like the world has come around to, you know, so if you're going to build Web 3, you
better build something that people want, which is something that allows them to protect their own
information. It's not good enough that you say you're going to protect it or that, you know,
there's a presumption that you will. That's just not going to cut it these days. And that's the
hope for me for Web 3. And that's basically, we chat with privacy inside. That's the thin end
of the wedge. And then from there, you can build out more privacy platform. And then, you know,
there's other things that you want to decentralize, of course. And I'm not trying to diminish the
significance of, you know, the new kind of models that have paradigms that have emerged for the
crypto and blockchain space. I mean, there's a lot of interesting new stuff. But that's the
fundamental divide, yeah, still. Can you take us back to David Trauma in 1985 and tell us what made
it so obvious that there was this massive fork in the road? Like, what did you see that no one else
saw back in 1985? I saw it in the late 70s.
Late 70s?
Yeah, it was the first, I published the mixing article, which now there's 65,000 references to in technical papers.
I don't know what that translates into in terms of lights, but I think it's a lot.
And so, because it's not, you know, they had to say something about it, I guess.
So I was sitting in the hot tub in Berkeley with my professor.
Yeah, okay, you know, we were, you know, swim trunks or whatever.
It was like like that.
But, you know, it was like, he had this hot tub and he said, come on over, you know, a house, you know,
we'll talk, you know, about your research.
And, you know, that's the way it was.
You know, it's not like now, you know, where people are afraid.
You know, he found a fresh house and hang out.
And so he had like the redwood trees in his backyard, you know, you sitting there and looking up at the redwood trees and try to figure out, well, if we really need to solve the privacy problem, how can we work on that?
Well, let's see.
What would be the simplest kind of basic problem that you could start with, you know, the kind of toy problem.
and to work on it. You know, that's the way often you sort of get, you know, make progress in science.
And so, you know, I said, well, voting, you know, we'll overcome voting. And so I came up with
mixed nets to solve voting. And it's still in that mixed paper. There's a couple paragraphs on the
voting, but it's a little bit broader than that. You know, then I read this article, this guy named
Paul Barron at the Rand Institute, you know, like these kind of government-sponsored think tank that,
you know, was involved in all kinds of spooky stuff. You know, they were talking about traffic
analysis and all this and I'm like, this is bad news. So yeah, I was like doubly motivated to work on it
in those days. So, you know, I think what gave me the insight that, you know, circle back your question is,
is that I knew that you could solve it. I think that it's still the case. A lot of people
think, oh, you know, I can't, you know, there's spying on me. They're spamming me with fake news.
They're this and that. They're exploiting my data for making money off. But, you know, what can I do?
I need to be on this or that platform.
You know, there's nowhere else to go, really.
You can't really vote with your feet.
I mean, some of the platforms that have held themselves out, like I said, to provide privacy,
I'm now really question.
I don't want to, you know, malign them, but I challenge you to find any messenger
besides the elixir that actually uses quantum resistant security on the end-to-end encryption.
So they've always been saying, use end-to-end encryption.
That's the B-all.
that's why you should move to our, but it's fake because they're not upgrading it.
And the U.S. government has, you know, there's a presidential directive.
All government has to use quantum resistant VPN and everything because otherwise, you know,
the point is, you know, for privacy, if, you know, you may think quantum resistance has some tinfoil
that connection.
But actually, for privacy, it's like extremely conservative and solid because it's not.
question that quantum computers will ultimately be able to break these codes that are used to,
you know, because people use the public key to set up the communications, quantum computers can break
it. Ironically. So, you know, Ed Snowden told us that the government was involved in what they
called the full take. They capture everything and they save it. They've always been doing that. We knew
that back in the day. The NSA said, oh, send us your used magnetic tapes with all those phone
records on them, you know, we'll give you brand new tapes in exchange. And the phone company said,
oh, that's, that's great. It's like, yeah, they have these huge facilities so you could, you know,
look at them up on it's frightening. Like, store all this stuff or they're just waiting or maybe
they already have quantum computers or whatever. When they're interested enough, they'll be able to
look into that data retroactively. And so governments have taken the, a firm reduction, stop this
possibility and I think that the public should as well but the people that we're trusting
to help us seem to be asleep at the switch and that's leaving the whole
metadata issue out there is no messenger out there besides elixir that's protecting your
metadata so they're all just keep talking about how they have strong end end encryption but it's
not that strong anymore you know it's it's public e-base which means it can be broken by
Fundam computer.
So, I mean, roughly speaking.
It's based on, you know, earlier.
Government standardized public ecosystem, which are, you know, the more I looked into
it, okay, and I said, well, great, I've got this communication system that can solve
the metadata problem, that can solve, keep confidential who you talk to and when.
But then I have to find a way to do payments over that system that doesn't undo the privacy.
And I found that.
And then the third and final piece with the blind signatures was that the blinds signatures could actually be generalized into what I call the credential mechanism,
that which you might think of as a kind of zero knowledge, minimum disclosure proofs or something like that, but it's a little different than that.
That allows you to basically, you know, you can keep the full database of information about yourself that normally under the current paradigm would be maintained piecemeal by different organizations.
organizations, you know, government agencies and companies and so on.
And they could potentially link all your records together by your identity, right?
You could maintain all that data yourself.
They wouldn't have it because you would deal with each of them under a different digital pseudonym, a different public key.
And then if they wanted to know something about you, they could ask you.
And then you could answer if you chose to in whatever way you wanted.
But then you could prove that your answer was correct.
with like zero knowledge, minimum disclosure.
This is the cryptographic stuff.
So basically you could organize society that way.
Only you would know who you talk to when.
You could do payments privately.
And the database of information about yourself would be your exclusive property.
But I argued in those papers in the mid-80s that the data that organizations would get would actually be better and more useful to them.
Because right now they ask you to voluntarily tell them all this stuff and you could lie to them and so on.
and then they have to sift through all this.
And whereas this way, you know, you just answer the exact question that they want to know,
are you allowed to be here in this event?
You know, do you have a driver's license?
You pay your car insurance or at least 18 years ago, you know, whatever.
And you just say, yes, I am qualified.
And that's it.
They don't need to know all the supporting information.
And why do they need it?
It doesn't really help them.
And then so it's a better way to do it.
And I showed how to do that in the Scientific American article,
achieving electronic privacy and it's more technical antecedents.
So once I realized that you could do all three parts,
just all you really need,
that's when I, you know,
that caused me to double down my commitment to try to push this all forward.
But the public awareness wasn't really there, of course.
You know, back in the day, you'd talk about privacy.
It sounds like a very kind of wimpy, you know, issue or some kind of thing.
But now, you know, there's a massive recognition that this is the whole,
whole game, I think. That's what I've always obliged.
You said the phrase minimum disclosure.
I'm wondering if you could just unpack and elaborate on what that means and how you might
organize that around it as a principle.
Well, it's a different kettle of fish.
So I was at an event in Marseille, at a university called Lumini.
It was a little mansion, and there were these ladies there in cook outfits making this
giant kettle of Marseille-fish soup to celebrate.
at the end of our one-week conference on randomness.
But at that conference, I presented a model that a technique basically promuleted the circuit
diagrams, and you can still see it on my website and my publication of strom.com,
that's with Propos and Brassar, I believe, that included both models, both what's called
zero knowledge and what's called minimum disclosure.
And the zero knowledge people just hated us and they kept trying to say, oh, you didn't really do it and, you know, suppressing our papers and the references.
It was so painful and, you know, dishonest in my view.
But in any case, we cut a deal with them that they would call the one model minimum disclosure and they would call the model that they created zero knowledge.
So I showed you could do both.
One construction that could do either mix and match them.
And then also, if you look on my website, the spy master's double agent that showed how you can combine all the models and get the best of both worlds.
So that's why.
So it was a weird political thing that happened in science.
And that's why it's called minimum disclosure.
You know, it just depends on whether you want the secrets to be protected unconditionally.
In other words, against a quantum computer or infinite computing power.
Or whether you're willing to allow the secrets to be protected.
by some problem that you assume is hard to solve.
And so, of course, in my early career,
I was all about protecting secrets unconditionally
because I argued that, you know,
how could an individual know what kind of computing power,
you know, governments would have or in the future and so on.
But it turns out that in voting,
what you really want to be sure of is that the outcome is correct.
And so what we proved is you can't have it both ways.
You know, you have the secrecy of the secrets and the correctness of the outcome.
And one of them can be unconditional and at best forces the other one to be based on a computational assumption, assuming a certain kind of, you know, problem can't be solved.
And so for voting, you want to do it the other way.
And can I just interject something about voting?
I don't want to.
I just want to mention this because it's another thing a lot of people don't realize.
and it's very interesting to me
because we just saw that we just got this paper
except in the last couple of weeks.
So there's this longstanding issue with voting technology.
You want to vote online, you know, you're not in a booth.
So someone could bribe you or coerce you, like threaten you.
And there's really, you know, you could live stream your voting act.
So there has not been a satisfactory solution to that in a literature.
You look at my website, you see I did a lot to try to create a voting technology community as well.
Look at the books that I published and stuff.
So it turns out that we finally found a way to do it.
And it's called Vodaxx.org.
You can look at you can see it.
But basically it just says it's kind of like quantum mechanics.
It's an interesting result.
I won't go into it here, but it's quite practical.
And we've built the software for it.
And we're going to let a lot of people help us.
And some people are volunteer to incentivize that process.
And so we're going to try to make.
it really usable, but the basic algorithms are running. So we proved it with the strongest
type of proofs, which is the so-called universal composability framework, proof. So it's like really
strong, but basically it renders voting on improperly influenceable. You can't coerce or bribe voters
really effectively. And that's necessary for if you want democracy to flourish, because, you know,
I think you need a little more voting and a little more unstructed, you know,
sample voting, if you can read about it on my website, you know, that's the way I think it really
needs to be done because that scales with the complexity of society, you know, and I've got some new
results coming out as, you know, obviates a lot of these theorems like Aeros theorem and stuff
like that, you know, big problems with social choice theory. So sample voting is a really,
really nice, but it means that people aren't always voting at the same time, so you can't really
go to booths. You got to do it, you got to be able to do it. That's what everyone realized.
You've got to be able to, you can't have the boost.
because booths sometimes they spy on you,
people harass you, you can't get there, all this stuff.
And here, like in LA, you know,
there's a lot of polling places that still never open every election,
handful of them, you know,
because that locks out a certain group of votes.
So in any event, we solved that recently in the Vote XX,
and it's a subtle thing, but it was the final sort of missing link
to allow democracy to flourish
because it's not enough just to be able to vote securely online.
You have to stop this.
improper influence.
And so it's proven elusive, but now we got it.
Certainly.
We nailed it.
Certainly.
The layer two era is upon us.
Ethereum's layer two ecosystem is growing every day.
And we need layer two bridges to be fast and efficient in order to live a layer two life.
Across is the fastest, cheapest, and most secure cross chain bridge.
With a cross, you don't have to worry about high fees or long wait times.
Assets are bridged and available for use almost instantaneously.
Across's bridges are powered by Uma's optimistic Oracle to securely transfer tokens between
layer two's and Ethereum.
Cross's critical ecosystem infrastructure and Across V2 has just launched.
Their new version focuses on higher capital efficiency, layer two to layer two transfers,
and a brand new chain with Polygon, all while prioritizing high security and low fees.
You can be a part of Across's story by joining their Discord and using Across for all of your layer two transferring needs.
So go to across.com to quickly and securely bridge your assets between Ethereum, Optimism, Polygon, Arbitrum, or Boba networks.
Arbitrum is an Ethereum layer two scaling solution that is going to completely change how it,
use defy and NFTs. Some of the coolest new NFT collections have chosen Arbitrum as their home,
while Defy protocols continue to see increased liquidity and usage. You can now bridge straight
into Arbitrum for more than 10 different exchanges, including finance, FTX, Whoobie, and Crypto. Once on
Arbitrum, you'll enjoy fast transactions with cheap fees, allowing you to explore new frontiers
of the crypto universe. New to Arbitrum, for a limited time, you can get Arbitrum NFTs designed by
the famous artist Ratwell and Sugoi for joining the Arbitrum Odyssey. The Odyssey is an eight-week-long event,
where you can play on-chain activities and receive a free NFT as a reward.
Find out more by visiting the Discord at Discord.g.g.g.
You can also bridge your assets to Arbitrum at bridge.arbitrum.io
and access all of Arbitrum's apps at portal.orghum.
In order to experience defy and NFTs,
the way it was always meant to be, fast, cheap, secure, and fiction-free.
The Brave browser is the user-first browser for the Web3 Internet,
with over 60 million monthly active users.
And inside the Brave browser, you'll find the Brave wallet,
The secure multi-train crypto wallet built right into the browser.
Web3 is freedom from big tech and Wall Street.
More control and better privacy, but there's a weak point in Web3, your crypto wallet.
And most crypto wallets are browser extensions, which can easily be spoofed.
But the Brave Wallet is different.
No extensions are required, which gives Brave browser an extra level of security versus other wallets.
Brave Wallet is your secure passport for the possibilities of Web3 and supports multiple chains, including Ethereum and Solana.
You can even buy crypto directly inside the wallet with RAMP.
And of course, you can store, send, and send.
swap your crypto assets, manage your NFTs, and connect to other wallets and defy apps.
So whether you're new to crypto or you're a season pro, it's time to ditch those risky
extensions and it's time to switch to the Brave wallet. Download Brave at Brave.com slash bankless
and click the wallet icon to get started. The other thing I'd like to define is the traffic
problem. We talked about that a number of times. Could you just talk about traffic analysis
and just the nuances that that brings to the table when it comes to like privacy?
Yeah. Well, thanks for asking. It's a, you know,
basic traffic analysis.
That's what the military used to call it.
And just fill in a little bit to your list just to get the sense of this.
Because one of the really crazy things about trying to find out who talks to whom and when
is that it's always been legal for any government agency to ask for that information.
So there was something called a mail cover.
You know, any agency of U.S. government could fill out a form and get, make your postman
write down every single address you send to and every single address you send to and every single
address that sends to you in the weight of each envelope, but not open them. Open them, that's
different. That requires, you know, a court order. And the same thing with the phone company,
you know, like, oh, sure, you know, we can tell you who calls and who you call and so on.
And I think that, you know, where mobile phones are sort of seems to fall into that category
too, right? And so, you know, most people are unaware of this. And they think that that information
is somehow protected. And so when a company says, oh, we don't, you know, we don't respond to,
you know, unless they have a subpoena, we're not going to tell them, you know, but the traffic data
they give. You know, Apple said, oh, we're not going to give the keys to open this phone,
you know, the password for that phone famously. But they gave tons of information about all the
transactions that were done with that phone. You can read it's in the court records. It was known.
You know, this is the double thing that I was referring to, right, that people say, oh, you know, it's like, don't look behind the curtain.
Don't look at that.
You know, the man behind the curtain.
You know, you've got strong end encryption here.
That's look at the, Zuckerberg, you know, statements on privacy that he's made several really long statements even.
And it's all about that.
Well, you know, it's a strong end in encryption.
We're going to move to that.
But don't worry about the traffic data.
We'll take care of that.
We may need that to stop spam, you know.
something like that. So it's always been
something that's never really discussed in a clear
open manner. But yeah, but basically
the social graph, who your friends are
and when you talk to them and so on is so revealing. Like I said
earlier, you can't lie about that really. You can't mislead,
you know? So for instance, when I was at Berkeley,
I went over to the, you know, the Doe Library, the main library, and they had the congressional record on the steps, you know, in the basement there.
And you could read the hearings where, unless you recall, but during that period, the CIA was compelled to reveal their methods in one example of their covert, you know, whatever they call it, Plenance of methods.
And the Congress was going to get to choose which instance they would look into.
That wouldn't really set them back too much, but it would give the Congress this oversight.
And so they chose the coup in Chile.
And so that you could read the sworn testimony of the CIA about how they did it.
How did they go in there and just take it over and switch the government there?
And how do they do it?
Traffic analysis.
They just put software in the phone exchange at the presidential, whatever, you know,
a palace or whatever it was facility because of the American-made photo exchange probably right
they just put some like malware in there that would every night call Washington DC to some
random number and upload all the traffic data which phone called which phone at what time and
how long they spoke and then the CIA it's also public they they they paid all these
universities to develop sophisticated analysis of data like that
you know, it wasn't, I don't know exactly, I think it was an advance so that they had this capability,
social graphs, and they could, so basically you could look and see, oh, who called who just before
this event and then who called who after it and so on. They knew exactly who was running the country
and they just went in there and, you know, surgically took it over. No big deal. Traffic analysis.
That's documented. I want to pause and just like double down on this like a letter metaphor really
quick where we have this envelope that has a packet of data inside of it, just a letter, for example.
And end-to-end encryption assures that that envelope is going to stay closed and it's going to
protect the data that's inside of that message. But what it doesn't do is all the rest of the
data that's outside of the envelope, which is who is coming from, where it's going, the time
that it's going. And then what you said also, like how much of a payload that data is,
like how is there a little bit of data or a lot of data? And what you're saying is that, well,
sure, encryption, it hides the actual message. But when you have so much of the metadata,
the peripheral data, and you collect all of that, it's a little bit like a Sudoku puzzle.
You can kind of fill in the blanks just by kind of figuring it out.
Oh, yeah. Oh, yeah. You can link it to all kinds of other databases, like this data about
where your phone is at all times. That's like you can buy that for almost nothing from these
companies that just capture it. You know, any other data you have, you can start to correlate
and link in.
And so let me mention something else real briefly.
I just presented this at a conference in Berlin a couple days ago for the first time.
But let me just point out to you that if you are listening, you know,
maybe some messenger service, let's just say,
is not going to give you the traffic data because they know it because they route all the messages,
right?
You just listen near their location because you have the full take, right?
So, or whatever, you just tap the Internet near their location.
and you just notice all that the IP address is coming in and going out.
Well, it doesn't take long to use statistical techniques
to find out how many, let's say, server processes they are running,
how many threads, how many cores and threads are running in there,
and you'll start to see that certain IP addresses always end up talking to other ones.
So then assuming that the processing is first in, first out,
which of course it is, all of a sudden, now you're,
can see the full interconnection, which IP addresses are talking to which IP addresses.
And then you can link that with the phone number stuff. And you've got it all. You don't have to,
you know, get the cooperation of the messenger services. So that's a programming project for undergraduates,
in my opinion. That's not like, you know, that's not really that hard to do. So this traffic data is not
well protected, not under law and not in practice. The only really way is mixing. Is this what
supports this like surveillance capitalism that we have today where you know I get targeted ads
based off of certain data that I like exhaust into the internet is this that same problem that we're
talking about it's tightly coupled I mean yeah there you know in a lot of cases you know web two
you have no real choice you sign up naively for this service so you start exposing a lot of data to
it they have a lot of data internally they don't need to do traffic analysis I mean they can just
link all kinds of things they have the data and that's I think you know a part of
it. But yes, those surprising little ads you get or whatever these funny things, yeah, that's when
one service links to another just based on your IP address, for example. Yeah. So it's extremely
porous. It's maybe the wrong analogy. It's easily interlinked based on, like just
riffing on what I was saying about breaking a messenger's traffic data. But there's a lot of
different sources of data and it's not hard to link them all together because they're all based
in effect on your identity. It could be your IP address, you know, your name, your Google,
you know, your location. There's a lot of different clues and you can just link it all together.
That's why, like you said, in the mid-80s, I said, look, we got to, the only way so that you
can control your own information is just to make sure that you talk to everyone under a different
pseudonym and you talk over a mixed network. So there is no.
metadata linking. No one knows your IP address.
That's how people were surprised to see how much Bitcoin,
what they're doing with Bitcoin, right? That's the IP address.
You know, no one sees your IP address because you speak over a mix network
and use a different pseudonym. So yes, you can authenticate yourself as the owner of that
account or whatever, and you can prove that your answers are correct with minimum
disclosure, as we discussed, right? But you don't have to allow those entities that you talk to
to know enough about you to link whatever it is they're collecting about each one is collected
about you to the stuff that the other ones have collected.
That's precluded by that technique.
You divide them up.
You know, you partition them.
You're building, you know, firewalls between them because you have different identities
with each one.
That's the only way.
Right.
I think the metaphor in the crypto world is that it would be like if you made a new transaction
with a brand new wallet every single time.
Well, no, but that's not.
You can do it, but yes, but with a different IP address.
With a different IP.
Well, yes.
The principle is the only way to get that different IP address is to use a mix network to appear somewhere else in the world.
So basically, you're going to have mixing and you're going to have digital pseudonyms.
And you have to have a way to prove stuff between them.
If you want to benefit from your, you know, you can do everything.
I totally all the card.
But if you want to be able to say, well, look, I paid my taxes.
I, you know, I've got this much money in the bank.
I paid my insurance, I got this advanced degree or whatever it is, this passport, this,
you know, I'm at this age or whatever, then you need to have a credential mechanism that allows
you to approve things about the statements that each individual organization gives you under
the respective pseudonym that you uniquely use with them.
I want to go back to this Fork in the Road comment.
And there's another quote that I want to read that I thought was really, really powerful.
this is from an article that says projecting the vision of the two futures, one built with the current technology and one built with decentralized services, David saw that their two approaches appear to hold quite different answers.
Large-scale automated transaction systems are imminent, and as the initial choice for their architecture gathers economic and social momentum, it becomes increasingly difficult to reverse.
Whichever approach prevails, it will likely have a profound and enduring impact on economic freedom, democracy, and our informational rights.
Now, sadly, I already kind of know, David, which path we went down.
It's kind of the one that we give away all of our privacy.
But I'm wondering, like, if we go back to 1985 and then zoom forward to where we are today,
is it as bad as you worried about?
Is it not as bad?
Currently, the state of, like, user sovereignty and user data today.
Is it as bad as you thought it could get?
Is it worse?
Is it better?
Just like, how bad is it?
And, like, what were your thoughts on where we are today?
Well, it's pretty bad.
currently much worse than it seems to be because, you know, as they say, they keep it all
hid. You know, if you had a lot of massive data, you're not going to announce it. And there's a lot
of progress in machine learning and so on that's unreported and that, you know, this people
economically benefited enormously from this data capture, as you know, and they're, you know,
going ahead very aggressively trying to, you know, so it's worse than you think. However, in my own
defense, I want to say that if you go back and read those, you know, those statements, what I actually,
let's say, prophesized was not so much an inevitability. I mean, I was trying to, you know,
scare people. Say, you know, if you get really bad, hard to dig your way out of. But what I
prophesized was that it would be kind of a Higalian, you know, thing. You know, the people would get all
riled up and push back and win some stuff back. It would become a bit of a struggle. It would be kind of a
struggle, but make no mistake, you cannot win a local war. You know, it's one thing. All that data
is legable or not. You're either hosed or you're not. You can't have it halfway. You know,
I was not like, well, it's pretty much okay because in this part, it's, you know, okay. But what I will say
also, though, to qualify that briefly, and I think, and importantly, is that, and this is an
insight that I've had subsequent to those publications, you know, I mean, because of the
internet of things, let's say, and all the cameras that people seem to have put up with and all
this existing data that's out there, right? There are certain things that are pretty much
hopeless and probably not worth finding about. And there's certain, you know, when you're walking
down the street, you know, you're going to wear a bag over your head or what, you know, it's kind of hard
to really win.
that domain. So what I argue now is that what is strictly needed for democracy to prevail
is that individuals must have a protected sphere where they are certain that they are free
to discuss political and other matters and pay for information and be paid for information
with their, let's say, friends and family and their people that they want to communicate with.
You must have that protected sphere, otherwise you cannot have democracy.
It's a necessary condition.
And then now with the stuff you know about voting and so on, it's actually sufficient to have democracy.
And then that democracy can later, you know, take the cameras down and do this.
But you've got to get over the idea that the government needs to see everything you do.
You have to give people a protected sphere or,
there is no possibility that they can really be participants in a democracy.
So, you know, if you want to say that we're different from China, then you either have to provide
the protected sphere wholeheartedly, you know, no secret little holdbacks or just give it up
and say, okay, you know, we're just going to spy everybody just like they do, and there's no real
difference. I think that's a very pragmatic approach, so it's very optimistic because
the XX network has those capabilities already.
We have a way to have those protected spheres.
You know, a thing that just to drive it home for a minute,
you know, colorful thing is, you know,
I don't know how many of your listeners realize that coffee was criminalized,
both in England and also in the Middle East.
For the longest time, by rulers who felt that if people got together in coffee houses,
who knows what they would be talking about.
Look it up.
It's startling.
Revolutionary thought.
Yeah.
It's really quite surprising.
So I have like 500 years.
So I think it makes the point.
You know, there was this bookstore in San Francisco.
I think it's defunct now, but it was called a clean, well-lighted place for books.
I always like the name of it.
You need a protected sphere, a place where you can be you, where you can have an actual political consciousness
and development and discuss issues and actually meaningfully participate in governance.
Absent that, there's no democracy.
And it's not much to ask for.
And, you know, if you try to criminalize that, claiming, you know, some kind of, you know,
abuse of this or that, I mean, you're throwing the baby out with the bathwater, you know?
Yeah, sure, there might be some tradeoffs.
And I don't want to get into all this kind of, you know, duns, don't kill people,
it's the people, whatever.
But, no, you want democracy.
that is a necessary condition.
So just understand that and get over it.
And the thing is that, you know, the NSA said,
oh, we need a spy and everybody.
And they had all these programs spent billions of billions of dollars.
They never caught anybody.
And if you read the reports on that, it's just ridiculous.
You know, they had all this surveillance.
So they never really found any, you know.
So it's like, okay, you know,
if there are a few people that are really doing evil,
then follow them around.
You don't need to use the new techniques to stop that.
It isn't at that scale.
There's really no rationalization for it.
And then if you take it to the next level,
I mean, those countries that try to suppress this will lose out, in my view,
in the medium term, because there's so much economic opportunity
in creating, let's say, a level playing field for financial transactions and services
globally, we're paying basically $3 trillion annually in fees for financial services.
It comes on your school of economics, but that's at a huge cost of society.
And one thing that blockchain has done is shown that you don't actually need that.
You can do financial service.
I mean, $2 trillion of that is payments.
You can do it without this whole huge overhead.
And so if you unleash this, there's a lot of economic.
opportunity there. And if those governments that don't have a way to allow people to pay their
taxes meaningfully in that context will suffer. They'll either try to stop it and that'll be horrible
for them or they won't get the revenue and that could be a big problem. And so those kind of
techniques that I mentioned, the credential mechanism will allow you to say, okay, I'm proving
that I'm paying the correct taxes without telling you anything about what I'm doing. And that's way
better for, just like I say for, you know, for companies, that's way better because now they actually
have a proof. You know, you send you the taxes.
Because who knows, it's really true, whatever, you know, it was fudging on it.
It depends on the country, right?
In Brazil, it's like 50% fudging.
I think that's what I heard when I was there.
So it's, you basically prove that you're paying the right amount and that's it, done.
That's far better strategy for liberal democracy than trying to say, oh, no, no, we can't let you have your, you know, your protected sphere because it's just too dangerous to society.
You know, that's really a doomed take.
So, you know, we are at a critical juncture.
Certainly. I'd like to actually zoom back in time really quick because there's this line that stuck out to me while I was doing some of my homework from one of your head teachers, I believe at Berkeley. And it was while you were working on your mixed networks. Yeah. And your head teacher said, Manuel Blum. Maniwell Blum was the head of the department. He's a theoretical computer scientist. You know, I took his graduate theory seminar and I'm the only one who got an A plus in it.
because I proved a more general result in half a page.
All his star students like McCauley and so on, they didn't do anywhere near as well as it.
But I wasn't a theory student.
Right, right.
But yeah, Manny was quite a, you know, a presence in the department.
And again, this is going, I believe, with mixed networks, which is the concept of like decentralization, but, you know, all the way back then.
And he said, don't work on this because you can never tell the effects of a new idea on.
society. That's right. What did you think when your professor told you this?
You know, well, this was about my dissertation topic, which was, you can see it on my website,
is actually blockchain. It was actually, it was called the computer systems, established,
maintaining trust by mutually suspicious parties. It was a way to do a consensus, you know,
with cryptography, there was a bunch of nodes and they could add nodes and take them out
and so on. This was, you know, aimed at providing multi-party computation, which later I found a
theoretical way to solve. And you can also read about that on my website, the Spine Messrs
double agent problem that I mentioned earlier. But yeah, so it's in the acknowledgement of my dissertation.
I thanked Manuel Blum for, you know, trying to tell me that you could never anticipate the
effect of technologies on society. This is a common thing that's believed, you know, because it's
too hard to anticipate, you know, so I should not work on things that I thought would make the world
better because who knows they might make it worse.
But I dedicated
to him because I said it was the rejection
of that principle that
it was the motivation for doing this dissertation
because I think that
privacy technology
is different
from other technologies
in that respect. The only
known use is
really to allow
liberal democracy to flourish.
It's really only about empowering
people. You don't need
anything fancy to allow like what I call the monolithic model of security where you know it's all controlled by the government and everything's hierarchical and so there's no real evil use for privacy technology in my view it's our main way through it's the way to make information technology create a new world that we would like to live in as opposed to a new world that we really will not like to live in when I was running the digit caches
company in Amsterdam, you know, we had a e-cash was used, you know, around the world. It was
in Deutsche Bank issued in Deutsche Marks and we had it in dollars and Australian dollars and the
world was using it and everything. But this guy I knew in elementary school wrote to me and he said,
you know, he was a really smart guy and he wrote to me and he said, David, you know, I'm a drug
addict and I'm in prison and you're my main chance. You know, I always get always kind of resonated with
that's like it's our main chance. It's the main way through. This is it. You want liberal democracy
to flourish, then this is the only way and it is the way. We've proven it because we've built
the stuff. It works. So, yeah. David Tom, privacy is the way to have a flourishing liberal
democracy is definitely one of the subtexts that we talk about on this bankless podcast. So you
could not have said it better myself. If you had a message for the cryptocurrency,
industry, not the cryptography industry. What would it be? What would you say to the collective
cryptocurrency industry? Well, I mean, it's been a blast. It's been so, you know, what has been
achieved is so fantastic because it, you've created so much outside of the control of government.
It's caused a lot of heads to turn and possibilities to be opened up. And it's a very exciting
community and I'm thrilled that I, you know, could have been doing some of the early work that led to
this.
But, yeah, I think it's also, don't forget about the kind of future we're trying to build here.
Also, I guess, based on, you know, but I mean, it's not front and center, you know, it's not
everyone's focus.
It is mine.
I mean, I'm not naive enough to think that, you know, one can force this through.
I mean, you have to find sort of a way to incentivize.
everyone and then sort of spread the word and create a situation where everyone's pulling together
to make it happen. But I guess right now we have the wind that are back because the vast majority
of the public realizes this is what's needed. It's quite different from just even a few years ago.
It's really, it's a profound opportunity at this moment. And yeah, so I hope we can seize it.
Yeah, trying to get the crypto world to zoom out and focus on the long term, sometimes only works
in the bear markets and not the bowl markets.
Yeah, well, that's an optimistic way to look at the current situation.
Yeah, people can work on stuff now because, yeah, when things are really, yeah, I hear you.
That's a good point.
Yeah, this is the time to do it.
And, you know, I'm certainly trying to double down and move this all forward.
And, yeah, I'm excited.
I mean, yeah, it's a little bit unfortunate that we can't all like, you know, meet and
otherwise kind of wearing me out, you know.
You know, I was like, you know, I was like.
You know, but I believe it's certain that we're going to pull through this.
And I think you're right.
We will probably come out way more medium-term focus than we were when we went in.
And so that's a huge opportunity right there.
Yeah.
Certainly.
The other fantastic thing I really enjoy about this industry is that the cryptocurrency industry
stands on top of the cryptography industry, but the cryptography industry, it was built in the
60s and thrived in the 70s and then came alive in the 80s.
and that is not that long ago.
And so it's an honor, David, to be able to talk to you and host you on this podcast
because we stand on top of the shoulders of giants, and you are one of those giants that the
entire industry stands upon.
So just thank you so much for everything you've done for the world of data, privacy,
cryptography, and allowing us to have this flourishing ecosystem to play fun games in.
Well, you know, thanks for recognizing it, David.
And I have to say, watch this space, as they say.
You know, I've got some new stuff, which I'm very, very, very.
very excited about. Some of it you can see on my website. Some of it's not up yet. But even though we
solve the basic problems and our current stuff is really enough to move things forward, there's
some interesting new developments that I'm also super excited about. And I urge you to,
our listeners, to keep the ears open for some new stuff coming out soon. It's pretty pretty exciting
stuff. Certainly. We will put all of the links to chom.com in the show notes and also everything
about links out to XX network as well. Awesome. This was so great.
Thank you so much.
Likewise. Thank you, David.
Super pleasure. Bye.
