The Breakdown - Riccardo 'Fluffypony' Spagni on How Coronavirus Could Impact Privacy
Episode Date: March 7, 2020As the Coronavirus took hold in China, officials in the Hubei province tracked potential patients by examining purchase records for cough and flu medicine for the previous month. Welcome to the new... frontiers of privacy. In this wide-ranging episode, @NLW chats with former lead maintainer of Monero and Tari co-founder Riccardo Spagni - aka @FluffyPony on Twitter - about privacy in the context of: The recent arrest of DropBit CEO Larry Harmon surrounding bitcoin mixer technology being used for illicit purposes The US govt’s battle against end-to-end encryption Central bank digital currencies At home devices like Alexa and Google home Clearview AI and facial recognition China’s response to Coronavirus Why individual apathy is the greatest threat to privacy in the world
Transcript
Discussion (0)
Welcome back to The Breakdown, an everyday analysis breaking down the most important stories in Bitcoin, crypto, and beyond, with your host, NLW.
The Breakdown is distributed by CoinDesk.
Welcome back to The Breakdown.
It is Friday, March 6th, and today we have a very special episode.
So for a little bit of context, I want to go back to February 17th.
That was the point at which most people in America were not paying attention to the coronavirus yet.
but it had become a huge emergency in China already.
On that day, China's English-language newspaper, The Global Times, reported that authorities
in the province where the coronavirus epicenter of Wuhan was located had launched a very extensive
search of fever patients.
They were using financial transaction information about people who had bought fever and cough
medicine since a month earlier, January 20th, to go look for and find people who had symptoms of
coronavirus.
Now, I tweeted that day, when someone tells you that only criminals should care about financial
privacy or says something like, I have nothing to hide, send them this.
Privacy is the topic of our conversation today, and I am joined by an illustrious guest in that
context, Ricardo Spagney. Many of you know Ricardo better as his online avatar, fluffy pony.
Many of you know him as the former lead maintainer of Monero, which is a privacy-centric
cryptocurrency. Some of you know him as the co-founder of Tari, which is a new digital asset platform.
But whatever the case, however you know Ricardo, what everyone knows about him is that he is an
extremely poignant observer of privacy in the larger global context. In our conversation today,
we really take a look across the globe at the state of privacy in the context of a number of issues
that have come up over the last month. Some of them are specific to cryptocurrencies. We touch on
the rise of central bank digital currencies and what they might mean for privacy and surveillance.
We touch on a case of a developer of a popular wallet app Dropbit who was recently arrested and accused
of money laundering having to do with Bitcoin mixing from a few years ago.
Most of our conversation, however, focuses on the larger global context for privacy.
In that light, we talk about the attack on encryption from the powers that be in places like
the United States.
we discuss what the response to coronavirus might mean for privacy and how it could in fact not
do what I was suggesting in the sense of warning people about what surveillance can do,
even if they're normal law-abiding people, but in fact have the opposite impact where
China ends up looking like a paragon of response as compared to the United States.
Finally, we talk about what it will take for citizens to actually care about their own privacy
and why the apathy of regular people might in fact be the greatest threat to privacy in the
world.
Now, if this sounds bleak, fear not, Ricardo actually has a really interesting point for optimism
that we get in towards the end.
So I hope that you enjoy this conversation.
I know for me, it's one of the most important that we can be having over and over again.
One last note, I wanted this interview to be as raw and as much like the conversation as possible.
So it is much more lightly edited than our normal episodes.
So without any further ado, let's dive in.
All right, we are here with Ricardo, the man the myth, the legend himself.
Hey, thank you so much for spending some time today.
And thanks very much for having me.
Okay, so as we were just talking about, you know, the provenance of this conversation was a comment around an article on basically
around Bitcoin mixing and whether there were implications for developers of privacy technology.
And what I actually want to do in this conversation is maybe zoom out and get your take on
kind of the state of privacy across a couple different dimensions. But maybe let's start with
where we actually started in that conversation. So basically, you had a few weeks ago,
Larry Harmon, who is the CEO of Coin Ninja, which most people know through Dropbit, was a
on money laundering charges that had to do with his participation in Alphabet and a Bitcoin
mixing software a few years ago. And one of the strands, one of the narratives after was that based
on people's reading of some of the actual formal statements from authorities, that maybe they
were interested in developers of privacy technology. And your point was that that's not how you
read it at all. That this was a specific case.
where there was much more going on than someone just building on building a privacy technology.
So I guess maybe let's get your take on that specific situation first.
Sure.
You know, I think it's the problem with a lot of this stuff is like I'm not a lawyer and most of the people commenting on these things are not lawyers.
And so that can often lead to, you know, really weird interpretations.
And I think the point that I made there was if regulators were so anti-privacy,
like if privacy was just a thing that they bucked against with every fiber of their being,
we would see evidence of it in other ways.
We would see, for example, tour developers getting arrested.
We would see all sorts of like all sorts of evidence of like a very,
very strong anti-privacy stance, and we haven't seen that.
So sure, privacy coins or privacy-enhancing cryptocurrencies are maybe a bit of a new world.
You know, it's something that regulators have not previously been,
have had to think about.
But certainly when it comes to things like even WhatsApp and Signal and
and obviously Tor and I2P in Freenet, there have been all sorts of horrendous things that have been
stored on them and published on them.
So, you know, if regulators were going to take like a very strong anti-privacy stance,
I really believe we would have seen evidence of that already.
I think this is a good point and something that's important.
You know, one of the temptations, I think, when they're all.
are legitimate concerns about a regulatory response or a potential regulatory response in the future
is that just the natural social media cycle amplifies it.
So we focus on boogeymen that aren't there when there are actually plenty of good things
to talk about that are right in front of us.
So by way of extending this conversation, we have seen where, in some ways, it feels like
when we talk about implications around privacy technology or new privacy technology
and the firm of like privacy preserving coins, there are actually, in fact, things that are happening
already in terms of encryption, it seems. That's, to me, looking at the U.S. at least, that's where
a lot of these battle lines are, is whether technology companies that exist right now that have
nothing to do with cryptocurrencies per se are going to be able to extend and continue with end-to-end
encryption. Yeah. And I think that that, to me, is a much bigger battlefield. In fact, it's a
battlefield where we've already fought and largely lost in Australia, where regulators in Australia
have made a case for backdoor encryption, a strong case, and other regulators have brought into that.
So, you know, I think that that's, you want to pick a fight and you want to, you want to pick a side.
I think fighting against so-called responsible encryption is, is a fight worth fighting.
And privacy coins, I do not think even feature on the radar of those regulators.
They're too busy fighting things that they view is more insidious.
You know, WhatsApp used by terrorists and as if.
And, you know, like, what are we going to do?
Force everyone to forget that encryption exists.
It's not even feasible.
Yeah, I mean, that's the interesting thing to me.
is I guess, you know, what is your take if you have one right now on the state of that battle?
I mean, I guess in the U.S. specifically is the context that I'm looking at.
You know, the Electronic Frontier Foundation published another kind of warning signal a couple
days ago about the Graham Blumenthal bill, which they call a new path for the Department
of Justice to finally break encryption.
And it's all kind of part of this same trend where particularly Attorney General Barr's office
has been holding up crime as a mechanism to try to get this objective, which it seems,
I think, pretty bipartisan for many years to have these back doors.
Yeah, I mean, it's, I feel like a lot of it is just a basic misunderstanding of the consequences
of doing something like that.
So the argument that I've made to people in discussions is if you had to,
to ban encryption, what would you imagine is the logical consequence? Does encryption disappear?
You know, when I say encryption, I mean encryption that isn't backdoored. So you add these magical
backdoors and then you ban all other forms of encryption. And then, you know, you basically just
end up with a scenario where criminals and terrorists and, you know, people that are trying to
serve nefarious ends, they're just going to use strong encryption anyway. Because again, the math
behind it doesn't disappear. The pieces of software that have been written thus far do not magically
disappear. So in view of all that, basically all you're doing is you're saying ordinary people
who might want to use encryption for totally normal reasons. Like maybe they don't want their boss to know
like every message that they send.
Maybe they don't want their email being read by their ISP.
Totally normal things are now viewed as problematic.
Because, of course, why would you want that?
You know, you must be evil.
You must be wanting to do something bad.
Yeah.
Well, this is the classic argument of, you know,
we only need privacy.
if only criminals need privacy, right?
Only people who are doing bad things need privacy.
I don't have anything to hide.
I was thinking about this argument recently
when in the context of coronavirus,
you started to see these tracking tools
where governments were making,
were actually making visible people
who had purchased flu-like medicines, right,
at pharmacies in the context of the outbreak.
And this seemed to me to be this interesting moment of seeing, I guess, what might happen in the world where there's a context all of a sudden for someone to care about what even regular people are doing with their money, with their time, with their communications.
Yeah, and I think someone pointed out the other day that China's response to Corona has been really in some way.
is amazing because they have just this incredible access to information,
which is totally dystopian and not something that I think any of us find at all appealing.
And yet their ability to rapidly respond to evolving scenarios is definitely something that weighs in their favor.
At the same time, I, you know, hard pass.
I'll live with my non-dispopian future where they don't exercise as much control.
But it is interesting to see.
And I think that, you know, I think that we are, we're at a sort of weird space where in time where people are torn, at least Western civilization, not so much.
in the East, but where people are torn between still believing that the government is
looking out for your best interests and, you know, absolutely incapable of doing any wrong
and at the same time sort of, you know, realizing that maybe that's not the case and they're
starting to distrust governments. And so then they're torn. I mean, you know, you need,
like in most countries, you need the government.
to look after you and help you in situations like this, you don't have much choice.
At the same time, if there's a general distrust of the government, that's just not a great
position to be in.
Yeah, I mean, it's really fascinating, I think, in some ways to look at this coronavirus response
and weigh what it might do in terms of the public conversation about privacy.
Because I do think that there's a couple dimensions of this.
We kind of started with what regulators are thinking about privacy technology.
and what their various motivations are.
But then there's this other side, which is to what extent citizens care.
And another classic, almost cliche at this point is that people don't care enough about
their privacy to agitate for it.
And because of that, we're just the frogs boiling in the pot.
And maybe we're already cooked, you know, and we don't even know it.
But one of the weird things that could happen is if you do see a kind of large-scale epidemic
in the U.S. and a real public health crisis, especially,
with how it's been handled, people might actually look over at China and say, hey, well,
maybe they have something right.
So it could actually have the reverse impact that I first had when I saw that people were
being posted for having gone to their equivalent of Walgreens, right?
And that's the frightening thing, right?
So someone, I mean, a thing that people have asked me over the years from 2014 onwards is,
what do I think is the biggest risk to Monero surviving a decade or two?
And my answer has always been and remains apathy, that people ultimately might not care enough about privacy.
And if that's true for Monero, it's certainly true for everything, that people might just feel apathetic towards privacy.
and you can see that in their choice of devices,
that they're putting, you know, Alexa and Hey Google in their house,
and they're just not really giving much thought to whether, you know,
that's being listened to on the other end or anything like that.
And where it becomes really frightening is I've got a friend who has,
he's probably got like, you know, four or five Alexa devices in various rooms.
and I'm not sorry
no Lexa the little Google
HomePod whatever they called
you know and the other day
we're talking and he's like
man I swear Google's listening
into me because I was talking
with my wife about whatever
about getting diapers and then
Google was serving me ads for diapers
now of course like I know
there's a bunch of research in this area
and most of the time
it's you know if you actually
look at the person's browsing history. They actually did look for diapers two days ago,
and they forgot that they looked at it on some other site, and then Google's remarketing
and super cookie stuff picks up the fact that you were on this website looking at that page,
and now they know what to serve you. But of course, in his mind, no, Google must be listening,
and yet he won't get rid of the devices. Like total paranoid overreaction, he still won't get rid of
them. And so that is the level of apathy towards privacy that that, that,
we have reached. It's literally 1984, but instead of the government going and sticking microphones
anywhere, we've gone and stuck there everywhere.
Well, this is the fascinating thing in how this played out different than 1984 in some ways
is that these tools came in through the Trojan horse of convenience, right?
And that instead of it just being this horror state, it's this unbelievable,
deliver you anything you want, whether it's information or actual problems.
products state, you know? And so it's kind of like there's also a tradeoff element of people
being paranoid but saying, well, what can I do? I guess it's worth it at least or something like
that. Yeah. And that's, I think that's really the, you know, the problem is that some of these
arguments left in nothing to hide argument have been made in a very public way. And so people go,
oh, well, I have nothing to hide. And so therefore. And the reality is that it's of course,
more nuanced than that because it's true you probably most people might be law-abiding citizens and
have nothing to hide but what law and whose law and when the law arbitrarily changes and you
don't know about it and now you're contravening it and there's evidence of it floating around
not only for law enforcement to see but for your neighbors or your boss to see then it becomes
problematic and so and i've always struggled and i still struggle to find
ways of explaining this to people so that they understand exactly how dangerous the situation is.
And an illustration that I've used recently is to explain to people is what if you made a donation
to, it doesn't matter, it can be to really, you know, save the dogs foundation, you know,
planned parenthood, it could be to a political party, whatever.
And your boss found out about that donation and he does not share your views and because of it,
He either makes your life really difficult or he fires you.
That is a privacy leak.
You're not going around bragging about the donation you made, at least I hope most people don't.
But because of a lack of privacy, you know, your name got stuck on a website somewhere.
And people do this all the time, right?
They back things, not on Kickstarter, but on some of the other things to back sort of more donation or nonprofit.
initiatives. And then when they get to the screen that says display my name on the website or on the
campaign, they go, yes, yes, please, I want my name on there. And then they don't think about the
fact that that's going to be cataloged somewhere that any 13 year old with Google can go and find,
you know, things that they've donated money to and make judgment calls about them. And that is
just open, like in the air stuff, what we call OSENT, open source intelligence.
You know, not even like stuff that law enforcement has access to.
And yet people keep doing this over and over again without thinking through the consequences of these actions,
without thinking about the very real issues that it can create in their life,
not because they are doing anything bad.
They're doing something good.
They're giving money to someone to help them, to an organization to help them,
but that other people might disagree with those choices.
It's a really interesting hypothetical because I think, you know, this is something that I share, you know, where you started with this line of thought was had to do with kind of places where the law might just be wrong, right?
Where doing something out of sync with the law, it's again, or nothing to hide isn't just a matter of, you know, your browsing history or something like that, but actually could have major implications because you live in a more authoritarian state.
And obviously a big portion of the world, there's a big portion of the world for whom that is the normal state of things. And it's much harder if you've grown up in, you know, pro markets, liberal kind of, you know, systems to understand that. And so I struggle with that analogy as well. But the hypothetical that you're posing, I think, is a lot more realistic for a lot of these folks. And I think it might be particularly acute now. You know, we're having a, obviously, a much.
greater polarization in American politics, at least than we've seen before, that extends from
the ballot box and political issues to actual identity issues in a whole different way, right?
You have a very strong kind of cancel culture on really every side, even though different sides
call it, where you are either in and you think the same things or you're out because you don't.
You have a heretical point of view.
Well, obviously people are complex, and they may find, or most people, I think, probably have
some heretical view that is just a little outside of the orthodox of whatever group they're around,
right? So maybe it's in the south and you are supporting Planned Parenthood, right? Maybe it's you're in the
north and you are supporting a pro-life group, right? Maybe it's somewhere else and you have an
anti-vax position or whatever, right? Whatever the context is, if you have a heretical point of view
and that can be made public, there could be serious social ramifications to say nothing of, to your point,
if your boss finds out and just doesn't really like it and uses it as a pretense.
So I think it's a really interesting hypothetical that maybe gets a little bit closer to
the lived experience of people rather than forcing them to empathize with a type of experience
that they haven't had.
Yeah.
And I mean, it's good because that's really the issue, right?
It's oftentimes when we're trying to convey how problematic this is, we use weird
analogies or hypotheticals like, oh, what if you live?
in Thailand and it was illegal to throw bubble gum onto the street and then you did that and
you were arrested and people are like what so you know I I try to find ways of um of
really empathizing with people with the fact that this is a super weird thing to think about
and and hopefully you know that helps them really come to terms with with how how important it is
to grab hold of your privacy and really protect it, really shield it.
So I've just a couple more questions for you.
And so picking up on that thread, have you been watching the conversation around Clearview AI at all?
And if so, do you think that there's any potential that that company in particular freaks out people sufficiently to start paying attention?
Or is it likely to be just another thing that kind of doesn't register on most people's radars?
you know i i think um so so i i guess to some degree it's a little bit like um like china right where
they have they are so far advanced when it comes to uh stuff like this when it comes to facial
recognition um and and pumping that into um into systems that uh you know that are centralized
that give give access give them access to
all sorts of information on people.
And at the beginning, that certainly, I'm sure the argument was like, look at all the good
that this does.
But now over time, people are starting to realize how incredibly dangerous it is and how
incredibly dangerous it has become.
And so why have they realized it?
They've realized it because of stuff like the social credit score, you know, the fact that
something that they did three years ago now suddenly turns around and bites them.
And they never maybe expected that before.
They certainly shouldn't have expected that.
And so I think that to a large degree, we're going to see the same, right?
Stuff like Clearview, certainly really powerful software and a really powerful system.
and absolutely, I have no problem with law enforcement advocating for stuff like this,
but it is going to eventually become a cat and mouse game like most technology does, right?
Stuff like this is quite invasive.
If you think about back in the day when the internet was new and young
and privacy was kind of the last thing on anyone's mind,
there were no VPNs, you know, there was no need to hide your IP address or concern about, you know, browsing habits or anything like that.
Over time, it became an issue. And so now you sort of, you know, fast forward to today where VPNs are not uncommon.
And, you know, even people who aren't particularly obsessed with privacy will use a VPN because they understand the risks.
the security and privacy risks of using the internet at like a coffee shop.
And so I think to some degree, that's because of shared experiences,
because of what they've seen on the news and so on.
So stuff like Clearview, great, fantastic.
But eventually people are going to start using technology to counteract it.
Whether it is, you know, walking around with face masks on or whatever it is,
they're going to start really, you know, people.
people will start rocking the boat.
And of course, those early adopters of antiphacial recognition technology will be people
who are a little bit paranoid and so on.
But it trickles down, just like VPNs have trickled down, just like inter encryption in messaging
is trickle down to WhatsApp.
Okay, yeah, I mean, WhatsApp's another conversation, but still, the fact that this technology
trickles down, I think, is a, it gives me hope for the ability for people to
reclaim their privacy despite the existence of systems like Clearview.
I think that's a great point that part of the answer might not even be just a mass shift in
consciousness, but giving people tools that are just as good, just as convenient that have this
built in so that making a shift isn't disruptive to the way that they operate in any day.
Well, I've kept you for longer than I intended to because I think this is such an important
conversation. I have just one more. I think I share your sense that obviously the conversation
about privacy is much larger than crypto and digital currencies. However, I do think there's one
sort of part of this industry that I'm watching with particular regard to privacy this year is the
emergence of central bank digital currencies and what that might mean for kind of this larger
battle of privacy versus surveillance. So I wonder if you are spending any time thinking about the
implications of a Libra or more likely the implications of, you know, DSEP or a Chinese digital
yuan or a, you know, a digital dollar and what that might do for this whole conversation with
regard specifically to people's financial privacy. Yeah. So, so CBDCs are, again, you know,
there's definitely some level of like, oh, wow, that's kind of interesting going on in my head.
not so much from a sort of anti-censorship or censorship resistance perspective,
but just from a making digital currencies the norm.
And I think that that sort of paves the way for Bitcoin and, of course, other currencies like Minero
that offers something slightly different to be more widely accepted.
I would be very, very, very, very surprised if, you know, any CBDC or any sort of basket-based centralized currency like Libra has any meaningful privacy.
You know what I mean?
They, if you look at the backlash that Facebook had from regulators, it is clear that regulators do not want that to be a thing.
And I think that that's what we're going to see.
I tend to agree or actually producing a big podcast series around the idea of this battle for the future of money and certainly going back and looking at the immediate regulatory response to Libra.
You know, the privacy issue and the unwillingness to trust Facebook is right at the center of it.
Well, listen, I really appreciate you taking the time.
I think this is such an important set of conversations for right now with some interesting new context over the last few weeks.
So Ricardo, thank you so much and really appreciate you.
Thanks so much for having me.
It was a lot of fun.
I think for me, the most optimistic takeaway from this conversation is Ricardo's sense that
there is this set of technologies that are becoming mainstream, even though they don't seem
to be dominating news cycles.
So VPNs have become something that even normal, non-paranoid people use.
Face masks are becoming something that is more common.
and certainly could be accelerated by issues like the coronavirus. The idea that perhaps the
technologies to prevent privacy loss could become as convenient as the technologies that destroy and
threaten our privacy is something that is worth holding on to. That's going to do it for this
week of episodes of The Breakdown. Next week we have an even more interview packed week with
some really interesting people looking at a huge array of global issues from the
and beyond. I hope you are heading off to a great weekend wherever you are, and I hope that
you stay safe doing it. Until next week, peace.
