Bankless - Is Privacy A Winnable Battle? | Andy Yen, Founder of Proton
Episode Date: December 15, 2025AI is becoming the most powerful surveillance machine ever built and most people are feeding it their deepest secrets without realizing who can read them. Proton founder Andy Yen joins Bankless to exp...lain how Big Tech captures your identity, why AI chat logs are a privacy disaster waiting to happen, and what a self-sovereign, user-aligned future could look like. We explore how AI models profile you, why subscription business models don’t guarantee privacy, how Proton built Lumo without retaining user data, and why encrypted email remains the cornerstone of your digital identity. Andy also breaks down EU “chat control,” Apple’s privacy-washing, the crisis of trust in crypto, and the practical steps anyone can take today to reclaim their digital freedom. --- 📣SPOTIFY PREMIUM RSS FEED | USE CODE: SPOTIFY24 https://bankless.cc/spotify-premium --- BANKLESS SPONSOR TOOLS: 🔵COINBASE | ETH & BTC BACKED LOANS https://bankless.cc/coinbase-borrow 🪙FRAXNET | MINT, REDEEM, & EARN https://bankless.cc/fraxnet 🦄UNISWAP LABS | SWAP NOW https://bankless.cc/uniswap-labs 🛞MANTLE | GLOBAL HACKATHON 2025 https://bankless.cc/mantle-hackathon 💤EIGHT SLEEP | IMPROVE YOUR SLEEP https://bankless.cc/eight-sleep --- TIMESTAMPS 0:00 Intro 2:18 Why Privacy Matters More Than Ever 3:24 How Screwed Are We? Andy’s 1–10 Privacy Score 5:43 Who Can Read Your AI Chats? 12:22 Worst-Case Scenarios: AI Breaches & Behavioral Profiles 15:08 AI Knows You Better Than You Know Yourself 16:55 Privacy as a Digital Civil Liberty 20:26 Can Big Tech Ever Offer Real Privacy? 26:28 Why Lumo AI Can Be Private (and Big Tech Won’t Copy It) 31:08 What Lumo Runs On & How It Competes With Frontier Models 35:45 The Case for a Self-Sovereign AI 38:46 How Proton’s Foundation Model Really Works 45:47 Can Private AI Be Sustainable? 49:47 The Full Proton Privacy Stack 54:59 The Messy State of Encrypted Chat Apps 58:49 Browsers, Search, and the Dark Patterns of AI Integration 1:02:53 Are iPhones Actually Private? 1:12:42 The Fight Over Encryption: EU Chat Control & Beyond 1:26:08 “A Backdoor for the Good Guys” — Why It Can’t Work 1:29:42 Financial Privacy = Freedom 1:32:58 Crypto’s Reputation Problem 1:39:24 Practical Privacy: The One High-Leverage Step 1:45:15 Two Futures: One Dark, One Bright --- RESOURCES Andy Yen https://x.com/andyyen --- Not financial or tax advice. See our investment disclosures here: https://www.bankless.com/disclosures
Transcript
Discussion (0)
AI like social media is intentionally designed to be addictive.
In fact, some of these AIs will even change the way they talk to you
to cater to your personality and sort of what it senses is the sort of answer that you want to hear.
And that's a really scary thing.
You mentioned that AI maybe knows you better than some of your best friends.
I would actually argue that in not so long from now,
the AI could know you better than even you yourself.
A lot of us humans are not so self-aware.
about who we are, right?
AI will actually be able to exploit the weaknesses of personality
that even you are not aware of in order to compel you
to keep using it and do what it wants.
Welcome to bankless, where we explore the frontier
of digital privacy.
This is Ryan Sean Adams.
It's just me here today, so I am here to help you become more bankless.
The question, is privacy a winnable battle?
This is a very important conversation
between myself and Andy Yen.
Andy is the CEO of Proton.
They are the makers of Proton Mail.
I'm sure you've heard of it.
Few things we discuss.
Solving privacy in AI.
How tech companies are screwing us.
EU chat control legislation.
Encryption as a civil liberty.
His views on crypto.
Also, I think my favorite part of this episode
is that it comes with a little bit of homework.
There's some level ups.
And I genuinely want you to consider
the homework. I want you to consider leveling up your privacy in 2026, like make it a project.
Because going bankless is about freedom and you lose your freedom when you lose your privacy.
I think a number of factors have accelerated this, most notably AI coming on the scene.
And I want you to accept for a minute the framing of this episode as we get into it.
What if your AI chatbot isn't your friend? What if it's a sycophantic super genius designed by a company
to trick you in defeating it more data
and then using that data to gain more leverage over you.
Even if that's a little bit true, don't you think it's concerning?
There are ways to opt out, and we explore them in today's episode.
Let's get right to it.
So Bankless Nation, there is a confluence of things happening in the world right now.
There was a Coinbase data breach earlier this year
that leaked customer email address information, phone number information.
I know the crypto industry has been hit with,
in real life attacks on crypto people, particularly those who haven't been able to keep their
information private. We live in a world of AI tools. We don't know what they're doing with the
data. Like what is chat GPT doing with the data? I don't feel like I have much control.
I have personally adopted the entire proton stack. I've done this lately to drastically
improve my privacy and security posture. And I think every crypto user should go investigate it and
take a look at it. So all this to say, it's a very good time to have an episode with the co-founder
of Proton. Andy, welcome to Bankless. Hey, thanks for having me. It's a pleasure to be here. And I think
between the world of crypto, the world of encryption, which you're in, and the world of privacy,
a lot of intersections. So hopefully we'll have a fun conversation to go through some of these
pretty important topics, as you say. Very important topics. And definitely some common cause. So you're
the founder of Proton. This is one of the world's largest consumer digital privacy companies. I believe
there's 100 million users of Proton worldwide.
On a scale of 1 to 10, I feel like you're the perfect person asked.
So, Andy, on a scale of 1 to 10, how screwed are we on digital privacy today?
For a normal person, they're using the typical web stack of Gmail.
They've got an iPhone, Instagram, whatever, chat GPT.
How screwed are they?
I think it really depends on who you are, right?
The average person is probably quite screwed if I'm being completely honest.
and I would say the more technical sophisticated people have more means to protect themselves,
but it's going to become harder and harder if the current trends go on.
So what we see actually, people talk about AI, and I think you mentioned AI as well.
And what AI is actually doing is it's simply an extension of a trend that's been going on for 50 years.
Because fundamentally, AI is it's actually a better and more efficient way for our
humans to communicate with computers. And so it's not dramatically changing any of our business models,
but actually it's accelerating the existing models that really exist. So if you think about
search, the information that you give into Google search, which allows Google to build a very
detailed profile about who you are, well, an AI conversation with, say, Gemini, that is way more
intimate, that is getting much more information about you. So what Google is able to do is they're
probably able to accelerate by a factor of five or ten, their existing business model with
the advent of AI. And this is something that is happening, you know, sort of across the board.
So then it becomes harder and harder because AI now becomes the central tool of our lives.
It's everywhere. So do you live in the Stone Age, so to speak, from tech perspective,
or do you participate in the acceleration of the loss of our privacy globally? And this is why I
think in some extent, the average person who doesn't have any knowledge of this is pretty
screwed. They can't take means to protect themselves.
But if you're aware, then you can do something.
The problem is, I would say probably 80% of the population,
90% of the population simply doesn't understand some of the issues that we talk about,
for example, on this show.
Well, this episode is definitely for awareness,
but also I hope toward the end we prompt to some action,
some small steps maybe and first steps you could take,
because privacy is definitely a journey,
but there are some steps you can take that are relatively easy,
and you can start them now,
and then you can keep improving over time.
That's the journey that I've been on personally.
But since you brought up AI again, let's talk about AI.
So chat GPT, tools like Gemini, if I'm using a chat bot type tool, like something like chat GPT, who can see my chats?
So can employees that these companies see them?
Can the government see them?
Are they subpoenable?
Are they like open to everyone?
Or is it like only under certain circumstances kind of like they break the glass in order to get access to the chats?
Who gets access to this?
Unfortunately, it's kind of all.
the above, and that's kind of the
scary thing. So let's
break that down piece by piece.
Of course, tech companies can
see it because they are recording
and essentially, you know,
analyzing and saving
every single conversation you ever have.
So everything you ever type in
is being recorded and
more or less permanently retained.
Like precisely. They're not
anonymizing it. They're not
doing something to kind of mix it in.
Because you're locked in.
It's your use, it's your, it's your, it's your,
you're a user ID.
Okay.
And they're actually looking at this information
because that is how they try to improve these programs.
So it's there.
And they actively look at it and they use it
because that is part of their business.
And it's also being used to profile you
and send you advertisements.
And in the case of your chat GPT,
even shopping recommendations,
so they can tell you what to buy
and recommend your products.
And you can actually directly buy them
through chatyPD now, right?
So the companies definitely see all of that.
Now, the thing about what that implies is anything that a company has,
it's actually obliged to give up to law enforcement.
So the FBI asked for it, if the feds come asking for it,
if any police or prosecutor requests it,
that information is also available.
The government has access to this information.
But a private party that sues you can also get access to the information.
And there's also, in fact, a famous New York Times,
a case where New York Times, I think, sued Open AI.
And as part of the lawsuit, they, you know,
trying to require Open AI to actually retain all the conversations
because they wanted to use those, you know,
conversations as evidence in their lawsuit against Open AI.
So that's the thing.
Now, what is even worse sometimes is then there's also the inadvertent breaches that happen.
When you give something to, you know, into OpenAI,
into LGBT, you're actually contributing to the knowledge of,
chat GPT. So the information that you give it becomes part of its brain, so to speak. And if it's
talking to somebody else, there's actually a very real possibility. And in fact, quite high likelihood,
that information that you've given it can then be regurgitated out and shown to somebody else
as part of another conversation. Because that information is now in the corporate information that is
used to train and give answers to these models. And that has happened, right? If you put a password
into chat dpd and someone's very smart and prompt engineering, they can get chatDB to spit out the
information that you gave it on accident. And then, of course, there's also bugs. I think there was
a couple examples of data breaches where an AI company accidentally left something open and revealed
all the chats or sometimes the chats were, I think there's even one case where the chats were
accidentally opened and indexed by Google, in which case anybody can get access to them. And that's the nature
of information. Once you put it out there, it's out there. You cannot really take it.
it back. You may be able to
sometimes force them to delete it, but
that may or may not be too late depending on
your threat model. So, unfortunately,
the answer is, it's all the above
in that whole list of things that you gave me.
Because when you put it into chat
GPT, it is unfortunately no longer
your data. You can now borrow USDC
against your Ethereum and Bitcoin on Coinbase.
Crypto-backed loans on Coinbase
make assessing liquidity seamless for crypto-hoddlers.
Powered by Morpho, Coinbase Cryptoback
loans gives you direct access to
on-chain financing, allowing you to take out
loans at competitive rates using your crypto as collateral.
Over $1 billion in loans has been opened through Coinbase to date.
On the Coinbase app, eligible users can borrow up to 1 million USDC using Bitcoin or Ethereum
as collateral.
Users can convert their USC into fiat to make down payments, refinanced debt, or cover urgent
expenses and more.
The benefits are numerous.
Interest rates are variable, typically between 4% and 8%, and respond to market conditions.
Loans are approved in seconds without credit checks.
Repayment schedules are variable, meaning there are no fixed debt.
deadlines. The kicker, Coinbase will not treat borrow transactions as taxable events.
Manage loans directly in the Coinbase app with ease. It's currently available to U.S.
customers, except New York. And additional collateral types and increased loan limits are coming
soon. Want to learn more? Click the link in the show notes or visit Coinbase.com
slash borrow. Introducing FRAXUSD, the genius aligned digital dollar from FRAX. It's secure,
stable, and fully backed by institutional grade real world assets, custodied by BlackRock,
super state, and fidelity. It's always redeemable one-to-one, transparent.
apparently audited and built for payments,
defy, and banking.
The best of all worlds.
At the core is FraxNet,
an on-chain fintech platform
built to align with emerging
U.S. regulatory frameworks
where you can mint, redeem,
and use FRAXUSD with just a few clicks.
Deposit U.S.DC, send a bank wire,
or tokenized treasuries,
and receive programmable digital dollars
straight to your wallet.
FRAXNet users benefits from the underlying
return of U.S. treasuries
and earn just by using the system.
Whether you're bridging, minting,
or holding, your FRAX USDA works for you.
FRAX isn't just a protocol. It's a digital nation, powered by the FRAX token and governed by its global communities.
Join that community and help shape FRAX nation's future by going to frax.com slash R slash bankless.
FRAX, designed for the future of compliant digital finance.
Uniswap Labs is built for DFI because Uniswap Labs built DFI.
We've been creating powerful tools to make crypto easier and safer since 2018.
And it's more than just smooth trading across 15 chains and counting.
It's some of the deepest liquidity in crypto.
It's a seamless app experience for everything you do on chain.
Discover new tokens.
Research confidently.
Swap instantly.
Manage it all securely in one place.
Experience how well crypto works when it's built by the pioneers of defy.
What would be the worst case?
Let's talk about data breaches for a minute.
Because at the beginning of the episode, I talked about a data breach app and a Coinbase,
which revealed a lot of customer, AMLKYC type of information.
So this would be like, you know, name, email address, physical location.
all of the personal data,
if something like Google Gemini or ChatGPT had a data breach,
maybe this at the level of sophistication,
maybe this would be a well-funded state actor or something.
Or like, I don't know, there are some parties
that could probably do this type of thing.
What's the worst case scenario?
Would they really have access to every, like, users, logs,
chat logs, and be able to leverage that in the future?
Yeah, they would have your chat logs.
and depending on what you say to chat GBT,
that could be quite compromising.
And, you know, there are some people today
who use chat GPD for relationship advice,
for personal advice.
It is their psychologist.
It is maybe even their virtual girlfriend
or boyfriend as a case may be.
Sure.
The information that you give a chat GBT is incredibly intimate.
It's literally a private conversation
with somebody who, in some cases, is your best friend.
And that is all potentially leakable, accessible, subpoenaable,
and also available to hackers as well.
There's been a scaling of this since you started Proton,
I believe in the early days of, you know, it was 2013 or so.
2014, 2014, okay.
So a long time ago, but we have put more and more in digital format.
So back in 2014, it would be the most sensitive things I could imagine for myself online
would probably be my email address, maybe my search history.
Now in 2025, it would be everything that I've ever said to chat,
GBT or that it's been able to divine somehow, you know, depending on my usage pattern of chat
GPT. I mean, there's a real case for many users of these AI tools that the AIs know them
better than most of their closest relationships in their lives. Like, they know everything about them
and they can also divine things about them based on particular patterns. So can you talk about
that? Like, as we are increasingly going into the digital age, it seems to be. It seems to
It seems like we are giving more and more to the machine.
And I guess if knowledge and information is power,
then the machines become much more powerful,
or the corporations that control those machines
become much more powerful relative to the people.
Like, I almost feel helpless with this.
And I know I could stop using it at any time.
And yet it is so useful for everyday life
and to create economic output
that that's not an option for me
or most people listening to this.
Can you talk about that?
AI, like social media, is intentionally designed to be addictive.
In fact, some of these AIs will even change the way they talk to you
to cater to your personality and sort of what it senses is the sort of answer that you want to hear.
And that's a really scary thing.
You mentioned that AI maybe knows you better than some of your best friends.
I would actually argue that in not so long from now,
the AI could know you better than even you yourself.
A lot of us humans are not so self-aware about who we are, right?
AI will actually be able to exploit the weaknesses of personality that even you are not aware of
in order to compel you to keep using it and do what it wants.
Because what is the purpose of something like Chad GPT and Gemini?
Well, at the end of the day, it's engagement.
They want you to keep using it.
It's sort of a hamster wheel.
Once you get on, they never want you to come off.
So it is designed to tell you what you want to hear, to keep you coming back, and to ultimately
make you dependent.
That is not a bug.
That's actually the core feature of this product.
And we've seen this play out with social media algorithms,
which are just basically AI light.
Yes.
But social media, when you do it, you sort of understand it's public.
In the back of your mind, you know,
if I share something on Facebook,
I'm sort of expecting people to see it.
But in AI, it's like a chat,
which we assume by default is private,
but actually it isn't.
And this is why I do think it's quite scary.
the consequences of this. And it's, of course, the machine knowing about you, but it's who
controls the machine and who controls the machine are giant corporations that they don't really
have your best interests at heart. They're here to make money and they're here to make as much money
as possible by exploiting your data and by exploiting you ultimately. That's why privacy is very much
time. It's sort of, it's a political idea, isn't it? It's very much tight. It's like,
because what we're describing is a world where there is great power asymmetry. And
the large corporations with the information chatbots and data centers,
they have all the power.
And the individual citizens and the individual users don't have that power.
And so privacy is part of, that's what we'll talk about a bit more,
but privacy is part of correcting that power asymmetry.
I mean, do you see privacy as almost like a modern day digital civil liberty?
It is.
And it's also a fundamental human rights.
So privacy in many ways is our last defense against the encroachment of civilians capitalism, Chris today dominates the world.
If you look at the largest companies on the stock exchange globally by market cap, they are really all companies who are actively engaged in AI.
That is the biggest business.
And the market cap of these companies added together is bigger than most countries.
You add them up, you get, for example, you take your top two or three companies, that's bigger than the GDP of Germany.
So we are really at this stage where these companies have gotten so big that they're actually probably in many cases more powerful, more influential than even governments themselves.
So the ability of governments to even regulate these companies is quite limited.
You know, we think often about, oh, you know, we use the thing about privacy as we need privacy and encryption to serve as a last safeguard against the encroachment of power of government or individual freedoms.
But these companies today are bigger and more powerful than most governments.
So you can almost say that the government part is irrelevant.
It's the corporates that are probably even worse in many cases.
And government actually in some sense, at least in a democratic world,
is supposed to serve the will of the voter.
So you have some control over that.
But for an OpenAI or Google,
your vote doesn't count.
You don't have a vote.
So it's actually even worse.
Let's talk about AI maybe a bit more
because I was starting to get some glimmers of hopes
when you listen to some of the CEOs.
And when Chad GPT first came on the scene,
it was actually refreshing to see a subscription-based business model for me,
who's like I'm very aware of surveillance capital
and kind of the Google type ad, you know, like the model where they're harvesting your eyeballs
and mining your information, that's how they monetize you.
It was great to see ChatGPT and it was a subscription service.
So I was like, okay, we're going to monetize this in a different way.
Then I started to see like, you know, months later and years later, the tremendous amount
of CAPEX that is being spent by OpenAI and all of these models.
And when you kind of run the math, I don't see how it's possible to sustain that level of
CapEx and investment from a subscription-based model just because advertising and being able to
take out all of the information about individuals and groups and segments and then sell them
something based on that. It's got to be always a more revenue producing and more profitable
endeavor. And so I've even seen ChatGBTBTBT pivoting towards that. There was one other thing
that gave me some hope. And I want you to comment on all of this. But Sam Altman said recently that
He wished that chat GPT and AI models had more privacy and more confidentiality.
He said that unlike the doctor or a lawyer, we probably need legal protections because chat
GP doesn't have that, but we're sending it information.
We're having conversations as if we were talking to a doctor or lawyer.
And those classes are protected.
You know, you got an attorney-client privilege and that sort of thing.
We don't get that with chat GPT.
And so he was pushing for more privacy regulation in that direction.
I haven't seen any of that legislation come forward,
but it was nice to hear him at least acknowledge
that there is a privacy problem.
Anyway, take all of this.
Do you think that there is a world
where the existing AI companies
and maybe the governments can come together
and say, okay, no, this is too much.
There need to be some privacy regulations here
or maybe the AI companies decide
not to monetize our data for advertising
and it's a subscription service
and it works a bit more like
like Proton does.
I think a subscription doesn't mean
that they want to violate your privacy.
Honestly,
from like a bet take on your perspective,
if I can, you know,
trick this person and giving him his data for free,
but I can also make him pay me for that privilege.
Why wouldn't I do both, right?
This goal would be here.
I'm optimistic here, I suppose.
Well, I actually pay me to abuse this data.
Hell, you know, I'll take the money.
money. I'm not going to leave the money on the table, right? And I'll also harvest his eyeballs
and sell them some ads. Yes, exactly, right? Well, find out. We can cut even more money.
And that's what these businesses are about. It's all about money. So fundamentally,
it's a question of business model. The business model here is modernization at all costs.
These are profit-driven companies that care only about profit, and they will squeeze a dollar
out of you any way they can. So they're happy to take your subscription money and abuse your data at the same
time, and this is what they do. So subscription, I think, it's not a close enough safeguard to say that,
you know, oh, it's private. It really, you have to go down to the business model, the business
ethics, also what the business stands for. That's the key thing. Now, I think Sam is talking about,
oh, you know, it'd be great to have some government regulations around privacy, etc., etc.
He's probably more thinking about his New York Times lawsuit, right? He wants the government to protect
him from, you know, third parties subpoenaing his information that he's collecting. So what he's
basically saying is I want regulation to ensure that the only person that can abuse your data is me
and nobody else, right? That's effectively what he's saying. So it's regulation in sort of a,
let's say, very self-serving way, right? He's definitely not asking, oh, let's have regulation
that prevents me and myself from abusing your data. I know he just wants everybody else to be locked
out of his ecosystem so he can have a monopoly on, you know, your information. And that's,
sorry, so yeah, I wouldn't say that, you know, just because he says that he's actually going to go
in that direction because history has proven over again. And Sam's a known quantity, right? He's been in
the valley for a long time. People know what he's about by that by now. Andy, would you go as far as maybe
my intuition did and say that the only way AI companies are going to be able to show some return
on investment for all the CAPEX they're spending is the surveillance capital business model? Like,
that's the only way you can actually make investors whole and continue paying for
the chips and data centers and energy.
I think in the long way, so here's the interesting thing about AI and sort of all technology shifts.
In computing, there's a concept called it's called Morris Law.
Are you familiar with Morris Law?
It's basically that computing power doubles every 18 months.
And somehow it's amazingly held up for the last 30 years.
It's a bit similar with AI.
AI as technology is going to rapidly commoditize.
What would today costs maybe a billion dollars to do?
train may in, you know, five, six years time only cost 10 million. So I think there's two sides.
One is the cost of AI is going to go down probably exponentially with time. So these giant cost
projections that people are putting out today of how much a cost to build AI, that may not even
be true. They may say, oh, it's a trillion dollars, but that could end up being only $100 million,
100 billion over time.
So I do think there is a way to make the business model work over time.
This I think is, we can say we can be pretty confident about that.
But I do think some of the promises that have been made and advertised today in terms of how much
money we're going to spend, how much we're going to invest, how much we're going to build.
Those are unrealistic, not granted in reality.
And the businesses today, most A business is highly unprofitable.
And that's okay in the early phase,
but at some point, investors are going to want to return.
And investors may want these returns before the costs have dropped far enough
on the exponential scale.
And then you enter into a situation where these companies are going to be under increasing
pressure to generate as much money as possible.
And this is why chat GDP gets into shopping.
It gets into promoting different things to you.
This is why they get into browsers so they can track your browsing activity
and pop up other things to sell you things because they're under this pressure
to generate money.
But I do think long-term the business model works.
It's just going to take a while.
And these companies may not have so long,
so they're forced to be increasing the aggressive in abusing their data.
And unfortunately, that's their business.
And this is why when Proton, you know, we built our Luma AI.
We did it in a different way because we realized that actually you need to have a different business model
or else it simply is going to lead to really bad outcomes for the users.
Let's talk about the differences maybe between Luma and some of the existing.
So Luma is protons AI.
Yeah, Lumo, it's the end of the O.
Lummo is Proton's AI, and it preserves all privacy, encrypts all data.
Let's say Sam Altman or Sundar from Google, they listen to this interview, and they call your bluff, and they say, no, Andy's got the wrong idea.
This was never about surveillance capitalism.
This was, like, we always want to kind of, like, protect users.
Could they just turn on encryption?
and have none of their employees have access to any of the data
and have none of it be subpoenable?
And would that be legal?
Is there anything technically or legally preventing them
from just flipping a switch,
or at least in user preferences,
allowing users to say,
hey, I want all of this to be fully encrypted in private?
Yeah, there's no technical limitation, in fact,
that prevents them from doing what we're doing.
What really prevents it is business model limitation.
and if we're being completely frank, it's a problem of capitalism.
Capitalism drives them to make the highest possible profits.
And Proton, of course, being predominantly owned by a nonprofit
doesn't have those same type of constraints.
But there's no technical barrier to say why you couldn't do the same thing as Dumo.
Is there any legal barrier?
There's also no legal barrier as well, in fact.
Okay, so governments around the world aren't saying,
hey, we need subpoena rights just in case, you know,
we're dealing with someone who is cooking up a...
bio device that's going to like build thousands of people.
Well, it depends on the government.
Russia and China would definitely have those requirements.
But in the U.S., here in Switzerland, here in Europe, we thankfully are not there yet.
I say yet because it's hard to tell the future, but we're at least not there yet.
Let's put it that way.
So what Lumo does that is quite unique is, number one, we don't keep a record of any of your conversations.
Anything that is in your chat history is encrypted in a way that we cannot decrypt.
So we are technically prevented from accessing your history.
Number two, we don't use any of your conversations or any your prompts to do model training
and refinement.
So there's no chance that your information can't get leaked out.
And that also means the number three, our staff, you know, don't read your conversations
because they can't.
And if we can't get access your information, it means the government coming with a subpoena or
or a court order, even if fully legal, well, we can't.
not disclose information that we ourselves don't have access to. And so it's sort of the only, you know,
chats bot, AI chat bot where there's a strong technical guarantee that your conversation stays
private. And that is what is different. And that's your point. Google could technically build a similar
system. They just don't have the economic or business model incentives to do so because their business
model is flipped. I'm a subscription business and I take all of my money directly from subscribers.
And that means that my incentives are aligned with our customers. Our customers pay us because
we're private. And the incident we're not private, they stop paying us. So I have a financial
incentive to keep doing that. But if you use Google, you're not actually Google's customer.
You're the product. They're selling to the real customer, which is the advertiser. And that is a
misalignment of incentives, which is never going to really
put any real pressure to compel them for the patent of privacy, because protecting your privacy,
unfortunately, goes against their fundamental business interests. You know, Google used to have this
expression like, don't be evil. And I think in crypto, we adopted this expression can't be evil,
evil, because this is what encryption really allows us to do. And I want to ask you, so when you're
talking about the chats, you don't have access to them, so just to be clear, they're fully
encrypted, and you have no way to access, Proton has no way, no employees of Proven. You know,
ton, no government subpoenas have any ability to actually decrypt the encrypted conversations.
Is that correct?
Yeah.
So if you talk about Lumo, your chat history is saved, but it's encrypted in a way that we cannot
decrypt it.
So it's encrypted essentially with your private key that we don't have access to.
We just have an encrypted copy of that, that, you know, we cannot decrypt.
And that is the technical difference.
And by the way, it's all open source so people can, you know, take a look and see how it works.
But yes, your history, you know, whatever sense of discussions that you have had with
LUMO, that's your data, and we cannot get access to it.
Very cool.
All right.
What model does Luma run under the hood?
We use all of them.
Okay.
But only open source models.
We're strict on open source models.
So we have basically, for example, we have models of Mistral.
We even have actually Open AI's open source model.
We've got the Chinese open source models as well.
Like the deep seeks.
Yes, yes.
And KMEK2, you know, those models like that.
Yeah.
But of course, then we bring them in, we modified them.
and we make sure actually we use the best model for every single query.
And this is important because all the models have their own biases.
So if you were to ask a Chinese model certain questions,
it wouldn't give you, let's say, the correct answer, right?
And we ensure that we do give the correct answer.
And we also try to be as neutral as possible.
You know, we don't want to be a right-wing model.
We also don't want to be a woke model.
So we try to train and calibrate our systems to be sort of, let's say,
as neutral as we can with as few biases as possible.
And this is something that we actively do.
The open source models are pretty fantastic.
I mean, it is the case that they're close to kind of what the frontier models,
close source models are actually providing.
If a user was switching maybe from Gemini or ChatGPT to Lumo,
would they notice anything?
Would they like lose anything?
What would they lose?
I think one of the ways ChatDPT has people locked in is
now has kind of a memory of all of their chats.
Yeah, yeah.
And so it can recall context.
It's also very, like, the interface works fantastic.
It's very easy to use.
How about Lumo?
How closely is it able to replicate all of the bells and whistles of some of these frontier chats?
Well, Limo has been in the market for five or six months,
and these guys have, let's say, been on now at this point for over three years.
So we will always be a little bit behind, let's say,
cutting edge because we simply haven't been around as long, right? But as you say, what is
interesting about the AI revolution is the gap between the best open and the proprietary is really,
really quite small. And that means that even if we're not 100% the way there, it's actually
pretty close. And we're quickly adding the feature set to, you know, complete the business feature.
So you talk about memory. Memory is something that actually work in right now that is probably going
to come to loom out within the next month or two. But we have a major release essentially every two month.
and it's a field that is improving very rapidly.
So I think it's a gap that we close,
and it's a gap that is pretty easy to close
because of how good open source solutions are
in this particular tech revolution,
which is quite rare when it comes to tech revolutions overall.
And I think that's quite important.
On the topic of memory and sort of personalization,
in fact, there is a possibility for you to personalize Limo.
You can tell Luma how you want to talk to you
and how you want to respond.
Okay.
But there are certain, let's say, features that
we are going to decline to copy because we don't think they're right for society.
So I don't really want Lumo to begin to change his answers and behave differently because of
inferences it has made about your personality.
You can do that optionally if that's what you really want by opening up the personalization
and instructing it to do that.
But by default, it's not going to do that because I want to avoid some of the filter bubble
stuff that we had on social media.
So manipulation? That's what we call it in a human relationship.
Yes, yes, yes, yes.
Manipulation, right?
I want to voice on the manipulation because I think that's harmful.
And social media, the problem was if you were on the left,
it kept feeding you more left content and it made you more and more extreme.
And did the thing that people on the right.
And then we live now in a completely polarized world and then we wonder why, right?
So I think Lumo and AI, that's some responsibly,
it should not cater and try to reinforce our worst impulses
even if that gives higher engagement.
It should actually try to do more
to steer people in society
towards the center to seeing both points of views.
And that means, I would say,
probably less personalization by default,
because personalization is what gives rise to a lot of that.
So some people will say,
oh, well, but the fact that you don't do this is a bug.
It may be, say, a product bug,
but it's probably a feature for society.
And this is a tradeoff that we try to get cracked.
I think it's a long-term feature for the individual person too, right?
I mean, a lot of social media, the dopamine, all of these things, the sycophancy from these chat.
It's not good for people.
It might feel good in the moment and might cause you to spend more time with the chapup,
but it's not necessarily good for your overall well-being, right?
This really gets back to kind of what I think we all need and what we want from AI
and what I'm not sure that we will get at the end of the AI rainbow
with some of the large companies pursuing this
is I want a individual AI and hopefully at some point an AGI
that protects my best interest and works for me
and represents me and doesn't work for some other company
and doesn't try to manipulate or harvest me or sell me something
or use me for some sort of purpose
or influence me.
Like if an AI is trained as a lawyer, let's say,
I want to be able to trust my AI as that lawyer.
I want it to work for me on behalf of me
to protect my civil liberties in a court case.
I could tell it confidential information.
It's not going to wrap me out, you know?
And every individual, I think,
actually needs that protection in a world
where everybody, large companies,
are going to have all of these AIs.
Like, that's how you actually make it democratizing.
And I'm not sure that the path that ChatGAPT and Google have for us
is going to end up with a self-sovereign AI that sort of works on behalf of the user.
Yeah, it doesn't get there because it's not profitable enough.
It's not their business model fundamentally.
And the reason why a proton, you know, we transform the business into being having a nonprofit
as this bigger shareholder is because that is the way it,
which you resolve this conflict. So it's kind of funny, you know, because you can sort of see
intentions from direction of travel. Sam Olman has spent the last three years trying as hard as
possible to stop being a nonprofit, whereas Proton went on the other direction and actually went
from a for-profit company that could have stayed for-profit into something that was
primarily owned by nonprofit. And I think that speaks a lot about intentions, of course, but you need to
have that structure. That is the essential structure you need to have in order to ensure that you can
carry out sort of the vision that you spoke in the long run.
And a foundation structure really puts, what is mandating is it's a legal structure
that obliges you to put society's interests above financial self-interest.
And this is the basis of a nonprofit foundation under Swiss law.
And that's a key, I think, innovation from business model standpoint,
because it gives you the flexibility and the freedom, actually,
to make the decision that is good for the customer,
but maybe not always best for the bottom line.
Let's talk about that.
So in crypto, we're actually familiar very much with foundations.
And you're like, what a notable foundation?
I think that comes to mind is the Ethereum Foundation.
And I think they might be actually registered in Switzerland as well.
They're all here.
They're all here.
They're all here.
Okay, okay.
And so just like you guys, I guess maybe let's talk about the pros and cons of that.
So model the foundation, like many of the benefits that you said,
sometimes the disadvantages can be they can get stuck in bureaucracies.
They can't move as fast as companies.
They're not as well funded.
They're not as aggressive.
They can't get maybe the talent from outside that they need.
And so when you talk about Lumo being sort of a self-sovereign AI that works on behalf of its
user, I want Lumo to be very well funded because I wanted to be a good product.
I wanted to be a better product than ChatGBT.
and in order for it to be a better product,
you need the resources to hire the talent
to make it a better product.
And you also need the business model that supports it, right?
The thing with AI is inference costs are high,
the compute costs are high.
And I can see where a proton mail kind of works
on a subscription-based model
because that's relatively static
and it's storage, whatever, costs are low.
When you get to AI, I mean,
does a $20 a month subscription even pay for AI
when I'm going to be cranking on the thing
and boosting up your server costs.
So can you talk about the model?
Yes.
And then how that fits in the foundation
to make something like this sustainable.
So maybe a popular opinion among your audience,
but I would actually reject a bit of comparison
with the crypto foundations
because if we're being, let's say, completely frank,
a lot of the crypto foundations
that were created in Switzerland,
and this was a lot of them in 2017,
around sort of the ICO era, we call it,
Yes. I mean, let's be honest, most of those are scams, right?
Many of them were. The vast majority. And these foundations were not here for social benefit, right?
These foundations were created because it was a convenient legal structure to legally launder large amounts of money received from unwitting investors who 99% of the time were defrauded at the end of the day.
So I actually feel it was a bit of a mistake for Switzerland to cater to this business,
but it was early days.
They didn't realize a lot of those things happened.
And some of these foundations are still around.
They're still around having cash out tons of money to the founding teams.
And then what happened?
They went off and they did things that ultimately didn't have utility.
Ethereum maybe is kind of one of the rare exceptions where I suppose some value was created.
but a lot of these promised, you know,
we're going to have this new blockchain
to do this thing that will transform the world
and then it's vapor where it hasn't shown up
and they've just been sitting around collecting salaries year every year,
not shipping anything, and these networks are dead.
Proton, you guys had your opportunity to launch a coin in 2017
and you didn't, right?
We did, we did.
The problem is we had all the bankers and lawyers in Zube.
We're in Geneva, so we're not in the ecosystem,
but Switzerland, same country, right?
They all showed up on.
say, oh, you know, we're going to do this great SEO for you.
You're not going to have to give up any equity.
You're going to create a new, you know, token or coin or a blockchain.
And you had to resist this.
And then we promise you at least a hundred, you know, a million.
And some of these things are raising billions.
You have a big brand.
You can do all this.
And it's like, well, this is great.
Actually, I went to Zook, in fact, because I was intrigued.
And it's like, if someone offers you a billion, you got to go and hear them out, right?
Yes.
So I went there and I went up with the board.
And of course, it was tempting.
And they were signing entrepreneurs up left and right every single day.
But I realize, like, if I go down this path,
what you're essentially asking me to do is defraud the customers in the community
that has put great trust in the business for essentially personal gain.
And it was something that was simply incompatible with the profile stands for.
It's something that is incapable of my values.
I could not personally do it.
So we refused.
And then people looked at us like we were idiots.
Like, okay, why are these people so stupid?
The people were raising 100 million left and right and these guys are going to sit here and struggle in Geneva,
barely having any money to get by and try to grow the business.
Like, we look like idiots, right?
But in retrospect, I think it was the right thing to do because it just wasn't correct.
It was simply a fraud.
And we didn't want to be involved in that at all.
And now, what makes, I think, the pro-definition structure different?
So first of all, it's not credit to defraud.
people. This is the number one thing. And the assets of the foundation didn't come from outside
investors or users who put money in. That wasn't what we did, right? But you're correct. It has to be
sustainable. This is the key thing. A foundation, technically doesn't a profit interest,
but if all your revenue is coming from a couple of donors, guess what? You're in the pocket of
those donors. You work for those donors. And this is why proton has what I call a hybrid
structure. It's not purely a foundation. Actually, it's a foundation as a foundation as a
biggest shareholder of a for-profit company because profit still needs to be there. But this is
sort of a self-fulfilling privacy in that the two sort of reinforce each other. And the way it works
is the company does not have to take decisions that are bad for users because it's not going to
come under pressure from its primary shareholder, a nonprofit, to do that because a nonprofit actually
cannot do that. And if the company were to go out and let's say tomorrow we decided we decided to go
into surveillance, capitalist and business model.
Well, the foundation as the controlling shareholder is just going to block that,
and that's the end of that, right?
It's never going to happen.
So the foundation actually gives the company the freedom to do the right thing.
But the foundation has, as this asset, actually a big shareholding in a highly profitable
business.
And that means the foundation doesn't need to go out, you know, with a pen to beg for money
to sing this stuff.
It can collect the money from the company to operate on its own.
And that means the foundation is actually independent, completely independent
of any outside force.
And so I think it's sort of a novel structure.
If it was just a company, it wouldn't work.
It was just a foundation also wouldn't work.
But the combination of the two of them together with this shareholding structure,
this gives us a solution that is, you know, I say it's a self-reinforcing.
And that is what we did that was really kind of innovative from a business, you know,
set up standpoint.
It hasn't actually been done before.
There were really, I think, very few examples of this.
So I'm under some pressure because if we fail, then we've shown the role that this model
doesn't work. But if you succeed, hopefully others will say, hey, you know, I can do business
differently. This work for Proton, it could work for us. And I think this is the future of how you get
to responsible capitalism. That's fantastic. And I do wish Sam Altman listens to this episode,
because that would be instructive in his organization of Open AI, although that ship has probably
sailed. That ship has sailed for them, yes. Yeah, okay, you said highly profitable. So you're
indicating that Proton is highly profitable right now, specifically when it comes to the Lumo,
model. So in kind of the AI feature set, that seems somewhat different than some of the other
subscription offerings at Proton in that it has some higher variable cost when I'm pounding inference on
your model. How is that priced? And how do you expect to make that sustainable?
Proton has from the beginning always been involved in unsustainable businesses that we're allowed
to continue because we have, you know, stakeholders who are not purely profit-driven. And I'll
give you kind of an example of this. You know, we today have the roles, one of the world's largest
free VPN services. And unlike most free VPNs, which monetize your data and abuse your privacy
to make money, our free VPN is in fact not monetize. Does it a log, doesn't track, does it even
have a bandwidth limit, right? You can use it for as much as you want. There's no limit.
But not only that, we also spend millions of euros every year in building R&D
to ensure this VPN works in Russia and works in Iran.
And by the way, these are two countries, which are under sanctions.
So even if these users wanted to upgrade to pay us, they can't because there's no PayPal,
there's no credit cards, you know, they're not cryptocurrency.
Yeah, they're locked out of the bank.
Yeah, they can think crypto base.
But that's kind of, let's say, a small percentage of population, right?
So we're actively investing millions to be present in a market where there is no prospect for modernization
because legally it would be illegal from the pay us.
So that's clearly money losing business.
And Lumo, well, it's a bit early to see in Lumo, so I may yet turn a profit at some point.
But what I guess the point is we don't need to turn a profit on all of our businesses
because we are not under pressure to do that.
If I was backed by VCs,
they would have told me years ago,
you know, kill off your Russian business.
This is stupid.
This is a waste of money.
I get that, yeah.
And you saw all the Western companies
simply left Russia when they couldn't collect money anymore
because that was the prudent business thing to do.
But our structure allows us to engage in activities
in business lines that are not profitable,
have maybe very little prospect of being profitable,
but are aligned with our mission and are good for the world.
And this is why I do want people to pay for a removal.
If you're happy limo user, please pay us, right?
Even though you're not obliged to do it because we also have a free version of it.
It's subscription-based version.
Yeah, this is a paid version.
But let's say the free version is pretty good, like all our products.
They're probably too good, to be honest, right?
But that is our mission.
Our core mission is to make privacy accessible.
I don't want privacy to be a luxury good.
This is maybe Apple's model, right?
Although they don't believe in privacy, privacy for the business marketing.
I think privacy is a fundamental human right that needs to be available for everybody.
It can't be a luxury good.
And I want it to be open for anybody that needs it.
And that comes at a cost, but it's a cost that Proton is willing to pay.
And we are very fortunate to be positioned that we're able to pay that
because we do have other paying customers who make us profitable.
And they are essentially subsidizing the other parts of the business that today do
not make money and cannot make money.
Just for people listening to this, because I think they might be intrigued by the
Proton ecosystem.
What is the full suite of everything you guys offer, just like, you know, like line iteming
kind of the product.
It's just to be so easy.
I used to say we do email, we do VPN, and then that's it.
Yeah.
And that's when I last, so last time I checked on Proton, it was probably like, you know,
four or five years ago.
And, you know, like, I have just basic email.
And then I went back this year and I was like, okay, I'm going to refocus on my privacy.
and I was amazed by the slew of services that you now have
and actually how much better everything has gotten.
I mean, it truly is, proton truly is just like as good as Gmail
and I don't notice a big difference.
So you've made some big strides on the product side.
But what's everything you offer?
Well, it's actually been incredibly difficult
because you have to get the existing products better and better.
But then people today don't really think about tech as products.
Tech is actually ecosystems.
products only exist on the owner anymore.
That's right. So you must build the rest of the ecosystem out.
So we have, of course, email.
Then we built calendar.
There's also the VPN service.
There's proton drive.
A proton drive itself is smaller products.
So there's proton drive.
There's some photos capability on drive.
There's also proton docs, which is like...
It's like the whole G-suite, basically.
You're replicating only.
It's private encrypted.
Exactly.
There's proton sheets, which is the Excel equivalent or the Google Sheets equivalent.
But then there's also...
more things as well. There's also the password manager, and I think it's honestly the best free
password manager, because again, we're not so concerned on monetization. And then when you build a
password manager, we had demand for a two-factor authentication app that was actually open and secure. So there's
actually a proton authenticator. Really? Rather than Google Off, you have the Peroton. Authenticator?
And there's also even a Bitcoin wallet, proton wallet as well. Yes, yes. Right. And then there's
Lumo, which is our privacy focus AI. And something that is in beta, but not yet release, is actually a
proton meat, which is sort of a Zoom competitor that is N10 cryptic.
I love this.
Yeah.
You guys are, it's fantastic.
It's fantastic to see the growth here.
Well, I'm glad people appreciate it because it is not easy to do so many things at the
same time.
And I don't want to do them for the sake of doing them, right?
I want to do them and do it well.
I know I may not be able to do it perfectly well in year one or year two, but I do eventually
want all of these products to be best in class.
And that takes probably a decade in general, because that's how long it takes to me.
a share product, but we're committing on each part that we release to actually make it
eventually best in class.
And it's a lot of complexity.
There's a lot of hard work.
I can't tell people that a lot of times as an organization gets bigger, you sort of have
like, you know, more free time, you can relax a little bit, you can have a little bit less
stress.
It's simply not true at a prototype because the complexity continues to increase as you get
bigger and bigger.
But I'm excited to do it because it's something that's what it's exciting for me and
I think, for the team. But it's also something that we owe the community. And this has to go back
to the history of Proton. Many people sort of forget about this, but Proton started through a crowdfunding
campaign. It was, you know, people, ordinary citizens and users, many of them, in fact, from the
crypto and Bitcoin space who took their hard-earned money and made a crazy bet on, you know, a PhD student,
a bunch of PhD students. Think about a very wholesome ICO. That's a very wholesome ICO. That's
what this was, except there was no token, right?
Yeah, there was no token, but actually it
was a crazy bet because it wasn't even a
speculative investment, right? They didn't
get equity. Product.
What they got was a
promise that when we eventually
built a product, if we built a product,
they would have a credit to use
to subscribe to the product that
at the point they gave the money still
didn't exist yet
and was being built by people who had no
track record of building a product.
with any success.
So I'm very very thankful and very grateful to those initial users who took that crazy leap
of faith in us.
And I think we need to keep working for those customers because they have put their
faith in us and we need to show hopefully eventually that we have earned that trust they
put in us.
And that's where we keep pushing.
And that is, I think, a very strong motivation to also keep going.
and also because I think we are doing the right thing.
I think our vision of what the internet should be is the correct one.
And I believe most people out there, if you ask them,
do you believe more in, you know, Sam Altman's vision of the future,
or Google's vision of the future, or Proton's vision in the future?
They do align with our, you know, vision.
So it's encouraging to know that, you know,
I think most people out there would support and back what we're doing.
And that really keeps us going.
And that keeps us driven even after all this time to keep going as fast as possible
to build more and more things and, you know, lose more sleep because we're, you know,
doing too many things at the same time.
One thing you didn't mention, and I don't want to add more pressure, product pressure to
you, but here's another, is peer-to-peer chat.
So, you know, kind of, I think about tools I use every day, Telegram, for instance,
Discord, for instance, I noticed X added a new chat feature that is encrypted, air quotes.
Yeah.
Maybe just give me a rundown because I know something like Gmail.
If somebody's using Gmail, Google can actually read your email, right?
It's the same sort of thing we talked about with Gemini and ChatGPT.
Yes, exactly the same. Exactly the same.
Now, let's talk about chat.
So Discord, Telegram, WhatsApp, Signal, the new X-Chat feature, what can the company see?
What's subpoenaable?
What's private?
What's not in the status quo?
And then also Proton chat, anytime?
Well, I'll give the rundown of all you talked about.
So Discord, no encryption, everything visible, everything discoverable, subpoenaable, is fully open.
Okay.
Telegram.
Okay.
I might get in trouble for saying this.
Actually, I won't get in trouble if I'm saying this.
Yeah.
I'm not afraid of people.
Advertise as encrypted, but not encrypted, not by default.
Right.
So 99% telegram is simply not encrypted, right?
It's, it's, and defaults matter.
It's unencryptured by defaults.
You have to go do a separate.
setting and create new specific, like, chat rooms with encryption on.
Yes. Yes.
And I've never been involved in one of these chat rooms because the, like, of the hundreds
of telegram chat rooms I'm in, it's all the default, which is not encrypted.
Yeah. And actually there's some people who say, you know, there's telegram like an intelligent
ops of the Russian secret services or something like, like, like, like, because it looks like a
honeypot, right? Now, look, I'm not going to, I'm not going to speculate. But, but it's, I would
say a lot of people use it believing there's encryption, but in actual.
there isn't. So, so, yeah, let's call a spada spade, right? Signal actually very well encrypted,
encrypted, you know, everywhere, but encrypted to an extent that there's probably some
significant usability tradeoffs. It's maybe not the best for group collaboration, you know,
group chat histories don't really appear as you expect, right? Because of, I mean, there's valid
encryption reasons for that. And WhatsApp actually also encrypted in many ways. The communities are
not, but at least the DMs and small group chats are encrypted. But owned by meta, which is probably
going to do everything they can with your metadata to try to mess with you and make money off of you
somehow. So unfortunately, that's kind of not so great. There is actually gap in the marketplace because
all these solutions are sort of imperfect in some way.
I don't think it's possible to build a perfect solution either
because there's always compromises.
But I do think when it comes to like telegram discord,
I assume that's probably where you spend a lot of your time, right?
Yeah.
I think there's a gap there.
And I think someone can do a better job.
Will it be proton that does a better job?
Well, we got a lot of things spinning already right now.
And users would probably kill us if you went off and did even more things
before we improved some other stuff.
So I would say not immediately, but yeah, we work for the customer.
We're for the user.
If users tell us, and you're a user as well, if you tell us,
the totem should do it, enough people say it, well, guess what?
If you work for the customer, you're obliged to listen to the customer.
And that's how we decide how we build.
If enough people say they want it, then actually we do it.
Let's talk about some other pieces of the digital stack,
maybe where you don't have product ambitions, but honestly,
I'm just looking for advice here because I don't know what I don't know.
So let's talk about browsing.
in search. So there's search engine and then there's also the browser that I use. So something like
Chrome versus maybe a Firefox versus maybe a brave. It was interesting. Andy, so Gemini actually
added a feature inside of Chrome such that when you turn it on, Gemini can look at all of your
browsers, all of your open browsers and actually allow you to ask questions based on what you're
reading. This is like a power user type feature, right? So how brilliant would it be if I could be on a
web page and I have a question to Gemini about something I'm reading or some graph that I don't
quite understand? I say, hey, Gemini, like, what's this mean? Tell me about it. Give me the
history of it. Like, it's interactive like that. The tradeoff, though, is, I guess, Gemini would be able to
incorporate every single tab that I see and everything I'm looking at and incorporate that into
its model of me in the world.
Anyway, browsers.
What do you recommend here?
What's good?
What's not?
Yes.
Well, what you see now is all the AI company is building browsers.
Yes.
And that's not an like accident.
They didn't just say, you know, do it for fun.
And they did it because they realize if they integrate into your browsing and they
can combine your chat history with all your browsing activity.
It's like exponentially increase in the data that they have.
So just like, you know, Chrome is who.
going to integrate Gemini. Chat GPD also wants you to be on a chat GPD browser because it's,
it's just a more effective way to suck up your data. And so I do think the choice of browser is very
important. And again, here we kind of live in a sort of imperfect world. Chrome is, well,
if we're being honest here again, and this is not because I don't dislike the other browser options
out there, but Chrome is the most performance browser. It's the most stable.
one. It works for the most websites. Now, that's true also because they're sort of anti-competitive,
right? They also do things to make certain sites not work as well as Firefox, like really crippled Firefox.
So, yeah, they didn't play fair with Firefox, I would say. And Mozilla being fully funded by
Google had probably limited recourse against that. But in terms of overall reliability
performance, from is unfortunately, number one. That's just the way it is.
Firefox has some performance issues, but honestly, they close a gap.
It's quite good now.
The interesting thing about Firefox, of course, is they've been involved in some
controversy recently because they, you know, have started again to advertising,
starting to get into AI.
They remove some of their promises that they made to customers to sort of shift
towards a more commercial model.
And that's raised some concerns in the community.
Brave is, I think, a good option, but Brave always sort of had.
their basic attention token and their crypto sort of add-on,
which I suppose the crypto community likes,
but other people who don't want to be involved in that,
maybe I don't like that.
So I'm going to give you sort of like an outside choice here,
which doesn't come very often,
so it's been around for a long time.
I'm currently liking Vivaldi.
That's a odd choice.
Okay.
Why? It's chromium-based.
Okay.
It isn't doing like some of the, you know,
crypto stuff that Brave is doing that some people, you know,
maybe we're not the biggest fan of.
And it's open source.
And it actually works pretty good.
But the space is constantly evolving, right?
So, you know, if you ask me this question a year from now,
maybe my answer is different.
But this is the one that I have,
that I think is actually a solid option among browsers.
We talk about our phones.
So one of the reason I feel like I like my iPhone
is because Apple places a priority on privacy.
But I think you're going to tell me,
some of that is smoke and mirrors and propaganda.
And maybe not as true as I hope it is.
But when it comes to a choice of a phone stack,
what are like the pros and cons?
What should privacy conscious consumers and digital natives be wary of here?
Yeah.
Well, the first thing I might say is if somebody is spending billions of dollars
putting up giant billboards, you know, saying privacy,
you probably should be a little bit suspicious
of why they need to spend so much money
to convince you that it's private
if it's actually private, right?
Actually, you know,
Apple has the same definition of privacy
as probably Open AI
where, you know, it's kind of funny.
Every single company has its own definition of privacy
and what they're really trying to do
is redefine privacy.
And I can give you some example.
Google, if you go to their webpage
or to any of the product pages,
privacy, encryption security,
It's all over the place, right?
And I call it privacy washing.
But what is Google's actual definition of privacy?
The definition is we're going to give you more options over how we abuse your data.
Privacy for them is about all the different things that they have.
This is a goal definition.
It's so cynical, but I think it's true.
Right, right?
And then Apple's definition is we're going to be the only ones who are allowed to abuse your data.
No one else is allowed, just us, right?
So third-party cookies, all this other stuff.
No, just us.
And Apple has a giant ad business.
They do do lots of advertising and they're putting it into their products.
It's a 30 billion business today, you know, the Apple advertising business.
And they do a bunch of things that are just counterintuitive to their privacy.
So I'll give you a quick example, app store fees.
If you charge people that take subscriptions on mobile, 30% kind of revenue,
what you're basically doing is you're incentivizing a surveillance capitalism business model
because a free app, like Facebook, pays zero.
Oh, no, actually, they pay $99 per year,
which is for the developer fee, but that's it.
So if you charge proton 30%
and you charge Facebook 99 a year,
you clearly don't care about privacy
because you are essentially making that privacy business model
a lot harder to, you know, sustain.
And so Apple has a bunch of things
that is clearly contradicting
to their privacy advertising.
So at the end, they don't care about privacy
as all the ads say, they care about money.
And then this is very clear when you're so looking to that.
Unfortunately, today, mobile is a monopoly of two players.
Every single mobile phone is either Android or it's iOS.
And I think this is one of the hardest monopolies to break
because it cannot be broken with less than, I would say, probably,
you know, anywhere from $5 to $10 billion.
Why?
because the device manufacturers themselves are also complicit in maintaining the monopoly.
These device manufacturers are paid by these big tech companies to pre-install certain applications,
certain softwares as a condition for getting access to Android, for example.
And this is the whole thing behind the epic lawsuit against Google, right,
with sort of these deals that were being made.
And so I feel the only way that we resolve this actually is, right,
I'm not, let's say, a very pro-government person in general, but in a monopoly situation,
you've got to have regulators come in and say, this is a monopoly, and here are certain things
that you cannot do because you are a monopoly. It's the only way because it's gone too far now.
There's only two left in the whole mobile space. Yeah, they used to be Blackberry, they used to be
Nokia, these the other options. But now there's literally just two.
I guess, Andy, like, so tell me what areas that Apple is really like breaching privacy on
So if I, my data, let's say, in ICloud, they say it's encrypted.
Is that all fully encrypted?
Or let's say, you know, can Apple, does Apple have any access to data on my phone?
And how well does, you know, like face ID basically, like, can third parties really like sort of crack the encryption that's on my iPhone?
I mean, there was some case back in the day.
I think it was like, was it the FBI?
I was trying to get an iPhone.
I think Apple made much propaganda about this.
FBI couldn't crack our cones.
We held firm.
But you know how that story ended?
No.
That story ended because the FBI dropped the case
because they found a way to crack it
without Apple's cooperation, right?
With their own three-letter agency stuff,
they could figure out.
So I basically said at the end,
you never mind, it's okay.
You can have the court win.
We found a way in.
So don't worry about it.
Great.
And that's what I said.
Look, I think among the big tech companies,
Apple definitely is the best, you know, for a privacy standpoint.
They do have a different business model of selling hardware,
which allows them to do that.
But Apple is a company that cares first and foremost about profit.
You know, other principles kind of fall by the wayside after profit.
If you look at what they've done, you know, from a competition standpoint,
like, you know, they were referred for criminal prosecution by a court
in California for how blatantly they, you know, breach the court ruling that asked them to play fair
with Epic and other developers. It's clear that this is a company that only cares about money.
And every single time, our court asks them to, you know, try to open up their ecosystem to allow
other privacy players like Photon to, you know, be able to have a chance to succeed.
They essentially engage in malicious compliance. If you go to like a Wikipedia article on malicious
compliance, some of the examples are Apple, right?
And so it's a company that I think, yes, is more private than Google, but it doesn't really have a moral compass too many sense.
It doesn't behave in a very ethical way.
And you see this in sort of the way they act in every single case, right?
Like, you know, the European Union said, look, you need to stop being abusive towards developers and you need to open up your apps for ecosystem.
And give a quick example, you know, if you're in certain countries, you can't get proton.
You can't get proton because the app store is the only way to get proton.
And Apple has decided to comply with a dictator in some country in moving certain apps
because that is more profitable for them than trying to fight it or pulling out of the market, right?
So there's all these different examples where I just think Apple has strayed very far from what I originally stood for.
It's sort of been, let's say, it's almost like contaminated or infected by this sort of relentless
pursuits of money and only money above all else.
It's lost some of the fundamental values.
So I do think it's been private than Google, but I find it very hard to put my trust in a
company that is engaged in all sorts of behavior that if you look from the outside is really
quite despicable.
Mantle has launched a global hackathon until the end of 2025.
The focus is on building.
the future of real world assets. From now until December 31st, Mantle is inviting developers,
founders, and innovators around the world to design and launch new real world asset and
DFI products on Mantle. The reason to build here is simple. Mantle is not just another
blockchain. It is an ecosystem built for builders who want real distribution and real users.
Projects on Mantle have access to tap directly into ByBit, one of the largest exchanges globally,
giving teams exposure to more than 70 million verified users and potential listings through
by-bit launch pad and launch pool. The Mantle ecosystem is backed by a $4 billion
treasury that supports growth with grants, liquidity, and venture investment. And all of it
runs on a modular Ethereum layer two stack that delivers high performance, low fees, and
full EVM compatibility. The hackathon features $150,000 in prizes, plus grants, incubation,
and direct access to top VCs across six tracks, including real-world assets,
defy, AI, ZK, infrastructure, and gaming. If you're ready to build where real-world finance meets
on-chain innovation, join the Manifax.
Global Hackathon at Mantlenetwork.io slash hackathon or click the link in the show notes for more
information.
Crypto is risky.
Your sleep shouldn't be.
Eight Sleep's mission is simple.
Better sleep through cutting edge technology.
Their new Pod 5 is a smart mattress cover that fits on the top of your bed.
It automatically adjust the temperature on each side so you and your partner can both sleep
the way that you like.
It's clinically proven to give you up to one extra hour of quality sleep per night.
Eight Sleeps Pod 5 uses AI to learn your sleep patterns, regulate temperature, reduce
snoring and track key health metrics like HRV and breathing.
With a new full-body temperature-regulating blanket and built-in speaker, it is the most complete
sleep upgrade your sleep and recovery with A-sleep.
Use code bankless at 8Sleep.com slash bankless to get up to $400 off the Pod5 Ultra during
their holiday sale.
That's 8Sleep.com slash bankless.
You also get 30 days to try at risk-free.
Link in the show notes for more information.
The question is where do we put our trust?
I will tell you the crypto consensus on this is you got to trust the cryptography.
and that's the only thing you can trust, basically.
And maybe this brings in the conversation of governments, right?
So we've been talking about a big surveillance capitalism and some of the corporations.
And there are ways where our governments are a check on that power.
And they're democratic institutions, at least the democratic republics that we have are.
And so they should represent the people and they should be there to support civil liberties.
But we don't always find that that's the case with respect to privacy and in
encryption. And maybe I'm much more familiar with some of the battles that are going on in the
United States. And we've had battles with cryptocurrency and financial surveillance and all of these
things. And we could talk about that. But there are a couple of that have popped on my radar
that I want to find out from you a bit more about. Maybe you would know, like, which ones we should
focus on. But the EU chat control legislation keeps popping up. And I think, Andy, maybe you could
describe this. But does this not give EU countries the ability, basically,
to pre-check communication inside of a chat and just make sure it's not child pornography
or any illicit behavior, insert whatever the government thinks is illicit, illegal here.
And essentially, doesn't it break cryptography?
Can you talk about that legislation and anything other that you're seeing coming from
world governments?
That's pretty alarming.
Yeah.
But actually, before I jump on that, I want to go back to the point you made about
who we trust and the crypto point of we should.
we should trust in the crypto.
Actually, sorry to break it to you, right,
but crypto had written by people
and people manage these crypto systems
and the infrastructure in which your crypto runs.
So I don't think you can say
just trust in the crypto.
At some point, you also need a trust in the people as well.
And so people is a bit important.
You can do the crypto correct,
but if it's a person who is sort of, you know,
like to kind of go back to this,
the crypto scam examples
where in ICOs.
Those were open source projects.
The crypto was probably correct
in many cases,
but the people were scammers.
And unfortunately,
you know,
crypto was correct
for the guy that ran it was a scammer.
What do you want to do?
So as much as we try to remove people out of it,
there's ultimately element
of you're trusting a person,
you're trusting a team.
And when I look at the services I trust,
I also looked very closely at
who runs it, what do they stand for, what are they said, and what might their values be.
And by the way, the first kind of red flag is if you can't even find out who they are, right?
Because if they're not very visible, that's like, okay, why are they, you know, not discoverable?
What are they trying to hide from, right?
Interesting, interesting.
Yeah, so I think people's very important.
Now, going to EU chat control, chat control is not a new concept.
It's been, if the file has been open and debated and kicked back and forth at the year,
European Commission probably for now close to three years.
And what they want to do is they want to say, okay, so I'll tell you what the pitch is, and I'll
tell you why the pitch is bullshit, right?
The pitch is basically, we need to be able to prevent, you know, child pornography and
terrorism.
And the best way to do that is every single message on an encrypted app before it is sent.
It's going to be scanned and sent to a government database where it can be compared against
known bad things.
And then if it's a bad,
then your phone needs to
phone home to the government
and report you to the police
and the police will look at your stuff
and maybe come and get you.
That's really what it is.
And by the way, would this include email?
It's obviously chat conversation.
By the way, so chat control
would force you to do that.
What is kind of, again,
I may sound anti-apple, right?
But, you know, which I'm honestly not
not anti-apple because I try to be putting
objective here in general and I would say
they're better than Google.
but check control is basically doing what Apple volunteered to do a couple years ago.
I remember this.
And then they got a massive backlash for doing it.
But actually, in some sense, this was almost an Apple invention, right?
Apple was the one that kind of said, hey, we're happy to do it voluntarily.
We will voluntarily scan your shit and then call the police on you.
To protect the kids, of course.
That's just the reason.
And I thought that was a giant breach of privacy.
I don't know why Apple proposed it, but actually Apple made this proposal voluntarily already several years ago.
and then they pull back from it.
But that's what chat control in effect is doing.
Now, fortunately, chat control keeps popping up and then getting killed.
So it's a zombie.
It keeps coming back alive.
But it's not being very successful.
And what happened recently was the Danish presidency of the European Commission tried again
to introduce it and push it forward.
And predictably, it ran into opposition because it's deeply unpopular,
across Europe. And in fact, someone even set up an email campaign to bombard people in the
European Commission over this, which is super effective. But in the end, the Danish presidency
remove the mandatory detection orders from the current text. So presently, it's actually not
possible for the mandatory scanning to be enforced on tech companies, but it is possible for
tech companies to do it voluntarily. So Apple's a crazy scheme. If they want to bring it back,
would definitely be possible
and they could do that
under the new legislation
but the mandatory part
has been removed
and that doesn't mean
it won't come back
these things have a way
of coming back every couple of years
but I consider
knocking, knock on wood here
the check control issue
to actually be almost resolved
in Europe
and we're not going to have
mandatory breaking of
inter encryption communications
for this purpose
and that I think is a huge win
for Europe. It's something
that
is, you know, it's, you know, we were hoping this would be the outcome. And it seemed like
we've finally gotten there. So I think it shows that human rights still survives today in Europe.
How do we harden this a bit more? Because as, as you point out, it does seem like these
types of issues is a zombie that comes back from the dead and keeps haunting us. Or we're always
playing whack-a-mole. And if it's not in one jurisdiction, it's in another. If it's not one
bill, it's another. If it's not one take on, you know, cracking encryption, breaching privacy,
it's another. Is there some way to, like, enshrine these civil digital liberties in some sort
of a bill of rights? You know, the U.S. has kind of the Bill of Rights and these are things that are
baked into the Constitution. They don't really address privacy. And that's something you have to,
like, read into it. And furthermore, they don't really, they're not as adapted to the digital world.
Like, for example, if I was to think about a modern day digital bill of rights, the number one thing would be you can't outlaw encryption.
Every single citizen should always have the ability to encrypt their data and the government shall have no ability to like breach that, interfere with that, make it illegal.
Is that kind of what we need in order to like have the zombies stay dead?
And is anybody working on that project?
Well, if you look closely at the law, this is enshrined in certain laws.
Like, there are people that say encryptionist speech.
And if that's the case, then the First Amendment does protect you.
And if you look at European law, like a mass surveillance, mass surveillance is illegal
in Europe because the European Corps of Human Rights has interpreted some of the EU statutes
as saying that.
So there is sort of a legal basis for a lot of this.
But I do think it has to be strengthened because even there's a legal basis,
you can always find creative ways to go around it.
And for data attentions, the argument is always,
oh, we're just going to do it in certain situations
when there's a state of emergency, whatever, whatever, right?
But then you have countries like France,
which are like perpetually in a state of emergency.
We're like, oh, we've been an emergency for 10 years, right?
And so, like, that's bullshit, right?
So I do think we need to have new legislation
that protects us in insurances.
But legislation is done by legislators
and most of the people today in government
actually are tech illiterate.
They don't know anything about tech.
So I hate to say it,
but I think the solution is that some people need to die.
And let me quantify a statement, right?
Like, you know, not kill them,
as in they need to die from old age,
Progress moves one funeral at a time.
That's sort of an idea.
Exactly.
Of natural causes, of course.
Yes.
Yes.
And then new people need to come in from the more tech-sive generation
who understands things better, understand these years better,
and you put in place a public legislation.
For example, if I were to go today to the European Parliament or even the Swiss parliament
and ask them to write a new legislation for privacy and encryption security,
I probably wouldn't do it.
I wouldn't do it because I would be more worried about them doing it up.
Yeah, about them screwing it up.
So I was like, you know, maybe you just don't do it at all because you could actually make it worse.
You know, the saying is they will kill us with their good intentions.
So, yeah, I think it needs time.
We need to have a new generation of legislators who understand better, who are more tech native,
and who you can have this conversation with.
But Andy, it's not just being tech native. That is an essential and important component,
but you also have to have these kind of classical liberal values of enshrining ideals like privacy
as an individual right because somebody might push back and say, Andy, no, like, privacy is great.
This is fantastic. All we're asking for is for governments to have a special key that they can use
with court approval to unlock the data. And you don't.
support, you know, child abusers or terrorism. Do you, Andy? Can you talk about that type of
objection? Because that's a common objection, even people who, quote, unquote, understand the tech
might make. Yeah. And the answer I would give is, I've never seen a backdoor that only left
a good guy's in because it doesn't exist. I wish it did, but the reality is it doesn't. And then the
question I would ask is which government? Because are you talking about the government that is
in power in your country today? Or the one that could be in power, you know, five, ten years?
years from now. The point of civil liberties, fundamental human rights, the reason they're
called fundamental is because they're also here to protect us from the tyranny of the government.
And the government, even in a democratic society, is often just one election from changing.
And I say here in Europe, but in the U.S., where I saw was after the last election, you know,
half the country became terrified. Now, the civil liberty is being infringed.
But the election before that, the other half was terrified that they were going to be in French, right?
And that's a perfect example.
You need to have these things in place so that no matter how the election goes, you are not terrified because your fundamental rights are going to be protected.
So you're not doing it to protect maybe present, but you're doing to protect against all possible futures.
And that's the forelooking notion of why fundamental rights are required.
and why even if you don't feel an imminent threat from your government today,
you should still fight for this right.
And by the way, this is also not saying that we are going to allow criminality
to run rampant on the internet with no checks of balances whatsoever.
That's not what we're saying either.
We're saying it has to be proportional.
And the reason people object to mass surveillance
and the reason people object to check control is
that is essentially saying everybody is under surveillance by default,
even if they're not under criminal suspicion.
And that is undermining the fundamental presumption of innocence,
which is a cornerstone of democracy.
Because we as society say that you're innocent and total proof and guilty.
If you're guilty and total proof of innocent, then actually that's fascism, right?
That's not democracy anymore.
And that's the value that we need to defend.
So it's fundamental to democracy.
Without this, you don't have democratic society that survives in the 21st century.
And this is the argument that I always give people who bring this up.
Because, yes, I hear this often, but it's simply not true when you think about it.
Well, here's another related objection that they might make.
And we certainly hear it in crypto with peer-to-peer transactions is bad guys use this stuff to do bad things.
And I'm sure that there are bad guys who have used Proton to disguise their privacy.
In fact, I mean, maybe that's the only place they can get it, or one of the few places that they can get it.
And so they might say, Andy, well, just like cryptocurrency, what you're doing with privacy is you might have good intentions, but you are empowering criminals and terrorists and child abusers and all of these things.
And you're equipping them with privacy technologies.
Over at Gmail, you know, they use those tools.
Government subpoena can go access their emails and they don't have that ability.
So what you're doing is kind of a net bad.
What do you say to that?
Yeah.
Well, the funny thing is this is something that can be proven with data.
If you look at the number of law enforcement requests that Proton gets
and you look at the ratio of that compared to the number of users,
it's not worse than Google, in fact.
This is very surprising, but the data is clearly there, it's public and it's there.
So the notion that criminals are more strongly preferring platforms like Proton
is kind of a false one.
And also the reason for this is,
Let's say you're using, you know, proton to send a bomb threat.
Well, actually, you don't care that your message is encrypted.
In fact, you want the other side to read it.
So, so, so, so, so, so, so, so I might do this, Andy, to seal man this a bit more.
I might do this on a proton VPN so that authorities couldn't track my traffic.
And I might do this using other privacy tools that allow me to do this kind of nefarious thing.
Yeah, yeah.
And yeah.
Yeah, so I agree.
But the first point is the stats don't show it's the case, right?
But let's assume the stats were to show the opposite.
Yeah, let's assume that.
And actually, there were more criminal users on, you know,
incremental platforms.
Well, I'll give you another stat.
I'm willing to bet that the, you know,
rate of cybercriminality in a fully surveilled society
is lower in certain countries.
Pretty sure the rate of criminality
is going to be lower in China
and in North Korea
compared to the U.S. and Europe.
But you definitely pay a very high price for that.
And if you were to ask the people in North Korea,
do they feel more secure
because of the total surveillance
of their society provides to them?
Well, they would tell you in public, yes.
But then in private, when you can ask them
and you actually give them actual privacy
to express their true feeling,
they would probably say no.
And this is the core concept.
In a democratic society,
in a society that gives privacy
to its citizens,
there is always a negative personality.
There is a cost to that.
But it's a cost that we should be willing to bear
because the cost of the alternative,
a society without privacy,
is actually so much higher.
And that is all that is to.
it right. We can never get to a world where we can prevent all crimes that occur online. But we don't
want to get to that world because that's actually a much worse world than the one that we live in
today. Well said. And I think our politicians also need to understand that aspect of it in addition
to understanding the technology to pass some good privacy legislation and regulation.
Andy, we've been talking mostly about the proton stack of tools, which is around what I would
call communication types of protocols. And so it's email or it's a chat back and forth with an
AI, that sort of thing. In the crypto world of things, and I mean cryptocurrency here, not general
cryptography. In the crypto world of things, we think a lot about financial privacy. And this is,
I would argue, a subset of communication, right? You're communicating economic value, basically.
But I'm wondering if you would go that far because there are some people in the privacy space and some jurisdictions that treat financial privacy as different from communication privacy.
So, for example, they might respect, you know, citizens' rights to communication privacy. But when it gets to financial privacy territory, we can't have that. You must have AMLKYC. We're not necessarily comfortable with the whole peer-to-peer type thing. We need to know who the person sending the money is and who,
the receiver is at all time. We need the ability to blacklist, white list, and pull the plug.
In the crypto project, we say, no. I mean, financial transactions should be peer to peer.
And by the way, they can be and should be private. Some jurisdictions also don't like that aspect
of it. We've had many court cases around that. What's your take on this on financial privacy
specifically? It's a matter of perspective. Today, we see a lot in the news about Venezuela.
and one of the countries for the highest Bitcoin adoption in the world, in fact, is Venezuela.
And there's three reasons for that.
One, you know, massive inflation running out of control.
The government suppressing all dissent, also through controlling financial institutions,
and just privacy.
You need to have, you know, you need to be able to move money in and out of the country
without being detected because otherwise the government may, well, they may tax it,
they may steal it, they may, you know, use it to target you,
bunch of things can happen.
There is really no difference between freedom and financial freedom.
If you don't have financial freedom, I would argue you don't have actual freedom either.
And so I see it as kind of a similar concept.
And you need both.
So my view actually is we must have financial freedom.
This is something that we should fight for.
And it wasn't so long ago that we had this.
We had cash for many years.
and cash, well, maybe this is not popular in the crypto crop,
but I think cash is one of the best privacy technologies out there.
It's actually extremely popular in the crypto crowd,
and we would 100% agree.
Yes, yes.
So banning crypto is a bit like saying,
I'm going to ban cash.
And that's the analogy that we give.
You would never ban cash.
It would be unacceptable.
So I think that's an argument that, you know,
we should advance in this space.
And we should advance it because it's the correct argument, actually.
I agree with that argument. And I'm curious your take on this. So you're definitely a privacy
advocate, many shared values with many people in crypto in general. What's your take on
crypto right now? I imagine that you probably, like me and many listeners, see some of the
benefits here. Also have seen some of the scams and the downsides here as well. But what's
your take on it as you look at crypto like right now? I think the biggest challenge in the space and
in our space, and we're also in this space as well, in fact,
is the ratio between the, you know, the legitimate and the illegitimate slash scammy is incorrect.
Yeah.
And at Proton, what we also guard very carefully about is that ratio.
We know there will always be illicit uses of our platform,
but we need to keep that ratio to be as low as possible.
And in our case, we're talking, you know, a small fraction of our percent.
because above that, you actually get tainted in a way that is not conducive for the future success of the movement.
And in crypto today, we're not going to talk about illegitimate uses, scams, etc.
It's not a fraction of 1%.
It's unfortunately probably, I don't know, 30, 40%.
It's a substantial.
And I think crypto is always going to have a limit to its influence.
its growth,
its scalability,
and how mainstream it can be,
if we as a community do not tackle that problem.
I don't have the answer,
actually,
for how is the best way to address that.
I think that requires probably someone smarter than me
that has thought longer about this problem,
but I think we need to do that.
If we don't do that,
we don't move it into the mainstream.
And we need to do it in a way
that preserves our values as well.
How do you do it at Proton?
So is it a matter of kind of attracting the good guys?
more good guys because then the good guys
outweigh the bad guys if you're able to bring them to the platform?
It's attracting the good guys,
but it's also making it crystal clear
that we are not here to serve the bad guys.
And so it's as much positioning as it is a technology.
And of course, you do everything you can
to try to block abuse.
You try to find the suspicious user patterns,
things that don't look up.
Without breaking your print.
Yes, without breaking encryption, you do what you can.
And also reacting quickly.
When I'm made aware of users who are using it for illicit purposes,
there's no tolerance.
They're gone.
I don't care if they're paying me or not paying me.
That's a breach of terms of service.
It's a breach of the law.
They're banned from the platform.
That's it.
And I do think if you look at crypto,
there are too many platforms that probably became aware of illicit activities.
that were happening on their platform.
But these activities were profitable for them.
So they probably tolerated the activities for far too long.
And that made it sort of an environment that welcomed other illicit actors
because they felt safe within the space.
So it's about actually creating an environment that is hostile to actors
who are not going to be good for the long-term reputation
and long-term brand of crypto space.
as a whole.
We need to, maybe we need to call out scammers for being scammers.
Instead of fetting them at crypto conferences, maybe they should be blacklisted and not
allowed to occupy our public spaces and the public imagination.
I always say, like, some of the most famous figures in cryptos all have convictions.
That is not a good thing.
Some of them say it's a badge of honor that we fought the system, but no, you're just
criminals. And that's not positive overall if you want crypto to become mainstream.
Are you planning to do more, Andy, with your crypto wallet in particular. I believe it's a Bitcoin
wallet now. There's obviously, you could expand that to, say, Ethereum, you could get into
decentralized finance. One of the interesting things about Bitcoin is there's no privacy
on Bitcoin. So there is pseudo anonymity, of course, but if you send from one Bitcoin address
to another, it's all on chain. There are various organizations that can kind of like data mine
that and figure out who the underlying wallet identity is. So it's not truly private. It is peer
to peer. But in general, when you think about kind of crypto products at Proton, what's the idea here?
We also believe very strongly in focus within the company. And I would want to be, let's say,
the world's best Bitcoin wallet before I go off and add other things. And the other things that
we add also would depend on what is the demand from the community, what are people looking at
for what are the key things. Bitcoin today actually is the most commonly used coin within our user
community. So it makes sense to support that. Everything else actually is a really, really far,
you know, distant second or third. So we think about our mission of best serving our community.
The best thing to do right now is actually to take Bitcoin and make it as good as possible
within our wallet because that is what the vast majority of people today on Proton are actively
using. And this could change your time. You know, blockchains, they come and go. So there's no saying
that somebody, something new could come out in a few years, become very big because it has better
qualities than Bitcoin. And then maybe a big proportion of our community starts to use, you know,
this new thing. At which point, we would also be obliged to adopt it because we're here to
serve the community ultimately. So that's kind of how I look at it. I look at what is the work that
brings the biggest benefit and biggest value to our user community,
and who are they asking for?
And it's a very simple community-driven decision-making.
So, Andy, let's get practical now as we maybe bring this to a close.
So somebody listening to all of this so far is just like,
I agree with the principle, I'm busy.
Like perfect privacy in today's day and age, it's impossible.
It's incredibly difficult, very time-consuming, too much mental overhead.
Why bother?
What do you say to that?
Is there anything like any advice you'd give to just a normal person for like three to five things maybe they can do to improve their privacy posture right now that aren't overly burdensome?
Well, people are lazy.
So I think three to five is very too much, right?
Maybe we leave them with one.
And the one that I would give is actually a lot of people have asked me, you know, Proton always had a vision to do many things about ecosystem.
Why did you start with email?
intuitively makes no sense
because email is sort of a dying medium
of communication that's
well the demise of email has been predicted
continuously for 30 years is still around
I'm still an email maxi myself
but I get that people aren't
yes and you know what email
will still be here in fact I
predict 30 years from now
because email is not a means of communication
email is actually identity
it is your digital identity
which I would argue in the 21st century
is the only identity
that matters.
And when you switch from Gmail to ProtonMil,
what are you actually doing?
You're not finding a new method to communicate.
There's no way to communicate it's on email anymore.
Even I don't communicate so much on email.
But making that switch is incredibly powerful
because Gmail isn't email, right?
Gmail is identity.
It's a login state.
It's your account.
It is a thing that allows Google to consult your data
from across the entire web,
all the sites you visit that run Google Analytics,
all the cookies that are dropped across the web,
all the files you upload,
all the communication you have,
everything you do on Google Web,
every video that you watch on YouTube.
It's all linked to your Google account.
And you know how you can prevent Google
from having that information?
Just log out.
So switching from Gmail to Protomel
is simply saying,
I'm going to erase my identity from Google.
I can still go on YouTube and watch videos and whatever, right?
But it is no longer having all the information on the internet
tied to a single profile of who I am,
and you have effectively opted out of the Google system
by moving your identity to a different provider who you trust more.
And thanks to GDPR, this is pretty easy now.
This is easy switch functionality.
Google is required to let you export.
So you could go to a proton account.
You can link your Gmail account, move all your data over,
and it's just a few clicks.
And Google will truly delete it.
delete your data?
Well, Google will let you transfer everything.
And they're also obliged under European law to delete your data as well.
How about U.S. law?
Would that be for U.S. listeners as well?
I think on the U.S.
It is maybe not obliged in the same way, but there's many state laws.
But there's many state laws which do require it.
Okay.
So Google does do it as well in the U.S.
Because they can have to.
So moving from Gmail or Protima is sort of opting out of the Google ecosystem.
It's logged down at Google.
It's preventing them having a profile.
It doesn't mean you can't use any Google services,
If you're not logged in, it's completely different.
You know, the amount of vision they have on you is different.
So, and it's easy now.
It's a couple of clicks and then you're done.
So that's actually how you start.
Then, of course, there's all these other things you can do, right?
But that's the main one.
I think the main one is to protect your identity and separate your identity from a big tech ecosystem.
Yeah, it's a you can still have actually, like I still have, for example, an old Yahoo account, right?
You know, when I'm at my McDonald's and they ask me to, you know, use the free Wi-Fi.
I'm not giving them my proton.
I'm going to give him that.
I have to cut the spam, right?
But there's things like this.
I think that that's one thing that you do
that probably makes a big, immediate impact.
And that's how we started with email in 2014,
which everybody said was already dead back then,
but it's not because it's your idea.
I think you are absolutely right.
I think that is golden advice for all bankless listeners, actually.
So I've done this.
I haven't full ported my Google information over,
but what's fantastic about Proton is you can also create different aliases.
Yes.
So if you go to McDonald's,
on's Wi-Fi, you could just spin up a different alias in Proton and use a fake alias. So it's not tied
to kind of your main alias. And you have like numbers of things that you could spin up. Also,
when I was setting up in Proton, just the emphasis on security. It was really important. I mean,
we have a lot of email accounts that get hacked in crypto so that people can go get your
identity, go get access to your exchange, go get your recovery password, right? They don't know your
proton email, then they can't get that. And also, if you are two-factor authenticating
with like past keys and Yuba keys
and completely locking your email down
and Proton kind of makes that easy to do.
You do it that way too.
So locking down your identity is pretty key.
I think that's great advice, Andy.
And there's something that we have called Proton Sentinel,
which is pretty unique to Proton,
but we do it because we have a lot of activists,
journalists, crypto-high profile people who use Proton.
And it's a way that even if your Ubiki is stolen
or your 2FA is stolen and your password is stolen,
if you enable this feature,
we will still sort of detect logins that we find suspicious and block them.
So it practically secures your account even in the case that you have been fully compromised.
And that's something that actually we added in because we got demand from crypto users on ProCompterman
that said, hey, you know, this is happening a lot in our space. We need more security.
So we actually built that. It's called Proton Sentinel. And it's, yeah, you can read up on it.
It's kind of interesting as well.
That's fantastic. Yeah. I love the stack. And thank you so much for your time today.
And it's been great. I mean, I guess there was a close. One last question. It does seem like, I mean,
You've been doing this for 10 years, right?
And the internet has come a long way since then.
Now we have AI and everything that that will bring.
So paint a future of 2030, maybe one where privacy wins and we're doing okay.
Or another maybe a darker future where we'll kind of lose this fight and it continues
on the trajectory that it's been on.
And like, what do the two different worlds look like?
Well, I think losing the fight would be if big tech companies decided to engage in any competitive
practices, which regulators don't block, and they do that to wipe out companies like, you know,
Proton or the privacy companies. To give you kind of the very basic example, they could say,
we're not going to allow privacy companies on the app stores. And then if they did that,
because they're not declared as monopolies right now, there's something that prevents them
doing that. And then companies like Proton will not be able to exist in such an outcome.
So I think that is, you know, the risk is that big tech is so emboldened by the lack of regulation and lack of government oversight that they just go off and do completely blatantly unfair things to kill off the space.
I suppose they could even buy the regulators at that point.
I mean, they could get involved in lobby groups.
Yeah, if you look at the U.S., I think they've already bought a couple regulators and a couple, you know, maybe even a couple people high up in government, right?
Definitely a few senators, let's say.
So this is what the future looks like.
So this is the future for government is subservient to big tech.
And big tech controls our government and our democracy.
And democracy effectively ceases to exist because governments don't work for people anymore,
but governments work for big tech companies.
And we are, to be frank, pretty dangerously close to that, at least in certain countries.
So that's the, you know, a dystopian view.
The alternative is companies like Proton, and not just us, but the entire space of privacy-protecting services,
the entire crypto and Bitcoin space that is working on financial freedom, is that this space survives.
It continues to develop and grow.
It provides a viable alternative because, again, it's not enough to exist.
You need to be an alternative that is viable.
You need to have a feature set that someone can credibly switch over and not be so burdened by the lack of features and the poor user experience that they cannot, you know, stay on your platform.
So it means that we create a user experience that is good enough across our entire ecosystem, not just proton, but also all the crypto and Bitcoin ecosystem that is a viable replacement for traditional finance and traditional big tech companies.
and we win the argument in the public mind, in the public space,
where people understand that this is the better future,
and we at that point would probably achieve a central market share.
So crypto could go from maybe less than a percent
to perhaps 20 or 30 percent of finance overall,
maybe a proton instead of having 100 percent of the market,
20 or 30 percent.
And at that point, that is scale.
That is a viable fraction of the world population
where you have enough of a base
where actually you can win in the long term, right?
We get to that scale,
getting past the 50% tipping point,
that is conceivable.
So these are sort of the two paths,
and the path that we end up going on
really depends on us as individuals
because we live in capitalism
and even China that claims to be communism
is today capitalism.
And the most powerful force in capitalism
is you.
Is an individual consumer
making the right choices,
steering the economic and also technical
and a political future of our societies
to the choices that we make every single day
in our daily consumption of services.
And if we make the right choices now,
the next five years,
then we take the role on a different path.
And, you know, so I suppose the positive note
that we can end on is, yes,
it seems depressing, it seems scary,
it seems very difficult,
but actually we have the power
and we can do this if you want to.
I love that, perfect way to end.
Andy, thank you so much for joining us today.
Yeah, thanks for having me. It's really been a pleasure, and hold you back some time to share some more thoughts.
Absolutely.
Guys, got to let you know, of course, none of this has been financial advice, although with some fantastic privacy advice, we are headed west.
This is the frontier. It's not for everyone, but we're glad you're with us on the bankless journey.
Thanks a lot.
