Big Technology Podcast - Is It Worth Switching To Encrypted Email And Messaging? — With Andy Yen
Episode Date: December 28, 2022Andy Yen is the CEO of Proton, the maker of encrypted email service ProtonMail. He joins Big Technology Podcast to discuss the merits of encryption, why his service has doubled over the past two years..., and whether it’s something we actually need. This episode gets into a fun debate, covering both the pros and cons of the movement. If you like Big Technology Podcast, please rate it five stars ⭐⭐⭐⭐⭐ in your podcast app of choice. For weekly updates on the show, sign up for the pod newsletter on LinkedIn: https://www.linkedin.com/newsletters/6901970121829801984/
Transcript
Discussion (0)
LinkedIn Presents
Welcome to Big Technology podcast, a show for cool-headed.
We want a conversation of the tech world and beyond.
So why isn't all of our online communication encrypted?
I mean, there are some good reasons not to do it,
especially when it comes to things like online safety,
but also maybe we've gotten just a bit too comfortable with letting companies and governments
just kind of look through all of our most private conversations.
So it's end 2022.
Let's bring on Andy Yen.
He's the CEO of Proton, which makes the encrypted email service proton mail.
And it's a product that's expanding, actually, into things like docs and calendar, wink, wink, Google.
And it's also seen its user base basically double over the past few years, past two years, really.
So as you listen, I would ask you to consider, why would we want this?
Why would we want an encrypted email service?
What are the benefits that we have from it?
What are the benefits we get for keeping everything open?
It's not an easy interview.
It's pretty tough, actually.
And I didn't just want to let Andy walk into the end zone.
But I think by pressing them a little bit, we really start to put these arguments for encryption to the test.
And hopefully, by the time you finish listening, you'll come away with a better understanding of where this push and where this movement is.
and whether we should get on board or not.
So I hope you'll enjoy.
Here's my conversation with Andy yet.
Andy, welcome to the show.
Hi, thanks for having me.
It's good to see you again.
Good to see you again.
So for those who are new to Proton or have just a vague idea of what it is, what is Proton?
Well, I think, you know, Proton is a company that is trying to offer the same services as, say, you know, Google, Apple or some of the big tech companies you might know, but with a different business model.
You know, we simply believe in putting privacy first, giving you control over your data.
And we do that by having rigorous encryption across all of our services.
So whether it's Proton Mail, Proton Calendar, Proton Drive, or, you know, our Proton VPN service,
what Proton is, is an alternative to big tech services that really put you in control of your data.
Okay, great.
And so can you give like just a 101 definition of encryption?
Maybe you can do it like in 30 seconds or so.
This is a tough one.
Well, I think encryption is a complicated topic to explain, but it's math.
And the rules of mathematics are kind of constant over time once they work, they work.
I think what is more valuable to explain is what encryption does for you, right?
So when you talk about end-to-end encryption, which is what proton uses in all of our services,
take proton mail, for example, end-to-end encryption, what?
it does is ensure that we as a service provider don't have the ability to read your emails
and read your data. And that means I cannot sell your data to advertisers. And I can also not
give your data to say third parties like governments that may request it. So it's a technical
means to ensure the privacy of your communications. So, Andy, I use Gmail. I view email kind of as a
postcard, right? I send it. I'm fully aware that people can, you know, potentially read it. I don't
want the federal government snooping in or any government snooping into my email. But it's not like
I have like, you know, state secrets in there. And, and I don't mind if it's something like a Gmail
advertises to me. So, and I feel like that's how most people feel. That's why I see most of the
emails, at least, you know, that come into me or Gmail when they're not corporate. So why would
someone go with a encrypted email service over that? Like, I guess, it seems like it's a lot of
effort to switch. So why would someone do that? Well, I think the question is, you know,
should we accept the status quo, right? You're right. Email is a postcard, but why should it be a
postcard? The technology to make email not a postcard definitely exists. You know, the encryption
is possible. So if you have the choice between sending postcards versus sending encrypted
seal letters, why would you choose the postcard option? And that's really what Proton is about
is to give people this alternative. And, you know, actually switching is not very hard.
If you want to use ProtonMail today, we have an easy switch tool.
You can actually, you know, log in with one click, putting your Gmail information.
And we'll pull, thanks to, you know, GDPR, your contacts, your emails, your calendar from Google
directly to the Proton ecosystem.
So, in fact, it's not hard to switch.
And I think we shouldn't be so accepting of the status quo that emails have to be, you know,
postcards.
We must have advertisement in order to get email.
And we need to let Google have access to all of our information just to be online.
You know, this is, I guess if you have no choice, you will live with that.
But if there is a choice, I don't see why anybody would tolerate that.
Well, I'll give you one example, which is convenience.
I mean, if I'm in Gmail, it links to calendar.
Sometimes I book a flight and we'll auto-populate in Google Calendar.
The search is really good.
I mean, it's created by a search company.
Search is good.
And I like the interface.
So, again, like, yeah, we don't accept the status quo, but the status quo does give us convenience.
Yes, for sure there is convenience, but you're also giving up a lot, right?
You're giving up your most sense of intimate data.
You're opting into a surveillance machine, which has, you know, very negative consequences for
society and also for the security of your data.
And that is actually Proton's mission.
Our goal is to make the user experience functionally equivalent or as close to as possible
where you don't have to give up convenience to also get privacy.
I think the kind of, you know, the notion that you need to
trade-off privacy in order to have convenience. This is ultimately a false notion. And this is one that
through technology, you know, we should be motivated to solve that problem. And I believe in the
past there definitely was a gap. But I think as, you know, private technologies, Protonel and others
committed to develop, you'll see that over time that gap actually shrinks, right? You know,
the search experience, the mail counter integrations, you know, these are all things that,
you know, today, Proton can do, and we'll only do better in the future. So if I get a
get an email from like a flight confirmation that goes to my proton mail, it will show up in the
proton calendar?
We're not reading your emails to do that right away.
But we are now starting to auto add the ICS calendar invites.
So in fact, today the integration is already working for, you know, we are adding, you know,
events directly into the calendar.
And this is things are, you know, we will continue adding more and more of these features.
And that's part of protons move towards an ecosystem, right?
The reason for integrating mail calendar and drive together is to actually support those integrations.
Yeah.
Now, you talk about opting into a surveillance machine.
You know, we hear a lot of this, like, and sometimes this language gets alarmist,
like surveillance capitalism, for instance.
So how much surveillance is actually happening to the average user?
Well, I think it's not so much of surveillance that has actually happened.
But the potential surveillance that this enables,
enables. And if you, for example, look at the amount of data that Google or Facebook or even Apple
has on you today, it is far more data than what the East German government had on East German
citizens, you know, through the Stasi, right? It's, you know, way more. It's funny. I was just,
yeah, I'm here in Germany. I was just at the Stasi, the Museum and the Stasi headquarters from like a
week and a half ago. And I was like, oh, this is all good. Sorry, I was there a week and a half ago.
And I was like, oh, this is all going to be good material to speak with Andy about.
Because that level of surveillance was unbelievable.
But, like, that was done to individuals in dossiers that they would then go and use to sort of keep the general population in line.
And that's not exactly what's being done here with, you know, ads, get it showing ads based off of even some of the information that you've sent.
It's, you know, or, you know, and I would say it seems like most, most people that are emailing or don't have the government reading their emails the same way the Stasi would read and text.
wiretap, everything.
They were dressing up as tourists with cameras to take pictures of people in the street.
I don't know if that's the same thing.
Well, you know, Google doesn't need to dress up as a tourist in the street, right?
Because they're already on your phone and on your devices and scanning all your information anyways, right?
They have automatic ways of doing that.
And I think the way to look at this is, you know, imagine the real-life version of Google.
It's someone who follows you around, listening to all your conversations, writing down everywhere you've been,
keeping the record of all your purchases, and putting it into it into.
a little notebook. You would never tolerate the real-life version of Google. So why should we tolerate
the online version of Google? Because I'll give you an example. I mean, by the way, I'm not,
I'm just trying to, you know, hash this out because I actually think there's a lot of merit to
what you're doing, but I want to make sure that you get the arguments that Google would make
just so, just so that we're, you know, able to really get to the heart of the matter.
It's not, at the end of the day, the Google that's taking notes on us is not a person.
you know, it is an algorithm.
And if a Google engineer were to accept or to access that, you know, notepad of
data, which I don't deny they have, they would get fired.
So it seems to me like it's a lesser degree.
Well, this, you know, this goes on a day topic of kind of, you know, algorithms, right?
And, you know, what is more responsible, you know, a person or a computer algorithm?
I would argue that actually the algorithm is probably the bigger risk.
Because as we have seen, it can do things that are completely.
completely, you know, unintended in terms of consequences.
And, you know, and today we're in the world that's Romay algorithms.
And I would think that actually, you know, the person would be able to make a moral judgment.
The algorithm simply does what is trained to do.
And, you know, the fact that it's an algorithm and not a person on the other end doesn't
actually comfort me.
And I don't think, you know, should comfort us.
So what are the sort of unintended consequences that the algorithms have done when they've
gotten a hold of people's data?
Well, let's take social media and Facebook, for example, right?
You know, an example of an algorithm is the so-called notion of, you know, filter bubbles.
So let's say you are a right or left-neaning, you know, political person.
Well, you will end up clicking on content that, you know, more aligns with your views.
And guess what?
The content that gets the most views and gets the most traction on social media is the stuff
from the far left and from the far right.
So the algorithm says, okay, you know, you clearly lean that way.
and then it inadvertently shows you
more and more extreme content
and this over time
leads to the polarization of society
because people on the right have more right views
people on the left have more left views
and it's an algorithm that's constantly feeding them
information to make their views more extreme
so I think this is an example of how algorithms
can go out of control and this is just
I've been scratching the surface here
yeah and I wouldn't call I mean that's not
I think filter bubbles I think filter bubbles have been
somewhat proven
that you do come into contact
with a lot of viewpoints from the other side on social media.
The thing is, though, and I think you hit on it, that social media does amplify the stuff
that's the most radical, and that's the type of stuff from the other side that you see,
and it makes you not be able to find common ground with people.
We've had a couple discussions about that here in the show over the past year.
But that being said, it's very different.
Like, your personal data is, you know, that's part of the business model there.
I don't think Gmail, you know, tracking your data is going to make you send more.
and more extreme views out to people.
So what could be the potential
on the unintended consequence here?
Well, I think, you know,
what Google is doing is not just Gmail, right?
They're taking all of your information
from diverse sources,
which is every single platform that you use.
And essentially, they're building a profile on you.
And then they're using that information
to decide, you know, what they show you,
what you don't show you,
and to kind of, you know, inform your worldview.
So when you think about search and advertising,
this is basically, you know, control over what you buy, what you think, what political ads you see.
This is a high level of control over, you know, our day-to-day lives.
And of course, you know, today it's Google.
Maybe you trust Google, right?
But the data is out there forever.
You know, there's...
For the record, I do not trust Google.
You know, it's a trade-off for sure.
Yeah, of course.
And, you know, your information is literally one merger and acquisition from corner
overseas. I think the example, the classic example of this is, you know, the grinder,
the gay dating app got bought by the Chinese, right, and became a national security issue.
So the thing about data online is you can't take it back. You know, once it's out there,
it's out there. And the ways that it can be used and misuse in the future is, you know,
completely out of your control. I think when people were signing for Facebook and signing
for Facebook groups back in 2010, 2012, they didn't even imagine that Cambridge Analytica
could be the outcome of that data. And this is the thing is that, you know, we
don't really understand how some of this data can be used in the future. And we're today
freely giving it up. And I think that is, you know, really the behavior that I think we need
to take a step back from and say, hey, you know, maybe this is too much. Maybe we should have
control of our data instead of just freely, you know, giving it up. Right. K-Bers-analytic, I mean,
these stuff might influence in some extent, like the way that you think. But they're not, it's not
like, you know, in that social network movie where they're like controlling you. Wouldn't you agree?
Well, depending on who you believe and who you talk to, you know, Cambridge and Adelika could have swung some elections, right?
And I think that's highly doubtful.
But sorry, go ahead.
Yeah, yes, it's true.
But I think it goes without saying that this data could.
I don't think what they did was good.
But yeah.
Yeah.
But I think it goes while saying that this data could swing elections, you know, if misused.
And, you know, Cambridge Analytica, of course, is one example.
but you can kind of see this taking to the logical extreme, right?
And we don't need to look very far or even imagine too far
to see what this extreme looks like.
If you look at a country that also has tech giants,
that are pervasive in everyday life with basically monopoly power,
that's China with, you know, Alibaba, Tencent,
and these are the big, you know, Chinese tech giants.
And there you clearly see an example of what happens
if a tech company's data is put at the service of a gun,
government actor that is, you know, bent on control and supervision of a population of one billion
people. So this is a logical conclusion. This is the risk of what could happen. And of course,
that seems very far away from us, you know, maybe here in Europe and here in the U.S., but why would
you, you know, freely give this power over us out there to the public in a way that we cannot
control, right? I think it's something that should concern all of us, really.
Andy, there is a difference, though, between the U.S. and China case, because it's the government, right?
At the end of the day, the United States government or governments in Europe aren't going to have population control the same way that the Chinese Communist Party does.
So that's where the distinction is.
It's the party, really, not the technology.
Or maybe a combination of both, but it starts with the party.
Yes, it starts the people.
But technology enables you to do this, you know, at massive scale and at very high risk, right?
I can give you example of another country that 30 years ago, you know, on paper and also in practice, was a democracy in a competitive elections.
This was Russia, right?
And, you know, you see how this can actually shift over time.
I'm not saying that, you know, in Western Europe, we're destined to follow the same trajectory.
But there are definitely some countries, you know, which in the past you might have considered to be democracies, which today you'd be hard-pressed to consider it a democracy.
And you can see kind of the danger of this data, because this data, you know, that's out there today will still be there in 30 years and can potentially be misuse.
So this is why I think, you know, from a society, you know, standpoint, privacy does matter.
You know, we should care about it because without privacy, you don't really have true freedom of speech, freedom of information.
And without that, you know, actually democracy doesn't really work.
So I find it as fundamental to safeguarding democracy, you know, in the 21st century, which is the digital century.
That's right. So why? So you started with email. I get pro time mails. Like people sign up to big technology with it. I have an account. I don't use it very often. Maybe I should use it more often. Why did you start out with email versus I? Because I feel like messaging. You know, I use Signal. And Signal has disappearing messages. It's encrypted. I think that's much better compared to the other alternatives. But email less of a compelling event for me to move over. So why start with email?
It really comes down to the notion of what email represents.
Today, when you look at email, it's not a communication tool.
People think it's a communication tool, but actually email as a communication tool probably
stopped 10, 15, maybe even 20 years ago.
What email actually is, is identity.
And if I were to give you the choice between losing your passport or losing your email,
losing your email is probably far more inconvenient because it is the thing that you have online
that links together the rest of your online life, your whole
digital existence. So when you choose to use email from Google and you use Gmail, what you're
doing is not so much, you know, using a Google tool to communicate. What you actually are doing
is saying, I'm going to give Google control of my identity because that Gmail account is your Google
account. And that is a thing that Google uses to tie the information they have on their entire
platform across all their services to your real life identity. And in some ways, you know, the use
a proton mail instead of Gmail is to say I'm going to opt out of this system and I'm going to
have my identity and I refuse to have a file for myself, you know, in Google. And that's why
we started with email because it's, in my opinion, the highest leverage way for people to break
free of the system. So what can Google do with Gmail that you can't do? I'm not talking about
functionality. I'm talking about privacy-wise. So like what do you mean by that exactly?
Like they can read our messages or when you have an encrypted email service, what can't you do that
others can't?
Because your selling point is what you can't do.
So from a technical standpoint, I would say that, you know, Google could actually encrypt
all the emails on their system.
They could do, you know, everything that we do because technologically nothing prevents them, right?
I would say the key difference between Proton and Google is really the incentive structure that is in place.
You know, Google is a company that today, you know, is making 90% of the revenue from advertising.
And advertising works best when it's contextualized and when it's personalized.
So in other words, Google has a $200 billion incentive to really get up to the line of pre-pee in terms of, you know, using your data because that's what makes the money.
So when it comes to, say, making advertisers happy, as opposed to protecting your privacy,
well, the people that are paying $200 billion are the one that are going to get the final say.
So I think the difference is business model and alignment of interest between, you know, users and companies.
Okay, but more to the question in terms of like the practically what they can do.
So do they have like product managers that can go on and read your emails?
Do they have like an ad system that targets based off of the emails?
And can that, or can't that happen in Proton Mail?
Well, Proton, for one, is not an advertising business.
So, you know, we don't have user profiles.
We don't link any information to real-life identities.
You know, we're not looking closely at, we're not scanning your information.
We're not monetizing any information.
We're not serving any advertisements.
So, you know, all the things that Google does enable its ad business, you know, we don't do any of that.
And I think that's, you know, a very clear and very big distinction.
And we also cannot one day, say, change our mind.
It's just that we want to do that.
Because once data has been encrypted with zero access encryption, once your inbox is encrypted,
because it's zero access, we don't have the means to say, oh, you know,
protons been sold, the new owners really want to mine data and do that as a business model.
It's not possible to actually decrypt the data after it has been encrypted.
So I think it's not only do we not do it today.
We are prevented from doing that in the future by the nature of the tech works.
So nobody inside Proton can read the emails of people.
Yeah, yeah, we have no means.
Yeah, exactly.
We have no means to decrypt because we simply do not possess the encryption keys that would be required for us to decrypt user inboxes.
But Gmail they can.
Yeah, of course.
How big of an issue is the government?
I want to go back to this government question because we talk a lot about, so we've talked about surveillance date,
but we've talked all about that in terms of the context of advertising.
Let's talk about it in terms of governments being able to access email.
There was the whole Snowden revelations that showed that the government was spying on a lot more of us than we thought.
Do you think that's still the case and how often is the government going into people's email inbox and just reading whatever they want?
I think they're doing it probably way more frequently than we imagine.
And if you take the U.S., for example, Google space, Google is an American company, right?
There's things like national security letters, there's things like FISA courts,
there's many ways in which they can, you know, get direct access to people's inboxes and get
this information. So I would say actually in the U.S. is probably quite common because the means
to doing it are quite easy. And, you know, FISA courts are rubber stamping exercises.
Do you have any data in terms of how, like, is there any publicly available data in
terms of how often this actually happens? Like if you're an average user and if you're not like,
you know, in ISIS, like how often is this, how easy is this to do?
Well, Google does publish statistics, so they do have data on this, and we know it's, you know, in the millions of requests per year.
This is the ones that they publish and the ones that they're able to be transparent about, right?
There could also be some that, you know, they're simply not allowed to disclose because they receive a national security letter.
So, you know, clearly there are, you know, large number of government access requests, and it is something that happens, you know, quite routinely.
Now, who they're targeting, you know, how they're targeting, are they doing it within, you know, the confines of the law?
This is hard to say, right?
I think, you know, Snowden showed that at least in 2013, it was being done on a widespread basis with very, you know, active, very practical checks and balances.
Maybe it's different today.
But, you know, I would be, you know, skeptical if it has radically changed since, you know, Snowden made his relevations public.
Okay.
And again is with us.
He is the CEO of Proton Mail.
We're going to take a quick break, talk a little bit more about the company that he operates
and whether its incentives are different than the other big tech companies, incentives
that we've discussed.
We'll also take a look at, you know, some of the other products they offer.
And then going to get into some of your Twitter questions or some questions, good questions
from Twitter that showed up when I asked for them.
So thank you for those.
And then we will finally, we'll talk a little bit about Apple and its decision to encrypt
iCloud. So lots of interesting stuff. Don't go away. We'll be back right after this.
Hey, everyone. Let me tell you about The Hustle Daily Show, a podcast filled with business,
tech news, and original stories to keep you in the loop on what's trending. More than
two million professionals read The Hustle's daily email for its irreverent and informative takes
on business and tech news. Now, they have a daily podcast called The Hustle Daily Show,
where their team of writers break down the biggest business headlines in 15 minutes or less and
why you should care about them.
So, search for The Hustled Daily Show and your favorite podcast app, like the one you're
using right now.
And we're back here on Big Technology Podcasts, second half, our last half of the show for
the year.
So we're closing out 2022 Strong here with Andy N, CEO, Proton, Mail.
Andy, first of all, why did you start this company?
Did you, like, see what Snowden did and say, okay?
I have to create an antidote to that.
What was your, what was the compelling event to start the company?
Well, I think like many good tech products, ultimately, ProtonMil was a service that I myself wanted.
You know, back in 2014, when we first started, if you wanted to have, you know, privacy via
your email, you didn't really have much of a choice, right?
You could self-host your email, which was, you know, finding a cloud service provider,
which back then wasn't even that common.
installing your own email software and then being the administrator of that.
Or if you want to be online, you can go to Google or Yahoo or some of the other companies
that all monetize in the same way.
So my thinking was simply there must be a third way.
There must be a better way.
There must be a way for I can, you know, how something is that's so easy to use.
My parents can figure it out.
But at the same time, still preserve, you know, privacy.
And that was the concept of a proton mail, was to see if we could, you know, bring together, you know,
the business model, the technology, and also the,
user experience to make that happen. Okay, cool. And so when did you found it? This was back in 2014.
So it was about a year after the Snowden leaks. And that, you know, I think that really got people
thinking, you know, myself included. How many people does the company have? Today we have slightly
over 400 employees around the world. Okay. So it's a sizable company. Yeah. It's, it's,
it's grown a lot in recent years. And your business model? Like, you're also a company. I mean, Google's a
company, you're a company. So, you know, your objective also is to make money off of email users
just a different way. Yeah. Yeah. And I think that's extremely important, right? Because if you don't
have a sustainable way to make money, actually the business won't be around for a long time. And,
you know, I think as a user, you shouldn't actually trust a company that doesn't have a way to make
money. Now, the question is, how do you make money, you know, in a way that aligns incentives with
customers. And in Google's case, as a Google user, you're not Google's customer. You're actually
the product that Google is selling to its real customers, which is the advertisers. And that's
the best alignment of interest. And in Proton's case, of course, our users, they actually pay us.
So we keep the lights off through the users to pay us. And because of that, my only incentive
is to protect user privacy, because that's why they pay us. And I think that alignment is very
important to, you know, make sure that you can always keep users best interest at heart.
Okay. So it's a paid product. What has been the growth curve? I mean, I get a sense that as people
have become more aware of this, people are more and more moving towards encrypted services.
Like, I, just in terms of messaging on the phone, I never would have thought to use an encrypted messaging
app a few years ago. And now almost all of my messaging happens in Signal, like I mentioned.
So are you seeing a similar growth curve?
Yeah. Well, I think just to come back on your previous comment, it's not exactly a paid
product. It's a freemium product. And that means most people actually don't pay us, right?
But the power users are one more features will pay us. And, you know, like most freebie
models, there's a small percentage of a paying users. And I think that is very, very important
because that ties directly to growth. The fact that it's a freemian model,
means that we've basically made privacy encryption technology very accessible.
So anybody, anywhere in the world, even if they don't have any financial resources,
can actually get on the system and get private information, you know, get the private information and protected.
And this has, you know, led to what I can only call, you know, very exponential growth.
When we started in 2014, there were maybe, you know, 10,000, 20,000 people on this on the platform.
Today, if you look at all Proton services, we're talking in excess of 70 million accounts.
And most of these came on just in the past couple of years.
So I think you're right, there has been a dramatic shift in people's attitudes towards privacy.
And there is now more more demand for services that actually give you control over your data.
And I think that's a positive thing.
It's long overdue, but it's great that it's actually happening now.
She said most of these came on in the past couple years.
When did you hit 35 million users?
Was it like last year?
this was probably, you know, within the last two years, let's say.
Two years. Okay. That's wild.
Okay. And the last thing I want to ask you about your product,
that we're going to move on to some Twitter questions.
You don't just do email.
Actually, when we met, you mentioned that you do like Google Drive like capabilities.
Like one of the things that would prevent me from switching over is the fact that I love the fact that, like, you know, I have dot Google Docs and all that stuff.
But you have a version of that as well.
Yeah, well, I think I first want to kind of maybe cover the notion of switching.
You know, people tend to think of privacy and not private as kind of mutually exclusive, right?
Like, you know, if I'm a privacy person, it means I don't have Facebook, I don't have Instagram, you know, I don't have, you know, LinkedIn.
I don't use any other services.
And I think the more practical way of looking at this is really a coexistence.
So I'll give you example.
Of course, I use ProtonMil.
But, you know, I've got my old Yahoo account as well.
And when I'm in the McDonald's, let's say in Zurich, trying to get access to the public Wi-Fi, I'm not giving them my Protamil account.
I'm giving them my Yahoo account because I know they're going to sell that and it's going to get spammed, right?
So I think this idea of private versus not-private is kind of a false dichotomy.
I think it's a coexistence, you know, people...
So it's a tough sell to say, hey, let's, you know, add one more inbox.
I mean, a lot of people have like a Yahoo, a hotmail, a Gmail, a couple of g-mails going through like the difference.
But if you have so many, what's one more, right?
And I think it's more work.
But answer that question about the drive stuff, because I feel like that's important.
Yeah, yeah.
So coming to the question of drive, you know, of course, we need to have documents and all the functionality.
Now, you know, Google Drive has been out for, you know, more than a decade, right?
And they have a big head start on us.
So, of course, I'll take us some time to get all this functionality.
But the drive product, as is out there,
array today is actually already doing quite well. We're already past, you know, one million
files uploaded, you know, per day. And it's only been, you know, a few months since we launched
the product. And what Drive is doing is it gives you a way to, you know, store stuff on the cloud
with N10 encryption. And I think that's actually, you know, fantastic. It doesn't have document
storage yet, but that's, of course, on the roadmap. And if you say you have a very sensitive
a document. In the past, you had two choices. You could put it locally on your phone or on your
computer or in a USB stick, and that keeps it safe. But if you lose your phone or lose your USB
stick, you're screwed. If you're in a different country without access to your USB stick,
you're also in trouble, right? So, you know, cloud is, of course, convenient. But the issue with
Google Cloud is, of course, then you're giving Google access to all your files. And I think,
you know, the innovation really, what makes, you know, Proton Drive super interesting is
you keep the privacy, your data is still confidential and, you know, we cannot decrypt it.
But it's accessible anywhere in the world from any device.
And I think there definitely is a happy medium that, you know, can be achieved there.
When you get convenience and privacy at the same time, and this is why we're pursuing that product.
And I think it's, you know, something that could be very valuable.
So you don't have like a composer yet, but that's something you're working on.
Yes, yeah, yeah.
It's something that is in progress.
Okay. What's the ETA on that?
It's hard to say, right?
You know, we're a community-driven company.
It depends on what community wants.
And I can tell you from the kind of the initial reaction, you know, Twitter, Reddit, and people who have emailed us, they want the composer and the document editor.
But a lot of people just want better photo support.
So I think we had to balance the two to see what is, you know, higher priority.
Talk a little bit about that, about how your community drives what you build.
Is it a forums or something?
Or like, how does that work?
Well, it comes back to the history.
Proton, as at the very beginning, was a crowdfunding project.
So it was launched from, you know, 10,000 people donating.
half a million dollars to crowdfund proton mail.
Not each, right?
No.
Unfortunately, no, but overall, right?
And that was, I think, a super fascinating way to begin the business.
And where today it probably won the only crowdfunded companies that, you know, started
crowdfunded and then, you know, was able to be around and grow over the years.
Actually delivered.
Yeah, yeah, actually delivered on the promise, right?
I'm still waiting for my fridge with the boombox Kickstarter.
Yeah, I remember it's funny because, you know, back when we,
you know, we're doing crowdfunding, we were selling
what we call lifetime accounts, right?
Which, of course, is very highly speculative to buy a lifetime account
for a crowdfunding project that hasn't even launched yet.
But I think those people, at the price of we sold them at,
definitely got their money as worth.
And being a crowdfunding company means that it's very community-driven.
And because the community is what pays us.
So every product that we build, every feature that we build,
it's really driven by community feedback.
It's whatever people say they want the most, we build it.
So, you know, when they ask for calendar, we did calendar.
They asked for file storage.
We did file storage.
And even today on Drive, it's, you know, through Reddit, it's through Twitter, it's
through people contacting us in customer support.
It's through people who answer our annual surveys and our polls.
You know, literally, you tell us to do it and then we're going to do it.
And I think that's how, you know, companies should be run.
It makes product management, you know, very easy, of course, because my roadmap is very clear,
right?
It's what the community tells me they want.
It's like the dream of Web 3, right?
The users become stakeholders, except.
you don't, you know, I don't think you're issuing any of those FTCS tokens.
Yeah, I think you don't need a, you know, scammy cryptocurrency in order to achieve this.
You know, you can just ask them directly.
And even without coins, they'll tell you what they want.
Yeah.
Okay, cool.
So let's talk a little bit about the news.
So Apple is offering encryption to iCloud.
And people like Walt Mossbrook, I talked about this iCloud loophole where, like, Apple talked about, like,
happens on your iPhone, stays on your iPhone.
But because ICloud wasn't encrypted,
there are, you know, different government entities.
I believe that could have gotten in there.
Lisa, so what I think the argument is?
What's your take on the fact that Apple was going all ICloud for the,
all encryption for the I cloud backup?
Good sign?
Yeah, I think, of course, it's a positive sign.
It's probably, you know, good for, you know, consumers.
But I think as with anything that Apple does,
you got to take it with a grain of salt, right?
And kind of understand, you know, their motivations.
And at the end of the day, I think the question that, you know,
we ought to ask ourselves is, you know, what does Apple stand for?
What is Apple's values, right?
And when it comes to privacy in Apple, the way I think about it,
and the way I've seen it and kind of what their actions have demonstrated,
is it's privacy when it's convenient and doesn't hurt the bottom line.
And I'll give you kind of a quick example of this, right?
You know, this intent encryption of, you know, ICloud.
at least a file storage.
I'm not really sure.
I would be highly doubtful if Apple will roll this out in China.
China is a country where, for example,
Apple is already putting status centers with Chinese joint mentor partners
in order to give Chinese government access to Apple customer data in China.
So Apple relationship privacy seems to be, you know,
we'll do it when it's convenient and when it makes us more money.
But when it comes to, you know, being private or being people first
versus, you know, profits.
Profits tends to win most of the time.
And there's, you know, been many examples in recent years that, you know,
really shows kind of how they take that approach, right?
You know, I think how does Apple define privacy?
Apple really defines privacy as, you know,
nobody can abuse your data except for us, right?
And the real definition should be,
and the problem definition is no one can abuse your data, period.
So I think it's very important when we look at privacy announcement
from big tech companies to not let them redefine what privacy means,
because that's actually the fastest way to lose privacy.
Yeah, it's interesting.
There's no way that they're going to end up closing that off to China.
I mean, I feel like reporting has suggested that that's basically the whole deal from
them is that there is a backdoor there for the Chinese Communist Party.
Yeah.
And then, you know, Apple is also not open source, right?
So, you know, how it works, what it actually does.
There's no way to independently verify.
So, you know, I think that's also something that makes it a question.
a bit different from other, you know, encryption software.
Like, you know, Signal, Proton Mail, Proton Drive.
These are open source softwares.
And, you know, that way you as a consumer can be very confident that it actually does
what it's advertising to do.
Yeah.
Okay.
So here is a tweet from Tim Sweeney who runs Epic Games.
It's a notorious Apple opponent.
Tim, if you want to come on the show, you're always welcome.
He says Apple's move to support ICloud end-to-end encryption is a great one, provided it's
not a gambit to announce this and then all.
offer to trade it away in secret talks with U.S. officials in exchange for their dropping antitrust
legislation and enforcement, that timing, dot, dot, dot. And then he says, my view is that the proper
boundary for absolute privacy in which a company can't look or listen, even if compelled,
is the boundary between personal effects and personal conversations and published information
and public conversations. What's your take on what he has to say? Well, I think, of course,
and Apple have been locked in a fight for a long time.
Today are not friends.
Yeah, yeah.
We're also locked in that fight as well, right?
Because, you know, we also have been very vocal in criticizing some of Apple's,
you know, apps for policies.
Right.
Do I think it's part of some secret negotiation, you know, with the government to trade away?
You know, my personal opinion, no, right?
Seems a bit out there, yeah.
Yeah.
But I also can't know too much what is happening, you know, behind closed doors.
It's, I think it's quite interesting because if you may remember back less than a year ago, Apple was saying, you know, we're going to do some special, you know, client side back in scanning in order to, you know, make, you know, child sexual abuse materials, you know, let's say detectable by default and we're going to practically, you know, send that information to law enforcement, right?
So it seems a bit kind of, you know, odd, the same company that was thinking about, you know,
practically reporting their users to the government is now coming around and saying, oh,
now we're going to encrypt everything, right?
So it's a, you know, even for a company like Apple, it's a bit of a, you know, big turnaround.
I think, you know, some people are going to get me a flash from that.
So, and Apple is huge.
It's based in the U.S.
clearly has a lot of discussions with law enforcement.
and there could be things, you know, going on behind the scenes.
But I would prefer to, you know, give them the benefit of the doubt.
On that front.
Yeah, but I will say that, you know, Apple's previous privacy moves.
Yeah.
If you look at them a bit more cynically, you kind of see what is, you know, at play.
So I'll give you example of kind of them locking out third-party trackers, right?
Apple said, oh, this is a move for us to, you know, protect your privacy.
We're going to disable third-party trackers.
And, of course, Facebook, you know, cries foul.
But then you dig deeper.
And it turns out that, you know, Apple's not blocking all trackers.
There are some trackers are still there.
And it's the first party ones.
It's the one that Apple controls.
And then at the same time, you see Apple building on ads business and pushing ads, you know, into the settings, into other applications that they control in the app store, right?
And Apple's, you know, ad business builds on 300 million in 2017 to $4 billion today and a projected $30 billion in 20206.
So when Apple said locking out third party trackers, you know, their actual motivation,
you know, wasn't privacy.
The actual motivation was I'm going to, you know,
lock out my surveillance capitalism, you know, a competition
so that when I go do it adds,
I have a clean run and I've already cleaned the field.
And so this is why I think, you know,
Tim is probably right to be a bit suspicious
and some people are a bit suspicious.
Right. How do you, Andy, let's go back to the child sexually exploitative material.
How does a proton handle that stuff?
Because, you know, it seems like folks that would be in that
world would want an encrypted service.
I mean, it's sort of like you've built the perfect tool for them.
Well, I think it doesn't matter whether it's encrypted or not encrypted.
You know, folks out there who are doing this type of abuse are going to use services.
You know, they use Google Drive, for example.
They for sure use Apple as well.
It's an issue that all platforms have to deal with.
And of course, you know, when we get reports of such things, you know, it's illegal in Switzerland.
We follow the rules around that.
But there's kind of this notion that in order to prevent this entirely,
you must disable encryption or you must backdoor encryption.
And the way to think about this is, of course, if you had a world without privacy,
it's probably more quote-unquote secure, right?
But if you ask the people in North Korea that live in such a regime,
they probably don't feel more secure.
So, you know, as a society, if we are for freedom of speech, freedom information, and democracy, you know, we also need to tolerate some level of privacy.
But how can you catch people?
What methods can you use?
Well, you know, short of breaking encryption, there is many different ways.
So I'll give you a kind of example.
You know, today, if you were to look at, let's say, you know, some file metadata.
So who uploaded the file?
did they get uploaded over Torah IP?
What is the pattern of access?
How is it being shared?
Actually, a lot of this information can, in fact, be leveraged to give you very accurate
predictions of which accounts are abusive.
And these are techniques, these are not new techniques, right?
This was used, of course, for anti-spam, because how do you know if an email is spam or
not spam?
And some of these same behavioral techniques can actually get you quite far.
and will it be, let's say, as effective as, you know, checking every single file?
No, right?
You know, if you want to stop, for example, you know, bombs being sent in the mail,
you open every single package and you read it, right?
But, you know, we don't do that because that is simply against, you know,
our values of privacy that, you know, we accept in society today.
And this is exactly the thing.
You know, we, of course, know, that probably by allowing
encryption will be less good at detecting some of these things. But that's still an overall
benefit to society because privacy gives us so much more in terms of freedom and democracy and
civil liberties. And that's the balance that I think regulators always need to be very careful
about striking. Yeah. So we got some great questions for you from Twitter. One of them is
from Neville Lahiru. And this is a pretty good one. It talks about what are your thoughts on the whole
government versus encryption matter? And if there's a
practical middle ground to it, especially when the fingers pointed at national security issues.
So it builds off of what we were just talking about.
Obviously, the government's going to want to have stuff a little bit more open, and there
could be national security risks that people are using, you know, you're going to, you admit it,
you're going to be a little less good at it, you know, probably goes across the board.
But you make a compelling argument for encryption.
Is there some common ground that folks can find?
Yeah.
Well, I think we are already at the common ground.
And, you know, this goes back to...
What is that? Yeah.
Yeah. So this goes back to, I think, you know, the incentives of government's, incentives
of private sectors and the role of legislation.
You know, when you pass a law today, it's really to fix, you know, something that markets
are not able to address.
That's the main reason for passing a law in a capitalistic economy, right?
And take terrorism or, you know, even child abuse, right?
is there any tech company in the world, even the most evil one, for example, that would want that on their platform?
I would say not.
It's illegal.
It's a PR.
It's a moral and ethical issue.
So the notion that, you know, private sector companies like proton, like Apple, like Facebook, you know, you name it, is not already doing the best that they can do to try to tackle this problem.
It's just, you know, folly, right?
You know, there is no need for government to say, oh, you need to do more on terrorism because
everybody is really doing the maximum they can, right?
You know, the absolute disaster for any tech company would be to find that their
technology was used in the next 9-11.
So, you know, in that sense, I think there's a very common ground in that everybody agrees
these things are objectionable, and everybody is doing the best that they can, you know,
within technology possible to tackle that issue.
And that's something that, you know, I'm not saying that it maybe doesn't let everybody
in D.C. sleep easy at night.
Maybe it keeps the FBI up at night.
But it's something that we in society at large can feel.
comfortable and probably confident in because, you know, everybody, even at Proton, we have dozens
of engineers, you know, working full time just on this issue. And so I, so I believe the common
ground and the alignment actually is already there. Okay, great. Going along these lines,
another question we had from, and by the way, shout out to Jay Manchin Wong, who said,
ask about the ICloud advanced data protection. So we got that in. And now Richard Porter
also asks about how Proton and the Swiss authorities interact.
And there's this article that he links to where he says that he shows that a court order led to the arrest of a French climate activist when it came to Proton.
So what's going on with your relationship with the Swiss government?
And do they have undue power over your company?
I think unless your company is based in a boat 15 miles offshore, you're under the jurisdiction of some country.
Right. So, you know, given that you accept you'll be in a jurisdiction of some country, it's a question of, you know, picking the jurisdiction. And in Proton's case, you know, we chose Switzerland because it is quite privacy protective. It's a law tradition of privacy. And the laws generally are quite good. But that doesn't mean, you know, Switzerland is not Somalia, right? It's not a lawless, you know, country where anything goes. And I think, you know, proton as a service, if you use Proton, it's also not, let's say, immunity from criminal prosecution.
So in situations where someone is breaking Swiss law, you know, we do have to comply.
And actually, this case, in, you know, this particular case...
Yeah, what happened with the climate activist, yeah.
You know, I'm not even sure the climate activist, to be, to be honest, right?
You know, what happened in this case was, you know, there were some folks who have been illegally, you know, squatting a building.
And they had done, you know, extensive property damage and have been, you know, have been stealing, you know, theft, right?
So the issue here is
your destruction of private property
of theft
these are things that
actually are illegal in Switzerland
and you can't do that
things like that. When you do something like that
you will get in trouble. Now
what I find interesting about this case is people were so
focused on oh you know proton crop rates with Swiss governments
which to me was pretty obvious
but what was interesting is
this was actually probably one
the first times that Proton's encryption was tested in court. If you think about this case,
you know, people have been destroying a building, they're squatting a building. The police knew
who they were, right? They weren't trying to find them. They simply had to go to the address
and they were there. The folks actually had already been previously been arrested. So in that case,
you know, why did police come to Proton in this case? Well, they were trying to get incriminating
information from the emails in order to build a case. And what this case demonstrated,
you know, clearly, you know, through the legal system was even under legal pressure, even
under legal coercion, even with a Swiss, you know, government coming down and ordering us,
there is no way to, you know, break protons encryption. There is no backdoor. Otherwise,
we would have, we would have been legally compelled to use it. And I think it's, in many ways,
I think it's good to have that reassurance, right,
to know that our assurances of encryption
have been legally tested.
And if these folks have been using, you know,
Gmail or something else, the outcome will be very different, right?
All the evidence would have been there.
But in our case, you know, nothing could be decrypted.
And I think, you know, it proves that, you know, this works.
And Proton does fight in court.
In fact, you know, about a month after this case,
We won actually major legal victory.
So, you know, what this means is unless you're based, you know, 15 miles offshore, you're in a country, right?
But what you can do is change the laws of a country through the way that they're interpreted by fighting court cases.
And, you know, we actually won a court victory about a month after this case that, you know,
dramatically limits the amount of information that email providers are required to retain and handover.
And, you know, in democracy, you can do that.
So, you know, it's not like you just have to give in to whatever the government ask.
You can sue the government.
You can fight them in court.
And in many cases, you actually can win like we did, you know, in that particular case.
Andy, last question for you.
You mentioned you have 70 million users.
And, you know, compared to Gmail, which has a billion plus, sort of feels like a hobby site.
And I'm not saying that it's a hobby site.
But like the benefits are obvious.
The sale is not going very well if you look at the sheer numbers.
Why is that and can that ever change?
Well, you know, if I look at privacy, it's really a bit like electric cars, you know, 20 years ago.
The environmental, you know, benefits were obvious.
The fact that it's the future, well, it wasn't obvious back then to anybody other than Elon Musk.
This is why he's the richest man in the world, right?
But at some point, going electric on automobiles becomes inevitable.
And I think it's the same for privacy, right?
You know, at some point, when the private service is just as good as a non-private service,
why would anybody not pick the private service?
And just like today, why would you not pick electric car when everything is basically equivalent
or if not better than, you know, the non-electric version?
And I think you look at a service like Proton where we don't have advertisers
and we can really do everything at service of the users.
in the long run, being privacy first
actually will lead to a better user experience
because we don't have to make the compromises
that Google and other companies are making
to satisfy advertisers.
So if you take the longer view of this thing,
I do believe in 5, 10, maybe even 20 years,
we won't get to a place where people look at this and say,
well, this is obvious.
Why didn't we go privacy first?
Now, I'll take a while to get there.
There's a lot of work to do.
Tesla didn't take off the automobile industry overnight.
It took 20 years.
but if the core philosophy is correct and if the alignment is there with the customers,
I think, you know, this is going to be inevitable, but it will take some time to get there.
And D.N., thanks so much for joining.
Yes, thanks for having me.
It's really a pleasure to be on.
Great chatting.
And that'll do it for us here on Big Technology Podcast.
Thank you, Andy, so much for joining.
Really great conversation.
We have to do it again for sure.
Thanks to all of you, the listeners.
This is going to do it for us for 2022.
We have some big news coming up.
In 2023, this podcast is going to get even better, and I'm very excited to make an announcement
in the first week of January, so stay tuned for that.
It's going to be super fun.
I really feel like you can tell you now, but I'll keep you in suspense.
Anyway, thank you so much for listening.
Really, it's amazing that people come back week after week, and I'm going to do my part
to make sure that your dedication to this show is rewarded with some great interviews coming
up next year.
Okay.
thank you, Nate Gwattney, for the amazing job that you've done this year. We've really, you know, done some last-minute shows, and I think it's really paid off. We've been able to be timely and tight in the work that we do here and couldn't happen without you. So thank you, Nate. Thank you, LinkedIn for having me as part of your podcast network. Here is to the next year. We're going to keep doing this next year. So that's awesome. And once again, thanks to all of you. Wow, 2022 is over. Next show is
in 2023. We will see you then and there on Big Technology Podcast.