Big Technology Podcast - Signal’s President on AI, Advertising, and Running a Popular Messaging App — With Meredith Whittaker
Episode Date: January 3, 2024Meredith Whittaker is the President of Signal and chief advisor to the AI Now Institute. She joins Big Technology Podcast for a lively discussion about the state of Google, whether AI is for real or a... marketing gimmick, whether the online advertising business model is ethically broken, and the state of the Signal messaging app. Stay tuned for the second half, where we discuss the mysterious nature of Telegram. And enjoy the cool-headed arguments throughout. --- Enjoying Big Technology Podcast? Please rate us five stars ⭐⭐⭐⭐⭐ in your podcast app of choice. For weekly updates on the show, sign up for the pod newsletter on LinkedIn: https://www.linkedin.com/newsletters/6901970121829801984/ Questions? Feedback? Write to: bigtechnologypodcast@gmail.com
Transcript
Discussion (0)
Signal president Meredith Whitaker comes on to talk about the popular messaging app and what's real or not in the world of AI.
All that and more coming up right after this.
Welcome to Big Technology podcast, a show for cool-headed, nuanced conversation of the tech world and beyond.
We're joined today by Meredith Whitaker.
She's the president of the Signal app and the chief advisor to the AI Now Institute.
Meredith, welcome to the show.
So happy to be here.
Thank you for having me.
You bet.
So we're going to cover a lot of ground today.
I thought it would be a missed opportunity to start without asking you briefly about Google.
You spent 13 years there, what's your just, like, high-level view of the state of Google today?
You know, look, I'm not at Google anymore.
I hear the rumors, but, I mean, history teaches us empires die, right?
You know, the winners don't stay winning always.
And I think, you know, I think management at Google has always been a bit fuzzy.
there's always been, you know, the cloud business has, you know, consistently struggled to catch up to Amazon and Microsoft.
You know, none of this is new news. I think, you know, for a very long time, Google was sort of figured as just the default leader in AI.
A lot of the sort of early techniques around parallel processing were being kind of innovated in labs at Google.
And, you know, what you have now is, you know, I don't even know if it's a struggle.
to innovate, right? Because I think we need to back up and look at, like, what happened with
chat GPT. Because chat GPT itself is not an innovation, right? It's an advertisement that was a very,
very expensive advertisement that was placed by Microsoft to, you know, advertise the capacities
of, you know, generative AI and to advertise their Azure GPT APIs that they were sort of selling after,
you know, effectively absorbing all.
open AI as a kind of Microsoft subsidiary. But that, you know, the technology or the sort of, you know,
frameworks on which chat GBT are based, you know, are dated from 2017. So, you know, Microsoft
puts up this ad. Everyone, you know, gets a little experience of, you know, communicating with
something that seems strikingly like a sentient interlocutor. You kind of have a, you know,
a supercharged chat bot that everyone can, you know, experience and have, you know, have, you know,
have a kind of story about. It's, you know, a bit like those kind of, you know, viral, like,
upload your face and we'll tell you what kind of person you are, data collection schemes that we
saw, you know, across Facebook in the 2010s. And then an entire narrative of kind of innovation and,
you know, or a narrative of like scientific progress gets built around this sort of chat GPT moment, right?
Suddenly generative AI is the new kind of AI suddenly claims about sentience and about, you know,
you know, the superintelligence and about, you know, AI being on the cusp of, you know,
breaking into full consciousness and perhaps endangering human life. All of this almost like
religious, um, religious rhetoric kind of builds up in response to chat GPT. So yeah, I mean,
I'm not, I'm not a champion of Google, but I think we need to be very careful about, you know,
how are we defining innovation and how are we defining progress in AI? Because, you know, what I'm seeing is
kind of a reflexive narrative building around and, you know, what is a very impressive ad for
a large generative language model, but not anything that kind of, you know, not anything we should
understand as constitutively innovative, right? Right. And so, okay, well, I definitely have
some questions about that for you. We're going to get into it for sure. But just to focus a little
bit more on this line, I mean, they are adding generative search to the top of search pages right now
in a lab's feature, but obviously that's something that they're thinking about rolling out
more broadly. And it's not that they didn't have this innovation, if you want to call it that,
or they put this product in-house. They did. It was, I mean, we're both aware, right, called Lambda.
We just had the first product manager on Lambda. And they decided not to ship it, which did lead
to, you know, even if it's not a big technological breakthrough that's going to kill us all,
it led to, you know, giving Google a chance to like really, I'm sorry, giving Microsoft the chance to really outflank it to cause a scramble.
People called it a red alert within Google.
I mean, that was the words they were using.
And they didn't release it from my understanding because of safety concerns.
So do you, I'm kind of curious what you think about it.
Like, was that the right move?
Or does it leave them if that's where they're going to be?
Well, I mean, right as measured by what?
as measured by, you know, the demands of shareholder capitalism that require constant growth
forever and ever and ever unstopping, as I said recently at a Washington Post event, you know,
the definition of metastasis. Like, as measured by the requirement that, you know, the shareholder
driven corporation sort of metastatically continue growing their market share, their user base,
their revenue, yeah, maybe that is a problem.
But I don't know that we should, you know, we should be measuring the benefits of certain decisions based on that metric, right?
So what I'm saying is like, I think we need to reevaluate how we're measuring progress, how we're measuring innovation, and recognize that it's a problem when a massive surveillance corporation like Google overrides an entire, you know, ethics team, you know, in order to push to release something because they feel that there is a competitor who is released.
releasing something dangerous ahead of them, and, you know, thus there is no value in safety anymore
because the real metric, the real objective function is revenue and growth and shareholder
returns. But from the framework that you laid out, right, I mean, so describing growth as
metastasizing something, I mean, what should a company like Google do? Like, should they just kind
of crawl up and fold their cards and become that declining empire? I mean, from your perspective,
What is the right path for that type of company?
I think, again, the issue is that the actual objective function, the actual goals that this company must prioritize above all others are revenue and growth.
And so my critique here, if we're going to be real, is of, you know, this form of capitalism.
Of the system.
So then pragmatically, like if you were running Google, would you just hop out of the system?
Well, I wouldn't be running Google because they don't put people like me in charge of running Google.
go hypothetical here though. Yeah, hypothetical. They give me Google, I think we, you know, like,
how do those resources need to be redistributed? What does sort of technology in the public
interest actually look like? What forms of technology are constitutably not in the public interest?
How do we look at something that is, you know, a bit closer to a, you know, democratic governance
of the role of these resources in, you know, human life? You know, I think all of those questions
are questions I would want to raise, and I think those are, you know, these questions obviously
have to be answered beyond Google, right? And so this is, you know, this is the role of, you know,
popular movements like the Writers Guild of America and others who are, you know, I think,
doing the best job of regulating AI of anyone out there right now. This is the role of, you know,
some forms of state intervention, you know, depending on what those look like. And I think this is
the role of, you know, kind of, you know, theorists who are capable of reimagining what role
if any computational technology has across different, you know, aspects of our lives, our institutions,
our economy.
But tech, I mean, so this is a big topic of debate right now, the role of tech growth.
Now, I won't deny the fact that there's been some bad outputs from it, but, I mean, there seems
like there's also, it seems like on the balance, it's been good.
I mean, curious what your perspective is.
and we're talking through, I'm using a Chrome browser to speak with you right now.
Like, obviously, Google has put, like, so much information at people's fingertips.
It seems like it empowers people, I don't know, maybe as much or more than it would take away from folks.
But you have a different opinion.
Well, I mean, I guess get down, like, what do you mean by tech growth?
Do you mean the commercialization of network computation that happened in the mid to late 90s?
No, no. I mean that the various.
products that these tech companies have built.
Yeah, so the commercialization of network computation.
Okay.
Which is, I mean, which is kind of the like, you know, you took these sort of research networks,
these, you know, military communications networks, you know, that sort of developed through
the, you know, mid-century.
And in the early 90s, the Clinton Gore administration sort of put all, you know, there was
Bush involved in this in the sort of late 80s, but, you know, there was put all the kind of economic
eggs in the basket of, you know, the internet will be a balm to an ailing economy where
that just see neoliberal policies, you know, slash manufacturing that has seen the Democratic
Party sort of, you know, move away from seeing its base as, you know, labor and workers to
seeing its basis, you know, high tech and, you know, workers, more or less. This is a very, you know,
kind of quick summary of a much more complex era, of course. And then really pushed for neoliberal
policies and neoliberal policies, that's not like a fancy word. That just means like an ideology
that sees the private sector as, you know, ideally responsible for everything, right? So,
you know, privatize, privatize, privatize, as few regulations, as few, you know, interventions from the
state as possible is that, you know, ideological framework, you know, sees neoliberal policies
kind of applied to this commercialization. And, you know, I would say Matthew Crane's work, you know,
particularly the book, you know, privacy or profit over privacy is a really good.
good, you know, historical analysis of this time, which sees, you know, this sort of, you know,
discussion throughout the 90s, this policymaking process where the technology and advertising
agencies are leaning very heavily on the Clinton administration to ensure that, you know,
the, there are as few regulations as possible. So it's industry-led. It is open for innovation.
and to ensure that there is a, you know, explicit endorsement of advertising as the, you know,
business model of commercialization. And this is, you know, this is born out. You see the framework
that was sort of published, I think it's 1997, although it went through many drafts, you know,
explicitly endorses advertising as the business model of, you know, the commercial internet. And, of course,
advertising requires surveillance, right? The more you know about your market, the more you can
sort of target them, the more you can sort of make good on the, you know, promises of market segmentation
and targeting. So this is, you know, this is the framework that then births these products that,
you know, seem really innovated and cool. Like, yes, they have, you know, reshaped our relationship
to information. But, you know, this, this sort of surveillance advertising business model also gutted
local media. It also gutted our news and information ecosystem. There is very few business models now
for independent news and, you know, local news, which in itself has sort of, you know,
redounded to, you know, let's say a decrease in the strength of, you know,
democratic processes and institutions at the local level.
In the U.S., we have, you know, a crisis of gerrymandering where it's very difficult to claim
that sort of elections are- But was it surveillance or was it just aggregation of audience?
What's the difference?
I mean, surveillance is getting all the date, like that, I mean, yeah, gathering all this data
on top of people and you know aggregation of audiences like look like advertisers want to buy big
audiences that's why they love TV and you can buy big audiences on Facebook and Google as opposed to
a local newspaper so if you have if you're like an ad buyer you can go to let's say you know
Google or Facebook and get hit all your objectives as opposed to having to go to a disjointed group
of like 500 local newspapers to do the same thing well I think those two went hand in hand right the
sort of, you know, because it didn't start as, you know, one platform to rule them all, right?
You know, you've had kind of, you know, search and other things, but, you know, this business
model was developing throughout the 90s. And it isn't just a mass audience, right? We don't all
see the same ads. We are segmented into, you know, micro targets that are, you know, like, you know,
I am, you know, a woman of a certain age, you know, based in the New York area. And, you know, I go to
yoga frequently, and that's why you saw this ad, right? So, you know, that, you know, there is an
imperative for data collection, which, you know, again, data collection on audiences and market
segmentation was not new to the internet, but the, you know, surveillance capabilities of
network computation and the ability to sort of, you know, link databases that existed before this
commercialization with, you know, the imperative for advertisement and, you know, the ability to,
you know, quickly, you know, through cooking.
which was another thing, like, you know, collect a bunch of data on people, you know,
overlaid those two things.
Many of the local newspapers used the same technology.
They just didn't have big enough audiences.
So I guess I'm losing your argument, though.
What's the argument here?
Specifically on your point about the local news ecosystem, it's more complicated than simply
saying that it was surveillance that put them out of business.
But I guess I was building on your argument that like wasn't it a net benefit, right?
And I'm saying we would have to go into the weeds and actually analyze like what were the collateral consequences beneficial to whom and recognize like we are in a crisis in terms of our information ecosystem, our media ecosystem, our ability to sort of access a shared reality to access like credible empirical information about our world.
And that is in part because of the dynamics of this business model, which is self-reinforcing, right?
Which does, you know, once you have a sort of winner in this business.
model, it is very difficult for another competitor to break in. And that, you know, that speaks to
your point around, you know, what I would call like platform dominance, right? There's one Facebook
instead of, you know, a million heterogeneous little local news, you know, news outlets, which
themselves can take different positions, can report on different things, can sort of provide a much
richer information ecology than, you know, one Facebook that decides a certain type, you know,
pivot to video. Now don't pivot to video, right? And makes that, you know, determination.
in ways that are profoundly un-democratic.
I mean, definitely a rough transition for a lot of new sources to this new world,
but it was also a product of like a lot of terrible business decisions as well from
newspapers and magazines not accustomed to having to adapt and innovate.
And I think that there are some examples now.
But this is why we have, you know, like do we want our media to be driven by the same
sort of, you know, business imperatives, right?
If a very valuable source of news and investigative reporting is not able to turn a profit, does that make it less valuable?
No, no, definitely not.
But I don't think they should, this is a thing.
I don't think they should have to go through the same processes as like a Google, like you can build a successful local news brand and you don't have to follow the same, you know, targeting criteria that Google did.
You can build, you can do it.
They just had these legacy, legacy costs, legacy structures, and they couldn't, they couldn't, they couldn't,
adapt fast enough to like figure out a way to do business on the internet and ultimately like they
were in the internet so that was sort of what was happening. I mean I think you don't agree you know like
it's something around 70% of you know ad buys from media go to Facebook or sorry meta right like
that's not you know I don't think we can say like there was a way to pivot but the old guys with
their gray beards and their cups of coffee and their teletype machines just didn't figure it out right
I think there was a predatory business model that, you know, effectively took the rug out from under these actors, you know, who were themselves sort of, you know, advertising funded, not sort of funded by, you know, as in the UK and many other places, kind of like, you know, state media funding arms, right?
So there was a choice way back when that media and news should be advertising funded, which itself was a, you know, was a thing.
but you know and then you know the continuation of that model into commercial network computation
the sort of internet business model you know displaced the and we didn't replace that we didn't
fill that and we didn't sort of you know effectively take measures to preserve our information
ecosystem from that transformation yeah i will agree with you that it's not in a good place right
now and our society is definitely lost because of the fact that we don't have as good local
reporting as we did previously. It's just enabled much more corruption and allowed behavior
to go unchecked. So you also said that you found the gender of AI is not actually that
useful. And I think maybe that's right in the consumer sense, like chat GPT, you know, it had
declining usage across the summer. People have talked about how this could actually be very
helpful in the enterprise sense. For instance, if you're a lawyer using a chatbot to query like
500 different documents to find relevant information for your case might be interesting. Or, you know,
in various other enterprise circumstances, you have a bunch of PDFs. You just got to be able to talk
with them, you know, have the bot read the data and then be able to pull out insightful information,
you know, could be helpful as well. So, I mean, talking about chat GPT as an ad for Microsoft services,
you know, potentially, but there also were some.
some actual interesting and I would say innovative uses that you're seeing right now with the
technology. What do you think about that? I mean, I didn't say useless, right? I said not that
useful in most serious contexts. Or that's what I think. And I think it's, you know, what I'm
saying is that like, oh, it can do, you know, if it's a low stakes sort of lit review, a scan of
these docs could point you in the right direction. It also might not. It also might miss certain
things because, you know, you're looking for certain terms, but actually there's an entire field of
the literature that uses different terms. And, you know, actually, if you want to research this
and understand it, you should do the reading, you know, not maybe trust a proxy, you know, that
is only as good as the data it's trained on and the data it's trained on is the internet
plus whatever fine-tuning data you're using, right?
So I don't, you know, I'm not saying it's useless.
I'm saying it is vastly overhyped.
And sort of the claims that are being made around it are, you know, I think leading to a sort
of regulatory and, you know, kind of a regulatory environment that is a bit disconnected from
reality and to, you know, a kind of popular understanding of these technologies that are far
over, you know, over credulous about the capabilities, right?
You know, like, like, again, it's not, you know, any, any serious context where
factuality matters is not somewhere where you can trust one of these systems.
What's your perspective on open AI?
I feel like we're going to find common ground here.
I mean, I think open AI has like a really annoying name because every time you talk about,
like, you know, like it's sort of, you know, but it's not open.
insofar as there is such a thing as, you know, quote, unquote, Open AI, not the company,
which I write about with some co-authors elsewhere, and it's, you know, not a clear term.
And I think, you know, we need to be careful about that assertion.
But, you know, I think Open AI is very good at marketing.
I think, you know, they are now effectively a part of Microsoft, which is, you know,
points directly to the fact that, you know, these bigger or better paradigm for, you know,
quote unquote AI is, in fact, you know, kind of centering and privileging, you know, a handful of
actors who are the ones who have the infrastructure, the data, the talent, the market reach to be
able to actually, you know, create and deploy these systems at scale. So, you know, the fact that
Anthropic is sort of, you know, tethered.
to Google and now Amazon, the fact that Open AI, you know, couldn't stay, you know, independent
because it became clear they needed the, you know, compute resources of one of these giants,
and it's a lot easier to get those when you sort of, you know, absorb yourself into one of those
giants benefiting from their economies of scale than it is to sort of license those outside.
It's very, very, very expensive.
So I think, you know, Open AI becoming effectively part of Microsoft is an object lesson in
you know just how concentrated the you know just how few actors are actually able to create these
systems and just how concentrated the power in the AI industry actually is where do you where do you
think this this all goes I mean if if you're so skeptical of this wave of generative AI and you
think it's a big wave of hype do you think it just fizzles in a similar way that web 3 did or
do you think that it's actually going to lead it somewhere
I think it's being hyped, right?
So hyped does not mean it's useless.
It does not mean it's...
It seems much more useful to me than the Web3 stuff for that, you know, for much work.
Yeah, well, I mean, look, Web3 was built on techniques for doing cryptographically, you know, assured append-only logging, which is hyper useful, right?
Certificate transparency, very, very useful.
There are, you know, many approaches to using those techniques to do something that is useful.
but Web 3 was sort of built on, you know, a lot of hype about, you know, effectively unregistered
securities, a mad rush to sort of fill in the bottom of an ever-expanding until it collapsed
Ponzi scheme and, you know, the like kind of hype around like NFTs and other, you know, effectively
like applications of those technologies that were, you know, not very useful to many people, right?
And you have, you know, like Dow's and smart contracts, et cetera, but those were not.
not like, you know, again, it was a, you know, it was a hype. And what was the motivation for that hype? In my view, it was, you know, a lot of big players in the tech industry were heavily invested at the top of that Ponzi scheme. And we're hoping for people to sort of fill in the bottom until FTX collapsed. And then, you know, kind of the whole thing, um, felt behind it. So I, you know, like what, again, what I'm doing there is distinguishing from like what the hype was predicated on, what the interests of the hype were, the narrative.
of the hype and actually the technological affordances that underlie the hype, which themselves
are not useless, right? But the claims made about them were deeply dishonest and misleading.
All right. Meredith Whitaker is here with us. She's the president of the Signal app, chief advisor
to the AI Now Institute. On the other side of this break, we're going to talk about the app itself,
Signal. Back right after this. Hey, everyone. Let me tell you about the Hustle Daily Show, a podcast
filled with business, tech news, and original stories to keep you in the loop on what's trending. More than
two million professionals read the Hustle's daily email for its irreverent and informative takes on
business and tech news. Now, they have a daily podcast called The Hustle Daily Show, where their
team of writers break down the biggest business headlines in 15 minutes or less and explain why you
should care about them. So, search for The Hustle Daily Show and your favorite podcast app, like the one
you're using right now. And we're back here on Big Technology podcast with Meredith Whitaker,
the president of the Signal App. Let's talk about the Signal App. I mean, what is the state
the Signal app right now. How many users does it have? How do you feel about, you know,
it's the way that it stacks up against the other messaging app? Is it in a state of positive
momentum now? What's going on with that app? Certainly in a state of positive momentum.
It is the world's most widely used, truly private messaging app. We have many, many millions of
users. We don't give a round number, but you can cheat a bit if you want and look at the app store
and see that it's been downloaded over 100 million times just in, sorry, the place.
store. The App Store doesn't give those numbers. The other one. The Play Store, see, that it's
been downloaded over 100 million times. And I, you know, I'm just, I'm really happy about the
state of the app, about the state of the organization, and about, you know, many of the things we
have, you know, moving forward, including, you know, the launch of usernames, which we're,
We're aiming for early 2024 there, and anyone who is interested in more on that can become
one of the signal code watchers who can see the development of usernames happening in real-time
in our repos.
How big is the team building the app?
We're a little under 50 people now, so bigger than we've ever been, and also incredibly small.
And is the app funded by donations, or do people?
pay to use it? It's funded by donations. So we had a, you know, a large, a large trance of money from
Brian Acton, which has helped give us the what's up co-funder, yeah, give us a runway. As we build up
a donor supported model, we're looking to as much as possible subsist on small donations
because we think that is the safest way to generate revenue.
We don't want a handful of large donors
who can have every right to change their mind
at some point about their patronage.
We don't want to be reliant there.
So you will see in the app, if you go to settings,
there's a little donate button.
We encourage you to click it.
And you will see periodically kind of a small message appear
that asks if you're able to,
donate a little bit to signal and you get a badge which is this little you know it's like a little
icon that appears near your profile photo um it's very cute if you uh donate through the app so yeah
we're looking at the donation model but you know let's be real it it is very expensive to run
technology like signal and i think that part of the equation is often missing because of course
like you know consumer technology is often or almost always experienced as free right and you know
that is because we are, you know, surveilled and subject to advertisements and, you know,
data breach and privacy violations and change of terms of service to sort of, you know, target us
with AI. All of those social ills come along with that. And, you know, just because Signal has
a different model of, you know, collecting as little as possible being incredibly staunch about
privacy doesn't mean it's less expensive for us to, you know, develop and maintain the app. So we have a piece
coming out, kind of talking a bit more about the cost of signal. And, you know, one of the
estimates we give is that by 2025, we anticipate that it will cost around $50 million a year
to develop and care for signal. And that's lean compared to, you know, a meta or a Google or
a, you know, other large consumer tech company. Yeah. And the reason why I ask about resources is
because it does seem to me like it's going to require more resources to compete with these other
apps because for a while messaging apps seemed fairly static, right?
They were a place to message your friends.
But now there's starting to be a place where, I'll use the word, so much innovation
is happening, right?
Like you see broadcast channels, for instance, is something that started to come to
places like Telegram and WhatsApp, where basically people can disseminate information to
people who subscribe to them, which is interesting.
And you're starting to see AI bots, right now.
Right now, Meta has 28 bots that is testing a messenger.
WhatsApp you have things like stories I know signals put in like stickers and stuff like that so I mean
of course like the privacy message is something that's going to appeal to some but to actually
compete shoulder to shoulder I'd imagine that you have to put in some of these bells and whistles so
how do you think about that when it comes to prioritizing what to build and just like the pace of
change that happens inside these apps yeah well I mean I think that's a great question and
one of the first answers is that we're lucky to not have
have to compete on their terms. We're not doing engagement hacking. We're not trying to show a board
that we're, you know, collecting more data or, you know, keeping people glued to the app for more
hours a day so that they can see advertisements or, you know, be part of our ecosystem where, say,
we join their metadata with other personal information we have or, et cetera. You know, we are aiming
to continually grow the number of people who are using Signal, but that's because Signal is not
useful if none of your friends use it. So we need that network effect of encryption so that,
you know, when people really need signal, when, you know, privacy is valuable and privacy is always
valuable, you know, post hoc or otherwise, you know, that people have it installed and can reach
the people that they care about. So we're not, you know, one of the reasons we are a nonprofit and
see our nonprofit status is, you know, not a nice to have, but actually essential to ensure that
we're focused on privacy is because we will not be pushed by profit-minded board members to,
you know, collect a little data to, you know, prioritize revenue over privacy, you know,
in an industry where the business model is undermining privacy. So we do, you know, we do think
about that landscape, but we are, you know, first and foremost, we're thinking about privacy, right?
And we are not a social media app. So we're not adding things like channels, like media broadcast
functioning, we see that as distinct from the service we provide, which is a, you know, we are an
interpersonal communications app. And we do see a lot of people, you know, reaching out to us, people
who talk to me when I'm, you know, out and about in the world who are very grateful that they don't
have to sort of, you know, open a truly bloated app where, you know, millions of features have
been added, making it look kind of like Microsoft Word, because, you know, we are incentivized to
constantly add new things to try to grow to, you know, meet our OKRs.
to meet our company goals as like, you know, AI becomes trendy or what have you, we can actually
stay lean and focus. So I don't actually see that as a competitive disadvantage. I see these apps
as becoming, you know, oftentimes bloated and, you know, kind of an infrastructural single
points of failures that people are grateful to not have to rely on for everything in their lives.
I mean, one of the reasons why they've become bloated, I imagine, is because I'm curious what
your take is on this. It's because the social network has moved to messaging apps in many ways.
Like actually all the action that was happening on like a Facebook news feed is now happening
in like a WhatsApp or a messenger where people instead of sharing with everybody share with
a group chat. So I'm curious to hear your perspective on that. Do you, are you finding that also like
in a app like signal sort of these messaging groups replace the old school social network that
sort of the social web emerged with?
Well, I think, you know, we are not a social network and we're not a broadcast.
Of course.
We don't provide broadcast functionalities.
We also have a cap on our group size.
So, you know, we don't, we're not looking to enable the kind of mass virality that a social
network enables or to, you know, in any way replicate the functionality of, you know, social
networking whereby I can, you know, sign up and then you have a directory of users effectively,
you know, kind of attached to that and you can sort of post to strangers and have that
post.
But the energy, though, the energy has gone from the social network to the group chat.
It's just morphed effectively into a new format, which is happening in your app.
Well, I don't, I mean, I actually, one, we don't collect data on our users if we can help
it.
So we don't have the kind of analytics and telemetry data that, you know, a surveillance app
would have and I'm not sure you know I'm I need to sit with it but I'm not sure I agree that the
social networking energy has moved right you know you still see people using you know all kinds
Instagram TikTok snap uh you know even Twitter you know or X nay Twitter um you know all of these
still have user bases and then people have their group chats with their friends with their family
with you know whatever their church group um and I don't think you know I
I don't think those things are the same.
And I think, you know, again, you know, there is a collapse in some of these apps, like in Telegram or, you know, most recently WhatsApp, kind of adding channels and adding social media functionality.
But I don't, I'm not sure I'm aligned that, like, the energy of social media has moved to group chats because I think they are doing kind of discrete things.
Okay.
Yeah.
But I do, you know, yeah, go on.
What's your perspective on Telegram and how do you compete with them?
And you mentioned that they are this kind of fascinating app that seems to be picking up a lot of steam.
I mean, has been for years.
But they're kind of mysterious to me still.
Yeah, they're mysterious to a lot of people.
There's a lot of, it's hard to find a, like, well-sighted claim about Telegram.
But what we do know is that, you know, Telegram talks a big game about privacy, about sort of defending human rights.
They have a very mythologized origin story with their, you know, founder, breaking.
with the Russian government, which adds a kind of, you know, dissident flare to the whole
endeavor, but, you know, they are not encrypted. They are not a private app. They provide,
you know, encryption as an opt-in feature for one-to-one chats only. And my concern with
Telegram is that that rhetoric and some of the bombast around human rights and privacy
serves to give the impression that the app is much more safe and secure that it is, driving
people to use telegram in situations where it's actually not very safe. We know that
Telegram, you know, even though they talk a big game, they cooperate with governments when
they're forced to or, you know, maybe when they're not forced to, I don't know, and, you know,
have very bad data collection and security practices. So I would, you know, Telegram may have a lot
of features that people like, you know, I'm not going to judge that, but I would say in terms of
privacy, I would stay as far away from telegram as possible because it simply can't be considered
a private app. If they are Russian dissidents, which kind of governments do you think they're
closest to? Because I have no idea. I mean, I have, I have no, like really I don't know much about
them and I distrust what I have seen because oftentimes it's sort of circular citations, right? It goes
back to, you know, a comment made by the founder or, you know, something similar that can't really be
verified. So I really want to stay away from judging or offering any speculation there. I just
don't know. And I think that's probably the position of most people right now because it's not a
very transparent organization. Yeah. They definitely, I mean, I've been covering the space for quite
some time and trying to get to the bottom of what the deal with telegram is has been one of the
toughest stories to crack. Well, if you do, let me know because I don't know. All right, Meredith.
So great to have you. Thank you so much for joining. Really appreciate it. Yeah, great to be here,
Alex. Thank you. Yeah, thank you. And love the app. I'm a big signal guy for all my friends.
They know that well. So me too. All right. Sounds great.
Text me on signals. Okay. Sounds great. Thank you, Mary. Thank you, everybody for listening.
And we will see you next time on Big Technology Podcast.
Thank you.