Behind the Bastards - Behind the Bastards Presents: Better Offline
Episode Date: December 8, 2024Here are a couple of our favorite episodes of Ed Zitron's Better Offline podcast series. Man Who Killed Google Search Sam Altman Is Dangerous to Society Rot Society Apple Podcasts Spotify iHeart ...See omnystudio.com/listener for privacy information.
Transcript
Discussion (0)
From audio up, the creators of Stephen King's Strawberry Spring comes The Unborn, a shocking true story.
My babies please, my babies.
One woman, two lives and a secret she would kill to protect.
She went crazy, shot and killed all her farm animals, slaughtered them in front of the kids, tried to burn her house down.
Listen to The Unborn on the iHeartRadio app, Apple podcasts,
or wherever you get your podcasts.
Hey guys, I'm Kate Max.
You might know me from my popular online series, the running interview show,
where I run with celebrities, athletes, entrepreneurs, and more.
After those friends, the conversations keep going. That's what
my podcast, Post Run High, is all about. It's a chance to sit down with my guests and dive
even deeper into their stories, their journeys, and the thoughts that arise once we've hit
the pavement together. Listen to Post Run High on the iHeartRadio app, Apple podcasts,
or wherever you get your podcasts.
Hey everyone, it's John, also known as Dr. John Paul.
And I'm Jordan, or Joe Ho.
And we are the BlackFatFilm Podcast.
A podcast where all the intersections
of identity are celebrated.
Woo-chah, this year we have had some of our favorite people
on including Kid Fury,
T.S. Madison, Amber Ruffin from the Amber and Lacey show, Angela Carras and more. Make sure
you listen to the Black Fat Fam podcast on the iHeart Radio app, Alpha podcast or whatever you
get your podcast girl. Oh I know that's right. Welcome to the Criminalia Podcast. I'm Maria
Tremorchi. Holly Frye
And I'm Holly Frye. Together, we invite you into the dark and winding corridors of historical
true crime. Maria Tremorchi
Each season, we explore a new theme from poisoners to art thieves.
Holly Frye We uncover the secrets of history's most
interesting figures, from legal injustices to body snatching.
Maria Tremorchi And tune in at the end of each episode as most interesting figures from legal injustices to body snatching.
And tune in at the end of each episode as we indulge in cocktails and mocktails inspired
by each story.
Listen to Criminalia on the iHeartRadio app, Apple podcasts, or wherever you get your podcasts.
Welcome to Decisions Decisions, the podcast where boundaries are pushed and conversations
get candid.
Join your favorite hosts, me, Weezy WTF and me, Mandy B.
As we dive deep into the world of non-traditional relationships and explore
the often taboo topics surrounding dating, sex and love.
That's right. Every Monday and Wednesday, we both invite you to unlearn the
outdated narratives dictated by traditional patriarchal norms.
With a blend of humor, vulnerability, and authenticity,
we share our personal journeys navigating our 30s,
tackling the complexities of modern relationships,
and engage in thought-provoking discussions
that challenge societal expectations.
From groundbreaking interviews with diverse guests
to relatable stories that'll resonate with your experiences,
Decisions Decisions is going to be your go-to source
for the open dialogue about what it truly means to love and connect in today's world.
Get ready to reshape your understanding of relationships and embrace the freedom of authentic connections.
Tune in and join in the conversation.
Listen to Decisions Decisions on the Black Effect Podcast Network, iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Call Zone Media Apple Podcasts or wherever you get your podcasts.
Hey everybody, Robert gosh darn Evans for you here.
For the end of the year to celebrate and stuff, we've got our normal Behind the Bastards content
coming to you.
Do not worry, that's all going to continue as normal.
We also wanted to highlight some other shows in our network, most of which are new and
launched this year.
We've got some compilation best of episodes that we think the Bastards audience is going to
love and we're delivering to you now in a special format with fewer ads.
So today you're going to hear some episodes of Better Offline, Ed Zitron's excellent critical
tech industry podcast, which has taken the tech world by storm. And I'm excited for you to learn about the man who killed Google search about Sam Altman,
the CEO of OpenAI and why he's dangerous for society and what Ed Zitron calls the rot economy.
Hello and welcome to better offline. host Ed Zitron.
And in the next two episodes, I'm going to tell you the names of some of the people responsible
for destroying the internet.
And I'm going to start on February 5th, 2019, when Ben Gomes, Google's former head of search,
well, he had a problem.
Jerry Dishler, then the VP and GM of ads at Google, and Shiv Venkatraman, then the VP
of engineering search and ads on Google properties, had called something called a code yellow
for search revenue due to, and I quote emails that came out as part of Google's antitrust
hearing, steady weakness in the daily numbers and a likeliness that it
would end the quarter significantly behind in metrics that… kind of unclear.
For those unfamiliar with Google's internal kind of Scientology-esque jargon, which means
most people, let me explain.
A code yellow isn't a terrible need to piss or some sort of crisis of moderate severity.
The yellow, according to Stephen Levy's tell-all
book about Google, refers to, and I promise this is not a joke, the color of a tank top
that a former VP of engineering called Wayne Rosling used to wear during his time at the
company.
It's essentially the equivalent of DEFCON 1, and activates, as Levy explained, a war-room-like
situation where workers are pulled from their desks and into a conference room where they tackle the problem as a top priority.
Any other projects or concerns are sidelined.
And independently, I've heard there are other colors, like purple.
I'm not going to get into that though, it's quite boring and irrelevant to this situation.
In emails released as part of the Department of Justice's anti-trust case against Google,
as I previously mentioned, Dishler laid out several contributing factors.
Search query growth was significantly behind forecast, the timing of revenue launches was
significantly behind, and he had this vague worry that several advertiser-specific and
sector weaknesses existed in search.
Now, I want to cover something because I've messed up, and I really want to be clear about
this.
I've previously and erroneously referred to the code yellow as something that Gomes
raised as a means of calling attention to the proximity of Google's ad site getting
a little too close to search.
I'm afraid the truth is extremely depressing and so much grimmer.
The code yellow was actually the rumble of the goddamn rot economy, with Google's revenue
arm sounding the alarm that its golden goose wasn't laying enough eggs.
Gomes, a Googler of 19 years that basically built the foundation of modern search engines,
should go down as one of the few people in tech that actually fought for an actual principle,
and he was destroyed by a guy called Prabhakar Raghavan, a computer scientist class traitor
that sided with the management consultancy sect. More confusingly, one of their problems was that there was insufficient
growth in queries, as in the amount of things that people were asking Google. It's a bit
like if Ford decided that things were going poorly because their drivers weren't putting
enough goddamn miles on their trucks. This whole story has personally upset me, and I think you're going to hear that in
this, but going through these emails is just very depressing.
Anyway, a few days beforehand on February 1st, 2019, Kristen Gill, then Google's VP
Business Finance Officer, had emailed Shashi Thakur, then Google's VP of Engineering,
Search and Discover, saying that the ads team had been considering a code yellow to close the search gap it was seeing.
Vaguely referring to how critical that growth was to an unnamed company plan. To be clear,
this email was in response to Thakur stating that there is nothing that the search team could do
to operate at the fidelity of growth that the ads department had demanded.
Shashi forwarded the email to Gomes asking if there's any way to discuss this with Sundar
Peshai, Google's CEO, and declared that there was no way he would sign up for a high fidelity
business metric for daily active users on search.
Thakur also said something that I've been thinking about constantly since I read these emails. That there was a good reason that Google's founders separated search from
ads. I want you to remember that line for later.
A day later, on February 2, 2019, Thacker and Gomes shared their anxieties with Nick
Fox, a vice president of search and Google Assistant, entering a multiple day long debate about Google's sudden lust for growth.
This thread is a dark window into the world of growth-focused tech, where Thakur listed
the multiple points of disconnection between ads and search, discussing how the search
team wasn't able to finally optimize engagement on Google without hacking it, a term that
means effectively tricking users into spending more time on a site, and that doing so would lead them to, and I quote,
abandon work on efficient journeys.
In one email, Fox adds that there was a pretty big disconnect between what finance and ads
wants and what search was doing.
Every part of this story pisses me off so much.
When Gomes pushed back on the multiple requests for growth, Fox added that all three of them
were responsible for Search and that Search was, and again I quote, the revenue engine
of the company and that bartering with the ads and finance teams was now potentially
the new reality of their jobs.
On February 6th, 2019, Gomes said that he believed that search was getting too close to the money,
and ended his email by saying that he was concerned that growth is all that Google was thinking about.
On March 22nd, 2019, Google VP of Product Management, Darshan Kantak, would declare the end of the Code Yellow.
The thread mostly consisted of congratulatory emails, until Gomes made the mistake of responding,
congratulating everyone, saying that the plans architected as part of the Code Yellow would
do well throughout the year.
Enter Prabhakar Raghavan, then Google's head of ads and the true mastermind behind the
Code Yellow, who would respond curtly, saying that the current revenue targets were addressed
by heroic RPM engineering and that the core query softness continued without mitigation.
A very clunky way of saying that despite these changes, query growth was not happening at the rate he needed it to.
A day later, Gomes emailed Fox and Thakur an email he intended to send to Raghavan.
He led by saying that he was annoyed both personally and on behalf of the search team.
In this very long email, he explained in arduous detail how one might increase engagement with
Google search, but specifically added that they could increase queries quite easily in
the short term, but only in user negative ways, like turning off spell correction or
ranking improvements, or placing refinements, effectively
labels, all over the page, adding that it was possible that there are trade-offs here
between the different kinds of user negativity caused by engagement hacking and that he was
deeply, deeply uncomfortable with this.
He also added that this was the reason he didn't believe that queries, as in the amount
of the things with people searching on Google, were a good metric to measure search and that the best defense against the weaknesses of
queries was to create compelling user experiences that make users want to come
back. Crazy idea there, what if the product was good? Not good enough for
Prabhakar. So little bit of history about Google here, they regularly throughout the
year do core updates to search.
These are updates that change the algorithm that say, okay, we're going to suppress this
kind of thing, we're going to elevate this kind of thing.
And they are actually the reason that search changes.
It's why certain sites suddenly disappear or reappear.
It's why sites get a ton of traffic, some don't get any, and so on and so forth.
But they do a lot of them.
The one that's really interesting, I'm a little bastard and I went and looked through pretty
much the last decade of these, the one that stood out to me was the March 2019 core update
to search, which happened about a week before the end of the code yellow, meaning that it's
very likely that this was a result of Prabarka's bullshit.
So this was expected to be one of the largest
updates to search in a very long time, and I'm quoting Search Engine Journal there. Yet when it
launched, many found that the update mostly rolled back changes and traffic was increasing to sites
that had been suppressed by previous updates like Google Search's Penguin update from 2012
that specifically targeted spammy search results.
There were others that were seeing traffic as well from an update that happened on the
1st of August 2018, a few months after Gomes became head of search.
While I'm guessing here, I really don't know, I do not work for Google, I do not have
friends there, I think the timing of the March 2019 core update, along with the traffic increases to
previously suppressed sites that 100% were spammy SEO nonsense, I think these suggest
that Google's response to the code yellow was to roll back changes that were made to
maintain the quality of search.
A few months later in May 2019, Google would roll out a redesign of how ads were shown
on Google search.
Specifically on mobile, replacing the bright green ad label and URL color on ads with a
tiny little bolded black note that said ADD in the smallest font you could possibly put
there, with the link looking otherwise identical to a regular search link.
I guess that's how they managed to start hitting their numbers, huh?
And then in January 2020, Google would bring this change to desktop, and the Verge's John
Porter would suggest that it made Google's ads look just like search results now.
Awesome!
Five months later, a little over a year after the Code Yellow situation, Google would make
Prabhakar Raghavan the head of Google search, with Jerry Dishler taking his place as the head of ads.
After nearly 20 years of building Google search, Gomes would be relegated to the SVP of Education at Google.
Gomes, who was a critical part of the original team that made Google search work,
who has been credited with establishing the culture of the world's largest and most important search engine was chased out by a growth-hungry managerial type
several of them actually led by Prabhakar Raghavan
a management consultant wearing an engineer costume
As a side note by the way
I use the term management consultant there as a pejorative while he exhibits all the same bean counting
morally unguided behaviors of a management consultant
from what I can tell Ragevan has never actually worked in that particular sector of the economy.
But you know who has?
Sundar Pichai, the CEO of Google, who previously worked at McKinsey, arguably the most morally
abhorrent company that's ever existed, having played roles both in the 2008 financial crisis,
where it encouraged banks to load up on debt and flawed mortgage-backed securities, and the ongoing opioid crisis, where it effectively
advised Purdue Pharma on how to growth hack sales of Oxycontin, an extremely addictive
painkiller.
McKinsey has paid nearly $1 billion over several settlements due to its work with Purdue.
But I'm getting sidetracked.
But one last point.
McKinsey is actively anti-labor.
When a company brings in a McKinsey consultant, they're often there to advise on how to cut
costs, which inevitably means layoffs and outsourcing.
McKinsey is to the middle class what flesh a stark example of the monstrous, disgusting rot economy,
the growth at all cost mindset that's dominating the tech ecosystem.
And if you take one thing away from this episode, I want it to be the name Prabhakar Raghavan
and an understanding that there are people responsible for the current state of the internet.
These emails, which I really encourage you to look up,
and if you go to wheresyoured.at, you'll be able to see a newsletter that has links to them.
Well, these emails tell a dramatic story about how Google's finance and advertising teams,
led by Raghvan with the blessing of
CEO Sundar Pichai, the McKinsey guy, actively worked to make Google worse to make the company
more money.
This is exactly what I mean when I talk about the Rar economy, an illogical product-destroying
mindset that turns products you love into torturous, frustrating quasi-tools that require
you to fight the company to get the thing you want
Ben Gomes was instrumental in making search work both as a product and a business
He joined the company in 1999 a time long before Google established dominance in the field and the same year when Larry Page and
Sergey Brin tried to sell the company to excite for one million dollars
only to walk away after Vinod Khosla, an excite
investor and co-founder of some microsystems that's now a VC who tried to stop people
going to a beach in Half Moon Bay, well he tried to lowball them with a $750,000 offer,
also known as a 100 square foot apartment in San Francisco.
In an interview with Fast Company's Harry McCracken from 2018, Gomes frayed Google's
challenge as taking the page rank algorithm from one machine to a whole bunch of machines,
and they weren't very good machines at the time.
Despite his impact in tenure, Gomes had only been made head of search in the middle of
2018, after John Guillén-Durier moved to Apple to work on its machine learning and AR strategy.
Gomes had been described as Google's search czar, beloved for his ability to communicate
across Google's many, quite decentralized departments.
Every single article I've read about Gomes and his tenure at Google spoke of a man deeply
ingrained in the foundation of one of the most important technologies ever made.
A man who had dedicated decades to maintaining a product with a, and I quote Gomes here,
guiding light of serving the user and using technology to do that.
And when finally given the keys to the kingdom, the ability to elevate Google search even
further he was rat-fucked by a series of rotten careerists trying to please Wall Street, led
by Prabhakar Raghavan. please Wall Street, led by Prabhakar
Raghavan.
Do you want to know what Prabhakar Raghavan's old job was?
What Prabhakar Raghavan, the new head of Google search, the guy that ran Google search, that
runs Google search right now, that is running Google search into the goddamn ground.
Do you want to know what his job was?
His job before Google?
He was the head of search for goddamn Yahoo from 2005 through 2012. When
he joined the company, when Prabhakar Raghavan took over Yahoo Search, they held a 30.4%
market share, not far from Google's own 36.9% and miles ahead of the 15.7% that Microsoft's
MSN search had. By May 2012, Yahoo was down to just 13.4% and had shrunk for the previous nine consecutive
months and was being beaten by even the newly released Bing.
That same year, Yahoo had the largest layoffs in its corporate history, shedding 2,000 employees
or 14% of its overall workforce.
The man who deposed Ben Gomes, someone who worked on Google search from its very beginnings,
was so shit at his job that in 2009 Yahoo effectively threw in the towel on its own
search tech, instead choosing to license Bing's engine in a 10 year deal.
If we take a long view of things, this likely precipitated the overall decline of the company, which went from being worth $125 billion at the peak of the dot-com boom to being sold to
Verizon for $4.8 billion in 2017, which is roughly a 3,000 square foot apartment in San
Francisco.
With search no longer a priority and making less money for the company, Yahoo decided
to pivot into Web 2.0 and original
content making some bets that paid off but far, far too many that did not.
It spent $1.1 billion on Tumblr in 2013 only for Verizon to sell it for just $3 million
in 2019.
It bought Zimbr in 2007, ostensibly to complete with the new Google Apps productivity suite
only to sell it for a reported fraction of the original purchase priced to VMware a few years later. That's not his fault.
But nevertheless, Yahoo was a company without a mission, a purpose or an objective. Nobody,
and I'll speculate, even those leading the company, really knew what it was or what it did.
Anyway, just a big shout out right now to Kara Swisher who referred to Prabhukar as
well-respected when he moved from Yahoo to Google.
You absolutely nailed it Kara.
Bang up job.
In an interview with ZDNet's Dan Farber from 2005, Ragavan spoke of his intent to align
the commercial incentives of a billion content providers with social good intent while at
Yahoo and his eagerness to inspire the audience to give more data.
What?
Anyway, before that, it's actually hard to find out exactly what Ragevan did, though
according to ZDNet, he spent 14 years doing search and data mining research at IBM.
In April 2011, The Guardian ran an interview with Ragevan that called him Yahoo's secret
weapon, describing his plan to make rigorous scientific research and practice to inform
Yahoo's business from email to advertising and how under then-CEO Carole Bartz, the focus
had shifted to the direct development of new products.
It speaks of Ragevan's scientific and his steady, process-based logic
to innovation that is very different to the common perception that ideas and development
are more about luck and spontaneity. A sentence that I'm only reading to you because I really
need you to hear how stupid it sounds and how specious some of the tech press used to
be.
Frankly, this entire article is ridiculous, so utterly vacuous, I'm actually astonished.
I don't want to name the reporter. I feel bad. What about Ragavan's career made this
feel right? How has nobody connected these dots before? I have a day job. I run a PR
firm. I am a blogger with a podcast. And I'm the one who said, yeah, okay, Dracula is now
the CEO of the blood bank.
Nobody saw this.
Nobody saw this at the time.
I just feel a bit crazy.
I feel a bit crazy.
But to be clear, this was something written several years after Yahoo had licensed its
search technology to Microsoft.
In a financial deal, the next CEO, Marissa Mayer, who replaced Bartz, was still angry
about for years.
Ragavan's reign as what ZDNet referred to as the Searchmaster was one so successful
that it ended up being replaced by a search engine that not a single person in the world
enjoys saying out loud.
The Guardian article ran exactly one year before dramatic layoffs at Yahoo that involved
firing entire divisions worth of people, and four months before Carole Bartz would be fired
by telephone by then-chairman Roy Bostock.
Her replacement, Scott Thompson, who previously served as president of PayPal, would last
a whole five months in the role before he was replaced by former Google executive Marissa
Mayer, in part because it emerged he lied on his resume about having a computer science degree.
Hey, uh, Prabhakar, did you not notice?
Anyway, whatever. Bart's joined Yahoo in 2009, so about 4 years into her brother Ka's brain of terror I guess,
and she joined in the aftermath of its previous CEO Jerry Yang refusing to sell the company
to Microsoft for $45 billion.
In her first year she laid off hundreds of people and struck a deal that I've mentioned
before to power Yahoo's search using Microsoft's Bing search engine tech with Microsoft paying In her first year she laid off hundreds of people and struck a deal that I've mentioned before
to power Yahoo Search using Microsoft's Bing search engine tech with Microsoft paying Yahoo 88% of the revenue it gained from searches.
A deal that made Yahoo a couple hundred million dollars for handing over the keys and the tech to its most high traffic platform.
As I previously stated when Prabhakar Raghavan, Yahoo's secret weapon, was doing his work,
Yahoo Search was so valuable that it was replaced by Bing.
Its sole value, in fact.
I mean, maybe I'm being a little unfair.
But there's a way of looking at this.
You could say that Yahoo's entire value at the end of his career was driven by nostalgia and association with days before he worked there.
Anyway.
Thanks to the state of modern search, it's actually very, very difficult to find much about Ragovan's history. It took me hours of digging through Google,
and at one point Bing, embarrassingly, to find three or four articles that went into any depth
about him. But from what I've gleaned, his expertise lies primarily in failing upwards,
ascending through the ranks of technology, on the momentum from the explosions he's caused.
In a Wired interview from 2021, glad handler Stephen Levy said Ragavan isn't the CEO of
Google, he just runs the place, and described his addition to the company as a move from
research to management.
While Levy calls him a world-class computer scientist who has authored definitive text
in the field, which is true, he also describes Ragevan as choosing a management track, which definitely tracks with everything I've found
out about him. Ragevan proudly declares that Google's third-party ad tech plays a critical
role in keeping journalism alive in a really shitty answer to a question that was also
made at a time when he was aggressively incentivizing search engine optimized content, and a year after he deposed
someone who actually gave a shit about search.
Under Ragevan, Google has become less reliable and is dominated by search engine optimization
and outright spam.
I've said this before, but we complain about the state of Twitter under Elon Musk, and
justifiably he's a vile anti-semi-racist bigot.
We all know this,
it's fully true, we can say it a million times. However, I'd argue that Raghavan, and by extension
Sundar Pichai, deserve a hundred times more criticism. They've done unfathomable damage
to society. You really can't fix the damage they've been doing, and the damage they'll
continue to do, especially as we go into an election.
Raghavan and his cronies worked to oust Ben Gomes, a man who dedicated a good portion
of his life to making the world's information more accessible, in the process burning the
Library of Alexandria to the goddamn ground so that Sundar Pichai could make more than
$200 million a year. And Raghavan, a manager hired by Sundar Pichai, a former McKinsey man, the king of managers,
is an example of everything wrong with the tech industry.
Despite his history as a true computer scientist with actual academic credentials, Raghavan
chose to bulldoze actual workers, people who did things and people that care about technology
and replace them with horrifying toadies that would make Google more profitable and less
useful. Since Prabhakar took the reins of Google in 2020, Google search has
dramatically declined with these core search updates I mentioned allegedly
made to improve the quality of results, having the adverse effect, increasing the prevalence
of spammy, shitty, search optimized content.
It's frustrating.
The anger you hear in my voice, the emotion, is because I've read all of these antitrust
emails.
I have gone through this guy's history and I've read all the things about Ben Gomes
too.
Every article about Ben Gomes where they interviewed is this guy just having these dreamy thoughts about the future of information and the complexity
Delivering it high-speed every interview with Ragovan is some vague bullshit about how important data is
It's so goddamn offensive to me
and
all of this stuff happening is just one example of what I think are probably hundreds of things happening across
startups or that have happened across startups in the last 10 or 15 years and
Big Tech too and it's because the people running the tech industry are no longer
those who built it. Larry Page and Sergey Brin left Google in December 2019, the
same year by the way as the code yellow thing, and while they remained as
controlling shareholders they clearly don't give a shit about what Google
means anymore.
Prabhakar Raghavan is a manager, and his career, from what I can tell, is mostly made up of
did some stuff at IBM, failed to make Yahoo anything of no, and fucked up Google so badly
that every news outlet has run a story about how bad it is.
This is the result of taking technology out of the hands of real builders and handing
it to managers at a time when management is synonymous with staying as far away from actual
work as possible.
When you're a do-nothing looking to profit as much as possible, who doesn't use tech,
who doesn't care about tech, and you only care about growth, well you're not a user,
you're a parasite, and it's these parasites that have dominated and are now draining the tech industry of its value. They're driving it into a goddamn ditch.
Rakevan's story is unique, insofar as the damage he's managed to inflict, or if we're being
exceptionally charitable, fail to avoid in the case of Yahoo!, on two industry-defining companies,
and the fact that he did it without being a CEO or founder
is remarkable.
Yet, he's far from the only example of a manager falling upwards.
I'm going to editorialize a bit here.
I want you to think about your job history.
I want you to think about the managers you've had.
I've written a lot about management and specifically to do with remote work and the whole thing
around guys who don't do work, who are barely in the office telling you, you need to do with remote work and the whole thing around guys who don't
do work who are barely in the office telling you you need to be in the office.
This problem is everywhere.
Managers are everywhere.
And managers aren't doing work.
I'm sure someone will email me now and say, well, I'm a manager and I do work all the
time.
Yeah, make sure you do.
That's why you're emailing me telling me how good you are at your job.
People who actually do work don't feel defensive about it.
People who do things and are part of the actual profit center, they don't need a podcast to
tell them they're good at their job.
What I think the problem is in modern American corporate society is that management is no longer synonymous
with actually managing people. It's not about getting the people what they need. It's not
about organizing things and making things efficient and good. It's not about execution.
It's about handing work off to other people and getting paid handsomely. And if you disagree,
easy at betteroffline.com. I will read your email, maybe I'll even respond.
But the thing is, management has become a poison in America. Managers have become poisonous.
Because managers are not actually held to any kind of standard. No, only the workers
who do the work are. What happened to Ben Gomes is one of the most disgusting, disgraceful
things to happen in the tech industry. It's an absolute joke. Ben Gomes was a god damn hero and I
really need you to read the newsletter and read these emails. I need you to see how many
times him and Thacker, great guy as well, were saying, hey, growth is bad for search.
The thing that Ben Gomes was being asked to do was increase queries on Google, the literal amount that people search.
There are many ways of looking at that and thinking, oh shit, that's not what you want.
Surely you don't want no queries, you don't want people not using it at all, but queries
going upwards linearly suggests that if you're not magic it to user growth at least the people are
Not getting what they want on the first try which by the way kind of feels like how Google is nowadays
When you go to Google and the first result and the second result and the fifth result and the tenth result
Just don't get what you need because it's all SEO crap
Now this is all theorizing but what I think Prabhagar Raghavan did was I think he took
off all the fucking guidelines on Google search.
I think he rolled back changes specifically to make search worse, to increase queries,
to give Google more chance to show you adverts.
I am guessing, don't have a source telling me this, but the pattern around the core search
updates, the fact that Google
search started getting worse toward the middle and end of 2019 and unquestionably dipped
in 2020, wow, that's when Prabhakar took over.
That's when the big man took the reins.
That's when Dracula got his job at the blood bank.
And this is the thing.
There's very little that you and I can actually do about this, but what we can do is say names
like Prabhagarh Raghavan a great deal of times so that people like this can be known, so
that the actions of these scurrilous assholes can be seen and heard and pointed at and spat
upon.
I'm not suggesting spitting on anyone, no violent acts.
No. Can be pissy
on the internet like the rest of us. Now I'm ranting, I realize I'm ranting,
but this subject really, really got to me. But it's not the only one. In the next episode,
I'm going to conclude this sordid three-part fiasco. With a few more examples, and how many of these managers,
these bean counters, devoid of imagination or ability or anything of note, save for that
utter slug-like ability to protect oneself, I want to talk about how these people managed to
obfuscate their true intentions by pretending to be engineers, by pretending to be technologists and pretending to be innovators.
I want to tell you all about how Adam Masseri destroyed Instagram.
And I want to tell you how little Sam Altman has achieved other than making him and his
friends rich.
See you next time. music and audio projects at matosowski.com. M-A-T-T-O-S-O-W-S-K-I.com.
You can email me at ez at betteroffline.com or check out
betteroffline.com to find my newsletter and more links to
this podcast. Thank you so much for listening.
Better Offline is a production of Cool Zone Media. For more from
Cool Zone Media, visit our website, coolzonedia.com or
check us out on the iHeart Radio app,
Apple Podcasts, or wherever you get your podcasts. In the quiet town of Avella, Pennsylvania, Jared and Christy Akron seemed to have it
all.
A whirlwind romance, a new home and twins on the way.
What no one knew was that Christy was hiding a secret so shocking it would tear their
world apart.
911 response, what's your emergency?
My babies, please, my babies!
One woman, two lives, and the truth more terrifying
than anyone could imagine.
They had her as one of the suspects,
but they could never prove it.
You're going to go to jail,
if you don't come with us right now.
Throughout this whole thing, I kept telling myself,
nobody's that crazy, crazy.
Uncover the chilling mystery
that will leave you questioning everything.
A story of the lengths we go to protect our darkest secrets. Hey guys, I'm Kate Max.
You might know me from my popular online series, The Running Interview Show,
where I run with celebrities, athletes, entrepreneurs, and more.
After those runs, the conversations keep going.
That's what my podcast, Post Run High, is all about.
It's a chance to sit down with my guests
and dive even deeper into their stories, their journeys,
and the thoughts that arise
once we've hit the pavement together.
You know that rush of endorphins
you feel after a great workout?
Well, that's when the real magic happens.
So if you love hearing real, inspiring stories
from the people you know, follow, and admire, join me every week
for Post Run High. It's where we take the conversation beyond the run and get into the
heart of it all. It's lighthearted, pretty crazy, and very fun. Listen to Post Run High
on the iHeartRadio app, Apple podcasts, or wherever you get your podcasts.
Apple podcasts or wherever you get your podcasts.
Hey everyone, it's John also known as Dr. John Paul.
And I'm Jordan or Joe Ho.
And we are the Black Fat Film Podcast.
A podcast where all the intersections of identity are celebrated.
Ooh, chat.
This year we have had some of our favorite people on including Kid Fury,
T.S. Madison, Amber Ruffin from the Amber and Lacey Show, Angelica Ross and more.
Make sure you listen to the Black Fat Fam podcast on the iHeartRadio app, Apple Podcast
or whatever you get your podcast girl.
Oh, I know that's right.
Welcome to the Criminalia Podcast. I'm Maria
Tremarque. Holly Frye
And I'm Holly Frye. Together, we invite you into the dark and winding corridors of historical
true crime. Maria Tremarque
Each season, we explore a new theme, everything from poisoners and pirates to art thieves
and snake oil products and those who made and sold them.
Holly Frye We uncover the stories and secrets of some
of history's most compelling criminal figures,
including a man who built a submarine as a getaway vehicle.
Yep, that's a fact.
We also look at what kinds of societal forces were at play at the time of the crime, from
legal injustices to the ethics of body snatching, to see what, if anything, might look different
through today's perspective. And be sure to tune in at the end of each episode as we indulge in custom-made cocktails
and mocktails inspired by the stories.
There's one for every story we tell.
Listen to Criminalia on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Hi, listeners. or wherever you get your podcasts. have shaped the fabric of these American communities forever. And you can get access to all episodes
of American Homicide, 100% ad-free, and one week early through the iHeart True Crime Plus subscription
available exclusively on Apple Podcasts. So don't wait. Head to Apple Podcasts,
search for iHeart True Crime Plus, and subscribe today.
Hello and welcome to Better Offline, I'm your host, Ed Zetron. As I discussed in the last episode, Sam Altman has spent more than a decade accumulating
power and wealth in Silicon Valley without ever having to actually build anything, using a network
of tech industry all-stars like LinkedIn co-founder and investor Reid Hoffman and Airbnb CEO Brian
Chesky to insulate himself from responsibility and accountability.
Yet things are beginning to fall apart as years of half-baked ideas and terrible, terrible product decisions
have kind of made society sour on the tech industry.
And the last month has been particularly difficult for Sam, starting with the chaos caused by
OpenAI blatantly mimicking Scarlett Johansson's voice for the new version of chat GBT, followed by the resignation of
researchers who claimed that OpenAI prioritized, and I quote, shiny products over AI safety.
After the dissolution of OpenAI's safety team.
I know, it's almost cliche.
Shortly thereafter, former OpenAI board member Helen Toner revealed that Sam Altman was fired
from OpenAI because of a regular pattern of deception revealed that Sam Altman was fired from OpenAI because
of a regular pattern of deception, one where Altman would give inaccurate info about the
company's safety processes on multiple occasions. And his deceit was so severe that OpenAI's board
only found out about the launch of ChatGPT. Which by the way is OpenAI's first product that really
made money, arguably the biggest product in tech.
Want to know how they found out about it?
Well, they found out when they were browsing Twitter.
They found out then, not from the CEO of OpenAI, the company which they were the board of.
Very weird.
Toner also noted that Altman was an aggressive political player, with the board, correctly
by the way, worrying the, and I quote again,
that if Sam Altman had any inkling that the board might do something that went against him, he'd pull out all the stops, do everything in his power to undermine the board,
and to prevent them from even getting to the point of being able to fire him.
As a reminder by the way, the board succeeded in firing Sam Altman in November last year,
but not for long, with Altman returning as CEO a few days later, kicking Helen Toner off of the
board along with Ilya Sudskaver, a technical co-founder that Altman manipulated long enough
to build ChatGBT, and announced at him the moment he chose to complain.
Sudskaver, by the way, has resigned now.
He's also one of the biggest technical minds there. How is OpenAI going to continue?
Anyway, last week a group of insiders at various AI companies published an open letter asking
for their overlords, the heads of these companies, for the right to warn about advanced artificial
intelligence in a monument, genuinely impressive monument to the bullshit machine that Sam
Altman has created.
While there are genuine safety concerns with AI, there really are, there are many of them
to consider, these people are desperately afraid of the computer coming alive and killing
them when they should fear the non-technical asshole manipulator getting rich making egregious
promises about what AI can do.
AI researchers, you have to live up to Sam Altman's promises. Sam Altman doesn't.
This is not your friend.
The problem is not the boogeyman computer coming alive.
That's not happening, man.
What's happening is this guy is leading your industry to ruin.
And the bigger concern that they should have should be about what Leo Aschenbrenner, a
former safety researcher at OpenAI, had to say on the D Dwarkesh Patel podcast, where he claimed that
security processes at OpenAI were, and I quote, egregiously insufficient, and that the priorities
at the company were focused on growth over stability of security. These people are afraid
of OpenAI potentially creating a computer that can think for itself that will come and
kill them at a time where they should be far more concerned about this manipulative con artist that's running open AI. Sam Altman is
dangerous to artificial intelligence not because he's building artificial
general intelligence which is a kind of AI that meets or surpasses human
cognitive capabilities by the way. Kind of like data from Star Trek. They're
afraid of that happening when they should be afraid of Altman's focus. What does Sam Altman care about? Because the
only thing I can find reading about what Sam Altman cares about is Sam bloody Altman. And
right now, the progress attached to Sam Altman actually isn't looking that great.
OpenAI's growth is stalling, with Alex Kantrowitz reporting that user growth has effectively
come to a halt based on a recent release claiming that ChatGPT had 100 million users a couple
of weeks ago, which is by the way the exact same number that the company claimed ChatGPT
had in November 2023.
ChatGPT is also a goddamn expensive product to operate.
With the company burning through capital at this insane rate, it's definitely more than
$700,000 a day.
It's got to be in the millions, if not more.
It's insane.
And while OpenAI is aggressively monetizing ChatGPT, both the customers and the businesses,
it's so obviously far from crossing the break-even Rubicon.
They keep leaking and they'll claim, oh, I didn't put that out there.
They keep telling people, oh, it's making billions of revenue, but they never say profit.
Eventually, someone's going to turn to them and say, hey, man, you can't just do this
for free or for negative.
At some point, Sacha Nadella is going to call Sam Altman and say,
Sammy, Sammy, it's time.
Sammy, it's got to be a real business.
I assume he calls him that because of Supernatural.
But as things get desperate, Sam Altman
is going to use the only approach he really has, sheer force of will.
He's going to push open AI to grow and sell into as many industries as possible.
And he's a specious hype man. He's going to be selling to other specious hype men.
The Jim Cremers of the world are going to eat it up. And they're all, all of them, the mock
Benioffs, the Sacha Nadella, the Santa Pesais, they're all desperate to connect themselves with
the future and with generative AI. And those that he's selling to. The company's brokering deals, yes, even Apple. They're desperate to connect their
companies to another company which is building a bubble, a bubble inflated by Sam Altman.
And I'd argue that this is exceedingly dangerous for Silicon Valley and for the tech industry
writ large as executives that have become disconnected from the process of creating
software and hardware follow yet another non-technical founder hocking unprofitable, unsustainable
and hallucination-prone software.
It's just very frustrating.
If there was a very technical mind at these companies, they might walk away.
And I'm not going to give Tim Cook much credit, but looking into it, I can't find any evidence
that Apple is buying a bunch of GPUs, the things that you use to power these generative
AI models.
I found some research and analysts suggesting that they would buy a lot, but now OpenAI
is doing a deal with Apple to power the next iOS and it's interesting.
It is interesting that Apple isn't doing this themselves. Apple, a company with hundreds of
billions of dollars in the bank, I believe, that pretty much prints money. That alone makes me
think it's a bubble. Now it might look like an arsehole if it comes out they have, but also,
why are they subcontracting this to open AI
when they could build it themselves as Apple has always done? Very strange. It's all so peculiar.
But I wanted to get a little bit deeper into the Sam Altman story and as I discussed last episode,
Ellen Hewitt of Bloomberg, she's been doing this excellent reporting on the man
and joins me today to talk about the subject of a recent podcast, Sam Altman's rise to
power.
So tell me a little bit about the show you're working on.
The show is the new season of Foundering,
which is a serialized podcast from Bloomberg Technology.
So this is season five.
And in every season, we've told one story of a high stakes
drama in Silicon Valley.
I was also the host of season one,
which came out several years ago.
It was about WeWork.
And we've done other companies since then.
And season five is about OpenAI and Sam Altman.
And I think we really tried to cover the arc of the company's creation and where it is now.
But in doing so, we really tried to do a character study of Sam Altman.
Like, he's a very important person in the tech industry right now with a lot of power.
And we really wanted to ask ourselves the question and to help listeners ask themselves the question,
should we trust him?
Should we trust this person who is currently
in a position of a lot of influence
and about whom there have been very serious allegations
and questions raised about,
to put it in the words of the OpenAI board,
his not consistently candid behavior.
And I think it's, you know, my hope is that we
give listeners a chance to hear kind of the whole
story and this like broader, you know, when
there's news that's happening, it can happen so
quickly, it's hard to get a step back.
And I think what the show really does is it collects
a lot of information in one place.
And we also have lots of new information
that you won't hear anywhere else.
And interviews with people who have worked with Sam,
who knew him when he was younger.
We have an interview with Sam's sister, Annie,
from whom he is estranged.
And there's a lot of material in there, I think, that tries to get closer to this
answer of like, what should we make of this person?
How should we think about checks and balances of power when we have these
companies that are, by all appearances, gathering a lot of power and therefore
the people who are running them have a lot of power as well.
So we have, it's a five episode arc, five episode season,
and the first three episodes are out now to the general public, and the last two
will come out on subsequent Thursdays, and if you would like to binge the
whole season right away, the episodes are available early to Bloomberg.com
subscribers. So you've just started this series about Sam Altman and his upbringing and also the growth
of OpenAI and Lube and everything.
Who are the people that have helped him get where he is today though?
So the making of Sam Altman is a really interesting part of the overall story of Sam Altman. You know, many people know him as the CEO of
OpenAI because that's the role he's been in when
he has risen to prominence, you know, beyond
Silicon Valley.
Like I think for many years, he was well known
in Silicon Valley, but this is like, now he's
kind of a household name.
And so it's important to understand where
Sam came from.
You know, he's been in the Valley for, you know, since 2005, I think is when
he started college, 2004, 2005 at Stanford.
And he dropped out and then he joined Y Combinator, the now famous startup
accelerator, but he was actually part of the first cohort of founders ever in, in,
in YC along with-
I included Twitch as well, right?
Yes, including the, the co-founders of Twitch
and of Reddit.
And so Emmett Shear, you know, for those who
know Emmett Shear has a like very short 72 hour
cameo in the open AI Sam Albin firing saga.
Um, but yes, Emmett and, and Sam were both in
the same YC batch.
So when we think about Sam's early career in Silicon Valley, I think what's important
to know is that he rose very quickly in part because he was very successful in making these
strategic advantageous friendships and connections with already established people in the Valley.
The most important one is Paul Graham, who is the,
you know, one of the founders of Y Combinator and, you know, basically like immediately took Sam under his wing when Sam joined this first batch of YC. And yeah, Paul's a really important mentor
to Sam. He's kind of the first person who really sees in Sam this, you know, ambition, this hunger for power,
this like drive to really build bigger and bigger companies.
Even when, you know, they met when Sam Altman was 19.
So, Paul like sees him as a teenager and sees this future potential.
And so, yes, you know, not only did Paul become a mentor to him and sort
of helped build Sam's profile over those early years because he would, you know,
Paul Graham is very famous for writing these essays about how to build startups
and how to build the best startups. And if you're at all interested in building
startups, you've read many of them. They're kind of like almost like a
startup Bible. And in many of them, he extols the virtues of Sam Altman.
He talks about Sam's ambition. He talks about Sam's cunning, his ability to like, you know,
make deals and like think big.
And so-
But never actually things Sam Altman has done is what I've found.
Yeah. There are some interesting, you know, I've read many of the things Paul has written
about Sam. Some of my favorite ones include Paul writing
that within three minutes of meeting Sam, this was
when Sam was 19, uh, Paul thought to himself, ah,
so this is what a young Bill Gates is like, or,
you know, this is what Bill Gates was like at 19,
I think is the exact quote.
Um, so, you know, he really builds him up in this
way and, and I do think Paul had like unique
insight into Sam, like they were close.
They, in many ways I'm sure still are.
Um, but it is this interesting role where, you know, Paul met Sam when he really
didn't have much to his name and he really elevated him early on through his
writings as this like startup founder to emulate, right?
That other founders should be emulating Sam.
And then of course, as Sam progresses in the Valley, he also starts to write
these like
startup wisdom essays.
Cause I evacuous start up Bible things.
In a similar style to Paul.
And then of course, the most important thing that happens is that in 2014, when Paul decides
he no longer wants to run Y Combinator, which at this point is a much bigger vehicle than
it was when Sam first started.
It has no longer just a few stops.
Totally has produced Stripe, Dropbox, Airbnb.
Tens and now hundreds in the cohorts. This is a big job, right? no longer just a few stops. It's totally has produced Stripe, Dropbox, Airbnb.
And the cohorts.
This is a big job, right?
Like running Y Combinator.
And when Paul wants to hand it off to someone, you know, he has said that the only person he
considered giving this to was Sam.
So in 2014, when Sam is, I believe, 28 years old, he becomes president of Y Combinator.
And this is, you know, he had started a startup.
It didn't really work.
He sold it and was starting to dabble in angel investing.
And at that point, Paul really elevated Sam to this new position of power. And then he ran YC for a while and then started OpenAI.
And in starting OpenAI, he also leveraged these like very useful connections with particularly powerful people who could help him, such as Elon Musk,
who was able to give the vast majority of the pledged funding to start OpenAI. Later,
when Elon Musk splits from OpenAI, Sam makes this very powerful partnership with Satya Nadella to
help fund OpenAI. Another important partnership that Sam has made much earlier on was this
friendship with Peter Thiel. and one of the things Peter
Thiel does is also, you know, gives him millions
of dollars to start investing.
This is like before Sam takes over at YC.
With Hydrazine, right?
Yeah.
And you know, another thing that Paul did that
really Paul Graham did that really helped Sam
was also he gave, Paul had the opportunity to be one of the
first investors in Stripe.
Yep.
He was offered the chance to invest $30,000
for 4% of Stripe, which of course, now that
Stripe is enormous, we all know how valuable
that was and Paul split it with Sam.
He was like, oh, I might as well share this with Sam.
So Sam has said that that $15,000 for a 2% of
Stripe has been, you know, one of his best
performing angel investments ever.
That was something he got.
My question is always where he got 15 grand from.
He was still working on looped at the time.
It's funny how privileged, anyway.
Yeah.
My guess is 15 grand was, I don't actually know
this, but my guess is 15 grand was not hard
for him to pull up.
And it's one of those things where it's really is, um, you know, access to access
and relationships are the sorts of things that can build, um, a career and
can lead to great wealth, right?
Like Sam is now, you know, by our own internal accounts and by other lists, a
billionaire and this money comes from, you know, not from open AI, but from these angel
investments that he's made early on that have been enormously successful.
So you called him in one of the titles, the most Silicon Valley man alive. Is this what you're getting at, this kind of power player mentality?
Yeah, I think it's, it reflects a few things. One, that even though he's, you know, he's in his late 30s, he's been a player in Silicon Valley for such
a long time, you know, close to two decades. And also that he's just someone who is extremely
well connected. So even before he took over Y Combinator, which I think you could argue is
like kind of king of the startup world in some sense, like Y Combinator is like, you know, the
top accelerator. It was the early stage. Totally. Even before he sense. Like Y Combinator is like, you know, the top accelerator. At least the early stage.
Totally.
Even before he took over at Y Combinator, I think he was extremely well connected.
He's very social, he's very helpful, he's very efficient.
Like many people have told me stories in which he, you know, calling Sam and
talking for five minutes has solved their problem because he knows exactly
the right person to call to fix it.
Or, you know, he's really good at making deals.
I think it's just clear he's extremely well
integrated into this world, um, and has very
successfully moved up the Silicon Valley status
ladder to the point where he is now, which is
kind of, you know, one of the, you know, he's
the CEO of the, one of the arguably hottest
companies in the Valley right now.
And I think
that that's not luck, right?
He didn't just come up with, he's not like a nobody who came up with an idea.
It's like he has the connections and has parlayed his connections into power to bring him to
the point he is now.
So in your experience talking to people about Sam Altman, how technical is he, do you think?
What have you heard? Because you say there
he wasn't lucky, but he also does not appear to have successfully run a business. Because
Looped shut down two people, well two executives tried to get him fired from there, he got
fired from Y Combinator, which did very well, but at the same time YC was basically a conveyor
belt for money at one point. Not so much recently. It just feels weird that
this completely non-technical, semi-non-technical guy has ascended so far.
My sense is that's not maybe the most fair description. I think Sam is incredibly smart
and people say this a lot and I believe them. I think his special skill, he obviously knows
how to, he's an engineer, he has
training, I'm sure he can build a lot of stuff.
It seems like his comparative advantage, his
special skill is relationships, deal making,
figuring out who exactly is the right person to
help him in whatever he's really trying to get
done and figuring out the best way to get
something to happen.
You know, one of the people I spoke to is someone who knows Sam
from when he was younger and knows him personally
and said that his superpower is figuring out
who's in charge or figuring out who is in the best position
to help him and then charming them
so that they help him with whatever goal
he's trying to get done.
And I think that like, yeah, one could argue
that that's actually a really good skillset
if you want to build a very big company,
which I think at this current moment he has, right?
Like OpenAI is really, you know, you can,
there's a lot that you can say about
whether they're upholding their original mission
or that's up for debate,
but I think that they've obviously been commercially successful so far.
Will Barron So it feels like Silicon Valley on some level.
Just to give some thoughts here, within the two episodes I'm doing here, the pattern
I've seen with Sam Altman is that everyone seems to want him to win.
And there's almost a degree of they will make it so.
Have you seen anyone who's really a detractor or anyone who's not pro Sam Altman?
Because it's interesting how few people are in tech.
Well, there is.
I won't get too into it because this is in some of future episodes, which will drop in future weeks.
But I would say in some of the conversations that I've had off the record about people,
in some of the conversations I've had off the record with people about Sam, I think
my general impressions are people often do find him impressive in terms of what he has
gotten done.
You know, the size and scale of his ambitions and the way that he has generally been able to make that happen.
I think there's also a lot of people who, you know, are willing to privately share some gripes that they might have about him.
Also, you know, in recent weeks, we've seen people
be a lot more public about some of those gripes.
We have Helen Toner, a former board member at OpenAI
who voted to remove Sam last November,
saying publicly in the last few weeks
that Sam lied to her and the other former board members,
that his misdirection made them feel
like they couldn't do their jobs.
And she has also said that people were intimidated to the point where they did not feel comfortable
speaking more publicly about negative experiences they'd had about Sam,
that they are afraid to speak more publicly about times that he has not been honest with them
or in which they've had challenging experiences.
And that has also been reflected in some of the private conversations I've had in which
people, you know, they might have complaints or they might have had like challenging situations
with him.
And I think they just feel like the risk calculus is not worth it to come out and say something
like that.
But, you know, there have been bits and pieces where people have come out and said things that
Sam has, you know, another thing that the board members have said was that Sam had been
deceptive and manipulative.
And that's also followed up by, or not followed up, there was also, I think back in November,
a former OpenAI employee who had tweeted something publicly about that, saying that Sam had lied to him on occasion, even though he had also always
been nice to him, which I think is a very interesting combination of characteristics.
Isn't that Silicon Valley though?
A little bit maybe.
I'm afraid of dealing with them, but they were so nice to me.
And yeah, of course, that person has not elaborated more publicly about what they meant.
For fear of... And yeah, of course that person has not elaborated more publicly about what they meant. But I think this is why people are asking themselves these questions, which is like,
the more that we hear about what the board was thinking before they decided to fire Sam,
I think the more people are wondering about what are the patterns of behavior that he shows
that led to the board trying to make
this drastic move?
Yeah, that's actually an interesting point.
So when Sam Alton was fired from OpenAI, there was this very strange reaction from Silicon
Valley, including some in the media, where it was almost like Hunger Games, everyone doing the symbol thing, where
everyone's like, oh, we've got to put Sam Altman back in.
Isn't it kind of strange we still don't know why he was actually fired though?
I mean, Helen Toner has elaborated.
Like, I've never seen, have you seen anything like this in your career?
I think that it has been surprising that there
has not been more of a clear answer.
Um, I think, you know, as, as, as time has gone
on, like we have heard a little bit more, like I
think Helen Toner has, you know, to her credit,
tried to give more information in, in, in
recent weeks about what happened.
I think, you know, people were obviously asking this question six months ago,
and so I think there's been a little bit of a delay in trying to get this answer.
And I wonder if maybe there just isn't a very neat answer to it.
And so then in that absence, we get this kind of more of a like murky, multifaceted,
multi-voiced answer.
But yes, I agree that it is sort of surprising that there hasn't been more clarification
on what exactly happened or a little bit more granular detail about what led up to it.
So onto the AI hype in general.
Said that a bit weird, but I'll keep going.
Why do you think there's such a gulf between what Sam Altman says and what chat GPT can actually do?
What Sam Altman says? What are you talking about specifically?
As in he says it will be a super smart companion.
Ah, yeah, yeah, yeah.
That he'll be all of these things.
Well, this is something that we get into in episode three, which is a personal interest of mine,
which is kind of the psychology of the AI industry right now.
And, you know, what I find so interesting about this
and what we try to delve into in episode three
and kind of throughout the series
is these kind of like extreme projections about AI.
And in the industry, you see both positive ones
and negative ones.
And I think, you know, the negative ones,
that's what looks like AI-doomerism, AI existential risks,
sometimes called AI safety, depending on your point of view.
But, you know, it's these projections that, you know,
super intelligence might very quickly and very soon
learn to self-improve in a way that allows it to rapidly outstrip
our control and our capabilities and could lead to the extinction of humanity.
There are so many interesting things to say about the psychology of believing that our
human race might either be wiped out or incredibly changed in within our lifetimes.
And we get into that in episode three. I think I really wanted to get
into the psychology of someone who believes that AI doom is just around the corner. And so we talked
to someone who sort of became convinced of this belief soon after the 2016 AlphaGo matches in which
the Go playing AI beat the, you know, the world's champion in Go. And he talks about, yeah, deciding not to make a retirement account.
Because he was like, what is the point?
By the time I reach retirement age, either the world will be dramatically different
and money won't matter or we'll all be dead.
And I think that even though some people might scoff at that,
that's like a real belief that people believe that these extreme
possible scenarios are in our near future. And on the other hand, we also see extreme projections
in a positive direction, this idea that AI is going to unlock a whole new era of human
flourishing that we might expand beyond our planet, that we might be able to give-
Abundance. Say what? Abundance. of human flourishing that we might expand beyond our planet that we might be able to give say what
abundance right exactly you know one of the things we do I believe in episode three is is is do a
little bit of a supercut of Sam Albin talking about abundance it's pretty clear that this is a way
that he likes to frame our AI future is going to be this future in which everyone has plenty, right? Everyone has, uh, you know, access to intelligence,
abundant energy, abundant access to, um,
super intelligence that can help us live kind of
our best lives and beyond our wildest dreams.
Um.
Right.
And I, you know, obviously Silicon Valley is a place
where people like to make grandiose statements,
but this is beyond that, right?
This is not just, this is not just like, you know,
we joked about WeWork's mission statement was to
elevate the world's consciousness.
Like, well, galaxies of human flourishing for
eons beyond us, like that is on another scale,
right?
Like we're talking about something that is sort
of at an unprecedented level of extreme rhetoric.
I think that's really interesting.
I think it is a very powerful motivator, both in the doomer sense and also in the abundance
sense.
People believing that what they're working on is the most important technological leap
forward for humanity.
Talk about a motivating reason to work on this technology, right?
Talk about a way to feel powerful, feel like you're making a huge difference.
I think that's a really key part of what's driving a lot of work in AI right now.
It's driving a lot of work, sure, but with Altman himself, there is this gulf.
It is a million-mile gulf between the things he says and what chat gbt is even
Even on the most basic level capable of doing and will be capable and it just feels like it almost feels like he's become
the propagandist for the tech industry and
It's very strange to me how far that distance is because you've got the AI do-mers and the
AI optimist, I guess you'd call them.
But Altman doesn't even feel like he's in with either.
He's just kind of, he'll say one day that he doesn't think it's a creature.
The next one he'll say it's going to kill us all.
It all just feels like a PR campaign, but for nothing.
Yeah. like a PR campaign, but for nothing. Yeah, it has been interesting to try to answer the question.
You know, one of the questions we tried to answer in the podcast is, does Sam actually
believe, you know, because as you mentioned, there are some early clips of him, you know,
when I say early, I mean around the time of founding OpenAI 2015 or so.
There's some clips of him talking about, you know, saying somewhat
jokingly that AI might kill us all. But there's also this, you know, very famous
blog post that he wrote in 2015 in which he says that, you know, basically
superintelligence is one of the most serious risks to humanity, you know,
full stop. And so it's clear that at some point in his life he believed kind of
what we might now call a more do-mer-y outlook. But some point in his life, he believed kind of what we might now call
a more do-mer-y outlook.
But as time has gone on, he has offered views
that are a little bit more measured and more positive.
He tends to, in his big media tour of 2023,
he tended to talk about how AI was going to,
his projection was that AI would
radically transform society, but that it would be net good, right?
That like overall, we would be glad that this happened and that it would improve lives,
even if in the short term, or for some people it might prove to be bringing a lot of challenges
as well.
And so it is, you know, I think one of the interesting things about him is it is a little
hard to pin down exactly what he thinks.
I think you're right that I wouldn't consider him like a gung-ho effective accelerationist.
I would not consider him a doomer.
He is like somewhere in this large gulf in between there.
But I think he's also smart enough to know that making grandiose projections about what
AI could bring is a compelling story, right?
Is a story that he can help sell by being like a spokesman for it.
Often that is the role of a CEO is to be a really good storyteller, to bring the pitch
of the company to the public, to investors, to potential
employees, to customers to try to sell them on this vision of the future.
And I do think Sam is good at that.
There is an interesting tidbit in episode three in which we interview a fiction writer
who was actually hired on contract by OpenAI to write like a novella about AI futures and things like that.
And yeah, he just talks a little bit about, you know, the novella is not, I think, in
active use within OpenAI, but they did at some point see, they did at some point see
value in commissioning it.
And I think, you know, something that the author, Patrick House, explains to us is,
you know, that opening eye, just like many other startups, is really motivated by story, right? And
that Sam Altman is inspired by fiction, you know, is inspired by certain kinds of sci-fi. I think
this is not unique to Sam. Many founders in Silicon Valley, you know, Elon Musk has talked about
this as well, are driven to create things in part
because of what they read about when they were younger, these dreams of the future.
And so it's just interesting to get his perspective on how motivating a story can be and how motivating
this compelling story of like, oh, we're building something that's going to change the course
of human history.
Like, you just can ask for a more powerful motivating force.
So as Altman accumulates power and as he kind of ascends to the top of open AI, do you think
he's done there?
Do you think there's going to be another thing he starts?
Because it feels like you've discussed like UBI and all these other things.
Do you think he has grander ideas that he wants to pursue?
Well, obviously I can't speak to what's inside Sam's head.
No, you don't know the man's mind.
But I mean, past indicators would suggest yes.
Like I think he has proven pretty consistently that he's someone who, you know, is, you know, as much as he might focus on
one project with a lot of effort, like he is cooking things on the side. Like this is a man,
this is going to be an extended metaphor, but this is a man working at a stove that has like
six burners, not one. And, you know, he, we already know that. What'd you say?
Sorry. He's got a big house. He's got multiple houses. Um, the, uh, the, you know, we already know that,
you know, in addition to running open AI, he has
funded and or helped prompt the founding of, or
has, um, you know, been very involved in investing
in and supporting other startups that, you know, are
part of this kind of ecosystem of
businesses that are connected to an AI future or might benefit in an AI future. So for example,
Helion, which is a nuclear fusion company, which he has invested a ton of money into,
I think he has said publicly that his vision is that this is a potential way to provide abundant
energy that could then power the technology that we need to improve AI to the level that
we're hoping that it can get to or that he's hoping that it can get to.
At the same time, we've talked a little bit about universal basic income.
This has been something that Sam has been a proponent of and an advocate of since at
least 2016 when he was running Y Combinator and they started a side research project to
study universal basic income by giving cash payments to families in Oakland of, I believe,
$1,000 a month.
That research project is still ongoing.
It's now moved away from Y Combinator and is
associated with Open Research, which is, I
believe, funded by Open AI.
And so it has kind of moved with Sam to his new role.
Um, and of course he also co-founded this company
called WorldCoin, which used these silver orbs machines to scan, to take
pictures of your iris and give everyone, register
every individual human as like a unique human
individual and to create this eyeball registry
in which, by which one could in the future
distribute a universal basic income. So
he's funding these energy companies, he's like involved in these sort of
crypto eyeball registry project that will help distribute UBI in this future
that he's imagining. Like I think it's safe to say he's definitely thinking
about things beyond just open AI for the future and imagining like okay well if
we have this piece that's growing
what else will we need to support it and I'm sure there are other things he's
working on that we don't even know about right like I know he has also funded
some like longevity bioscience projects and things like that I he's I guarantee
he's thinking about stuff beyond what we know about.
Final question why do you think the entire tech industry has become so fascinated
with AI? Do you think it's just Altman or is it something more? I do think chat GPT started
heating up this interest that was already percolating a little bit in the tech industry,
but it does seem like something about chat GPT captured the public imagination, made people imagine very seriously for the first time, how
AI could affect their lives, their lives individually.
It used to be kind of this abstract thing that
was a little farther away.
Or maybe you understood that like you were
interacting with AI sometimes like when you
would look at like flight price predictors or
things like that.
Yeah, exactly.
But, you know, I think as we, you know, we talk
about this in episode three, but you know,
chat GPT wasn't even new technology.
It was actually just a different user interface
on a model that already existed, GPT 3.5.
And so to me, that actually speaks, I guess, to
the power of like making a technology accessible
to everyone and in a way that was like easy to use and, you know, for better or for worse,
that kind of got a lot of people in this like public momentum of people thinking about AI,
you know, just feeling like it had rapidly increased its capabilities in a short period
of time.
And yeah, something about that really captured not just the minds but also the hearts of
people and getting them really thinking about what could a future like this look like.
And I think while some people were excited, a lot of people also reacted with fear, right?
And I think in the Valley, you will hear a lot of people also reacted with fear, right? And like I think in the Valley, like you will hear a lot of people more openly discussing
their fears of sort of like job loss or just like dramatic social change that might come
about in the next 10 or 20 years.
The feeling I get in conversations that I have in and around San Francisco is, you know, even people who are pretty deep in this technology are uncertain about whether
it's going to be overall good or bad.
Like, they're just uncertain of how to look back on this time, like whether
it will have ended up being a leap forward for humanity or something different.
Altman has taken advantage of the fact that the tech industry might not have any hypergrowth
markets left, knowing that chat gbt is, much like Sam Altman, incredibly adept at mimicking depth and experience by
parroting the experiences of those that have actually done things.
Like Sam Altman, ChatGPT consumes information and feeds it back to the people using it in
a way that feels superficially satisfying.
And it's quite impressive to those who don't really care about creativity or depth, and
like I've said it takes advantage of the the fact that the tech ecosystem has been dominated and
funded by people who don't really build tech.
As I've said before, generative AI, things like chat GPT, Anthropics Claude, Microsoft's
copilot, which is also powered by chat GPT, it's not going to become the incredible
supercomputer that Sam
Altman is promising.
It will not be a virtual brain, or imminently human-like, or a super smart person that knows
everything about you, because it is at its deepest complexity a fundamentally different
technology based on mathematics and the probabilistic answer to what you have asked it, rather than
anything resembling how human beings think or act or even know things. Generative AI does not know anything. How can a thing
think when it doesn't know? Eh? Anyone want to ask Brad Lightcap? Miramarati, Sam Altman,
any of these questions just once? Hear what they fart out? No? Ah.
While chat GPT isn't inherently useless, Altman realises that it's impossible to
generate the kind of funding and hype he needs based on its actual achievements, and that
to continue to accumulate power and money, which is his only goal, he has to speciously
hype it, and he has to hype it to wealthy and powerful people who also do not participate
in the creation of anything. And powerful people who also do not participate in the creation
of anything.
And that's who he is.
I've been pretty mean about this guy.
I really have, but he does have a skill.
He knows a mark.
He knows.
He knows how to say the right things and get in the right rooms with the people who aren't
really touching the software or the hardware.
He knows what they need to hear. He knows what the VCs need to hear.
He knows, quite aptly, what this needs to sound like.
But if he had to say what chatGBT does today, what would he say?
Yeah, yeah, it's really good at generating a bunch of text that's kind of shitty.
Yeah, sometimes it does math, right?
Sometimes it does it really wrong. Sometimes you ask it to do it, it can draw a picture. Hey what do you think of that?
These are all things by the way that if like a six-year-old told you be like wow that's really
impressive or like a ten-year-old perhaps because that's a living being. ChatGPT does these things
and it does it, I know it's cheesy to say but in a soulless way but it really does that because
and the reason all of this, the writing and the horrible
video and the images, the reason it feels so empty is because even the most manure-adjacent
press release still has gone through someone's manure-adjacent brain.
Even the most pallid, empty copy you've read has gone through someone, a person has
put thought and intention in, even if they're not great with the English language.
What ChatGPT does is use math to generate the next thing, and sometimes it gets it pretty
right.
But pretty right is not enough to mimic human creation.
But look at Sam Altman.
Look who he is.
What has he created other than wealth for him and other people?
What about Sam Altman is particularly exciting?
Well, he's been rich before.
And his money made him even richer.
That's pretty good.
He was at Y Combinator.
Don't ask too much about what happened there.
Just feels like sometimes Silicon Valley can't wipe its own ass. It can't see when there's a wolf amongst the sheep. It can't see when someone isn't really part of the system other than finding
new ways to manipulate and extract value from it. And Sam Altman is a monster created by Silicon Valley's sin.
And their sin, by the way, is empowering and elevating those who don't build software,
which in turn has led to the greater sin of allowing the tech industry to drift away from
fixing the problems of actual human beings.
Sam Altman's manipulative little power plays have been so effective because so many of
the power players in venture capital and the public markets and even tech companies are
disconnected from the process of building things, of building software and hardware,
and that makes them incapable or perhaps unwilling to understand that Sam Altman is leading them
to a deeply desolate place.
And on some level it's kind of impressive how he succeeded in bending these fools to
his whims to the point that executives like Sundar Pichai of Google are willing to break
Google's search in pursuit of this next big hype cycle created by Sam Altman.
He might not create anything, but he's excellent at spotting market opportunities, even if
these opportunities involve him transparently
lying about the technology he creates, all while having his nasty little boosters further
propagate this bullshit, mostly because they don't know.
Or perhaps they don't care if Sam Alton's full of shit.
Maybe it doesn't matter to them.
It doesn't matter that Google search is still plagued with nonsensical AI answers
that sometimes steal other people's work, or that AI in legal research has been proven
to regularly hallucinate, which by the way is a problem that's impossible to fix. It's
all happening because AI is the new thing that can be sold to the markets, and it's
all happening because Sam Altman, intentionally or otherwise,
has created a totally hollow hype cycle.
And all of this is thanks to Sam Altman and a tech industry that's lost its ability to
create things worthy of an actual hype cycle, to the point that this spacious non-technical
manipulator can lead it down this nasty, ugly, offensive, anti-tech path.
The tech industry has spent years pissing off customers, with platforms like Facebook
and Google actively making their products worse in the pursuit of perpetual growth,
unashamedly turning their backs on the people that made them rich and acting with this horrifying
contempt for their users.
And I believe the result will be that tech is going to face a harsh reprimand
from society. As I mentioned in the Rockcom bubble, things are already falling apart,
web traffic is already dropping.
And what sucks is the people around Sam Altman should have been able to see this. Even putting
aside his resume, I've listened to an alarming amount of Sam Altman talk.
I'm a public relations person, who the hell am I?
I'm someone who's been around a lot of people who make shit up.
I've been around a lot of people whose job it is to kind of obfuscate things.
And quite frankly, Altman's really obvious.
I'm not going to do any weird lie to me-esque ways of proving he's lying. He just doesn't ever get pushed into any depth. No one ever asks him really
technical questions or even just a question like, hey Sam did you work on
any of the code at OpenAI? What did you work on? I know you can't talk about the
future Sam, but how close are we actually to AGI?
And if he says, oh a few years, that's not specific enough Sam.
How about give me a ballpark?
And then when he lies again you say, okay Sam, how do we get from generative AI to AGI?
And when he starts waffling say, no, no, no, be specific Sam.
This is how you actually ask questions
and when you say things like this by the way to technical founders they don't get
worried they don't obfuscate they may say I can't talk about this due to legal
things which is fine but they'll generally try and talk to you. Listen to
any interview with any other technical AI person.
Listen to them and then listen to Sam Altman. He's full of it. It's so obvious.
And one deeply unfair thing with the Valley is
there are people that get held to these standards.
Early stage startups generally do.
The ones that aren't handed to people like Altman
or Alexis O'Hanian of Reddit or Paul Graham or Reid Hoffman.
They don't get those chances because they're not saying the things that need to be said
to the venture capitalists.
They're not in the circles.
They're not doing the right things because the right things are no longer the right thing
for the tech industry.
And when all of this falls apart, Sam Altman's going to be fine.
When this all collapses, he'll find something to blame it on.
Market forces, a lack of energy breakthroughs, unfortunate economic things, all of that nonsense,
and he'll remain a billionaire capable of doing anything he wants.
The people that are going to suffer are the people working in Silicon Valley who aren't
Sam Altman. The people that did not get born with a silver spoon in each hand and then handed
further silver spoons as they walked the streets of San Francisco. People that don't live in
nine and a half thousand square foot mansions. The people trying to raise money who can't right now
because all the VCs are obsessed with AI. The people
that will get fired from public tech companies when a depression hits
because the markets realize that the generative AI boom was a bubble. When they
realize that the most famous people in tech have been making these promises for
nobody other than the markets. Well the markets need you to do something eventually and I just don't think it's
going to happen.
And I think that we need to really think, why was Sam Altman allowed to get to this
point?
Why did so many people like Paul Graham, like Reid Hoffman, like Brian Chesky, like Sacha
Nadella back up this obvious con artist who has acted like this forever.
And what sucks is, I don't know if the Valley's going to learn anything unless it's really
bad.
And I don't want it to be, by the way.
I would love to be wrong.
I would love for all of this to just be like, Sam Orman's actually a genius, turns out the
whole thing was not.
No, it's not going to happen.
And I worry that there is no smooth way out of this, that there is no way to just
casually integrate OpenAI with Microsoft.
Because now there's an antitrust thing going in with Microsoft acquiring InflectionAI,
another AI company.
And that's the thing.
It feels like we are approaching a precipice here.
And the only way to avoid it is for people to come clean,
which is never going to happen. Or of course, for Sam Altman not to be lying. For AGI to actually
come out of OpenAI, and by the way, it's going to need to be in the next year. I don't think they've
got even three quarters left. I think that once this falls apart, once the markets realize, oh shit,
this is not profitable, this is not sustainable, they're going to walk away from it. When
companies realize that generative AI has given them a couple percent profit, maybe, they're
going to be pissed. Because this is not a stock rally worthy boondoggle.
This is not going to be pretty.
When things fall apart for Nvidia, which is still over $1000, when those orders stop coming
in quite as fast, what do you think is going to happen to tech stocks?
Startups are already having trouble raising money.
And they're having trouble raising money because the people giving out the money are
too disconnected from the creation of software and hardware.
The only way to fix Silicon Valley.
Perhaps is an apocalypse.
Perhaps is people like Sam Altman getting washed out.
I don't want it to happen.
I really must be bloody clear.
But maybe it won't be apocalyptic.
Maybe it will just be a brutal realignment.
And maybe Silicon Valley needs that realignment.
This industry desperately needs a big bath full of ice and they need to dunk their head
in it aggressively and wake the hell up.
Venture capital needs to put money back into real things.
The largest tech companies need to realign and build for sustainability so they're
not binging and purging staff with every boom.
And if we really are at the end of the hypergrowth era, every tech company needs to be thinking
profit and sustainability again.
And that's a better Silicon Valley because a better Silicon Valley builds things for
people. It solves real problems, it doesn't have to lie about what the thing could do in the future
so that it can sell a thing today. And I realize that sounds like the foundation of most venture
capital. That's fine at the seed stage, that's fine at this moonshot stage where you're early
early days. It is not befitting the most famous company in tech. It is not befitting a multi-billionaire. It is not
befitting anyone and it is insulting to the people actually building things both
in and outside of technology. The people I hear from after every episode they are
angry. They are frustrated because there are good people in tech. There are people
building real things. There are people that remember a time when the tech industry was exciting.
When people were talking about cool shit in the future and then they'd actually do it.
Returning to that is better for society and the tech industry.
Just don't know when it's going to happen.
to happen. Thank you for listening to Better Offline.
The editor and composer of the Better Offline theme song is Matt Osowski.
You can check out more of his music and audio projects at mattosowski.com.
You can email me at ez at betteroffline.com or visit betteroffline.com to find more podcast
links and of course my newsletter.
I also really recommend you go to chat.wizyoured.at to visit the Discord and go to r slash betteroffline
to check out our Reddit.
Thank you so much for listening.
Better Offline is a production of Cool Zone Media.
For more from Cool Zone Media, visit our website, coolZoneMedia.com, or check us out on the
iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hello and welcome to Better Offline, I'm your host Ed Zitron.
Been a hard couple of weeks, It's been pretty hard to focus.
I've written a few newsletters, I've gone to Portugal, I've done a bunch of shit.
Just trying not to think about everything happening outside, but it's time to do so.
Simmily every single person on earth with a blog or a podcast or even a Twitter account or xdeverythingapp or whatever
it's called now, they've all tried to drill down into what happened on November 5th
to find the people to blame, to explain what could have gone differently.
Really looking for who to blame though and find out why so many actions led to a result that will overwhelmingly harm women, minorities,
immigrants, LGBTQ people and lower-income workers.
It's terrifying.
It fucking sucks.
I'm not going to mince words.
Not that I would usually anyway.
And I don't feel fully equipped to respond to the moment.
I don't have any real answers, at least not political ones.
I'm not a political analyst and I feel disingenuous trying to dissect either the Harris or the
Trump campaigns because I just feel like there's a take Olympics right now.
It's the Dunning-Kruger Festival out there.
Everyone is trying to rationalize and intellectualize these events
that ultimately come down to something quite simple.
People don't trust authority.
And yeah, it's pretty ironic that this often leads them towards authoritarianism.
Now I don't want to give you the impression that I'm going to go on my crank mode and
somehow against institutions on their face.
I'm not.
But at the same time, understanding this moment requires us to acknowledge that institutions
have failed us and failed most people and how certain institutions' missteps have led
us to exactly where we are today.
Legacy media, while oftentimes they'reed by people who truly love their readers
and care about their beats, they're weighed down by this hysterical, nonsensical attachment
to the imaginary concept of objectivity and the will of the markets.
Case in point.
Regular people have spent years watching the price of goods increase due to inflation,
despite the fact that the increase in pricing was mostly driven by, get this, corporations raising their prices.
Now that's not to say that external factors like the war in Ukraine or lingering COVID
restrictions in China, these things did play a role in it.
They did.
But the bulk of these price increases were caused by these fucking companies raising
the prices.
It was in their earnings.
It was right there. Pepsi Cola said it on the news. Yet some parts of the legacy media
spent an alarming amount of time chiding their readers for thinking otherwise, even going
against their own reporting, and there will be links in the episode notes I promise, as
a means of providing balanced coverage, insisting again
and again that the economy is actually good, contorting their little bodies to prove that
prices aren't actually higher even as companies literally boasted about raising their prices
on earnings.
In fact, the media spent years debating with itself whether price gouging was actually
happening despite years of proof that it was.
Some of them even reported that the price gouging was happening. So like, get this, I just don't think people trust
authority and they especially don't trust the media, especially the legacy
media. It also probably didn't help that the legacy media implored readers and
viewers to ignore what they saw at the supermarket or at the pump and the
growing hits to their wallets from the daily necessities of life.
It was just a national level gaslighting, and it was disgusting.
And I know some of you might say, you know where to email me, oh it's not just this,
no of course it's not just this asshole.
But I think this is a big thing.
Now before I go any further, I've used the term legacy media here repeatedly, but I don't completely intend for it to come across as a pejorative.
Despite my criticism, believe me, I've got a few of them.
There are people in the legacy media doing a good job.
They're reporting the truth, they're doing the kinds of work that matters, and they're
actually trying to teach their readers stuff and tell them what's happening and giving
them context.
I read and pay for several legacy media outlets, and I think
the world is a better place for them existing, despite their flaws.
The problem is, as I'll explain, is this editorial-industrial complex, and how these
people who are writing about the powerful don't seem to be able to, or maybe they
don't want to, actually interrogate the powerful. This could be an entire episode
on its own, but I don't think the answer to these failings is to simply discard legacy media entirely. But I want to implore them
to do better, and to strive for the values of truth hunting and truth telling and actually
explaining what's happening and criticizing the people that don't have PR firms and lobbying
groups and lawyers and the means to protect themselves from the world.
The time for fucking around is over and we're currently finding out.
Now anyway, as you know as a person existing in the real world the price of everything
has kept increasing despite the fact that wages are stagnating and it's forcing many
of the poorest people to choose between food and fuel, or, I don't know, eating
and having heat.
Simultaneously, businesses have spent several years telling workers they were asking for
too much and doing too little, telling people a few years ago they were quiet quitting,
which is a fucking stupid term that just means going to your job and doing the thing you're
paying to do.
Anyway, anyway.
And a year later, in 2023, they insisted that the years of remote work were actually bad
because profits didn't reach the same profit levels of 2021, which was something to do with
remote work. Now, did anyone actually prove this? Did anyone actually go and... No, they didn't. They
just... Well, I just listened to Mark Benioff, who's one of the more evil people alive. Now,
I also think a lot of these problems come to 2021, a year that we really need to
dig into more.
We might not do so today, but we will in the future.
But one of the big things that punished workers and led to so many layoffs in 2023 was the
fact that we couldn't get back to the post lockdown boom of 2021 when everyone bought
everything always as they left the house for the first time in a while.
Now any corporation would be smart enough to know that that was a phase, that that was
not going to be forever, except every single big company seemed to make the same mistake
and say number going up forever, line go up forever.
When it didn't, well they started punishing workers and they started thinking, well could
it be that we as companies, we set unrealistic expectations for the markets and we just thought we'd keep growing forever
or maybe it was the people using the computer at home? Yeah, that seems way better. Anyway,
while the majority of people don't work remotely, from talking to the people I know outside
of tech or business, there's this genuine sense that the media has allied itself with
the bosses, and I imagine it's because of the many articles that literally call workers lazy and have done so for years.
Yet when it comes to the powerful, legacy media doesn't seem to have that much piss and vinegar.
They just have much more guarded critiques. The appetite for shaming and finger-wagging,
it's always directed at middle and working-class workers, and seemingly disappears when a person has a three character job title like CEO.
It's fucking stupid, it's insulting, and yes, it's demoralizing for the average person.
Despite the fact that Elon Musk has spent years telegraphing his intent to use his billions
of dollars to wield power, equivalent to that of a nation state, as you may remember from
my first episode of Anything, over on It Could Happen Here.
Too much of the media, both legacy and otherwise, responded slowly, failing to call him a liar,
con artist, aggressor, manipulator, racist, deadbeat dad, you know, all the things actually
happening.
No, no, no.
They kind of danced around him.
They reported stories that might make you think that they may be noticed it, but there was this desperation to guard objectivity.
And it was just… it lacked any real intent, it lacked any interest in calling account to a man who
has pretty much bought an election for Donald Trump. A racist billionaire using his outsized
capital to bend society to his will just isn't a fucking problem for the media, or at least not as
much of a problem as a worker who might not work 50 to 100 hours a week for a boss who
makes 130 times what they do.
The news, at least outside of the right wing, is always separate from opinion, always guarded,
always safe, for fear that they might piss somebody off and be declared biased.
Something that happens anyway.
And while there are columnists that are given some
space to have their own thoughts, sometimes in the newspaper, sometimes online, the stories
themselves are delivered with the kind of reserved tone that often fails to express
any actual consequences or context around the news itself and just doesn't seem to care about making
sure that the reader or listener learns something.
My mate Casey has a good point about podcasts, and I'd apply it to some of the news too,
that there's too much stuff out there that is there to make you feel intelligent rather
than make you intelligent.
And I think this falls into it.
Now, this isn't to say that outlets are incapable of doing this correctly.
I love the Washington Post, they've done an excellent job on analyzing major text stories. But a lot of these outlets feel custom built to be bulldozed the moment an authoritarian
turns up. This force that exists to crush those desperately attached to norms and objectivity.
Authoritarians know that they're ideologically charged words, but you quote it adverbatum
with the occasional, huh, this could mean… little dribble, this drizzle, this spunk of
context that's
lost in the headline that repeats exactly what the fucking authoritarian wants them
to.
And guess what?
Some people don't read the article, they just read the headline.
And Musk is the most brutal example of this, by the way.
Despite the fact that he's turned Twitter into a website pump full of racism and hatred
that literally helped make Donald Trump president, Musk was still able to get mostly positive coverage from the majority
of the mainstream media for his fucking robo-taxi nonsense.
Despite the fact that he spent the best part of a decade lying about what Tesla will do
next, there are entire websites just based on how much Elon Musk lies, yet they still
report this shit.
It makes me very upset.
And it doesn't matter that some of these outlets, by the way, had accompanying coverage
that suggested that the markets weren't impressed by Tesla's theoretical robo taxi
plans or their fake-ass robots run by people. Musk is still able to use the media's desperation
for objectivity against them, and he knows that they never dare to combine reporting on stuff with thinking about stuff for fear that Elon Musk might say they're
biased which he has been doing for years. Do you see my goddamn point yet?
And this by the way is not always the fault of the writers. There are entire foundations
of editors that have more faith in the markets and the powerful than they do the people writing
or the people reading their fucking words and
Above them are entire editorial superstructures that exist to make sure that the editorial vision
Never colors too far outside the lines or informs people a little too much
I'm not even talking about Jeff Bezos or Lauren Powell jobs or any number of billionaires who own any number of publications
But the editors editing business and tech reports who don't know anything about business and
tech, or the senior editors that are terrified of any byline that might dare get the outlet
under fire from somebody who could call their boss.
It's fucking cowardice. There are, however, I should add, also those who simply defer to the powerful, that assume
that this much money can't be wrong, even if said money, in the case of Elon Musk, is
repeatedly wrong and there's an entire website about the wrongness and the lies and the bullshit.
And I'm talking about Elon Musk still, obviously.
These editors are the people that look at the current crop of powerful tech companies
that have failed to deliver any truly meaningful innovation in years and then go, ooh, send
me more daddy, show me more of the apps.
It's fucking disgraceful.
Just look at the coverage of Sam Altman from last year.
You know, the guy who spent years lying about what AI can do.
And tell me why every single thought he says must be uncritically catalogued, his every goddamn decision applauded, his every
claim trumpeted as certain, his brittle little company that burns 5 billion dollars a year
talked about like it's a fucking living god.
Sam Altman is a liar who's been fired from two companies including OpenAI, and yet because
he's a billionaire with a buzzy company he's left totally unscathed.
The powerful get a completely different set of rules to live by and exist in a totally
different media environment.
Their geniuses, entrepreneurs, firebrands, their challenges are framed as missteps and
their victories framed as certainties by the same outlets that told us that we were quiet
quitting and that the economy is actually good and that we're the problem for high
prices.
While it's correct to suggest that the right wing is horrendously ideological and they're
terribly biased, it's very hard to look at the rest of the media and claim that they're
not.
The problem is that the so-called left media, which usually is just the centre, isn't
biased towards what we may consider left-wing causes like universal healthcare, strong unions,
expanded social safety nets, you know the stuff that would actually be helpful.
No, they're biased in favour of filleting an ever-growing carousel of sociopathic billionaire
assholes, elevating them to the status of American royalty where they exist above expectations
and norms that you and I must live by.
This is the definition of elitism.
The media has literally created a class of people who can lie and cheat and steal, and
rather than condemn them for it, they're celebrated.
While it might feel a little tangential to bring technology into this, I truly believe
that everybody is affected by the rot economy, the growth or cost ecosystem where the number
must always go up because everybody is using technology all the time and the technology
in question is getting worse. This election cycle saw more than 25 billion text
messages sent to potential voters and seemingly every website was crammed full
of random election advertising. Here's the thing about elections they're not
really always about policy no they're a referendum on the incumbent party or
president and by proxy a poll on how people feel. And the reality is that most people are fucking miserable. There's
this all-encompassing feeling that things are just harder now. It's harder to pay
your bills. It's harder to keep in touch with your friends. It's harder to start a
family. It's harder to buy a house. It's harder to fall in love. It's harder to do
everything. And what we're seeing is an enshitification of existence, to use
Mr.
Doctorow's phrase. Everything just... I
don't want to be this much of a
curmudgeon, but everything just kind of
sucks. It's all terrible, it's miserable,
and hardly anyone thinks it's going to
get better. And this creates the kind of
fertile conditions for a strong man to
emerge. One who arises and says that only
he can fix things, even if he spent four
years proving how he could not.
And the problem for Democrats and for institutions more broadly is that the all-encompassing nature of this milieu is kind of hard to solve.
It's hard to change the perception that everything's terrible when you're reminded of it, when you're trying to do the most basic of tasks.
Our phones are full of notifications trying to growth hack us into doing things that companies want.
Our apps are full of microtransactions.
Our websites are slower and harder to use with endless demands of
our emails and our phone numbers and the need to log back in because they couldn't possibly
lose a dollar to someone who dared to consume a Washington Post article and yes I'm talking
about the post which I fucking pay for despite the fact it logs me out all the time.
Our social networks are so algorithmically charged that they barely show us the things
we want them to anymore, with executives dedicated to filling our feeds full of AI-generated slop because
despite being the customer, we're also the revenue mechanism.
Our search engines do less as our means are making us use them more, our dating apps have
become vehicles of private equity to add a toll to falling in love, our video games are
constantly nagging us to give them more money, and despite it costing money and being attached
to our account, we don't actually own any of the streaming media we purchase.
We're drowning in spam, both in our emails and our phones, and at this point in our lives
we've probably agreed to 3 million pages of privacy policies allowing companies to use our information as they see fit.
We get one value transaction with every company. They get 11. They get a hundred.
We really actually don't know because there's no legislation to tell us what they're fucking doing.
And these are the issues that hit everything we do,
all the time, constantly, unrelentingly.
Technology is our lives now.
We wake up, we use our phone, we check our texts,
three spam calls, two spam texts.
We look at our bank balance, two-factor authentication check.
We read the news, a quarter of the page is blocked by an advertisement
asking for our email that's deliberately built to hide the button to get rid of it.
And then we log in to Slack and feel a pang of anxiety as 15 different notifications appear
in a way that is really not built for us to find what we need, just to let us know something
happened.
Modern existence is just engulfed in sludge. The institutions that exist to cut through it seem to bounce
between the ignorance of their masters and this misplaced duty to objectivity. Our mechanisms
for exploring and enjoying the world are interfered with by powerful forces that are just basically
left unchecked. Opening our devices is willfully subjecting us to attack after attack after attack
from applications, websites and devices that are built to make us do things for them rather than operate with dignity and freedom that much of the
internet was actually founded upon.
These millions of invisible acts of terror are too often left undiscussed because accepting
the truth requires you to accept that most of the tech ecosystem is rotten and that billions
of dollars are made harassing and punishing billions of people every single
day of their lives through the devices that we're required to use in order to exist in
the modern world.
Most users suffer the consequences, and most of the media fails to account for them, and
in turn people walk around knowing something is wrong but not knowing who to blame until
somebody provides a convenient excuse.
Like immigrants.
Like the Democrats.
Like whatever fucking works
because we can't actually call the people out, the corporations crushing our
existence. Why wouldn't people crave change? Why wouldn't people be angry?
Living in the current world absolutely fucking sucks sometimes. It's miserable,
it's bereft of industry, and filthy with manipulation. It's undignified, it's
disrespectful, and it must be crushed
if we want to escape this depressing goddamn world we've found ourselves in.
Our media institutions are fully fucking capable of dealing with these problems, but it starts
with actually evaluating them and aggressively interrogating them without fearing accusations
of bias that, as I've said repeatedly, happen either way. The truth is that the media is more afraid of accusations of bias
than they are of misleading their readers. And while that seems like a
slippery slope, and it may very well be one, there must be room to inject the
writer's voice back into their work. And a willingness to call out bad actors as
such, no matter how rich they are, no matter how big their products are, no
matter how willing they are to bark and scream that things are unfair as they accumulate more power and
money.
We need context in our news.
We need it.
We need it now.
We need opinion.
We need voice.
We need character.
We need life.
Because as long as we follow this bullshit objectivity path, we are screwed.
And if you're in the tech industry and hearing this and saying, oh the media's too critical
of tech, you're flat fucking wrong, kiss my asshole.
Everything we're seeing happening right now is a direct result of a society that let technology
and the ultra-rich run rampant, free of both the governmental guardrails that might have
stopped them and the media ecosystem that might have actually held them in check.
Our default position in interrogating the intentions and actions of the tech industry
has become that they will work it out as they continually redefine what work it out means
and turn it into make their products worse but more profitable. Covering Meta, Twitter, Google,
OpenAI and other huge tech companies as if the products they make are remarkable and perfect
is disrespectful to the reader's intelligence and a disgusting abdication of responsibility
as their products, even when they're functional, are significantly worse, more annoying, more
frustrating, and more convoluted than ever, and that's before you get to the ones like
Facebook and Instagram that are outright broken.
And I don't give a shit if these people have raised a lot of money, unless you use
that as proof that something is fundamentally wrong with the tech industry.
Metamaking billions of dollars of profit is a sign that something is wrong with society,
not proof that it's a good company, or anything that should grant Mark Zuckerberg any kind
of special treatment.
Shove your chains up your arse, Mark.
Open AI being worth $157 billion for a company that burns $5 billion or more a year to make
a product that destroys our environment for a product yet to find any real meaning isn't a sign that it should get more coverage or be taken more
seriously.
No, it should be a sign that something is broken, that something is wrong with society.
Whatever you may feel about chat GPT, the coverage it received is outsized compared
to its actual utility and the things built on top of it, and that's a direct result
of a media industry that seems incapable of holding the powerful accountable or actually learning about the subject matter
in question.
It's time to accept that most people's digital life fucking sucks, as does the way
we consume our information, and that there are people directly responsible.
Be as angry as you want at Jeff Bezos, whose wealth and the inherent cruelty of Amazon's
labour practices makes him an obvious target,
but please don't forget Mark Zuckerberg, Elon Musk, Sondar Pashai, Tim Cook, and every single other tech executive
that has allowed our digital experiences to become fucked up through algorithms that we know nothing about.
Similarly, governments have entirely failed to push through any legislation that might stop the raw,
both in terms of dominance and opaqueness of algorithmic manipulation, and the ways in which tech products
exist with few real quality standards.
We may have, at least for now, consumer standards for the majority of consumer goods, but software
is left effectively untouched, which is why so much of our digital lives are such unfettered
dog shit.
And if you're hearing this and saying I'm being a hater or a pessimist, shut the fuck
up I'm tired of you.
I'm so fucking tired of being told to calm down about this as we stare down the barrel
of four years of authoritarianism built on top of the decay of our lives both physical
and digital, with a media ecosystem that doesn't do a great job explaining what's being done
to the people in an ideologically consistent way.
There's this extremely common assumption in the tech media, based on what I'm really
not sure, that these companies are all doing a good job and that good job means having
lots of users and making lots of money, and it drives tons of editorial decision making.
If three quarters of the biggest car manufacturers were making record profits by making half
of their cars with a break that sometimes didn't work, that'd be international news.
Government inquiries would happen, people would go to prison, and this isn't even conjecture,
it actually happened.
After Volkswagen was caught deliberately programming its engines to only meet emissions standards
during laboratory testing, they were left to spew excessive pollution into the real
world.
But once lawmakers found out, they responded with civil and criminal action.
The executives and engineers responsible were indicted. One received seven years in jail and their
former CEO is currently being tried in Germany and being indicted in the US too.
And here we are in the tech industry. Facebook barely works, used to
genocides and bully people and harass teen girls. Peter Fowles run rampant on
there. There was a Wall Street Journal journal about it. They're fine.
So much of the tech industry, consumer software like Google, Facebook, Twitter and even chat
GPT, and business software from companies like Microsoft and Slack.
It sucks.
It sucks.
It's bad.
You use it every day.
You've been listening to me ramble for 50 episodes now.
You know what I'm talking about.
It's everywhere.
Yet the media covers it just like, eh, you know, it's just how things are mate.
Now Meta, by the admission of its own internal documents, makes products that are ruinous
to the mental health of teenage girls. And it hasn't made any substantial changes as
a result, nor has it received any significant pushback for fame to do so.
Little bit of a side note here, big shout out to Jeff Horwitz and the rest of the Wall Street Journal people who did the Facebook files, there are our
legacy media people doing a good job on this. Nevertheless, Metta exercises this reckless
disregard for public safety, kind of like the auto industry in the 60s, and that was
when Ralph Nader wrote Unsafe at Any Speed. And his book, it actually brought about change,
it led to the Department of Transportation and the passage of seatbelt laws in 49 states and a bunch of other things that can get overlooked.
But the tech industry is somehow inoculated against any kind of public pressure or shame
because it operates in this completely different world with this different rule book and a
different criteria for success, as well as this completely different set of expectations.
By allowing the market to become disconnected from the value it creates, we enable companies
like Nvidia, that reduce the quality of their services as they make more money for their
GeForce Now service. Or Facebook. They can just destroy our political discourse so they
can facilitate genocide in Myanmar and then, well, they get headlines about how good a
CEO Mark Zuckerberg is and how cool his chains are and how everything's just fine with Facebook
and they're making more money. No, no. I actually want to take a step back though, I want to take a little bit of a step back.
I previously mentioned, I said it twice now, oh Metta enables genocide and destroys our
political discourse.
I want to be clear, when I say that everything is justified at Metta, I'm actually quoting
their Chief Technology Officer.
That's quite literally what Andrew Bosworth said in an internal memo
from 2016, where he said that and I quote,
Ahem. All the work Facebook does in growth is justified, even if that includes, and I'm
quoting him directly, somebody dying in a terrorist attack coordinated using Facebook's
tools.
Now, the mere mention of violent crime is enough to create reams of articles questioning whether
society is safe and whether we need more plastic in our Walgreens.
Yet our digital lives are this wasteland that people still discuss like a utopia.
Seriously, putting aside the social networks, have you visited a website on the phone recently?
Have you tried to use a new app?
Have you tried to buy something online starting with a Google search?
Within those experiences, has anything gone wrong?
You know it, I know it has.
You know it has.
It's time to wake up.
We, the users of products, we're at war with the products we're using
and the people that make them.
And right now, we are losing.
The media must realign to fight for how things should be.
This doesn't mean that they can't cover things positively, or give credit where credit
is due, or be willing to accept that something could be something cool.
But what has to change is the evaluation of the products themselves, which have been allowed
to decay to a level that has become at best annoying and at worst actively harmful for
society.
Our networks are rotten, our information ecosystem is poisoned with its pure parts ideologically
and strategically concussed, our means of speaking to those that we love and making
new connections are so constantly interfered with that personal choice and dignity is all
but removed.
But there is hope.
There really is.
Those covering the tech industry right now have one of the most consequential jobs in
journalism if they choose to fucking do it. Those willing to guide people through
the wasteland. Those willing to discuss what needs to change, how bad things have gotten,
and hold the powerful accountable and say what good might look like, have the opportunity
to push for a better future by spitting in the faces of those ruining it.
I don't know where I sit by the way, I don't know what to call myself.
Am I legacy media?
I got my start writing in print magazines.
Am I an independent contractor?
Am I an influencer?
Am I a content creator?
I truly don't know and I don't know if I care, but all that I know is that I feel like
I'm at war too and that we, if I can be considered part of the media, are at war with people
that have changed the terms of innovation so that it's synonymous with value extraction.
Technology is how I became a person, how I met my closest friends and loved ones, and
without it I wouldn't be able to write, I wouldn't be able to read this podcast, I
wouldn't have got this podcast.
And I feel this poison flowing through my veins as I see what these motherfuckers have
done and what they're continuing to do.
And I see how inconsistently and tepidly they're interrogated.
Now is the time to talk bluntly about what is happening.
The declining quality of tech products, the scourge of growth hacking, the cancerous growth-at-all-cost
mindset.
These are all the things that need to be raised in every single piece.
And judgments must be unrelenting.
The companies will squeal, oooh, that they're being so unfairly treated by the biased legacy
media.
Ooh, ooh, save me.
Hey, Nilay Patel, interview with Sondar Peshai, this is how you sounded when you handed him
your phone, it was pathetic.
They should be scared of you, Nilay.
The powerful should be scared of the media.
They shouldn't be sitting there sending letters to the editor like fucking customer support.
No.
They should see this podcast.
They should see these newsletters.
They should see everything published by the tech media and go, uh-oh.
And there can be good people.
There can be good boys and girls and others.
There can be plenty of people that make good products and get great press for it.
But do you really think, meta, Google, Apple to an extent frankly, do you think Amazon
looks good right now?
Do you think it's easy to find stuff or do you think it's slop, full of more slop?
Mark Zuckerberg said on an earnings call the other day that he intends there to be an AI
specific slop feed. That should... these are harmful things.
This is pouring vats of oil into rivers and then getting told you're the best boy in town.
These companies, they're poisoning the digital world and they must be held accountable for
the damage they're causing.
Readers are already aware.
But are... and this is really thanks to members of the media by the way, they're causing. Readers are already aware. But are, and this is really thanks
to members of the media by the way, they're gaslighting themselves into believing that
oh, I just don't keep up with technology, it's getting away from me, I'm not technical
enough to use this. When the thing that they don't get, that the average person doesn't
get is that the tech industry has built legions of obfuscations, legions of legal tricks,
and these horrible little user interface traps specifically made
to trick you into doing things, to make the experience kind of subordinal to getting the
money off of you.
And I think that this is one of the biggest issues in society, and yes, of course I'm
biased, I'm doing a podcast about tech, but for real though, billions of people use smartphones,
billions of people are on the computer every day.
It's how we do everything.
And it stinks.
It stinks so bad.
This is the rot economy.
We're in the rot society.
But things can change.
And for them to change it has to start with the information sources and that starts with
journalism and the work has already begun and will continue.
But it must scale up and it must do so quickly. And you, the user, have the power. Learn to read a privacy policy
and the link there is to the Washington Post. Yes, there are plenty of great reporters there.
Fuck Bezos. You can move to Signal, which is an encrypted messaging app that works on
just about everything. Get a service like DeleteMe. And by the way, I pay for it. I
worked for them a lot four years ago. I have no financial relationship with them, but they're great for removing you from data
brokers.
Molly White, who's a dear friend of mine, and even better right, you might remember
her from one of the earlier episodes about Wikipedia. She's also written this extremely
long guide about what to do next that I'll link to in the notes, and it runs through
a ton of great things you can do. Unionization, finding your communities, dropping apps that
collect and store sensitive data, and so on. I also heartily recommend Wired's Guide to Protecting Yourself from
Government Surveillance, which is linked in the show notes.
Now, before we go, I want to leave you with something that I posted on November 6th on
the Better Offline Reddit.
The last 24 hours have felt bleak and will likely feel more bleak as the months and years
go on.
It'll be easy to give in to doom, to assume the fight is lost, to assume that the bad guys have
permanently won and there will never be any justice or joy again. Now is the time for solidarity,
to crystallize around ideas that matter, even if their position in society is delayed,
even as the clouds darken and the storms brew and the darkness feels all encompassing and suffocating.
Reach out to those you love and don't just commiserate, plan.
It doesn't have to be political, it doesn't even really have to matter.
Put shit on your fucking calendar, keep yourself active and busy and if not distracted,
at the very least animated. Darkness feeds on idleness,
darkness feasts on a sense of failure and a sense of inability to make change.
You don't know me very well, but know that I'm aware of the darkness and the sadness
and the suffocation of when things feel overwhelming.
Give yourself some mercy, and in the days to come, don't castigate yourself for feeling gutted.
Then keep going.
I realise it's a little solace to think, well if I keep saying stuff out loud things will get better,
but I promise you doing so has an effect and actually matters.
Keep talking about how fucked up things are.
Make sure it's written down, make sure it's spoken cleanly and with the rage and fire
and piss and vinegar it deserves.
Things will change for the better, even if it takes more time than it should.
Look I know I'm imperfect.
I'm emotional, I'm off kilter at times so I get emails saying that I'm too angry.
I'm sorry if it's ever triggered you, I really do mean that.
It's not intentional, I just feel this in everything I do.
I use technology all the time and it is extremely annoying, but also I'm aware that I have privilege.
And the more privilege you have within tech, the more you're able to escape the little things.
Go and buy a cheap laptop today. Try and see what a $200, $300 laptop is. It's slow.
It's full of 18 pop-ups trying to sell you access to cloud storage, to shit that you'll never use,
tricking grannies and people who can't afford laptops. So people that just don't know.
When I see this stuff, it enrages me.
Not just for me, but because I know that I'm at least lucky enough to know how to get around this shit.
Spent most of my life online, spent most of my life playing with tech, I know how it works.
And I know I have my tangents and my biases, but I wear them kind of like my heart on my sleeve.
I care about all of this stuff in a way that might be a little different to some. And it's because I've watched an industry that really made me as a person, that allowed
me to grow as a person, to actually meet people, to not feel as alone.
And I imagine some of you feel like this too.
And then watching what happens to it every day, watching the people who get so rich off
of making it so
much worse and then seeing what happened on November 5th. And you can draw a line from
it. People are scared. They're lost. Their lives are spent digitally and your digital
lives are just endless terrorism, endless harm. Some of you know your way around tech
so you can escape some of it, but it's impossible to escape all of it.
Try meeting people these days. You can't. Everything is online and everything online,
everything on your phone is mitigated and interfered with. It's an assault on your senses, one deprived of dignity.
And I see the people doing this and And it fills me full of fucking rage.
And it makes me angry for you and for me.
For my son growing up in what will probably be a worse world.
For my friends and loved ones who are harder to see, harder to speak to, whose lives too
are interfered with.
And there are the millions and millions of people who have no fucking idea it's happening.
That just exist in this swill and this active digital terrorism. Poked and prodded and nagged
and notified constantly. And I don't want, early on in this I got a message saying don't
tell people to be angry. I stick by that. But I'm not going to hide that I am.
I'm not going to hide the pain I feel. I'm not going to hide the pain I feel seeing this
shit happen. And I've watched this thing that I love, technology. I really do love tech.
I really do, deeply. I've watched it corrupted and broken and the people breaking it. They
don't just make billions of dollars,
they get articles in, they get interviewed on the news. Mark Zuckerberg, he wears a chain
and there's articles about how cool he is. He should be in fucking prison. He should
be in a prison on a boat that just circles the world and he shouldn't have air conditioning
or heat depending on how the weather is. And I know that I'm kind of errant and again,
tons of tangents, but look, the reason I'm like this is because I really care.
And I think caring, I think being angry at the things that actually matter
and giving context as a result, I think that's deeply valuable. And I realize I do fly off the
handle a lot, but it really is because I care.
I care about you.
I care about the subject matter.
I'm so grateful and so honored that you spend your time listening to me every week, and I hope you'll continue to do so,
because I'm not going anywhere.
Thank you for listening to Better Offline.
The editor and composer of the Better Offline theme song is Matt Osowski.
You can check out more of his music and audio projects at mattosowski.com.
You can email me at ez at betteroffline.com or visit betteroffline.com to find more podcast
links and of course my newsletter. I also really recommend you go to chat.wizyoured.at to visit the Discord
and go to r slash betteroffline to check out our Reddit. Thank you so much for listening.
Better Offline is a production of Cool Zone Media. For more from Cool Zone Media, visit
our website, coolzonedia.com or check us out on the iHeart radio app, Apple podcasts, or wherever you get your podcasts.
from audio up the creators of Stephen King's strawberry spring comes the unborn a shocking true story one woman two lives and a secret she would kill to
protect she went crazy shot and killed all her farm animals.
Slaughtered them in front of the kids.
Tried to burn their house down.
Listen to The Unborn on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Hey guys, I'm Kate Max.
You might know me from my popular online series, The Running Interview Show,
where I run with celebrities, athletes,
entrepreneurs, and more.
After those runs, the conversations keep going.
That's what my podcast, Post Run High, is all about.
It's a chance to sit down with my guests
and dive even deeper into their stories,
their journeys, and the thoughts that arise
once we've hit the pavement together.
Listen to Post Run High on the iHeartRadio app,
Apple podcasts, or wherever you get your podcasts.
Hey everyone, it's John, also known as Dr. John Paul.
And I'm Jordan, or Joe Ho.
And we are the Black Fat Film Podcast.
A podcast where all the intersections of identity are celebrated.
Ooh chat, this year we have had some of our favorite people on including Kid Fury, T.S. Madison, Amber Ruffin from the Amber and Lacey show, Angela Carras and more.
Make sure you listen to the Black Fat Film Podcast on the iHeartRadio app, Apple Podcast, or whatever you get your podcast, girl.
Ooh, I know that's right.
Welcome to the Criminalia Podcast.
I'm Maria Tremarchi.
And I'm Holly Frye.
Together, we invite you into the dark and winding corridors of historical true crime.
Each season, we explore a new theme from poisoners to art thieves.
We uncover the secrets of history's most interesting figures from legal injustices
to body snatching. And tune in at the end of each episode as we indulge in cocktails
and mocktails inspired by each story. Listen to Criminalia on the iHeartRadio app, Apple
podcasts or wherever you get your podcasts.
Welcome to Decisions Decisions, the podcast where boundaries are pushed and conversations get candid.
Join your favorite hosts, me, Weezy WTF, and me, Mandy B, as we dive deep into the world of
non-traditional relationships and explore the often taboo topics surrounding dating, sex, and love.
That's right. Every Monday and Wednesday,
we both invite you to unlearn the outdated narratives
dictated by traditional patriarchal norms.
With a blend of humor, vulnerability, and authenticity,
we share our personal journeys navigating our 30s,
tackling the complexities of modern relationships,
and engage in thought-provoking discussions
that challenge societal expectations.
From groundbreaking interviews with diverse guests
to relatable stories that'll resonate with your experiences,
Decisions Decisions is gonna be your go-to source
for the open dialogue about what it truly means
to love and connect in today's world.
Get ready to reshape your understanding of relationships
and embrace the freedom of authentic connections.
Tune in and join in the conversation.
Listen to Decisions Decisions
on the Black Effect Podcast Network,
iHeartRadio app, Apple Podcasts,
or wherever you get your podcasts.