Grubstakers - Episode 54: Larry Page and Sergey Brin (Part 2)
Episode Date: February 18, 2019Part 2 of 3 in our dive in to Larry Page, Sergey Brin, and how they help create the eyes that never stop watching us. This episode is more focused on Google's shady business practices, how google and ...the US government have been trading employees like Pokemon cards and we still got a midi keyboard to do our drops! Pretty gross.
Transcript
Discussion (0)
Get lost, please.
Thank you.
I can tell you that every job has its ups and downs, and a union can't change that fact.
I mean, it is the magic elixir of our age and of all ages.
What it does for prostate cancer is amazing.
You get a $200 billion profit and you didn't have to pay any tax. Isn that true listen yes is that true or not yes it is you do not pay a profit when
someone attacks when someone makes yourself for you become secretary of treasury so you didn't
have to pay the tax there oh in five out to... He never gets fully hard.
He never gets fully hard.
He never gets fully hard.
He never gets fully hard.
He never gets fully hard.
He never gets...
He never gets fully hard.
Hello, welcome back to Grub...
I love having the support of real billionaires.
He never...
Welcome back to Grubstakers, the podcast about real billionaires who never get fully hard.
I'm Sean P. McCarthy.
I'm here.
I hope you enjoyed our new theme song.
I'm joined by my friends.
Yogi Poliwog.
Andy Palmer.
Steve Jeffries.
And so this is part two of our Google episode.
Larry Page, Sergey Brin.
Hopefully you listened to the previous one.
A lot of great stuff
in there you learn all about um the economy of starting a web series of working on google glass
and then getting close to the the family of a billionaire and then uh uh using his capital to
start a web series about how it's wacky that you have depression. In an apartment, very specifically.
If there was an Olympic event for getting out of bed with your depression.
You need a red camera to shoot that,
to really capture comedically how that would look like.
You know, the little insider trick on comedy is that
the more expensive the camera, the funnier it
is.
Some of you, just to peek behind the curtain, you're probably like, all right, I listened
to that last week.
Not enough drops.
That drop keyboard was really annoying, but they've had a week, and I'm sure that they're
completely over it now.
Andy is currently handcuffed
with the microphone taped in front of him.
That's right.
You probably also sent us a bunch of angry Twitter messages
about how stupid that was.
The thing is, though,
we're recording these both on the same day,
so it's still novel to us
and we still haven't seen your angry Twitter
messages.
Right.
Um,
so,
uh,
we appreciate your feedback on part one.
Uh,
none of it will be incorporated into part two.
And it's not because we don't appreciate your feedback.
It's just cause we haven't heard it yet.
Um,
but so where we left you in part one was essentially this never gets fully
hard.
Not that, uh, where we left you in part one was essentially this. It never gets fully hard. Not that.
Where we left you in part one was essentially the story of Google into 1999.
1999, they get this $25 million capital injection from the venture capitalists in Silicon Valley.
They get this office space.
You know, it's kind of like a fratty environment.
They're hiring a lot of people who are fresh out of Stanford
or Stanford grad school.
Sergey Brin
is spending a lot of time
in the masseuse room
with various employees.
He never gets fully hard.
He never leaves the masseuse room.
It's his office.
I wonder if he has that in common with other
high-profile billionaires.
I think so.
I think he does.
He never gets fully hot.
What?
I'm just imagining Epstein and Sergey Brin having an arms race to build the most luxurious masseuse room.
I assume that Brin and Page don't have Epstein connections just because you can look up Epstein on Google.
I think so.
Yeah, if we have to start doing our Epstein research on Bing,
that will be a very bad sign.
Turns out you can't find it on Bing
because Gates was on the island.
You have to go to the dark web for that.
Speaking of which,
Larry Page got married
on Richard Branson's island
where I guess you guys
talked about in the episode
he groped somebody.
So, you know,
wonderful people, billionaires.
There's also a photo
of Larry Page
at his brother-in-law's wedding
and he's wearing Google Glass
at the wedding.
And it's like,
imagine going to a wedding
in a Bluetooth headset
and not being the biggest douche
at the wedding.
I like the idea of like, they go to the after party of the wedding and they're like, you're going to have to take those Google Glasses off.
Because Epstein provided the entertainment here and we can't have this on recording.
Though he did have a provision that he gets Google Glass.
I know I already made this joke
on a previous episode,
but now I'm imagining
the Eyes Wide Shut mask
with the Google Glass over it.
Like, damn.
These child orgies
have really entered Web 2.0,
haven't they?
It's required. Everyone at the orgy now has to have Google Glass so we don't recognize haven't they? It's required.
Everyone at the orgy now has to have Google Glass
so we don't recognize who's who.
Put them on.
Everyone at the Eyes Wide Shut party
has end-scan clown posse makeup
so that they don't get caught
by the facial recognition software.
Yeah, I was going to say,
the augmented reality system is going to start
asking you to tag people and everything.
But so I guess where we left you on the previous episode chronologically was, as we mentioned, 99, they get this big capital injection.
They're in an office.
It's kind of a fratty environment.
Most people say around 2001, Eric Schmidt gets hired as the CEO, supposed to be the adult in the room.
Sheryl Sandberg of Lean In fame, future episode.
I've got the audio book already.
She also gets hired in 2001.
It's an act of self-harm.
Supposedly, these two hires kind of clean up the environment a bit.
It's less fratty, more directional, less of a party kind of thing.
But what happens importantly, of course, in 2000 is the dot-com crash.
The dot-com boom keeps booming.
Yes.
All of the Wall Street scam of essentially...
The boom of 2000, the boom that never ended.
The Wall Street scam, which was basically they were IPO-ing all these stocks,
which had no way of making money whatsoever, but everybody was so hyped about the infinite potential of the World Wide Web that these Wall Street firms would give preferred shares to, of course, themselves and also preferred clients. and then send their analysts out on TV to hype it. And then a bunch of dipshits or the normal public or pension funds or whatever
would buy into these stocks.
And then as soon as the price got ticked up from what the initial buy-in of the preferred shares was,
they would all dump their stock.
So, you know, Wall Street and the connected individuals were doing fine,
kind of pumping and dumping these companies that had no way of making money.
And then about 2000, the bubble crashed.
One explanation I heard actually from that google
guy's book was they said uh companies stopped buying new computers because after y2k happened
like everybody was just buying all of this tech shit before y2k and then there was no y2k
so they stopped buying all these new computers and then that finally popped the bubble yeah
that makes sense so that's one explanation I've heard.
But whatever the cause,
the dot-com crash wiped out a lot of companies that had no way of making money whatsoever
and were just complete pump-and-dump scams.
Like Bryan Singer's web video website.
Jesus Christ.
What is this?
It was...
Bryan Singer bought..., funded a website that would play web series.
And they were made for kids, made for young teenagers.
Shot entirely in Google Glass. They starred young teenagers, and it turns out that it wasn't profitable to release a web series when the internet cannot handle video.
And also, that wasn't the point.
One of the creators of that website is now on the run because they systematically molested everyone involved,
and it's only now coming to light for Bryan Singer.
But if you watch An Open Secret, it goes into all that.
And we talked about this a bit on the David Geffen episode
where David Geffen's connection is like he was at some of these parties
that Bryan Singer and the other guy Andy just mentioned would have
where parties where children have alleged sense that they were molested and
raped you know fun stuff all that kind of eyes wide shut google glass nonsense um but anyways
the point is uh you know what i have no idea how we even got on that
sorry we've still got epstein on the brain clearly yeah um but who doesn't yeah but so the point was
there was a dot-com crash and uh hundreds of fronts for child molestation were wiped out
overnight in the dot-com crash uh wabistics was gone all sorts of things um but so the dot-com
crash happens in 2000 and google's actually able to weather this storm
pretty well google they're getting their initial money from licensing their search engines to other
companies i think they get a licensing deal with aol eventually uh they're also like they're also
just doing a kind of a traditional advertising model but it's really the dot-com crash and a
lot of people also give credit to cheryl sandberg on this uh again she was hired 2001 what happens in 2002 is they really rejigger their advertising
model well they they kind of came into a crisis because like they were able to survive it but
they were kind of hanging by the skin of their teeth like they um there was a lot of pressure
from uh the vc people to make more money because they were licensing, but they couldn't.
I feel like those people don't have enough money, so they do need more of it, in fairness to them.
And so they were just, part of it was like everyone was freaked out by the dot-com bust.
So it led to then the venture capitalists who, mean I don't know for sure but they probably lost
some money in other ventures then turning to Google and being like hey you're not gonna
fuck up are you we need you to make some money and so then Google I just got wiped out by Brian I'm still investing in X-Men 4.
So then they brought,
AdWords was kind of a marginal venture in Google at the time.
It was the licensing
that was their primary revenue stream,
but they realized
that they had to bring in more money.
And so basically in 2002, they switched to bring in more money and so basically in 2002 they uh switched to uh go down a trajectory
that they're still on right and yeah and it is interesting because we also mentioned on the
previous episode sergey uh brin wrote this paper at stanford which is essentially like
an advertiser supported uh search engine will be inherently corrupted because it will favor
advertisers over other people and you know and it like, and of course their motto famously is don't be
evil. So it's like whatever idealist kind of aspersions they got into the company with,
I think the real turn happens in 2002 where evil was reinterpreted to mean making the venture capitalists unhappy.
Yes.
Evil means losses for shareholders.
But so the turn really comes in 2002.
And just like from the book, Google Guys, just like a short explanation of the Google
Guys, excuse me, I mentioned it last episode.
It's called the Google Guys by Richard Brandt,
and they just have a short explanation of AdWords here.
What they came up with was a system that would let advertisers bid online
to set prices with those ads automatically matched to search terms
without advertisers ever talking to an ad rep.
And again, when we say matched to search terms,
what we mean is people will search
for things and then AdWords will say, okay, this person is searching for the example they give in
the creepy line documentary is this person is searching for umbrellas. So we will start showing
them ads for people selling umbrellas. And it kind of builds from there because they just need more
and more information about you and they're collecting a profile about you, all all of this so that they can show you targeted ads that you are going to
interact with and this idea kind of it it started they had kind of a series of revelations like at
first uh what they realized uh and this was before they decided to implement it in ads
but they realized that there was what they called data exhaust which was essentially just
excess data that would come from a search that they could collect but they thought was
unnecessarily unnecessary it would be like how a search is phrased spelling punctuation
how long people would dwell on a certain part of the page their click patterns the locations say from the ip address like this was all considered
um basically data exhaust but what they were able to do then is they realized that this stuff could
be folded into the search engine to make searches more accurate and uh one of the breakthroughs they
realized was let me see here they found um in 2002 uh a bunch of searches for carol brady's maiden name
and then um it was 48 minutes after the hour and it first appeared on the east coast uh
then it appeared in each successive time zone out to hawaii And what they realized is that it was a question at the end of who wants to be
a millionaire.
And so then they kind of that,
that kind of triggered their realization that they could essentially predict
behaviors using past behaviors.
Right,
right.
And so they kind of folded that into the search engine.
And so then once they realized really like came under the
gun in terms of needing money they took these ideas of um folding in the data exhaust and then
folded it into ads and so then they would be able to calculate the click-through rate like if someone
clicks on an ad uh google can you know read that and then they can mark that as success. Whereas before that, ads were just
billed based on how many times someone would
see them. But Google started
billing them based on the success of the ad.
Right.
So instead of eyes per ad,
it became clicks per ad? Is that what you're saying?
It became click-throughs.
So they essentially created the clickbait
market?
Sort of, yeah.
I mean, it's 2002.
I mean, they definitely did redefine internet advertising.
And essentially another part of the story, which we'll get to,
is they kind of get into an arms race with Facebook
because they invent this kind of ads word service
where they're showing ads based on this profile they're creating of your searches.
But then, you know, Facebook has even more data on you,
so Google needs more data.
But I mean, that's changing the face of advertising, period.
Because beforehand, you'd put a billboard up
because you can say like,
oh, 800,000 people drive by this billboard every week or whatever.
But to change that from, well, 100 people will click on this instead,
that's a huge shift.
Yeah, yeah.
It was, I mean, that's, you know,
we mentioned in the last episode
and on the facebook like pretty much on the facebook episode or zuck episode pretty much
all the money in advertising now is going to google and facebook yeah almost three third
three quarters of it in the u.s disgusting yeah and um again from the google guys book basically
in january 2002 uh larry and sergey give the go-ahead to convert what was their ad system at the time to this new system.
Their old system was essentially fixed fee ads were placed at a box at the top of search results that clearly said, you know, ads here.
And then they switched to the auction-based AdWords.
And the other reason, as we mentioned, why this was so profitable is people who wanted to advertise would essentially, you you know bid an auction uh and these are automated and based on per click um and again it's it's it's
a very important change but we should mention that essentially they were sued right after they
launched this because they stole this from somebody sure sure yeah yeah they took it from uh
overture yeah there's there's this great quote and and I'm, so I read the book Surveillance Capitalism by Shoshana Zuboff.
Just real quick.
I do like the idea, though, of somebody being like, you know what?
Intrusive surveillance capitalism is okay, but plagiarism is not.
Well, she like quotes this other book be original if you're going to snoop into
every aspect of my life to sell me products and compromise my data and my privacy
but sorry yeah in this other book they they quote someone is saying like and then page and bren
saw overture using ranked advertising and they had the ingenious idea to take it and make it their own.
It's like, that's really stretching ingenious.
Yeah, well, while Bren was in the massage room,
he happened to see Overture's technology in between sessions.
But so what happened there is Overture does sue them for this, and this gets settled in 2004.
Yahoo buys Overture, and Yahoo grants Google a license, a perpetual license, for this patent that Overture
invented in exchange for about 2.7 million
shares of Google stock.
I just realized Sean says Yahoo the same way
you'd expect Christopher Walken to.
Yahoo.
But so
yeah, I mean they stole it but they were making
so much fucking money that they got away
with it by basically just giving Yahoo
1% of their company.
Yeah.
So, and then this kind of continues.
2003, they invent AdSense.
And AdSense is very similar to AdWords.
You might be familiar with it if you've invented a website or put up a website.
Basically, it's a system where, according from the Google guy's book, Google uses computer
algorithms to analyze the data on a website
and choose which ads people visiting the site
were most likely to click on.
So if you happen to have put up a website
and you have traffic,
you can get Google AdSense on there
and you'll get a bit of money
just by letting Google
do their fucking surveillance capitalism on your site
and determine what people who are checking out your site
are also Googling
or Google Chroming or checking out on their Android device, whatever the case may be.
Google more like Big Brother.
Yeah.
Shoshana Zuboff changed the name to Big Other.
Oh.
What, really?
No, yeah.
Well, she goes into it.
Another concession to political correctness.
We don't want to gender the surveillance state.
There's plenty of women violating the surveillance and privacy of people.
So it's not just a brother.
It's a big other.
I'm just imagining Sheryl Sandberg's Lean In cover,
but on one of those big brothers watching posters.
Lean In.
And so AdSense kind of expands this and now all sorts
of websites that aren't even google are doing the same kind of targeting advertising model that
google is doing um but uh andy you wanted to talk about a bf skinner just kind of like the general
mindset behind surveillance capitalism and this this shift here yeah yeah first I wanted to play this clip basically on Google's business model that it's to kind of set
the stage so it's it's from this town hall with the guy who hosts planet money
one of them two boot lickers and then daddy David Harvey and you can kind of
just hear the tone
of the bootlicking throughout.
And I thought I would sort of talk with the three of you
about what does the presence,
let's talk about that right now,
of such a huge, iconic company
choosing to make New York a major part of its home.
And Ed, I'm guessing that for you this is, your book is called The Triumph of the City.
This would be a triumph for New York City.
Is that fair?
Well, I think it's an indication of the enduring strength of cities.
And I think I couldn't think of a better example of that than Google.
And then he goes on to bootleg a bunch.
It did just occur to me that with Yogi's view of the Manhattan skyline,
they are probably watching us from the Google Chelsea office right now.
A lot of our stuff is stored on Google Drive.
Yeah, I was going to say,
we have to finish recording this episode before my Google Doc research is locked.
So then after a bunch more bootlicking, he goes over to Harvey.
And so I'm finding myself really wondering,
what are you going to say about this? He introduces David Harvey. Well, I'll tell you. He goes over to Harvey. And so I'm finding myself really wondering what are you going to say about...
He introduces David Harvey.
How does this make...
Well, I'll tell you what I'm going to say.
The interesting thing about this building
is to think about the labor process that's involved
and the activities going on there.
What does it make?
What does it produce?
And it turns out with Google
that we, the public, actually do the labor.
We do it. They don't make anything. They just sit there in that building and what do they extract from us? They extract rents
from us. In other words, this is a totally parasitic form of economic activity.
That's what you would say.
All right.
And then he goes on to say
that it used
to be American cities would make things and like
Latin American cities would just extract rent
but now that's switched over.
New York City is one of the most
parasitic
economies in the universe.
He's listened to our podcast.
It's a form of new
industrial organization which is not about making anything.
As I said, it's really about extracting labor from everybody else who actually contributes all the information that Google then utilizes and sells to everybody else at a profit.
Here we go. Here we go. Here's the real boot looking.
I have no interest in Google personally
and I'm not here as a representative of Google.
But I think that actually it's a narrow view
of what Google does to say that they don't produce anything.
Now...
So...
Get fucked.
He completely misunderstood what Harvey said said right uh which was that google
extracts all of its labor from people and then this guy's like well no you wouldn't say that
they don't produce anything and it's like well that i mean the font for the website was custom
made yeah so well let's get into um i wanted to get into sort of oh go ahead you cut off the whole
david harvey harvey quote he goes uh, New York is one of the most parasitic
economies in the world.
Four dipshits just record themselves
hanging out, and then they charge
$5 a month on Patreon.
And this goes on in thousands and thousands
of cases. It's very...
This is absolutely fucking stupid.
This is absolutely fucking stupid. This is absolutely fucking stupid.
So.
But yeah,
and one more illustrative thing
from the Google guy's book.
Basically, this AdSense and AdWords,
again, revolutionizes internet advertising.
Facebook would do it again,
but from the book,
by the end of 2008,
Google had at the time
captured 75% of internet search
advertising, while Yahoo held on to about 20% and Microsoft just 4%. Google's revenue from
advertising came in at $5.5 billion in the third quarter of 2008. So again, this is wildly
successful and becomes the model for the industry that just really has become an arms race,
eviscerating people's privacy. Basically, to understand what the goal of surveillance
capitalism is or how it's meant to be revolutionary, you have to go back to essentially
the 30s or 40s to a man named B.F. Skinner,
which is for Burris Frederick Skinner,
who Principal Skinner was named after.
Skinner!
Skinner!
We don't have that drop.
So,
he was a pioneer,
an early pioneer,
of the idea of behavioral psychology.
And a lot of you probably learned about this
in like Psychology 101 class. That's certainly where probably learned about this and like psychology 101 class
That's certainly where I learned about it lame. Yeah, and it was
The main idea behind his behavioral psychology was famous for his steamed clams
Operational condition and conditioning. Yeah
that
Is really I get through my relationships.
Operation conditioning.
It's a good way to hide the aurora borealis.
Drug discovery.
So, operant conditioning, you're probably aware that it's essentially there's positive reinforcement,
which is when you reward a good behavior.
I've used this to teach my cat to high five.
That's right, Andy.
Yeah, by saying high five and giving her treats when she does it.
There's also negative reinforcement, which is where...
That's where you criticize us on Twitter, and so we get Andy a keyboard that he can play drops with.
He just makes a million nonsensical sound effects.
No, that's punishment.
That's different from negative.
Yeah, Sean, you idiot.
One random thing I learned in my UW Psychology 101 class
that I remember is they said negative reinforcement
is also reinforcement.
Maybe that's kind of a misnomer.
I don't know, like negative conditioning or something.
If you're trying to discourage a behavior,
you don't do reinforcement essentially.
Right, right.
Well, I think the idea,
negative reinforcement is actually trying to encourage a behavior.
And what it is, punishment is to discourage.
So negative reinforcement is like I play a bunch of drops and then when Sean starts saying something interesting,
I stop.
But you never stop playing drops.
What does that mean?
We're still waiting for the day.
And then of course,
punishment is exactly what we all know it to be.
We use that on the podcast
where we have this little cattle prod
and then when Sean starts to give his opinion on feminism uh we give him this little zap not
enough to hurt him but enough that he doesn't want it to happen again and so bf skinner uh
that's what it is andy shut up
skinner uh took these that uh ideas that he basically just got from like
torturing lab rats or giving them pellets and
stuff and then built an entire worldview on it. And so he wrote a shitty novel called Walden 2,
where it's basically, I haven't read any of Ayn Rand's books, but it seems similar to them where
it just has a protagonist explaining this new world he's created where everyone is conditioned to do the right
thing based on something that's agreed upon by the community. Basically, he took his behaviorist
ideas and then said, okay, we could base society on this. When people do bad things, they get
punished. When they do good things, they are rewarded. And that way everyone does the right thing in society.
And a part of this idea was that there's no such thing as free will as he sees it. Like he took the idea that because, you know, physicists have shown that all human processes are the result of physical properties.
And thus in a physical from a physical standpoint, they're all predetermined.
Then free will doesn't exist.
And Skinner then took that to mean that every behavior we take is a stimulus response.
And so everything we do is, you know, it's learned from essentially elaborate operant conditioning and he then went in a weird direction with this and wrote a book um i think
it was called beyond freedom and dignity and if you ask me this all sounds like a brave new world
am i right i've read a book and beyond freedom and dignity also known as what happens when you log into Facebook.
We've evolved beyond those concepts.
It was created Facebook.
He basically said that because they're like that free will is a mystic idea.
Of course, what he ignores is that like you you can't just say that there's no free will and then give yourself over to like someone else to like make you do things like you can't like you can't escape free will, even if it's not real.
The human experience is not divorced from it. We're stuck inside of it.
And any decision that is made with regards to free will is a decision that's made by someone using their free will. It's either you make a decision or you have someone else make it for you,
but there's still someone making a decision,
even though ultimately it's an illusion.
I probably have lost all of our philosophy student listeners.
To be fair, they were never found.
They stopped listening last episode
after you played too many drops.
We used to be the 181st philosophy podcast in Finland,
and just because of that, we're down to 200.
I don't think it counts.
I don't think you can say we lost our philosophy listeners
if we just lost all of our listeners,
including the ones who studied philosophy.
Shouts out to Czech Republic and Finland,
where we are in the top 180 business podcasts. Hell yeah. 151 in Finland, 155 in Czech Republic and Finland where we are in the top 180 business podcast.
Hell yeah.
151 in Finland, 155 in Czech Republic, 190 in the United States last October.
Oh, good news.
All right.
But Andy, so essentially how does this BF Skinner stuff tie into Google and the operating model?
Well, so essentially the idea of operant conditioning and creating a controlling people's behavior
through that is...
You really like watching rats suffer.
Google, yeah, Google just put up videos of rats killing each other.
And they were trained to be more efficient at it.
And that really drove their...
So what it means is that YouTube realized that in creating more effective ads,
they could make ads more effective the more information they were able to gain from people. Because by gaining information from how people would react to certain ads, they could then take that information and then feed it back into the system, like process it in a way where you can then use how people naturally react to
different stimuli to try to control the reaction they have.
Okay.
So essentially you're saying we're all guinea pigs to the Google machine and
they've learned from us in ways that we didn't even realize and they're
manipulating how we do things by the information they got from us by using
their service initially.
Yes.
Yep, pretty smart.
One of the more fucked up things, the other bit of research,
we watched the documentary The Creepy Line.
It's on Amazon Prime Video.
But it talks primarily about Google but also about Facebook
and these kinds of behavioral things.
And one of the things it mentions is that Facebook was essentially
showing users with depression depressing content of behavioral things and one of the things it mentions is that facebook was essentially showing
users with depression uh depressing content as a way of of testing essentially their ability to
control people and we talked about a bit on the facebook episode of the spike in teen suicide
and so it is entirely conceivable that facebook killed some people yeah and facebook it wasn't
just people with depression they would facebook kind um, shot did sort of a shotgun approach where they would just manipulate
everyone's newsfeed in different ways, um, to just see how they would respond based on, um,
almost random configurations of, uh, different ways of presenting information to people. And so,
you know, if, uh, they would see like if something would lower people's moods
or if something would increase people's moods, but like essentially if you were using Facebook
during the last few years, you were essentially a Guinea pig, um, to their operant conditioning
testing. And part of this idea of getting an edge then in behavioral modification is you,
because humans are so complex, uh, being able to manipulate a human
outside of like pushing a lever to get candy.
By the way, the most successful utilization
of these ideas is casinos and like slot machines.
But outside of like that condition,
you need to really get as many different dimensions
of human behavior and human experience as possible
and so in order to do this google expanded its operations into just about everything that they
could get their hands on right and so you kind of see this explosion into like google maps gmail
like for instance when gmail came out they would systematically uh you know read through the emails like not people but
programs would to better target ads of course yeah and you know they would have give people
profiles so then they could use that to track people's movements they would use cookies that
would track which websites people went to um even when they weren't using it and then you know they
would create the chrome browser so that they could track people um even when they're not
on a google website
yeah let's not forget about all the carbs from all those cookies android records what we're doing
even when we are not online as soon as you connect to the internet android uploads to google a
complete history of where you've been that day, among many other things.
And yeah, so this is like Andy was saying, like, you get off Google, they don't know where you are,
so they build Google Chrome. Now they always know where you are. You're gmailing, so they can scan
that to make ads, and now they have Android operating systems, so even when you're not on
the internet, Android is keeping track of where you are and what you are doing to an extent the entire day that uploads to Google as soon as you connect to the internet.
I mean, it's fucking terrifying.
And even if you've been offline for a long period of time, it collects where you've been in your history and updates your profile accordingly once you do have a connection.
Yes, Google has a profile of you.
Not including the Android phones, which also track and monitor your movements and stuff and another point that's made in the creepy line documentary is essentially you will
Google things that you wouldn't even tell like your closest friends and family
So like the Google profile of you is probably the most detailed thing that exists in the world
I mean it it outpaces like any sort of fucking
Surveillance state kind of bullshit. I mean, you know, It's the new religion. Exactly.
Like what the fucking CIA or FSB or MI6 has on you is going to be nothing compared to what Google does.
I don't know.
MI6 has some juicy pictures that I never put on Google.
I snail mail it to them
just because I'm such a big James Bond fan.
You don't know how many. You don't know how many
honeypot
operations I have been involved in.
So I will have you know that
MI6 has far more information
than Google. I write to them like, I'd like
to be the next James Bond. It's just a picture
of my asshole.
They write me back like,
we don't do, you can't be
a James Bond at mi6
click the link i'm basically you own me now that means i also get to be your next agent
how did andy die well mi6 killed him after they decided he was just being too annoying
they didn't even try they just showed up to his house like we might as well do wet work or these people these pictures will never stop coming
so yeah so actually the android case is interesting because it's it's part of google's
strategy where a lot of people will talk about how uh they think that google is getting into
this market or that market because they're
trying to be competitive with Apple or what have you.
But that's not necessarily what Google is trying to do.
Oh?
Because when they get into these markets, they usually don't make a profit off of it.
It was the same thing with YouTube, where they paid like, you know, $2.1 billion for it.
In 2006.
In 2006.
$1.6 billion. $ 2.1 billion for it in 2006 in 2006 and 1.6 billion 1.6 billion and quick
the negotiations for that deal happened at a denny's because they needed to meet someplace
secret so the deal for youtube between google happened over mozzarella sticks at a denny's
sergey was like my mistress needs a place to upload her web series. And they're like,
we want this to still be a popular website
even after you have the rights to it.
I know that this is a little odd,
but I know a couple of stories
with some of the Microsoft CEOs as well.
A lot of tech billionaires have Denny's stories.
I don't know why,
but you'd be surprised how many CEOs
that are billionaires have like,
and then we went to a Denny's. Really? Wi-Fi is so spotty.
Well, that's actually an issue with Google as well. Well, yeah, they're hiding out.
That's what they... Do you want me to talk about that now? No, no, we'll get to that later. Cool.
So yeah, so with YouTube, they basically, it wasn't necessarily that
they would make a profit immediately off YouTube.
What it was, was that they could test analytics, all kinds of different dimensions of analytics,
both from the videos people uploaded and, you know, from the way people would respond
to ads on YouTube.
And so they, they essentially were able to, that's why after I uploaded my standup set,
I started seeing all those suicide counseling hotline ads.
So,
um,
they,
they would also start,
um,
well,
one thing they,
they did is,
uh,
early on as they,
while doing this,
people kind of understandably got really uncomfortable with the situation of Google tracking every single thing you do.
And they went to great lengths to hide it like they very early on instituted a policy of secrecy inside Google to keep basically people from being aware of the extent that they can surveil things.
Like at one point their lobby had a ticker running of like up-to-date Google searches.
Right.
I remember that.
Yeah.
And Bryn told them to get rid of that.
Right around when the news of his affair came out.
I was going to say they've come a long way since the door was open while people were fucking and the employees were watching.
They're like, oh, you can blackmail with that.
You can blackmail with that.
The first thing they did, put locks on the doors.
Yeah.
The massage door is now locked.
Yeah, and I do just want to mention one other thing kind of loosely chronological.
So the Ad adwords we mentioned
2002 then 2003 is adsense 2004 is their ipo and the ipo really i mean again sergey brin and larry
page control more than half of the voting shares so they could change anything but once you're like
a publicly traded company that's responsible to shareholders and investors you really do get in
this generating ad revenue race where they're just doing all of these insanely intrusive things. And it's like a beast spinning
out of control. But I did just want to quote from the Value of Genius book,
pretty fascinating little anecdote about what happened after the IPO. They quote one former
Google employee who says, after the IPO, Google became more buttoned up, more metrics driven,
which was good for the company probably,
but it was not the culture that I was used to and that I enjoy while they're there.
Another employee says they started sending people to Dale Carnegie classes,
you know, public speaking and this kind of stuff.
But the anecdote that I really wanted to share is a former employee,
Heather Cairns, she says,
Larry and Sergey used to hold their forks and knives in a fist, scooping.
They used to scoop food into their mouths, which would be a couple interest from the plate.
And I'd be like, I can't even watch this.
I can't.
I'm going to be sick.
They had to be taught going to be sick. They
had to be taught not to do that.
And then she also says...
And that's where behaviorism comes in.
And then she said after the IPO, nobody has super
bad, disgusting behavior anymore. It's really
depressing. The personality has been coached
out of all of them. So yes,
after the IPO, Larry and
Sergey learned how to eat food.
That's interesting too that they essentially took the behavioral modification techniques
and like folded them inside the company as well.
Because now everybody represents the brand and all that nonsense.
Right, right, over time, created a process for dealing with the public
and a process for gaining a hold on their power in the kind of larger sphere of things. And so
the process to normalize, as Zuboff says, surplus extraction is one, incursion, two, habituation,
three, adaptation, and four, redirection. What that means is the first one is incursion.
They just don't ask permission for something where they put it in a terms of service agreement that no one's going to read.
And, of course, those are intentionally long.
Of course.
To make it so no one reads them. You would have to spend several, I think, several weeks out of your year to read all the terms of service agreements you encounter in that given year.
So there's incursion where they introduce a thing, for instance, scanning Gmail, scanning everyone's Gmail to target ads.
Then there's habituation where people kind of get worn down and used to
the technology. And then there's adaptation, which is eventually people will, when they find out
about it, make a big thing about it. And so they'll do some superficial changes to say that they're
responding to the public outcry. And then there's redirection,
where they kind of regroup and change to appear compliant,
but they're really just doing the same old thing.
And people eventually just get worn down,
and they're so used to it.
Like, you know, the idea that...
This sounds like corporate conditioning for the masses.
It's like, how do we make sure
that we can get away with
shitty things efficiently and correctly in a way where nobody realizes we're crooks maybe this is
an example of operant conditioning at work but the captchas that you keep filling out where you
have to identify where a stop sign is so i've been i've been told that that's fed into a machine
learning algorithm that helps them develop their driverless car.
Oh.
So you're basically helping their cars learn to discern where.
Oh, that's fascinating.
Yeah, yeah.
It goes back to that several times,
including from someone who used to work at Google.
You're saying when I fuck those up, I'm killing pedestrians.
Yeah, so if you fuck up a Captcha,
you're sort of culpable in a way so
whatever you're supposed to solve for a stop sign click on the mother pushing a stroller
it's also funny that like their ai can't recognize a big orange or big red octagon right right
like they have to have outside conditioning
to be able to image that.
But you all accept,
they don't even,
they never tell you about this like
other machine learning aspect.
They just say like,
use the CAPTCHA
so we know you're not a robot
to sign into this thing.
And like, well, actually robots can now sign,
enter incorrect CAPTCHAs somewhat frequently.
Yeah, because it's like that.
Yeah, there's like,
even as it's not even
about uh security aspect anymore it's just machine learning for their other products the thing that's
so terrifying is that we're literally by interacting with these things making our lives more miserable
tomorrow today like whether it's drones or self-driving cars ways to kill ourselves tomorrow
will be more efficient by our fucking idiocy today
well here's the thing though is that by saying that we're doing it you're kind of it that's that's
also an aspect of google's conditioning is that's fair they're essentially conditioning us to say
that we're doing it to ourselves when uh all these technological aspects, and the book Surveillance Capitalism is very good at pointing this out,
all of the technology that comes with Google can exist independent from the surveillance advertising behavioral analysis model.
It's a conscious decision on their part to harvest that data and then reuse it to try to like drive what we're doing and then
they'll i i mean uh this term is overused but basically gaslight people to say it's their own
fault yeah and that makes sense yeah and um and on top of that it's there's also the issue where you can't exist really in our modern society without a smartphone.
You can't, you know, maps are basically.
Not me, man.
I'm off the grid, baby.
You know, you can't do much of anything without email.
You can't connect with people because everyone is now, basically everyone kind of flocked to computers to essentially escape the disaffection that's inherent in our current economic state where everyone is atomized.
A quick question.
Do you think people could sue to be like, hey, we're basically helping you build your infrastructure.
I want money from that.
Yeah, people do that all the time.
There were some EU privacy lawsuits.
I don't know exactly how they all got resolved, but it was the right to be forgotten is one case.
Yeah, in Spain they sued for the right to be forgotten. But that's more privacy-based.
I'm saying like with the thing Stephen just mentioned about, like, you know, choosing the stop signs.
Oh, the surplus data?
Like, that's essentially, like, you know, that's work.
I mean, like, that's essentially.
Yeah.
So, like, why can't, I mean, not saying that I want to be compensated for this, but shouldn't Google be paying?
Like, okay, so, like, if I put a Google review out, right?
If I review a restaurant or some shit, they're making money off my review in the long run,
whether it's selling the data to the restaurants or whatever.
So in theory, shouldn't I be making money off contributing to Google success?
Not, no, because you signed a terms of service agreement.
Right, okay, that makes sense.
It is a very powerful combination.
It says once you log in that review, it's property.
Gotcha.
You didn't read the terms of service, Yogi?
It's a very powerful combination.
You didn't, when you clicked, I have read and read the terms of service yogi it's a very powerful combination you didn't
when you clicked i have read and understand the terms of service you were lying yogi tried to do
a drop but it's plugged into his computer yeah well it is a very powerful combination between
like the terms of service between the army of lawyers and just the pure capture of the state
like google and silicon valley owns the u.s democratic party uh and to a
large degree the republican party as well where it's like government fucking everyone i mean you
look at the way mark zuckerberg was treated on capitol hill all this deferential like mr billionaire
you built so many billions so many jobs you know you did so great american dream all this nonsense
well it's like you know you're fucking making children kill themselves and like uh you know you did so great american dream all this nonsense well it's like you know you're fucking making children kill themselves and like uh you know spying on everybody uh absolutely
just exploiting and abusing people's trust and privacy selling their private messages to anybody
who wants them you know and it's like yeah spotify can read your private facebook message really oh
yeah yeah that came out recently but i mean it's just like you look at... What was especially funny about that was that came out just as Spotify was doing their regular
like subway advertising blitz.
And so they were...
They like had those things that said like, hey, we're real sorry to the person who made
breakup playlists.
And it's like, you can read our messages.
Why are you advertising this?
Right, right.
Idiots.
But yeah, I mean, it is just something where it's like anybody who behaved as badly,
who didn't have just complete control
over the political process,
both here and to a large extent
in the European Union as well.
I know Google's been fined for various practices
in the European Union.
Well, it's also like with the uh with the zuckerberg thing it it's become an element of fear amongst elected officials because in 2012
facebook released a study now the effectiveness of this is kind of it could be a bit dubious um
and it also ties into the 2008 election um essentially they they have portrayed themselves and to a degree
they probably do have a certain amount of political power uh in 2012 facebook did this experiment
where they would try to get more people to vote and they would calculate it by people
clicking on like a little thing to say i voted to show that they voted and so they would try a
bunch of different um techniques to see if they could influence people who voted.
And so one way they put up a hot or not list of elected officials.
Well, one thing they did was they would first they would tell people, hey, you should go vote and see what the results of that would be. Then they would say, like, here's a list of your friends, just kind of randomly select it, or from just all your friends
on your friends list who clicked I voted and said, like, they voted, you should vote. And then
one final thing they did is they would use facial recognition software to find pictures
where, to find friends that are in the same picture as someone else. So basically friends that people know face to face,
and they would say,
this person voted,
you should vote.
And that,
um,
basically manipulating people's social relations.
Um,
and that showed that that had the strongest results of getting people out to
the ballot box.
And then of course the implications are Facebook can manipulate people's
behavior, uh, with regards to, you know, going out to vote.
Another political thing that they did was Eric Schmidt, when he was done or when, you know, he right after he left Google as CEO, as CEO, he went over to the Obama campaign in 2008.
Right. And headed up there um basically obama's election
campaign and it was historic in that it was the first uh campaign that used big data essentially
and you know this all these data analytic tools that they had perfected at google and that later
that facebook was learning to perfect um in an election and people who had worked on it,
you know,
they would say things like,
Oh,
we knew who people were going to vote for before they did,
which,
you know,
a bit of that I'm sure is advertising.
Sure.
Like Google's trying to,
uh,
Google and Facebook are also trying to make themselves look all powerful.
Cause that's how you get the ad revenues.
But it also,
um,
it also scares elected officials because then you don't want
to piss them off yeah of course they have the ability to sway elections now it's almost as if
it's the government's big brother well two points on that though uh first off uh according to the
creepy line document large parts of the u.S. federal government use Google Docs, Google Tools, Gmail.
So in addition, like besides...
Yeah, DSA.
I can't wait for the DSA platform that is like, we need to nationalize all of the companies except for big tech.
I need my google docs but so you know again like and uh you know the implication
here i don't know if we've explicitly said it i'm sure most people have figured it out if google
docs is free if gmail is free all these programs that say microsoft was charging you for microsoft
word if it's free there's no free lunch they are making money by profiling you and learning everything about you
and selling that to advertisers and uh and basically as uh yogi was saying earlier about
getting compensated essentially the way that they look at it is that they're compensating you by
having you use the product right it's free so you're literally using google maps while we're
having this conversation yes continue oh and then just one other thing to mention.
In addition to Eric Schmidt being on Obama's, he was an economic advisor.
Eric Schmidt, Larry Page, and three other Google execs donated $25,000 each to fund a $150,000 party at Obama's inaugural.
And more than 90 percent of political donations
by google employees go to the democratic party but google of course is the largest lobbyist
in the united states and they spread their cash around the way facebook or most major companies
do where it's like you spread your cash around so that no matter who's in power you have a
foot in the door are they the largest donator to the Democratic Party, do you think?
I mean, yes.
Like, I know Silicon Valley is kind of the major cash source, whereas the Republicans kind of rely more on, like, fossil fuels
and these kinds of things.
And Wall Street goes back and forth.
They're more with the Republicans now.
But Silicon Valley, California, is, like, a major cash source
for the establishment Democratic Party.
Gotcha.
Essentially, yeah. cash source for the establishment democratic party gotcha essentially yeah eric schmidt um
he had a a big role on obama's uh essentially his economic transition team and schmidt essentially
has said publicly that he doesn't think that uh technology should be regulated by the government
because it moves faster than the government,
which is kind of just a Trojan horse.
He also thinks his behavior shouldn't be regulated by his wife.
Yeah, he's like, throughout this book,
Eric Schmidt is kind of like this dark villain just lurking around.
Well, it's pretty interesting,
like just from the Creepy Line documentary again,
like they have lots of clips of him just explicitly lying to interviewers because Larry Page and Sergey Brin
are kind of like introverts
who more just talk to tech people.
Whereas when he was hired as CEO,
Eric Schmidt was like the public face of the company.
So he was the one answering
most of these privacy concern questions.
And he's like, I mean,
the documentary Creepy Line is named after his line
about essentially saying Google goes right up to the creepy line,
but we don't cross it.
And they just have all these other clips of him just explicitly lying and
saying like,
for us to get your data,
you have to opt in and all this other stuff.
And then what,
of course,
and then what of course,
uh,
goes on set is that they'll make it so that as a condition of using their
products,
you have to opt in.
Um, and you know, that'll be hidden in the middle of a terms of using their products, you have to opt in.
And, you know, that'll be hidden in the middle of a terms of service agreement.
Now, they actually have, using Obama, they essentially created all kinds of different ways for sort of political penetration.
Like Obama was basically a huge vector for this kind of thing. Because the book didn't say whether they let Eric Schmidt have his own slide in the White House.
But apparently by April 2016,
197 government employees went to Google
to become like executives.
And then 61 executives went from Google to the government.
And of those, 22 were White House officials who went to Google and 31 Google execs went
to the White House.
So essentially, they were just going back and forth between those.
What year was that?
This was from 2008 to 2016.
Wow.
Eight years of this shit?
Yeah.
Eight years of just circle jerking one another with employees? Yeah. And those were years of this shit? Yeah. Eight years of just circle-jerking
one another with employees? Yeah, and that's
those were the biggest growth years for Google. Yeah, certainly.
And you can also honestly see
like because Obama
saw all that success in his 2008
campaign from essentially
surveillance techniques, like you
can see how easily that's going to translate into
X-key score.
Like, you were looking at that, Stephen.
Oh, yeah.
Google was one of the contractors on X-Key Score.
The data collection manipulation software that went into the CIA,
sorry.
NSA.
The NSA program called PRISM.
Right, right.
Oh, that's what Snowden leaked?
Yeah, the Snowden leak. Yeah, the Snowden leak.
Yeah, the Snowden leaks.
Yeah.
So, like, that's interesting that you have, like, the revolving door policy between Google
and the White House that everyone, like, everyone kind of, well, not everyone, but, like, it's
a well-recognized, it's well-documented, that relationship dynamic between Wall Street and
the White House and Congress. well documented that relationship dynamic between wall street and white house and congress and like and uh but it's at the same at the very same time with the obama administration it was also going on
with the tech world right and that's like i mean it's it's interesting where again the book i read
was written in 2011 and you see the ideology all the way through it but there was a time in this
country where we really thought tech was going to save us. We thought, you know, Steve Jobs is the genius. We just need Elon Musk to be
the next Steve Jobs. We just need enough people who learn how to code. And all these Silicon Valley
people like kind of transcend the dirty capitalist model because these are idealists. These are
people building our future. And of course, when you have this kind of revolving door between the White House and government and big tech, of course, you know, the entire government and ideology is going to reflect
that. So but in actuality, of course, we're more people are becoming aware of just how vicious the
level of surveillance is and the information that Facebook and Google have on you that
dwarfs anything, you know, a Stalinist state would be able to have in the past.
And it's
what's especially
interesting about this is
it's created this whole
basically technocrat mindset
that
is most
quickly embodied in this guy Sandy Penthouse
who has...
That's his name?
Not Penthouse. Sandy Penthouse? He's named after Eric Schmidt's favorite place embodied in this guy sandy penthouse who has that's his name yeah or no i'm not a penthouse
sandy penthouse he's named after eric schmidt's favorite place to stay
sandy penthouse with his uh brother alex pentland alex quote sandy pentland um sandy pentland he
his letters are much more boring he is like this kind of floppy ted talk teddy bear um who essentially pioneered at mit
what he calls social physics which is uh the use basically the manipulation of people's social
interactions um to create desired outcomes and he he, he says it from like, you're manipulating
social interactions from a view from above, you know, you're looking down on people like they're
in anthill. And if you can trace as much as you can about them, you could then use that to
manipulate them. And he had a hand in that, um, Facebook election experiment, uh, and how people,
uh, interacted with each other and using that to manipulate people.
But he doesn't necessarily see it that way. So like, here's the mindset that he kind of goes in
this with. We could solve global warming tomorrow if we all knew how to sort of talk about it
rationally, come to a good decision and then carry that through. And the fact that that sounds like ludicrous fantasy,
everybody agreeing, sure, not in our lifetime,
tells you just how profound the problem is.
And that's why I think one of the most important things
that's happened in the last decade,
something that you've all been part of,
is this era of big data, which is not about big at all.
It's about personal data.
That's him talking to google employees um it they they all seem to have this like yeah tech will save us mindset like um
i know i know i only ever think about the nazis but just in that in that clip you can hear like
the silverware clattering in the background and it kind of reminds me of that Heinrich Himmler secret speech to the SS
about how what they're doing is necessary and will save the world,
but they can never speak of it.
Right, right.
Yeah, yeah.
Even in that clip, they're talking specifically about climate change
and getting everyone to understand and agree.
And I'm like like specifically in our present
2019 context like nancy pelosi clearly does understand that climate change is real and is
unwilling to do anything about it yeah and part of it is like they just have no concept of power
dynamics or they're not willing to acknowledge them like one of the things he talks about in
this talk is how they used a system somewhere in like Norway to get
people to use less electricity as they were like if you use less electricity a
friend of yours will get a reward and so then using I mean basically his
techniques just come down to peer pressure right basically using peer
pressure they got people to like drop their usage by 17% great but
individuals using technology isn't what's driving global
warming as we all know it's driven by industrial interests and you know profit motives and
basically an economic system that is idealized for constant growth or that is built around
constant growth and even in this talk he talks about about how he can use these systems to increase GDP in various areas. And one thing that he doesn't talk about is the question of who gets to make the decisions of how these tools are used. linchpins of all of all of this surveillance stuff is that it does not
work if people know they're being surveilled and it if people are able to
see that it's happening to them it just it doesn't have the same effect they
people have to be unaware that they're being used as a tool and the Truman Show
effect yeah yeah and in the book about the surveillance capitalism,
Zouhaf makes the observation that it's not totalitarianism,
but instrumentarianism.
Because totalitarianism tries to basically convert someone
to this idea of body and soul.
To commit to a state or whatever throughout themselves.
But instrumentarianism tries to work invisibly.
It's not trying to get people to commit to this idea.
It's just trying to use them towards your means
from basically a little man behind the curtain kind of thing.
What's behind then all of this is these people who are creating this technology,
and you also certainly saw it in the Obama administration,
was this idea that we know best, and we know how to use this best.
But of course, with Google, what that really amounts to is just like,
we know how to get you to buy a boner pill. Right. Like there,
it's just this massive infrastructure that is ultimately 89% of their
profits are advertising.
And so ultimately it's,
it's this,
this big instrumentarian system that studies and analyzes everything we do.
Um,
ultimately just to get us to buy whatever shit, uh, people who buy ads want to sell us.
Well, and going back to the Nazis, the entire learn to code ideology really feeds into that
where it's like you have this untermensch or whatever, which these are the poor masses.
These are the people who don't know how to code.
These are not the Silicon Valley geniuses. And the idea is anyone can become part of this coding master race by just learning how to code and becoming these Silicon Valley millionaires and people making six figures at Google or whatever.
But, of course, the other part of that is, of course, these people are superior to us.
They know how to code.
They know what we should be doing.
So, of course, they can be trusted with our data.
Of course, they have the right to know every single thing about our lives because they are superior to us.
And we are the the untermensch who cannot code.
And, you know, and again, just like the horrifying thing is, of course, Google and Facebook and all these companies, they already have all this data.
So that's scary enough that, you know, we're just because they are superior to us.
We have to trust them that they won't abuse this control they have over us. But at any point,
you know, they could be hacked, our data could be leaked, or they could just comply with a
government subpoena, our data could be leaked, or they could just start working with the government,
which of course they do in several cases, and our data could be leaked. So there's just like,
the very existence of these profiles is horrifying.
And, you know, these people that we just have to trust that are they are superior to us, even if they were, which, of course, they are not.
You know, fucking Eric Schmidt is in the cookie jar all day.
Yeah. You know, so like why not?
What if he gets blackmailed?
I mean, in his case, it's public information.
But say one of them gets blackmailed for something.
It's like, well, of course,
they'll just compromise our privacy
to get rid of whatever.
But even if they were these perfect angels,
the very existence of these data profiles
is fucking horrifying.
Yeah, and it's completely undemocratic
because it's sort of this one-way mirror
of control of information.
I think that also begs,s i think we as a pod
go a bit further and say that um billionaires are an affront to democracy oh yeah like their
their like their very existence is a threat to democracy as i think is one of our themes as a pod
well look i mean if and of course so of course they would build this, this like massive system of massive administrative system that they themselves have only, they have the expertise to really manage and are trusted with.
And it's basically built to keep us, you know, buying things and unaware that we're being constantly influenced to, you know, keep the system that keeps them rich running. Do you all think that the system
that you're describing just now,
it was with foresight intentionally built
or almost stumbled upon?
It was, at first it was stumbled upon.
Yeah, I don't think I give them the credit
of they designed it this way
because it's more, I mean, honestly comical
that it's like, hey, we figured out
we can fuck over people like this.
It's like, yeah, like I mean. I mean, it's great that they stumbled upon we can fuck over people like this it's like uh yeah like i mean i
mean it's great that they stumbled upon it from who wants to be a millionaire but i mean like that
you know what a what a brilliantly odd question of carol brady blah blah to make them go what the
fuck why are people looking this up you know yeah yeah i honestly think some of like the but like um
when when the creepy line they're talking talking about Google's ability to influence elections.
I honestly think like part of the bias that he uncovered was like Google didn't even intend for that at all to happen.
Right.
But they just sort of stumbled upon it, like you said.
And then now they completely embrace it.
Yeah, like part of it is just inoperable. You wouldn't be able to really shift it into a way that you could influence, say, a gubernatorial race in the same way that you could do so with quote-unquote low-tech methods of just outright racism or something, where you bar people from being able to sign up to vote or something.