The Joe Rogan Experience - #1768 - Dr. Robert Epstein
Episode Date: January 21, 2022Dr. Robert Epstein is an author, professor, and Senior Research Psychologist at American Institute for Behavioral Research and Technology: a non-profit, non-partisan organization that offers data rega...rding the power of Google and other Big Tech companies to censor dissenting opinions online and sway the outcome of elections.
Transcript
Discussion (0)
The Joe Rogan Experience.
First of all, thank you for coming.
Really appreciate it.
This is a very interesting subject because I think search engine results have always
been a thing that people kind of take for granted that the search engine results is going to give you the most significant results at the top.
And they don't really think about the fact that this is kind of curated.
And, you know, we found it many times because we use two different search engines.
We'll use Google and then we'll say, well, if we can't find it on Google, use DuckDuckGo.
And oftentimes when you're looking for something very specific, you'll find that you can't find it on Google.
If it's in there, it's deep, deep, deep, many pages in, whereas DuckDuckGo will give you the relevant search results very quickly.
So something's going on with search engines.
And from your research, what you found is that it can significantly affect the results of elections.
Well, not just that.
It can affect how people think.
It can affect your opinions, attitudes, purchases that you make.
Pretty much it's a mind control machine.
It's the most powerful mind control machine
that's ever been invented.
And by the way,
you should never use the Google search engine.
Never.
Never?
Never.
Why is that?
Because this is what I call,
and this is an S&M platform.
Now, I'm not sure what S&M means to you.
I don't want to pry into your personal life,
but the point is that what I mean by S&M is
this is a surveillance and manipulation platform.
On the surface, there are always two levels to everything with Google.
On the surface, it's like a free public library kind of thing, right?
Yes.
That's always on the surface.
Beneath the surface, it's something different.
From a business perspective, it's an S&M platform.
It exists for two purposes only, and that is to trick people into giving up lots and lots of personal information.
Notice your public librarian doesn't do that.
Notice that they don't actually do that.
And then it's also used for manipulation because they discovered quite a few years ago
that if they control the ranking of the search results, they can control people's opinions, purchases, votes.
Now, they can't control everyone's opinions because a lot of people already have strong
opinions. So the people they're going after are the people who are undecided, the people who are
vulnerable, and they know exactly who those people are. And they literally, your mind is putty in their hands.
So you should never, ever use Google or any other S&M product,
like Amazon Alexa is an S&M product,
or the Google Home device or Android phones.
Android phones are bad.
An Android phone is an S&M device.
It's always listening.
It's always recording.
Android phones are always recording you?
Are you serious?
Yeah.
I mean, I'm questioning this.
I mean, I believe you, but I just want you to elaborate.
Oh, yeah. There have been court cases in which the recordings have been subpoenaed from whoever is controlling that, you know, that so-called personal assistant or that device.
And courts have recovered recordings and transcripts when people are not even aware that they're being monitored.
I know that's the case with Alexa, right?
Yes.
But that's the case with Android phones as well?
Yes. In fact, Android phones, the equipment to prove this, which I didn't bring, but
is so cheap now that literally anyone can confirm this, Android phones, even if they're disconnected from your mobile service provider,
even if you pull out the SIM card, okay, as long as the power's on,
it is recording, tracking every single thing that you do.
So if you use it to read things, if you use it to listen to music,
you use it to shop things, if you use it to listen to music, you use it to shop, whatever it is,
and of course your location is always tracked, then when you go back online, the moment you're
reconnected, it uploads all that information. So some people wonder why their batteries run down
sometimes even when you're not really doing anything with your phone.
That's because with Android phones, I think it's 50 to 60 times per hour it's uploading.
It's uploading about 12 megabytes of data per hour. So that's a lot of energy that requires energy. So, I mean, the kind of phone I have is completely different.
It doesn't do anything like that.
What do you have, like a no agenda type phone?
Do you know that show No Agenda?
No.
It's my friend Adam Curry, who's the original podfather.
He's the guy who invented podcasting.
And his company develops these de-Googled phones where they take out
all the tracking stuff, everything, and it's basically running on a different operating
system.
Right.
So I have a phone that runs on a different operating system.
It's completely de-Googled.
What do you got?
Can you show it to me?
Yeah, I can show it to you.
I'm just interested.
It just looks like any old regular phone.
Right.
But it's not.
Is it running Linux?
What's it running?
No, it's a different operating system.
Can you not tell me?
It seems you're very quick.
It seems like you're trying to hide this, Robert.
Well, the point is I...
Look.
If you go to a website, it says myprivacytips.com.
Okay.
That's an article.
You'll get to an article of mine.
And that article begins, I have not received a targeted ad on my mobile phone or my computer since 2014.
Wow.
my computer since 2014. Wow. So there is a different way to use all the technology that's out there so that you are not the product. Okay. So they're actually, you know, a user making use
of services, but you're not the product. And it can be done. And yeah, is there a little inconvenience involved?
Yes, very little.
Is there some expense involved?
Very, very little.
All these services that you get for free, quote unquote,
they're not free.
You pay for them with your freedom.
If you want to get them in a paid form
so that you're not being tracked,
we're talking $10 to $15 a month.
Literally all of those so-called free services that are really, again, these S&M services,
all of them together are worth $10 or $15 a month.
And how do you use your phone, though, if you want to have a search engine? Are you using a different search engine?
Like, what are you using?
Well, that's changed for me over time.
But right now I'm using the Brave browser.
I use that.
Okay.
That's good.
That's really the best one right now. And then Brave introduced a Brave search engine, which now, fortunately, very recently, you can make the default search engine on Brave.
So Brave doesn't track at all.
Brave works faster than Chrome.
Chrome is Google's surveillance browser.
And Brave works even faster.
They're both built on what's called Chromium.
And Brave works even faster.
They're both built on what's called Chromium, so they're built on the same tech, except that Brave suppresses all ads.
So it works much faster than Chrome does.
And, you know, now, again, you can make the default search engine on Brave literally the Brave search engine.
Do you ever run into websites where they don't work properly because it's trying to upload ads or something and maybe there's a glitch?
Very, very rarely.
And then I will go over occasionally I'll go over to Firefox because Firefox was actually developed by a guy named Brendan Eich, who might be really interesting for you to talk to, by the way. And then he left Mozilla, which was the nonprofit organization
that developed Firefox. Oh, by the way, the connection between Firefox and Google, don't
even get me started. It's disgusting. But the point is, Brendan got sick of that situation, and he founded his own company, and he developed Brave.
So the same guy who developed Firefox developed Brave, very much into privacy, you know, really a forward thinker.
He's an amazing guy.
So when did you first become interested in digital surveillance and privacy and, like, what you're giving up by using these free services like Google?
I wasn't interested at all.
I've been a researcher for 40 years, and I had a lot of research underway.
I've done research on teenagers and creativity and stress management, all kinds of things.
I'm still doing all that stuff. But on January 1st of the year 2012, I got, I don't know, eight or nine messages from Google
telling me that my website had been hacked and that they were blocking access.
So I thought, the first thing I thought was, why am I getting these notices from Google? Who made Google the sheriff of the internet? Why isn't this coming from the government? Why isn't it coming from some nonprofit organization?
I've been a programmer since I was a teenager.
And then I started wondering, wait a minute.
Okay, they're blocking me on the Google search engine.
I get that.
That's them, right?
So they have these crawlers that look at all the websites every day.
And their crawler found some malware on my website.
That happens all the day too.
Everyone gets hacked.
I'm sure you've been hacked, and Google itself has been hacked.
So I get that they're blocking me on Google, Google.com search engine.
I get it.
Okay.
But I notice they're also blocking me on Firefox, which is owned by a nonprofit.
They're blocking me on Safari, which is owned by Apple. I thought, how could that be?
These are completely separate
companies. Took me a while. Took me a while to figure this out. I finally published a piece in
U.S. News and World Report, an investigative piece called The New Censorship. And I described nine of Google's blacklists. This was 2016, so this was a while ago. In detail, I described nine of Google's blacklists. I explained how the blacklists work. I explained Google can literally block access on multiple website that Google at one point in time, 2009, I think it was, I don't know, I might get the date wrong.
Let's just say January, whatever, 30th.
Google blocked access to the entire internet for 40 minutes.
Google, Google, anyway, in this article...
When you say that, with all browsers?
When you say blocked access to the entire Internet,
it's like if you use the Brave browser back then.
Did it even exist back then?
Probably didn't exist then.
Brave didn't exist, but no, there were...
Well, there were lots of search engines.
Google was not the first search engine.
It was the 21st search engine.
So, but what I'm saying is with all web browsers, it blocked access to the internet. It blocked access to virtually the
entire internet to virtually everyone in the entire world for 40 minutes. What? And this,
this was reported in the news. So is what's happening with their, their system is because
so many people are searching for things because they're monitoring so many different things to add to their search engine, do they have some sort of ultimate control over the internet in some weird way?
Here it is right here.
Google blacklists entire internet.
Glitch causes world's most popular search engine to classify all web pages as dangerous.
Wow.
search engine to classify all web pages as dangerous.
Wow.
Google placed the internet on a blacklist today after a mistake caused every site in the search engine's result page to be marked as potentially harmful and dangerous.
Holy shit.
The fact that they can even do this, I like how it gives you at the top, this article
is more than 12 years old.
Okay.
Imagine that 12 years means it's ancient.
They wrote it on stone tablets 12 years ago. Okay. Imagine that, like, 12 years means, like, it's ancient. Like, they wrote it on stone tablets 12 years ago.
Yeah, but, you know, this is nonsense.
This report is nonsense.
Is it?
Of course.
They, this is, Google is full of geeks.
Okay, I'm part geek, so I can relate.
I can speak geek.
And geeks for fun, okay, sometimes for profit, but most of the time it's just for fun, just to be cool and get their kicks and show how powerful they are.
To be leet.
Yeah, so they do crazy things.
So they shut down the internet. I guarantee you it was a geek thing because you know why I figured that out?
Because I kept wondering, why did they shut it down on this super early morning on a Saturday?
Why?
What's so special about that little period of time?
And it took a while and I figured it out. It's because that is one of the
only intervals of time in the entire week when every single stock market in the world is closed.
So they did it to show that they could do it and have their fun, but they didn't want to get attention.
And if they had interfered with financial transactions, they would have gotten a lot of attention.
So no one was ever caught.
No one was ever caught, but they never denied that this happened either.
So this was done through Google for sure.
They know this how?
Through Google, for sure. They know this how?
It's reported in the news reports, and Google was queried, and Google said, yeah, that did happen. Yeah, we fixed it.
So how does Google have the ability to even do something like that? How can that even be done? Well, that's what I explained in that article. They have blacklists. let me let me jump ahead and then i'll okay okay
but let me just jump ahead for a second because i you gotta you gotta see really how sinister this
whole thing is it's just seriously if you knew if you knew a half of what i know about all this
dark tech stuff you would just say the hell with it and just give up.
You'd say, I don't want to bring up kids in this kind of world.
This is too crazy.
Anyway, blacklist.
I feel like we need to stop you there and make you elaborate.
What are you saying?
Well, what I ended up doing,
which I think we should get to later in some detail if you're still interested.
Yes. What I ended up doing was I started doing randomized controlled experiments to see what kind of shenanigans are out there, to see what kind of power these companies have, especially Google.
out there to see what kind of power these companies have, especially Google. And I am still almost month by month making more discoveries, running more experiments, getting very disturbing
data. I mean, so disturbing. We just figured out, I think within the last month, that a single
question and answer interaction on Alexa. So you ask Alexa a question, and let's say it's
about, I don't know, some political issue or political candidate, something you're undecided
about. So you ask Alexa, and Alexa gives you back an answer. And the answer, let's say,
has a bias. In other words, it favors, you know, one candidate, favors one party, favors one cause,
In other words, it favors, you know, one candidate, favors one party, favors one cause, right? A single question and answer interaction on Alexa in a group of, say, 100 undecided people can shift opinions by 40% or more, one interaction. If there are multiple questions asked on that topic over time, you can get shifts of 65% or more with no one having the slightest idea that they have been manipulated.
But are they doing it to manipulate you? Or is it just the fact that they distribute this
information based on their algorithm? It's manipulating you just by default because the higher or more likely you
find this information from the search engine like that's what you're gonna that's what's
gonna influence your opinion but are they doing it to influence your opinion or is that just
the best answer like if you have a question who who is dr robert epstein yes who is he. Robert Epstein? Yes. Who is he? Who is he?
Yes, exactly.
That's you.
So if I ask that to Alexa
and then it pulls up these results,
it's going to pull up supposedly
the most relevant result.
Now, are they,
they have to have some,
if you have something like Alexa
where you're asking a question
and it's just reading it back to you,
there has to be like some sort of curation of that information, right?
Confession.
Okay.
Okay.
I have not followed Joe Rogan over the years.
Okay.
I have five kids.
My two eldest sons are like your biggest fans in the universe.
Okay, I have five kids. My two eldest sons are like your biggest fans in the universe. My eldest son is technically a bigger fan than the other son because he's recently gained 60 pounds because of COVID. So he's definitely the bigger of the two fans. This is Julian and Justin. You get it. So anyway, but I don't follow Joe Rogan, right?
Okay.
So now I've had to bone up and actually had to listen.
I was forced.
I had to listen to some of your shows, and I'm thinking, wow, this is interesting.
This guy is genuinely curious about things.
You really are genuinely curious. It's crazy.
Well, what's crazy is that that's crazy. That's not, it's not crazy to be curious.
Most people are curious, aren't they? No, not like you. Cause you, you actually,
you, you dig in and you really want to know. And I'm, I am, I'm now I'm so that's, that's
right now. Now I'm going to say something that's not so nice, which is on this issue, by the questions you're asking me, I can tell you have no idea what's going on.
Well, I kind of do, but you have to understand the way I do a show.
Okay.
One of the things that I do when I want you to elaborate on information, it's like maybe I know something, but I want you to elaborate to everybody else that's listening.
So you pretend you don't know.
I don't pretend I don't know. I just ask you questions.
I don't play stupid, but I do ask questions like,
please tell me more or elaborate
or where did you come up with this
or how do you know this for sure?
Maybe I know how you know it for sure,
but I want you to tell everybody.
So that's what I say.
The question you asked was-
The dark stuff.
Well, you're saying all this stuff that looks, maybe it's biased, maybe it might influence
people. Where is it coming from? Maybe it's just the algorithm. Well, let's say it's just
the algorithm. Well, the algorithm was programmed by human beings. Okay. And those human beings have biases.
They have beliefs.
And there's a lot of research now showing that that bias, whether it's conscious or unconscious, gets programmed into the algorithms.
So the algorithms all by themselves have biases built into them.
There's one way it can go.
Second way it can go,
the Marius Milner effect. You ever hear of Marius Milner? No. Okay.
Oh, this is great. This is great. Okay. Marius Milner. Okay. A few years ago, you probably heard
that Google got busted because their Street View vehicles were driving up and down. They're still
driving up and down streets all over the world, but they had were driving up and down. They're still driving up and down streets all over the world,
but they had been driving up and down streets all over the world,
more than 30 countries, for more than four years,
and they weren't just taking pictures of our houses and our businesses.
They were also sucking up Wi-Fi data, massive amounts.
I mean, we're talking terabytes of Wi-Fi data, passwords, everything,
including a lot of very deeply personal stuff. So someone just like me, a professor type,
figured this out, reported them to the government. The government went after them.
And so this is called the Google Street View scandal. And so they got a fine, $25,000
for interfering with the investigation. And they blamed the entire thing, Google blamed the entire
operation on one software engineer. His name is Marius Milner. So they fired him. Oh, no, no, that's not true. He's a hero at Google. He's
still working there. If you look him up on LinkedIn, his profession is hacker. He's a hero
at Google. They didn't fire him. Okay, they love this kind of stuff. So another possibility, besides the algorithm itself, is a single rogue programmer at the company can fiddle That shifts thinking and opinions and behavior and purchases and votes.
A single rogue programmer can do it.
And then, of course, there's the executive level.
The executives can pass along a mandate, a policy.
Does that ever happen? Oh, yeah. One of the links,
leaks rather, one of the leaks from Google that you may have seen, and I know the guy
who leaked it, it's Zach Voorhees, who's also a good person for you to talk to because he was a
senior software engineer at Google for eight years, and then he just couldn't stand it anymore
and he quit. But
unlike most of these people who've walked away, he brought with him 950 pages of documents and a
video. The video is two minutes long, and it shows the CEO of YouTube, which is owned by Google.
Her name is Susan Wojcicki, and she's talking to her staff, and she's explaining, this is 2017 after the horrible election results of 2016, and she's explaining how they're altering the up next algorithm in YouTube to push up content that they think is legitimate and to suppress content that they think is not legitimate.
think is legitimate and to suppress content that they think is not legitimate. So if it's happening at that level, the executive level, again, it still has the same effect. Any of these possibilities,
and there are others as well, ends up giving us content that impacts us and our kids especially, in ways that people are entirely unaware of.
So the way I like to put it is this.
You don't know what they don't show.
Now, I'm still confused as to how Google can blacklist websites
and how they can shut down the entire internet for 40 minutes
because do they have a switch i mean like is there a connection that all websites go through
google like how is that possible about three years ago they shut down all of japan
accidentally uh well that's you know that would take a whistleblower to figure that one out.
It was in the news at one point that the guy who was in charge of making these decisions, he actually has left Google.
He once shut down an entire domain name, which had 11 million websites on it, because he thought it was
kind of poor quality. Poor quality? Yes. Poor quality, like how so? I don't know.
This is just his take that it was poor quality? I have a copy of the internal manual. I'm happy
to send it to you from Google that's showing the criteria they use in deciding which content to suppress.
And some of the criteria are pretty straightforward, having to do with pornography and things like that.
And then there's this wide open area that says, or anything else.
Or anything else.
Pretty much, yeah.
So it's up to the discretion of the engineer.
There's a lot of discretion involved in making these decisions. And a lot of the decisions that
get made in very recent years since since Trump was elected, they happen to be decisions for the
most part that suppress conservative content, but not always, not always. Now I'm going to circle back.
Can you please explain again?
Sure.
I still don't know.
How do they shut down the internet?
How does Google have that ability?
Let's see.
I can answer the question, but it's not a simple answer. It's not like they have a switch, okay?
Okay.
But I'll give you a couple of clues here.
Okay.
Okay. First of all, what's the most popular browser right now? It's Chrome, by far. Well, Chrome is their browser.
By far. Well, Chrome is their browser. So obviously anyone using their browser, it's a simple matter for them to block anything, to block access to anything through Chrome. So that one's easy, right?
Okay. They can block access to anything through their search engine, which is used for 92% of all search around the world. So that takes care of a lot right there.
Then we get to, let's say, Siri. Do you use an
iPhone or Apple?
I use both. I mean, iPhone or Android, you mean. Yeah, I use both.
Yeah. So Siri, where does Siri get all her answers from?
Google? Google. Oh, good guess. Nice. Yes.
So, okay.
Let's take, oh, let's take Firefox.
Okay, Firefox.
Before Firefox takes you to the website that you just typed in, guess what?
They have to make sure it's safe.
So how do they make sure it's safe?
I don't know.
They check?
Well, they check Google's blacklist.
This is what happened when you would, that day,
or during that time period,
when you searched something on Google and you clicked it,
you would get this.
Warning, visiting this website may harm your computer. I think maybe you could
continue through like you can, because it happens from time to time now for strange reasons. I don't
know. Do not proceed to internet. Yeah, I don't know what happened then. What about if you go
through Safari or what if you go through Apple's browser? Safari, same thing. Safari, before they
take you anywhere, they've got to check Google's blacklist.
So not only is Google getting information about your search on Safari,
the fact is if Google wants to block you from going there through Safari,
they just add it to their blacklist.
In other words, if they put everything on their blacklist, then no one can reach anything.
Really?
Yeah, really.
So all browsers go through Google, except Brave, right?
Except Brave, yeah.
That's the only one?
You know, there are small browsers out there no one's ever heard of, but I mean, Google's
influence on the internet is beyond monopoly.
They're really in charge.
Outside of China and North Korea, they're in charge of pretty much everything that happens on the Internet.
Yahoo.
Let's take Yahoo.
Yahoo used to be one of the big search engines.
used to be one of the big search engines.
And some people still use it, except Yahoo stopped crawling the internet
about five years ago or more.
They don't crawl the internet anymore.
They get their content from Google.
Really?
Yeah.
So Yahoo isn't really a search engine.
It just searches Google.
And your second favorite, DuckDuckGo,
is also not a search engine.
God damn it.
What is it?
They have a crawler.
They do have a crawler.
And they do a little crawling.
But actually what DuckDuckGo does is it's a database aggregator.
They're checking databases.
And what is the difference there?
Oh, night and day. In other words, Google is literally looking at billions of websites every day,
and it's looking for updates and changes and new websites and this and that.
So it's crawling and it's extracting information, especially looking for links,
because that's how it gets you good information.
It looks for what's linking to what.
But DuckDuckGo doesn't do that.
DuckDuckGo is looking at databases of information.
And it's trying to answer your question based on information that is in databases.
Lots of different databases.
That's not what Google does.
Google is really looking at the whole internet.
And the Brave search engine, what does it do?
The Brave search engine is crawling. So it is crawling. It can't do it
at the same level that Google can. But obviously, this guy, you know, Brent Enyk is very ambitious.
So he's, you know, he wants to do it at that level. So no, no, they're doing,
Brave is trying to do what Google does, except preserving privacy and suppressing ads.
And it seems like what happened with Google before anyone even understood that the data is so valuable.
Before anyone – it was too late.
It was too late. It was already an inexorable part of day-to-day life that people were using that and that people were using Gmail and using all these services and just giving up their data.
Yeah.
Yeah.
Well, that's – So there's no consideration – like there's no regulation.
No, there's no regulation. There are no laws.
There are no laws. And in fact, the courts have ruled over and over again when someone has gone after Google that Google can do whatever they want. So I'll give you an example. A case have been that same guy that I mentioned earlier.
I think his name was Matt Cutts or something like that.
They all of a sudden shut down hundreds of URLs that were, that were, that eVentures was using for its business, saying they were not good quality.
Okay.
I mean, for you to get that much information out of Google is like pulling teeth because normally they just don't tell you anything. But anyway, so they shut them down, nearly shut down the company. The company decided to sue.
a couple years because they wouldn't provide any information through discovery. And Google always does that. They just stonewall you just even on discovery, which is like preliminary
stuff before a lawsuit. Anyway, so eVentures keeps pushing, pushing, pushing, pushing,
finally goes to court, and eVentures loses. And they're slaughtered. Literally, the decision of
the judge in the case was, Google is a private company. It can do what it wants. It can demote
you in search rankings. It can delete you. It can block access to your websites. It can do anything it wants.
Literally, that was the decision. So let's say if Donald Trump runs again in 2024,
and they have a Trump campaign website, Google can decide that that website is a poor quality
and deny people access to it so that when people go to Google Donald Trump,
they will never see his website.
Correct.
That's wild.
Well, they block access every day to several million websites.
So this is not a rare thing that they do.
And they block access based on their own decisions.
They're internal.
They don't have to justify them.
They don't have to have a criteria that they can establish that they're doing the right thing. They just do it.
And in the United States, there are no relevant laws or regulations in place to stop them.
Do our regulators and do our elected officials even understand this? Is this something that is of concern to them? Has this been discussed? There are a couple of them who understand.
And there are a couple of the attorneys general whom I know who understand. Doug Peterson from Nebraska, he totally understands. Ted Cruz, he was behind my invitation to testify before Congress.
A couple months later, he invited me to DC.C. We sat down, had a four-hour
dinner. Fabulous. We never stopped talking. And we never talked politics. We did not talk politics
the whole time. We just talked tech. Cruz totally understands. But he's hamstrung. He can't.
How do you fight the most effective mind control machine that's ever been developed, which also is very rich, has $150 billion in the bank right now in cash, makes huge donations to political candidates, and then can shift votes, millions of votes nationwide without anyone knowing that they're doing so.
How do you fight that?
And it's not something that they set out to do when they first created the search engine.
It seems like because of the fact that this is something that was, you know, it was initially so you could search websites.
It was that's what it was. Right.
Did they did they know when they first made this that they were going to be able to have the kind of power that they have today?
Or is this something that we all have sort of awoken to?
OK, I don't know.
Sergey Brin, Larry Page, the founders, I don't know them.
I've lectured at Stanford in the same building where they invented Google, which is kind of cool.
But I don't know them.
But I think these guys were and probably still are utopians.
I think they had the best intentions in mind.
The first top executive ever to leave Google is a guy named James Whitaker, who's gone completely silent,
by the way, in recent years, completely silent. But he was the first real executive to leave
Google. He finally issued a statement. He was under pressure. You know, why did you leave?
Why did you leave? He issued a statement, which you can find online. It's fascinating to see this.
And he says, look, when I first joined Google,
which was practically in the beginning, he said it was just a cool place and we were doing
cool things. And that was it, he said. And then he said, a few years later, he said,
we turned into an advertising company. He said, and it was no more fun. It was brutal. This is brutal, profit-driven ad company.
Now, if you don't think of Google as an ad company, then, again, you're not getting it.
They are the largest advertising company by a factor of 20.
I think the next largest one is based in London.
But Google is, what it's doing is tricking you, tricking all of us. Well, not me
personally, but it's tricking you into giving up personal information 24 hours a day, even when you
don't know you're giving up personal information. And then it's monetizing the information mainly by
connecting up vendors with potential buyers. It's an advertising company and so whitaker actually
quit because the the nature of the business changed and then of course everyone knows
about google's slogan right don't be evil yeah but no one seems to know that they dropped that slogan in 2015. Didn't they just add it to a part of a larger slogan?
Didn't we go over that, Jamie?
There was like a thing.
I'm trying to remember what they exactly did because we were sort of like, oh, my God.
They said don't be evil, and now they don't say it anymore.
Maybe they're evil.
But I think they had added something and made it longer. And so it
wasn't that it's not their slogan anymore. It's just their slogan sort of morphed. Right? Was that
it? Jamie will find it in a moment. Can I go back to something? Please do. Okay. I just want to go
back to Blacklist. Yes. Because I wrote this big piece
on nine of Google's blacklists. Their biggest one is called the quarantine list. That's that list
that Safari has to check. And that list that Firefox has to check. Everyone has to check that
list before they take you to a website. So that's a simple way, simple tool that Google uses to
block access to websites.
Because we go to websites through browsers, right? Okay, there we go. I had never seen any of those nine blacklists, but I knew they existed as a programmer, and I talked about each one in detail.
2019, I'm invited to testify about my research on my experiments on manipulation and, you know, how I monitor
elections now and all that stuff. So who testifies before me? A top executive, a vice president from
Google. He's under oath. You know, he's sworn in. The senators are asking him some really tough questions. And he's asked, point blank, does Google have blacklists? I think the full question might have been, does Google have white lists and blacklists? And his reply was, no, Senator, we do not.
No, Senator, we do not.
So that was July of 2016.
2019, rather, 2019.
Okay.
In August, literally three weeks later,
Zach Voorhees, who I mentioned earlier,
that's when he leaves Google.
Google sends like a, what's that called when the police,
a SWAT team.
Google sends a SWAT team after him.
I kid you not. Yep. So they were all very unhappy because he stole all this stuff,
okay? And he sent it, he put it all in a box and sent it to the Attorney General of the United States, okay? This is 2019, August. This is only less than a month after this hearing.
August. This is only less than a month after this hearing. So he's got 950 pages of documents,
all kinds of documents, and three of them are Google blacklists, which are actually labeled blacklists. Now, if I were putting together blacklists at my company, I would call them shopping lists. I would call them, you know, I don't know, makeup lists, you know, lists for my kids' birthday presents, or I wouldn't call them blacklists. So there are actually three of them he walked out with.
So you can look at the list. You can see who's on the list. You can see these are almost all or many of them prominent conservative organizations. There are no left wing organizations on those people on blacklist? Is it maybe some, I mean, it's obviously speculation, but is it maybe some sort of a deal that they've made
with certain politicians? Is it something they've decided on their own because this is the right
thing to do to suppress the bad people that put Donald Trump into office, like why are they doing that?
What you just did was amazing.
What did I do?
Because you got at almost all of it.
You just came up with it hypothetically, but you left out one area.
So the two areas you just nailed, one is to make money.
Right.
So they have three motives.
One is to make money, and that they do extremely well.
And no one who's tried to tangle with them has stopped that process. In other words,
the rate at which they're making money continues to increase every year. So a few years ago,
when I was first looking at them, they were bringing in $100 billion a year. Now they're bringing in $150 billion a year money. That's number one.
Number two, values. Okay. I could talk for hours on this issue because of recent leaks of videos,
PowerPoint presentations, documents, and of course what whistleblowers have been revealing.
They have very strong values there because the founders had very strong values and they hired people who had similar values and they have really strong values.
And they want the world to have those values.
They really think that their values are more valuable than other people's values, which means they don't understand what values are.
And so their values are, they're pretty much all of tech is very left-leaning.
Very left-leaning. So 94, 96% of all donations out of Google go to Democrats,
which I sympathize with. I'm from a family of Democrats. I lean left. So I say, yeah, fine.
That's fine. That's fine. But it's not fine. It's not fine. Because they have the power to impose
their thinking on other people around the world in a way no one has ever had such power, ever.
So values is second. And this is serious. One of the leaks from Google was an
eight-minute video, which you should definitely watch. It's so creepy. And it's called The Selfish
Ledger. And it's eight minutes. And it was put together by their advanced, their super secret
advanced products division. It was never meant to leak out of that company. And I have a transcript
of it too, which I've published,
so I can get you all that stuff.
But the point is, what is this about?
This is about the ability that Google has to re-engineer humanity
according to company values.
Re-engineer humanity according to company values.
Yes.
And this is a directive?
Like this is something they're doing purposely?
Well, in the video, they're presenting this as an ability that we have.
This is an ability that we have.
So that's the second area. You nailed it. Third one you didn't mention. The third one is
intelligence, because they had some support, Page and Brin, right in the very beginning at Stanford,
they had some support and had to be in regular touch with representatives from the NSA, the CIA, and another intelligence agency.
The intelligence agencies were doing their job.
They realized that the Internet was growing.
This is the 1990s.
So they realized that the Internet is growing.
So they realized that the Internet is growing.
And they were thinking, hey, these are people building indexes, indices to the content. So sooner rather than later, we're going to be able to find threats to national security by looking at what people are looking up.
If someone is going online, they're using a search engine to find out instructions for building bombs, for example, okay, that's a potential threat to national security.
We want to know who those people are. doesn't do this, but right from the very, very beginning, the Google search engine was set up
to track and preserve search history. So in other words, to keep track of who's doing the search
and where did they search. That is very, very important to this day for intelligence agencies.
So Google, to this day, works very closely with intelligence agencies, not just in the U.S., but other agencies around the world.
So those are the three areas, money, values, intelligence.
And the intelligence stuff is legit.
I mean, it's legit.
You know, it is an obvious place. If you're in law enforcement,
that's an obvious place to go to find bad guys and girls.
Yeah. So Google has this ability that they've proclaimed that they can sort of shift culture and direct the
the opinion of things and direct public consciousness
what percentage like how much of a percentage do you think they have in shifting do they have like
a 30 swing like what well see this is what i do, this is what I do. Now you're getting close to what I actually
do, what I've been doing for now for over nine years. I quantify. This is exactly what I do
every single day. That's what I do. That's my team, my staff. That's what we do. And it's cool.
Talk about cool. We're doing the cool stuff now. Okay, Google is not. We're doing the cool stuff. Because we have discovered a number of all that stuff, we measure the ability that these tools have to shift thinking and behavior. percentages, proportions. We can make predictions in an election about how many votes can be shifted
if they're using this technique or these three techniques. And so we, yeah, that's what we do.
So we started with the search engine, and it took years, years of work, but we really, I think at this point, have a good understanding of what the search engine can do.
But then along the way, we discovered other tools that they have and which they are definitely using.
And how do we know they're using these tools?
Well, we can get to that.
What are the tools?
Well, the first one we called SEAM, Search Engine Manipulation Effect.
And that means they're either allowing, you know, one candidate or one party to rise to the top, you know, in search rankings, or they're making it happen.
And you don't know for sure whether, you know, which is occurring unless there's a whistleblower or there's a leak.
Okay, but the fact that it's occurring at all, that's important. I mean, in a way, we don't care. Because if it's just the
algorithm that's doing it, well, that's horrible. That means literally a computer program is
deciding who's going to be the next president, who's going to be the next senator? Do we want that decision made by an algorithm? So anyway, we spent a lot
of time on that. We're still studying SEAM. Then we learned about SSE, which is search suggestion
effect. When you start to type, oh, in fact, if you have your phone handy, this will be fun. If you start to type a search term into the box, a search box, there are suggestions flashed at you.
As fast as you're typing, that's how fast those suggestions come.
Well, guess what?
By manipulating the suggestions that are being flashed at people, we could turn a 50-50 split in a group of undecided voters into nearly a 90-10 split.
Wow. Without anyone having the slightest idea that they're being manipulated.
That's just by manipulating search suggestions.
Just by suggesting.
Yes.
That's just by manipulating search suggestions.
Just by suggesting. Yes. And the reason why we started that work is because in June of 2016, a news organization, a small news organization, released a video which went viral on YouTube and then got blocked on YouTube. Frozen. Still frozen.
But then it continued to go viral on Facebook, so 25 million
views. In this little video, this news organization is saying, we've made a discovery.
When you go to google.com and you look for information about Hillary Clinton, you can't get any negative search suggestions.
Really?
Really. And they showed this in their little-
What if you Google Clinton body count?
You could not get negatives.
Really?
Yeah. It would give you nothing probably for Clinton body count. But as you're typing,
It would give you nothing probably for Clinton body count.
But as you're typing, you go Clinton B.
It would go, you know, Clinton buys the best clothes.
I don't know.
It would give you something like that.
It would not give you something negative.
So, for example, Hillary Clinton is, you do it on, and they showed this.
You do it on Yahoo.
You do it on Bing. And you get Hillary Clinton is the devil.
Hillary Clinton is evil.
Hillary Clinton is poison. And literally, they're showing you eight or ten items that are extremely negative.
You can check on Google Trends.
That's, in fact, what people are really searching for.
So Bing is showing you what people are searching for.
searching for. So Bing is showing you what people are searching for. Hillary Clinton is on Google at that time gives you, guess what? Hillary Clinton is awesome. Hillary Clinton is winning. That's it.
Two suggestions. So that's why we started doing this research on search suggestions because I kept thinking, why? Why would they do that?
Why would they suppress negatives for a candidate they presumably support? And we figured it out.
It's because, did you ever hear of negativity bias? Yes. Okay. So this is also called the
cockroach in the salad phenomenon.
So you've got this big, beautiful salad.
You see a cockroach in the middle.
It ruins the whole salad.
We are drawn, our attention is drawn to negatives.
Negatives.
Right.
And that's good for evolutionary purposes.
Good for survival.
So it ruins the whole salad.
The opposite doesn't work.
If you have a plate of sewage and you put a nice piece of Hershey's chocolate in the middle, it does not make the sewage look any more appetizing. So, we're drawn to negatives.
If we allow one negative to pop up in a list and the rest are neutral or positive suggestions,
that one negative for certain demographic groups can draw 10 to 15 times as many clicks as the other suggestions.
So one of the simplest ways to support a candidate or a cause is for your candidate or cause, okay, you suppress the
negatives. It's a simple lookup. You're looking up what's called the linguistic valence of the term.
It's a simple lookup. It takes, you know, a nanosecond. And if it's your cause, your candidate,
okay, you delete it. It's gone. People don't see it. But you let the negatives pop up in the search suggestions for the other candidate or the other cause. And what that does is it draws people who are looking up, let's say, Donald Trump.
to websites to, well, first of all, it generates search results that make that person look bad because you just clicked on Donald Trump is evil. And so you clicked on that, caught your attention,
boom, you get a bunch of search results that support that. You click on any of them and
now you're at a website that makes him look terrible.
Very, very simple kind of manipulation. So subtle. All you do is suppress negative suggestions for the candidate or the cause that you support. And as I say, so we did a series of experiments. We
figured out, okay, to what extent can we mess with a group of 100 people or 1,000 people?
And, yeah, we can turn a 50-50 split among undecided voters into nearly a 90-10 split.
When did they first start implementing this sort of search engine manipulation?
When did they implement the suggestion manipulation?
Well, we were able to estimate that to some extent. And by the way,
the, this, this landscape keeps changing. So I'll give you an example. When they first came
up with search suggestions, actually one engineer there came up with this thing, and it was cool. And it was an
opt-in thing when it first came out. I think it was 2009. And it was cool. And it was helpful,
because that was the idea initially. So then over time, I think, with a lot of these services,
a lot of these little gizmos, people figured out that, wait a minute, we can do things that maybe we didn't intend to in the beginning, but we can use these for specific purposes.
So anyway, so at some point or other, a couple of years later, it was no longer opt-in.
In fact, it was automatic and you can't opt out.
That's the first thing that happened.
And then you may remember there were always 10 items in the list initially.
But then 2010 or so, suddenly they dropped to four items.
or so, suddenly they dropped to four items. So in our experiments, we actually figured out why they were showing four items, and we went public with that information in 2017,
and three weeks later, Google went back to 10 items.
Why do you think they went to four?
Because four is exactly, we know from the research, is exactly the number of search suggestions that allows you to maximize your control over people's searches.
Because, look, if the list is too long and you've got a negative in there, they're not going to see it.
I mean, imagine if you had 100 search suggestions and you had one negative, right?
So it has to be short enough so that the negative pops out, right?
But it can't be too short.
If it's too short, then the likelihood that they type in their own damn search term and ignore your suggestions goes up.
So there has to be this optimal number.
It turns out the optimal number to maximize your control over search is four.
Wow.
And we also learned that you are being manipulated on Google from the very first character you type into the search box.
If you have a phone handy, I can prove it.
Okay.
So I'll Google. And this is going to be,
by the way, the last time. So you're all, those of you who are watching or listening,
you're all witnesses. This is the last time that Joe Rogan ever uses Google.
Really? Well, watch. Okay. Okay.
So you got Google up there, right?
Yes.
And you're in the search box?
Yes.
Type A.
What's it suggesting?
Amazon.
Yeah.
Well, it's doing more than one suggestion.
What are the suggestions?
Amazon Academy Sports and Outdoors,
Amazon Prime, Houston Astros, and then a bunch of other people.
Alamo Drafthouse, American Airlines.
So your first and third suggestions, notably the first position is the most important, are Amazon.
Yes.
Well, it turns out everywhere in the world where Amazon does business,
if you try to search for anything beginning with the letter A and you type A, Google suggests Amazon.
Why is that?
Well, it turns out Amazon is Google's largest advertiser and Google is Amazon's largest single source of traffic.
It's a business relationship.
Get it?
If you type T, you're going to get Target and so on.
But what's interesting is when you type G.
Just type G.
All right.
What do you think I would get?
Well, tell us.
Tell us what you got.
Grand Seiko.
Nothing interesting on there at all?
No. Gastronomical. And then number four is Google Translate. Number five is Gmail. Number six is Google.
Okay. Oh, it's starting to see a pattern here.
Yeah, but I mean, like, the first ones are all, like, something that I would look up.
Well, they know your history, right?
So they know who you're –
So the first ones with G, they'll allow you to have a little –
like they'll allow you to actually look up the things you're interested in or suggest things you're interested in?
First of all, you're Joe Rogan, okay?
So they may allow you to do all kinds of things.
Do you like have specific allows for people like me?
Yeah.
Everything's personalized.
But I mean,
is it personalized on purpose or personalized through the algorithm that sort of represents what you normally search for?
Yeah.
That's called on purpose.
Yeah.
No,
but I mean,
like it's,
they're not doing it specifically because it's me.
Like, if I was any other person that was maybe anonymous, but I also looked up those things.
For most people, to answer your question, for most people and folks out there, literally, pick up your phones, go to Google.com, which, by the way, this is the last time you're ever going to use google.com. But just type in G and see what you see. Most people, if they're
getting five suggestions, four out of the five will be for Google. So the lesson there is if
you're starting a new company, don't start the first- Don't name it with a G.
Don't name it with a G, right.
Yeah.
No G, because...
So what they're showing, and the point is,
has to do with their agenda, their motives, okay?
Every single thing that they're doing
has to do with their motives,
which have to do with money, values, and intelligence.
And a public library does not do that. You go, you borrow some books,
you ask some questions, you get some answers. That's that. That's the way the internet was
meant to be. It wasn't supposed to be this. The, Internet around the world controlled mainly by two huge monopolies and to a lesser extent by some smaller monopolies like Twitter.
It wasn't supposed to be that way.
It was supposed to be like the public library.
Right.
And it is possible, you see, you can set up a company like Brave that doesn't play these stupid games and doesn't fool you and it's
not deceptive. This is the business model that Google invented. It's called the surveillance
business model. It's fundamentally deceptive because up here at the level that you're
interacting with it, it looks like public library free.
Cool.
And down here underneath, it's something completely different.
There's no reason for that.
Tim Cook, who's still the CEO of Apple, has publicly said, this is pretty recent, publicly said that this is a creepy business model and it should not be allowed.
Well, that is one area where Apple deserves credit, right? That Apple has not taken up that same sort of net-like surveillance where they just kind of cast a net over everything you do
and then sell it to advertisers. And you can opt out of certain things in terms of like allowing apps to track purchases or allowing apps to track your,
your use on other devices or on other applications rather.
I wish I could agree with you,
but I can't because the fact is Apple is still collecting all this information.
Apple is still listening.
It's,
they're doing the same things.
It's just that at the moment, so far, you know, under
the leadership they have right now, okay, but that can change in a heartbeat. Let's talk about
Microsoft. Okay. So you probably know that Microsoft was Google's enemy number one.
Microsoft was sued Google in practically every courtroom in the world. Microsoft was Google's enemy number one. Microsoft sued Google in practically every courtroom in the world.
Microsoft was submitting regulatory complaints.
Microsoft was funding organizations that existed to do nothing else but fight Google for a long, long time.
Early 2016, Google and Microsoft signed a secret pact.
So the fact that the pact was signed, that somehow leaked.
But to this day, no one knows the details of what's in it, except here's what happened.
Simultaneously, both companies around the world dropped all complaints against each other.
companies around the world dropped all complaints against each other. Microsoft withdrew all of its funding from all the organizations it had been supporting. And there are some
people who believe because of Bing, Microsoft's search engine, which draws about 2% of search, by the way. It's no Google.
It had been bleeding money for Microsoft for years,
and some people believe that Bing, as part of this deal,
started drawing search results from Google.
We don't know, but we do know this,
that Windows 10 is a tracking tool.
Windows 11 is a tracking tool.
These new operating systems are so aggressive in tracking that it's very, even if you're a tech geek like me, it's very, very hard to get rid of all the tracking. So I'm still using Windows 8.1,
believe it or not, or Windows 7. Why didn't you switch to Linux or Unix or something like that?
Well, we use that for certain purposes as well. But I mean, for general stuff that you do,
you kind of, you know, if you're using desktops and laptops, Windows is still the way to go.
Except the company shifted.
It has been shifting towards the surveillance business model, as thousands of other companies have, including Verizon.
Just because it's so profitable.
It's so easy.
See, you're getting all the information anyway.
All you're going to do now is start to monetize it.
You're just building new parts of the company that no one even sees.
Right.
And the real issue here seems to be that this wasn't a thing 20 years ago.
It's a thing now, and it's the most dominant thing in terms of the way people access information, the way people get data, the way people find answers.
What is it going to be in 20 years from now? I mean, it seems like there's so much potential
for control and so much potential for manipulation and that it could only just get worse. If there's
no regulation put in place and there's no way to stop use of algorithms, use of curated data,
There's no way to stop use of algorithms, use of curated data.
Like, what is this going to be like?
Have you sort of extrapolated?
Have you looked at the future?
Yeah, that's what I do.
That's what I do every day.
It's depressing.
What do you think is happening?
What do you think, like, when we're looking at 20 years from now, what's going to happen?
Well, you might not believe my answer answer but 20 years from now already happened how so it's now it's here now
okay eisenhower you're not as old as I am, but you probably remember. I know the speech.
Ah, famous speech.
Yeah.
And everyone always points to certain language from his speech. This is his retirement speech,
his last speech just a few days before John F. Kennedy became president.
And it was a very shocking speech because this is a guy who was head of allied forces in World
War II. This is a, you know, I don't know, four-star general.
I mean, he's a, you know, he's an insider.
And in this speech, he says, you know what?
This terrible kind of entity has begun to emerge, you know, and I've watched it.
And he called it the military-industrial complex.
And you probably remember hippies,
like, you know, with signs and screaming, no military industrial complex. And Eisenhower
actually warned about the growth of this military industrial complex and how it's
taking over businesses and it's affecting the government and blah, blah, blah. What people
failed to note is that he also warned in the same speech about the rise of a technological elite that could control public policy without anyone knowing.
This was 1961.
Really?
Technological elite. Same speech.
What technological capabilities were even available back then other than the media other than you know broadcast television and radio
Well, but it means that whatever he was seeing behind the scenes
See, oh Jesus was scaring him and I what I have to tell you is that you're worried about 20 years from now
The technological elite are now in control
so a lead are now in control. So Apple, Google, Facebook, and to a lesser extent, the other
social media platforms. Correct. And Google is by far the most aggressive, the most dangerous.
You know, Facebook, there's chaos within Facebook. But, you know but we had this amazingly from Francis Haugen just recently of documents showing that people at Facebook are very much aware that their social platform creates turmoil, terrible turmoil on a massive scale, and that they like that.
They encourage that because the more turmoil, the more traffic, the more traffic, the more money.
But knowing that you're creating turmoil is – here's my thought on that.
It's like is it just human nature?
Because you were saying before about the negativity bias, that people gravitate towards things that are negative. And that's one
of the things that you'll find if you use YouTube. When you go on YouTube, if you're a person who
likes to get upset at things and you're a person who likes to look for things that are disturbing
or upsetting or political arguments, arguments whatever you'll get those in
your suggestions over and over and over again but if you're not interested in that if you're only
interested in airplanes and you start googling airplanes or or cars or watches or that's what
it'll suggest to you it doesn't have to suggest to you negativity you gravitate towards that naturally.
And so the algorithm represents what you're actually interested in.
So is it Facebook's fault that everyone, not everyone, most people generally interact more with things that are negative or things that upset them?
That's not their fault, but it is their fault that they take advantage of that to manipulate people.
That's entirely their fault.
But if their business model is to engage with people and to keep people engaged by giving
them content that makes them stay engaged and click on links and read more and spend
more time on the platform, and the only thing that it's doing is highlighting what you're actually interested in
why is what are they supposed to do are they supposed to make less money and then have no
suggestions and have no algorithm and just leave it all up to chance just leave it all up to you
go find what you're interested in and then keep finding what you're interested in through a direct search,
like through you trying to find these things directly with no suggestion whatsoever,
because that's better for the human race.
For the past year or so, we have been doing controlled experiments on YouTube.
We have a YouTube simulator. It's a perfect YouTube
simulator. And we have control. We're using real content from YouTube, real videos from YouTube,
all the titles, everything comes from YouTube, except we have control over the ordering,
except we have control over the ordering and we have control over the up next algorithm.
That's where the power lies, the up next algorithm.
So one of the things we learned recently,
not from Francis Haugen,
but it was someone else who left Facebook,
is that 70% of the videos
that people watch on YouTube now around the world
are suggested by
YouTube's Up Next algorithm. 70 percent. 70 percent. Yeah. And that's their algorithm. And
just like us in our lab, OK, we have control over what the up next algorithm suggests.
And guess what we can do with our up next algorithm?
What?
Well, it should be obvious.
You can manipulate people.
Yeah, we manipulate people.
We randomly assign them to this group or that group, and we just push people any old way we want to push them.
And when you're doing these tests and studies, like, how are you doing this? Like,
are you doing, how many people are involved in this? Are they students? Like, how are you doing
this? Okay, we never do the, you know, subject pool at the university where you get, you know,
50 students from your college to take, you know, to be your research. We never do that.
So we're always reaching out to the community or. We never do that. So we're always
reaching out to the community or we're doing things online. So we do big studies online.
And we are getting very diverse groups of people. We're getting, literally, we're getting people
from lists of registered voters. So we're getting people, you know, who look like the American population.
And we can mess with them.
Can I say we can fuck with them? You just did.
Oh, I guess I just did.
Oh, this is definitely not Fox.
No, we're on the Internet.
This is not Fox News.
Yeah, but the Internet, though, because there are no regulations and rules, it does allow for some pretty evil things to take place.
And the fact is, in our experiments, we do these, usually our experiments have hundreds of people in them.
Sometimes they have thousands of people.
And we can fuck with people, and they have absolutely no idea. Let me just, I'll tell
you about something new, okay? Okay. Something new, brand new. Okay. And this is, thank God I'm
not talking about Google this time. I'm just talking about something else that's happening.
There are websites that will help you make up your mind about something.
So, for example, there's a whole bunch of them right now that'll help you decide whether you're
really a Democrat or you're really a Republican. And the way they do that is they give you a quiz,
and based on your answers to how you feel about abortion and immigration and this and that,
at the end of the quiz, they say, oh, you are definitely a Republican. Sign up here if you want to join
the Republican Party. And this is called opinion matching. And the research we do on this is called
OME, the opinion matching effect. And there are hundreds of websites like this. And when you get
near an election, a lot more of them turn up
because the Washington Post will give you a quiz
and help you decide who to vote for.
And Tinder, Tinder, okay, which is used for sexual hookups.
How about romantic, sir?
Oh.
Not just sex. Sorry.
My mistake.
So Tinder actually set up a swipe the vote option on Tinder during the 2016 election. You swipe left if you think this, you swipe right if you think that. And then at the end of it, they say, oh, you should be voting for Hillary Clinton. But how do you know when one of these websites is helping you make up your mind,
how do you know whether the algorithm is paying any attention to your answers at all?
How do you know?
You don't. So what we've done is, we've done two things. One is we've gone to a lot of these
websites and we've been typing in random answers, or we've actually
set up algorithms that do it for us. If we want to do it 500 times, we set up an algorithm that
does it. And then we look and see what the recommendations were. And guess what? Guess what?
What?
Sometimes these websites are not paying any attention to your answers.
They're just telling you what they want to tell you.
And they're using this quiz to suck you in.
And then they add in, oh, this we love.
They add in a timer.
So in other words, after you finish the quiz, it'll go tick, tick, tick, tick, tick, computing, computing, computing.
And there's this delay creating the impression that they're really thinking hard.
And then they give you your answer.
So all that is for credibility to manipulate you.
Now, so over here, we're going to websites and we're typing in random answers.
On the other side, we're doing experiments in which we are giving people quizzes
and then we are giving people recommendations
and then we are measuring to see whether we can change anyone's mind.
And we're getting shifts of 70% to 90%
with not a single person recognizing that they're being manipulated. Not even one person
recognizing that there's bias in the results we're giving them. Not one. Because how could you
see the bias? How could you see the manipulation? You've just taken a quiz.
You're trying to make up your mind. The thing that's so scary about this kind of manipulation is it attracts exactly the right people, exactly the right people who can be manipulated.
Right. Because who's taking quizzes?
The people who are trying to make up their minds. They're unsure. Right.
They're vulnerable.
Yeah.
And when you, so you spoke to Congress about this?
You spoke in front of Congress?
Mm-hmm.
Yeah.
And when you did, was there any sort of urgency that was, did anybody understand like what the implications of this
are? Did anybody understand like this is, we were literally looking at these massive technologies
that are used throughout the world that can completely change the way policy is directed,
the way people are elected, who's in charge, what the public narrative is on a variety of subjects?
Well, there are some people.
There's a guy named Blumenthal.
He's a senator from Connecticut.
He gets it.
He understands.
He's kind of disgusted, I would say, with all this stuff.
But, you know, I said to Cruz, I said, why don't you work with Blumenthal? And he said, well, no, I don't think that'll work out.
Because he's a Democrat?
Because he's a Democrat. So they can't. So the Democrats, even the ones who understand this stuff, they won't do anything. Because why? Because these tech companies support Democrats very lavishly with donations.
Because these tech companies support Democrats very lavishly with donations.
So, for example, Google was Hillary Clinton's biggest donor in 2016.
And they're supporting them in these more subtle ways as well.
So the Democrats will do nothing. If they even say something, like if they rattle their swords, they don't actually do anything.
And the Republicans hate regulation.
This is a perfect storm for these companies to do what they have done, which is they have already
taken over. You're thinking 20 years from now? No, they've already done it.
Well, I'm not thinking that they haven't already taken over, but I'm thinking like how much more control can they have in 20 years?
If 20 years ago they didn't have any.
Like as technology advances, do you think that this is going to be a deeper and deeper part of our world?
Well, look at Zuckerberg.
Zuckerberg's trying to get us all into the metaverse.
So, yeah, you have even more control trying to get us all into the metaverse. Exactly.
So, yeah, you have even more control if you get people into virtual realities.
Yes, you have more control.
Every single thing they're doing is moving us farther and farther and farther down the rabbit hole.
Well, not just that. I'm thinking like there was a time where Zuckeruckerberg at least was uh publicly considering cryptocurrencies
right like some sort of a facebook cryptocurrency imagine if facebook cryptocurrency became the
number one currency worldwide maybe maybe it was the number one crypto like bitcoin is today
sure what the fuck you know so you're in the metaverse in order to exist and compete and buy things and prosper, you need Zuck bucks.
Yeah.
Right?
Yeah.
Oh, I published a few years ago an essay calling for his resignation.
Roger McNamee, who was one of the first supporters financially of both Google and Facebook,
he actually published a book about two years ago saying it was called Zucked,
how Zuckerberg has taken over the world. And he basically, he said in that book straight out that
if he had known what these companies were going to turn into, Google and Facebook, he would never,
never have backed them in those early days.
Jamie, did we ever find out what Facebook or Google rather changed their...
It got moved to the bottom of the code of conduct.
But it's still in there, right?
It's on the screen.
Okay, that's right.
And remember, don't be evil.
And if you see something that you think isn't right,
speak up.
So it's still in there.
Sort of.
I think it's really...
Google aspires to be a different kind of company.
It's impossible to spell out every possible ethical scenario we might face.
Instead, we rely on one another's good judgment to uphold a high standard of integrity for ourselves and our company.
We expect all Googlers to be guided by both the letter and the spirit of this code.
Sometimes identifying the right thing to do isn't an easy call.
If you aren't sure, don't be afraid to ask questions of your manager, legal, or ethics
and compliance.
And remember, don't be evil.
That was updated September 25th, 2020.
Mm-hmm.
Right.
So-
Can you hold this?
Because I have to pee.
Sure.
Sure. Unfortunately, I have to pee. I drank way too much coffee today. So we'll be right back.
Ladies and gentlemen, hold those thoughts.
This don't be evil thing.
This is where it gets interesting to me because the company is notoriously woke, right?
They've adopted these woke ethics.
And, you know, you hear about meetings that they have.
And there was the one gentleman. Jimmy, what is his name?
That was. He was fired from Google because he.
James Damore. James Damore. Thank you.
We actually had him on the podcast at one point in time because they asked him questions about why don't women have more prominent roles in tech?
And is there some sort of gender bias?
Is it natural?
And he wrote a whole paper about choices and why people choose one thing or another.
And they decided he was a sexist piece of shit and they fired him.
And it was really wild.
Yeah.
Because if you read the actual paper and the paper was available online,
there was nothing that he said that was not sexist at all.
Yeah, I read it, yeah.
Yeah.
So when a company has very clear motives, like they're in it to make money, like that's what they're doing.
How does that wokeness play into that?
they're doing. How does that wokeness play into that? Is that just a natural artifact of people coming from universities and then eventually working for Google? Or is this like sort of a
strategy that that's encouraged because it's more profitable that way?
Well, first of all, you have to understand that it's not that simple because Google has had demonstrators.
They've had literally their own employees holding demonstrations.
Not everyone is happy about the policies at Google.
People, for example, who have conservative leanings, which I don't, but people have conservative leanings there, they're miserable because the values agenda is so strong there that it dominates everything.
Isn't it interesting that you feel like you have to announce that you don't have conservative
leanings? It's interesting because you've done it a couple of times so far.
Three.
leanings. It's interesting because you've done it a couple of times so far. Three. Yeah. And you,
but you do it because people want to say, oh, alt-right psychologist, Dr. Robert Epstein.
Yeah. They would like to do that, right? Oh, I've been, ever since I did the testimony,
I mean, a bunch of things happened, one of which is very sad.
But, yeah, I've gotten branded.
I've gotten branded as some sort of conservative right-wing nutcase. And I don't have a conservative bone in my whole body.
So it really bothers me.
me. I'm doing what I do because I put humanity, democracy, America, you know, ahead of any particular party, any particular candidate. There are bigger issues here. It should be obvious
what you're saying. I mean, what you're saying should concern people. The idea that you would
just be labeled as a part of a disparaged political
party because it's an easy way to defame you and to discredit you. That should be obvious too.
You said that one of the things that happened was sad. What was that?
Well, in 2019, one of the things I did around the same time I did the testimony is I did a private briefing for a state attorney's general.
And so I did my thing, and I can scare people pretty well with my data.
We haven't got to my monitoring projects yet, but we will.
So I did my thing.
And then I went out
into the, uh, kind of the waiting room there and just waited because I was done and they started
filing out and one of them came up to me. I know exactly who it was. I know what state he was from.
And he says, uh, Dr. Epstein, I hate to tell you this, but he said, I think you're going to die
in an accident, uh, within the next few months.
And then he walked away.
Now, I did not die in an accident in the next few months, but my wife did.
Really?
Yeah.
So when this person said that to you, what does this person do?
He's an attorney general of a state.
And why did he say that to you?
Because he was concerned.
He thought I was pissing people off who had a lot of power and that they wouldn't like that.
And how did your wife die in an accident?
What were the circumstances?
She lost control of her little pickup truck that I had bought her
and got broadsided by a massive truck that was towing two loads of cement.
But her pickup truck was never examined forensically,
and it disappeared.
I was told that it had been sold to someone in Mexico,
and it just disappeared.
Sold to someone in Mexico.
Obviously, it was totaled?
It was totaled, and the wreck,
which I suppose was technically my property, disappeared.
It was never examined and disappeared and went to Mexico.
Now, was this an older truck or was it a newer truck?
It was an older truck, but, you know.
Older as in, like, how old?
Like 2002, but we kept in very good shape, had low mileage, new tires.
The reason why I ask is, like, what kind of wrote a story about a general in, during the time of Obama's administration, there was a volcano that erupted in Iceland.
And he was stuck overseas.
I believe it was Afghanistan or Iraq. I think it was Afghanistan. So he was over there writing a story for Rolling Stone. And because he was over there for so long,
because he was trapped, because no flights were going, because of the air cover was so bad,
because of this volcano, they got real comfortable with him. And these soldiers started saying things, not even
thinking this guy is like, you know, he's not one of them. He is a journalist and he's going to
write all these things about. So he wrote this very damning article. The general in question
got fired. And then this guy, Michael Hastings, started talking about how he was fearing for his own life. And cut to sometime in the future, he sped up.
There's actually a video of it.
Sped up on Sunset Boulevard towards the west side and slammed into a tree going like 120 miles an hour.
There was an explosion.
The car's engine was, you know, many yards from the car itself.
And there was a lot of speculation that not only did the government have the ability to manipulate,
that intelligence agencies had the ability to manipulate people's cars, but it's something they've actively done.
And people were very concerned that this guy was murdered because of what he had done.
Because that general wound up getting fired. Obama wound up firing him because it made Obama look bad. He was a very beloved
general. That kind of shit scares the fuck out of people. Well, there's a very good book on this
subject. It's called Future Crimes. And it starts out saying the kind of things that I've been
saying to you, which is the future crimes, they're actually here now. And this is an ex-FBI guy who wrote the book. And he's talking about how tech is being
used now to not only commit crimes, but to assassinate people. One of the simplest ways
to do it is you hack into a hospital computer and you change dosages on medication. You know, if the person you're going after has been hospitalized,
that's a really simple way to just knock them off and have it look like, you know,
just some silly little, you know, glitch or something. So yeah, there's a lot of ways now
that you can commit crimes that have never existed before. And as far as I'm concerned,
I mean, the kinds of things I study, in my opinion, should be considered crimes.
And I don't think we should ever be complacent and just say, oh, it's the algorithm.
Algorithms are written by people. Algorithms are modified. Google modifies its algorithm, its basic search algorithm, 3,000 times a year. That's human intervention.
Do you think that that's what happened to your wife, or do you speculate, or do you just not know and just leave it at that? How do you feel about that? It depends on the day. I think about Misty. She's from Texas originally, and I think about her pretty much nonstop.
I'm still wearing my wedding band, even though the accident was two years ago.
I don't know.
I know that the accident made news, not just here, but in Europe, because, you know, some people thought it was suspicious
that, you know, this, my beautiful wife, you know, we've been together for eight years, and
my beautiful wife was killed in this horrendous fashion. And, you know, obviously, I have,
I have pissed off some people at some big companies.
And I have work coming out.
I mean, the work that I have coming out,
I have right now 12 scientific papers under review and four that are in press, in other words, that have been accepted.
So I have stuff coming out that is over and over again.
It's like a sledgehammer, is going to make certain companies really look,
well, very evil, I would say. Do you think that they have the ability to suppress the kind of
coverage of the data that you're putting out to the point where it's not going to impact them?
How much has it impacted them currently? I mean, we're talking about committing murder or potentially committing murder. Like, how much have you impacted them
if they're still in complete and total control and they're still utilizing all these algorithms
and making massive amounts of profit? You haven't put a crimp in that.
Well, I have. I have put a crimp in it, yes. And so I do want to talk to you about this, about the monitoring stuff, because there is a way. There's more than one way, but there's one very practical way to literally just push these companies out of our personal lives and out of our elections.
really just push these companies out of our personal lives and out of our elections.
And I've been working on that project since 2016. That project started because of a phone call I received from a state attorney general, Jim Hood. He was attorney general of Mississippi at the time.
He called me in 2015, and he said, could Google mess with my re-election as attorney
general? Because in that state, they elect them. And I said, oh yeah, very easily. And he said,
well, how would they do it? And I explained how they do it, et cetera, et cetera. And he was very,
very concerned. And he said, but how would you know that they're doing it? And my mind just
started to spin. I was thinking, gee, I don just started to spin, like I'm thinking,
gee, I don't know, well, a whistleblower, you know, a warrant, something. And I became obsessed
with trying to figure out how to know what these companies are actually showing real people.
Now, here and there, there's some researchers at Columbia who should be ashamed
of themselves. There's some reporters at The Economist who should be ashamed of themselves.
Here and there, people have set up a computer that they anonymize, okay? And they type in lots
of search terms and they get back all these searches and they conclude that there's no bias.
and they get back all these searches, and they conclude that there's no bias.
But that doesn't tell you anything, because Google's algorithm can easily spot a bot,
can easily spot an anonymized computer.
They know it's not a real person.
How do they know that?
Because it doesn't have a profile.
You have a profile. You have a profile. You know, you have a, how long have you been using the internet?
Let's put it that way.
A long time.
Well, roughly.
25, 30 years, whatever it's been. Was it 94 I first got on?
Wow.
Yeah, so almost 30 years. Okay, so Google has a profile on you that has more than three, the equivalent of, more than three million pages of content.
Hmm.
Now, you're probably thinking, well, how could I generate that?
Because everything you do goes into that profile.
So, yeah, it's a lot of content. But the point is, they know
the difference between you, because you have a big old profile, and an anonymized computer
or a bot because there's no profile.
Right.
So it turns out, this is the simplest thing in the world to do, is that when they see
a bot, okay, they just send out unbiased content.
And we've shown this ourselves.
There's nothing to it.
But that's not the challenge that General Hood was basically giving me.
He was saying, how would you find out what real people are seeing?
So 2016, I got some funds.
I don't even know where they came from, but anyway.
And we started recruiting people.
We call them field agents.
This is exactly what that company does, Nielsen, that does the Nielsen ratings.
They've been doing it since 1950.
They're now in 47 countries.
And they recruit families, and they keep their identities
very secret. And they equip the families with special gizmos so they can keep an eye on what
television shows they're watching. And that's where the Nielsen ratings come from, which are
very important because they determine how much those shows can charge for advertising. They
determine whether or not a show stays on the air. So it's important.
So we started recruiting field agents, and we developed custom software literally from the ground up. And when we screen a field agent and we say, okay, you want to join us? We install on
their computer special software, which allows us, in effect, to look over their shoulders.
This is with their permission, obviously. Look over their shoulders, and we can take snapshots.
So when we sign these people up, we're taking lots of snapshots all day long,
and then information's coming in and it's being aggregated. So we can look at what real voters are being sent by Google,
Facebook, YouTube, anybody. And we take all kinds of precautions to make sure these people cannot
be identified. We deliberately had a small group of people, Gmail users, to make it easy for Google to identify those people.
Guess what?
They got unbiased content.
But everyone else was getting highly biased content.
Why did the Gmail people get unbiased content?
Because Google knew they were our field agents.
So Google was aware of your study?
so google was aware of your study i i i probably can't even sneeze without google being aware so you think google manipulated the results of the people that were gmail users no i think they
show that they didn't have bias no i think they unmanipulated yeah that's what i'm saying
yeah i mean manipulated in the sense of they didn't apply the algorithm to those people.
I had a reporter from DC, I'm not going to name him. He was doing a piece of my work. Then
he contacts me a couple of days later and he said that he called up
a woman who he believed was the head of google's
pr department he said and i asked her questions about your work and she started screaming at me
he said that he said that's very unprofessional i've never had that happen before
he said i'm going to tell you two things he said number one you have their attention
and number two if i were you, I would take precautions.
Jesus.
So monitoring.
2016, we recruited 95 field agents in 24 states.
We preserved 13,000 election-related search results on Google, Bing, and Yahoo. So it's 130,000
search results. So each one has 10 results in it. So it's 130,000 links. And we also then
also preserved the webpages. So we had 98,000 unique web pages. And then we analyzed it. We found extreme pro
Hillary Clinton bias on Google search results, but not on Bing or Yahoo.
Now, here's number four, disclaimer number four. I supported Hillary Clinton.
But still, I was very disturbed by this, extremely disturbed,
because we knew from the experiments we had run that that was enough bias to have shifted over a
period of time among undecided voters somewhere between 2.6 and 10.4 million votes without anyone
having the slightest idea that this had occurred. That's 2016. 2018, we monitored the
midterms. We preserved 47,000 searches. So we were expanding. We're getting bigger. 47,000. And we
found enough bias on Google, but not Bing or Yahoo, to have shifted 78 million votes. That's spread across
hundreds of elections, though, okay, with no one knowing. 2020, we went all out. We had more money.
We went all out. And we recruited 1,735 field agents just in swing counties, just in swing states, because we knew that's where the action was going to be.
We preserved 1.5 million ephemeral experiences, and I'll define that if you want, on Google, Bing, Yahoo, YouTube, Google's homepage, Facebook.
We, at this point, know how to preserve pretty much anything. We preserve 3 million web pages.
And we're getting to the climax here. Okay. We decided, which we hadn't done in the past,
on October 30th, 2020, before the election, a few days before the election, we decided to go public with some of our initial findings.
And we did.
senators, sent a very threatening letter to the CEO of Google just summarizing all my work,
my preliminary stuff. And guess what happened then in Georgia? We had over a thousand field agents in Georgia. Google turned off the bias like that. Google stopped with their homepage, go vote reminders.
They stayed out of Georgia. What does this say? This tells you that if you monitor,
if you do to them what they do to us 24 hours a day, you do that to them and you look for any kind of manipulation, any
kind of bias, any kind of shenanigan, and you make that public, you expose it, they
back down.
They back down.
They have to back down.
So doesn't this highlight that if our government is concerned about legitimate threats to democracy
and legitimate threats to the way information is distributed
and free speech and manipulation, that they should be monitoring Google.
But is the problem money?
Because of the amount of money that they give to campaigns,
the amount of money they give to support causes that these politicians back.
And the votes.
Don't forget the vote shifting because some of these politicians understand that.
Yes.
The government.
Forget the government.
Forget the government.
The government is not going to do this.
And would we even trust the government to do it?
So who should be doing it?
This should be done by probably a consortium, a bipartisan or nonpartisan nonprofit organizations.
And, you know, we should have hearings.
We should have, you know, very, everything should be
transparent, we should have wide representation of people serving on the boards and all that,
kind of like, well, the UN, but, you know, but this is a narrow kind of task. Here's what we
need. We need to set up now, because now we know how to do it. We need to
set up a permanent large-scale monitoring system in all 50 states in the United States. That's how
we start. Eventually, we have to help people in other countries set up similar systems.
That is how now and in the future, see, that's the real answer to your future question.
See, that's the real answer to your future question.
That is why now and in the future, that is how, now and in the future, we can get control over emerging technologies.
Not just Google, but the next Google and the Google after that. There is no way to know what these companies are doing unless you are monitoring.
One of the simulators we have now that we developed actually within the past year,
which is fabulous, I'm so proud of my staff,
we have an Alexa simulator.
I mean, it just works just like Alexa.
And it talks. It's fantastic.
Except we control what it's going to say.
And sure enough, can we shift people's, oh, yeah, easy peasy, nothing.
But what that tells you is that's one of the things we have to monitor.
We have to monitor the answers that these so-called personal assistants are giving people.
Because if they give biased answers, that shifts thinking and behavior.
And, you know, what if all of these companies
all favor the same party, which they do?
What if all of these companies
all favor the same candidate, which they do?
You add up these manipulations
and basically what Eisenhower predicted,
it's here now.
It's just that you can't see it.
You cannot.
First of all, if I'll give you an example.
Okay.
2016.
And I bet you Mark Zuckerberg has with one click, if he had sent out go vote reminders just to Democrats that day, because, you know are, right, he could have generated that day 450,000 more votes for Hillary Clinton than she got.
How do we know that?
From Facebook's own published data.
They published a study in 2012 showing how they could get more people to vote in 2010 by sending out vote reminders.
If you just take the data that they published and move it over to 2016 and say, OK, Mark, press the button.
Hillary would have absolutely won the election.
He, I'm sure to this day, is kicking himself because he didn't
do it. But how would you know? See, on any given day, any given election, how would you know whether
that kind of reminder is going out, number one? And number two, who it's going to? Is it going to
everybody? Or is it going just to a select group? Is it targeted? There's no way
to know that unless you have monitoring systems in place. With a monitoring system, you would know
within seconds or minutes if a targeted message like that was being sent out.
But if you had a targeted message like that, is that, that's not
illegal, right? Which is part of the problem. Like even if they did it publicly and you said,
all we're doing is encouraging people to vote. Yeah. But what if it's going just to members of
one party? Oh, I get it. But I mean, would they be obligated to send that to everybody?
Or maybe they could use the excuse that it's only the people that are politically inclined.
Here's what I'm – this is what I believe.
Okay.
Based on the experience that we just had a few months ago where we got Google to stay out of Georgia.
And, by the way, we positively got them to stay out of Georgia because we had over 1,000 field agents in Georgia.
And we were collecting a massive amount of – we collected more than a million ephemeral experiences.
I guess I'm going to have to define that.
In Georgia, I'm telling you, Google – we have never seen so little bias in Google search results ever since we started monitoring in 2016.
What's an ephemeral experience?
Okay.
2018, a leak to the Wall Street Journal from Google.
Bunch of emails.
One Googler is saying to others,
how can we use ephemeral experiences to change people's views about Trump's travel ban.
In other words, I didn't make up this term. This is from Google's internally, this is the kind of
lingo that they use. What's an ephemeral experience and why would they want to use ephemeral experiences
to change people's minds? Because an ephemeral experience is, well, most of the kinds of
interactions we have online involve ephemeral experiences.
Like, you type a search term, you see a bunch of search results, it has an impact on you, you click on something, it disappears, it's not stored anywhere, and it's gone forever.
like a news feed, a list of search suggestions, an answer box, that affect users, disappear,
stored nowhere. Authorities cannot go back in time and figure out what people were being shown.
That's why internally at Google, they want to use ephemeral experiences to impact people. Because unless someone like me, and I'm the only one doing this, unless some crazy guy like me is setting up monitoring systems and keeping everything secret while it's running, no one will ever know that you just flipped an election.
No one will ever know.
As I say, the most powerful mind control machine ever invented,
and it relies for the most part on ephemeral experiences, meaning no one knows.
You can't track it.
You can't track it.
You can't go back in time.
The only way to do it is you'd have to be looking over the shoulders of real users.
You have to look over their shoulders, and you have to grab it as it's occurring, and then you have to aggregate it, analyze it quickly.
And that's not really possible.
Well, no, that's what we do.
But, I mean, that's not possible possible well no that's what we do but i mean that's not possible for
the entire country uh yeah well that that's why we have to take what we've done do you see the
irony in that though it's almost like the only way to prevent this manipulation is by
massive surveillance of everyone no no no All you need is a representative sample.
You do what the Nielsen company does.
Same thing.
So we have a panel, it's called.
We have a panel of field agents around the country.
In a state like California, we'd have to have a lot
because they have a lot of people.
Idaho, we don't need so many.
So you just take representative samples from each state.
Like a Nielsen thing.
Exactly like Nielsen.
But would they be aware of who the Nielsen families are or the people that you're surveilling, your Nielsens? Would they be able to just have them receive unbiased data?
have them receive unbiased data.
Well, that's the whole point. The point of Nielsen is they have to keep the identities of those families secret because otherwise people would mess with them.
Right. But if they have the amount of surveillance capabilities that we're talking about here,
wouldn't they be able to know who these field agents are?
Well, that's why we're very, very careful about how we do the recruiting.
So, you know, it's expensive. Nielsen has to take precautions in the way they do recruiting
and equipping and training. We have learned from that. We take tremendous precautions.
And so, you know, you're asking, can this really be done? I'm saying,
yeah, I've done it four times, so I know it can be done. But it takes effort. There's a lot of
security involved. If someone is suspicious, we dump them. Nielsen does the same thing.
So how do you find out if someone's suspicious?
Nielsen does the same thing.
So how do you find out if someone's suspicious?
Well, how do we do that?
Yeah.
For example, let's say we're aggregating information that they're getting on the search engines, let's say.
So it's coming in. Our software is set up so that if the information we're getting from any particular field agent doesn't look right, okay, then it goes over to human review.
So what could that mean? That could mean, for example, that they are using an algorithm.
They're trying to tilt things in a particular direction. So they're not actually typing in anything. They're not using the computer the normal way they would use it,
which is what they're supposed to do. It means they've now developed or been equipped with an
algorithm to, boom, just start generating a lot of stuff, which would mess up our numbers, right?
Well, those people immediately are flagged. And when that happens and we can't
exactly figure out what's going on, we dump them. And we dump their data. If their information is
coming in faster than a person can type, we dump them. But there are other indications too. I mean,
I can't reveal all that, but we're taking precautions exactly like Nielsen has been doing all the way since 1950. It can be done.
Do you want to just inform and educate the public as to what's happening and how divisive and how interconnected all this stuff is?
It's hard to answer that question because as I keep learning more, and believe me, what we've learned in the last year easily eclipses what we've learned in the previous eight years.
We're learning so much.
The team is growing.
Our capabilities are growing.
So, you know, I'll say at one point in time, what I was concerned about was how can we get Google under control?
So I published an article in Bloomberg Businessweek.
There's a great backstory there because it was scheduled to come out, and then someone or other made a phone call to someone else, and then boom, the piece got pulled.
And this was a solution to the Google problem, literally. The editor-in-chief is literally having arguments
with the higher-ups, the publishers, because they pulled my piece on how to get Google
under control, how to solve the Google problem. I was scheduled to testify before Congress
the following Tuesday. The article had been pulled. The editor-in-chief was determined to get this piece out. He got it
published in their online version on Monday, the day before the hearing.
So what is this about? This is very simple, very light-touch regulation. The way to completely disarm Google
as to make their index,
which is the database they use
to generate search results,
to make it public.
And there's precedent for that.
The government has done that before.
It's very, very light-touch regulation.
And Google could still sell it, you know, when people like Bing want to use
a lot of it, a lot of data from the database, they could still make money.
But what would happen in that case, though, is that hundreds of other search engines would now
be set up, and then thousands, all pulling really good data from Google's database.
And then they would go after niche audiences.
And they'd all be giving great search results,
but they're going after Lithuanians.
They're going after women.
They're going after gays.
And so you'd end up with a competitive search environment
like there used to be when Google started out.
And more importantly, you'd end up with innovation in search. There's been no innovation
in search now for 20 years. I mean, look at Google's homepage. It's the same and the methodology
is the same. So you'd end up with innovation, you'd end up with competition, all with one very
simple regulatory intervention. And this was done with AT&T back in the 1950s.
Is there any consideration to adopt this? Have you had conversations where you think that this
could actually become a real thing?
Positively, this could happen because there are members of Congress who get it.
And they recognize that this approach is light touch compared to a million other things like the breakups.
You know, they're going to do breakups.
That's just nonsense.
So, yeah, it could happen, but it doesn't need to happen here.
It could happen in the EU because Google has, I think, 18 data centers.
I think only half of them are in the United States.
I think nine of them are in the U States. I think nine of them are in the
U.S. and five of them are in Europe. And Brussels, they can't stand Google. Of course, they've fined
them these billion euro fines. They've hit them up so far with three massive fines,
totaling more than 10 billion euros since, I think, 2017.
And what have they fined them for?
Well, bias in search results. How about that? That's the first big fine.
And this is Brussels.
Yeah.
Why does the United States implement some sort of a similar punishment?
Because Google owns the United States.
I mean, there's an antitrust action right now in progress against Google, and it's the attorney
generals, I believe, from every single state in the United States except California.
Because the attorney general of California, his main supporter is Google.
Google's based in California. So it's so crazy that they have this massive antitrust action
in progress, and the AG of California is staying out of it. His name is Becerra, I think.
But the point is that we're talking about light touch regulation.
That could actually be done.
It could be enacted by the European Union.
I've spoken in Brussels.
I've talked to people there.
That's the kind of thing they could do.
And if they did it,
it would affect Google worldwide.
And you would end up with
thousands of very good search engines,
but each aiming at niche audiences.
And doesn't that sound like the world of media, the world you're in?
Yeah.
And it doesn't seem like this would bankrupt Google.
Oh, no, no, not at all.
They'd still make massive amounts of profit.
Absolutely.
And it would fall in line with don't be evil.
Well, the fact is that depending on who the leadership is at any point in time at Google, they might look at that idea and say, hey, look, this would be great for us.
Really?
Sure.
But don't you think that any self-regulatory move like that would set up possible future regulatory moves?
Like, wouldn't they want to resist any kind of regulations
for as long as they possibly can?
But if they thought that they were going to be attacked in some worse way
and that this is a way out, you know, they're numbers people.
They're just numbers people.
I'll give you an example of Google just looking at numbers.
Okay.
2018, Election Day.
So, because I already told you we got them to stop doing this in Georgia,
but now I'm going back in time.
2018, Election Day.
Google on its homepage posts this,
Go vote. Everyone go vote.
So Google does this. Now the question is,
were they sending it to everyone? I don't know. But let's assume they were sending it to everyone.
Okay, the first thing that happened that day was all the major news services praised Google, praised Google for doing this amazingly good public service,
right? And I looked at it and I immediately said, that's not a public service. That's a vote
manipulation. So I sat down and I did exactly what a data analyst or data scientist at Google would
do. And I just ran the numbers.
And by the way, this is something we're also studying. It's called DDE, the Differential
Demographics Effect. The fact is Google has more Democrat users than Republican users.
users. So if they sent that to everybody that day, that would give 800,000 more votes. It would give, you know, more votes to everybody, but it would give 800,000 more votes to Democrats than to
Republicans. This is spread across, again, midterms, so it's hundreds of races. So if they send it to everyone, they win.
So if Google is biased towards Democrats in terms of users,
what are the Republicans using?
What kind of tech are they using?
I mean, if you're saying that Google's sending out these messages, right,
and that most of their users or the majority of their users are Democrats. Right. So what are what's the majority of Republicans?
I'm not sure what you're asking there. You're saying Google sending out this message. Go vote. Yeah. And through that message, because of the bias, because of the the difference in the numbers. Yeah. More Democrats are getting it because more Democrats use Google, right?
So what do Republicans use?
Well, they're still using Google.
Right, but I mean, there's not more Democrats in the country.
Is it less Republicans are online?
Like, what's the bias there?
Like, what is the difference?
I believe there are fewer Republicans online. That could be a factor.
Last time I looked at the numbers, it looked like there were a few more Democrats,
you know, people registered as Democrats than people registered as Republicans.
So, you know, it's a combination of factors. As you know, in recent years,
Republicans and conservatives, they have set up, tried to set up a number of platforms of their own. Parler is one.
Yeah. So they're, you know, they're social media platforms.
Yeah. They're trying to carve out their own their own world, their own niche on the Internet.
So they don't have to use these inherently biased platforms. Now all this search engine
stuff and the manipulation, how much does this apply to social media as well? And
is there cross-contamination? Well social media is more complicated
because social media, we're the ones who are posting all the stuff.
So we're providing all the-
But not necessarily.
Because if you pay attention to manipulation, there's a lot of manipulation that's coming from overseas, allegedly.
That's right.
My position has always been like, who's funding these troll farms in Macedonia?
Right. And how do we know that it's funding these troll farms in Macedonia? Right.
And how do we know that it's not someone in the middle of Kentucky?
Okay, you're absolutely right.
There's a tremendous amount of content that is posted by bots that is coming from organizations in other countries.
You're absolutely right.
I mean, I think Facebook just in the first quarter of last year, took down 2 billion profiles.
Facebook's top 20 Christian sites, 19 of them are run by troll farms.
Right. So there's a lot of junk out there. That's true.
So when you get to social media, the picture gets very complicated.
However, here's what you got to know. It's algorithms that determine
what goes viral. Everyone believes this crazy myth. Everyone believes this. Everyone I know
believes this. My kids believe this. Everyone believes that virality is mysterious. It's like winning the lottery. And that's not true. Because if I control the
algorithms, okay, I determine what's going to go viral and what is not. Now, that's, again,
a tremendous source of power. And of course, they do want a bunch of stuff to go viral, even crazy negative stuff, because more traffic, more money.
But the bottom line is they control the algorithms that determine what goes viral. That's where a
lot of the power lies in the world of social media. That's where, you know, the Francis Haugen
revelations are extremely important.
And just having that underbelly, that ugly underbelly of that company exposed.
So no matter how you look at this, for us to sit by,
Eisenhower's speech actually says that we have to be vigilant.
He uses the word vigilant.
We have to be vigilant so that we don't let these kinds of powers
take over our government, our democracy, our nation.
And we have not been vigilant, and we're not being vigilant now.
And the research that we do in the monitoring systems,
both the research is over here and the
monitoring stuff's over here, that reminds me every single day. I mean, I'm looking at numbers
every single day. You're keeping me away from my data and my research, by the way. But
I'm reminded every single day of just how serious this stuff is.
This is deadly serious for the future of not just our country, but all of humanity.
And the fact that people don't know it or that they sometimes say that I've given speeches.
Sometimes people say, I don't care.
I have nothing to hide.
I've heard that.
That infuriates me.
Yeah.
I've heard that about government surveillance too.
Yeah.
Yeah.
Well, look at the Chinese.
The lives of the Chinese are strictly controlled by the government, and more and more they're using high tech.
And Google has worked with the government of China to improve that technology.
And to limit access to search results.
That is correct. So Google does what's good for Google. They're not the sweet little old lady
running the library that people think. That's not what they are. They do what's good for Google.
I had a friend who worked at Google during the time they were working and having negotiations with China.
And her position was that China was just going to copy Google's tech if they didn't do that.
Yeah.
I've heard that, yeah.
Yeah.
So, like, they were in this position where, you know, like Tiananmen Square.
Like, you cannot bring up, like, Tiananmen Square is not searchable.
You can't find that in China.
The results of it, like the guy standing in front of the tank, like there's a lot of information from Tiananmen Square that would look terrible.
That's right.
So you can't find it.
No.
It's suppressed.
And, you know, Google's an expert at doing that kind of suppression.
They're the biggest censors in the history of humankind.
But still, look, I know I'm a very idealistic person.
I've handed out tests of idealism in my classes.
These are young people in their 20s, and I outscore them.
I've always outscored all of my students.
I'm very idealistic. I believe in truth, justice, the American way, like Superman and you know,
all that crazy stuff. But I, I'm going to do my best to get people to wake up. That's why I said yes.
Okay, I'll give up a day of looking at my numbers.
I'm going to come and talk to you because I am trying to get people to listen.
I'm trying to figure out how to get people to listen.
People must listen.
Let me put it another way.
That monitoring system I keep talking about,
that's not optional. Okay. That's not optional. That must be set up. If we don't set that up,
we will have no clue. We will not understand not only why this person or that person won an election, we will not understand what's happening with our kids.
I have five kids.
When my daughter Janelle was about 12, and I'm sure you've done this,
I think you have kids roughly that age.
So I did the thing a dad does sometimes.
I went into her bedroom just to check on her.
And I noticed one of her little electronic devices, the old iPod or whatever it was,
is sitting next to her pillow.
And then I looked a little closer and I went, what?
There were five electronic devices encircling her pillow.
It's our kids that we need to be thinking about here.
kids that we need to be thinking about here.
Okay, it's not just their future,
but literally how are they being impacted right now?
What kind of content are they being shown?
Is it pornographic?
Is it violent?
Is it, I don't know. Are they being pushed one way or another politically?
We are in the process right now
of trying to expand our research
to look at kids and to see what content these kids are being shown. Because it doesn't matter how
vigilant you are as a parent, okay, the fact is 99% of what your kids are seeing online or experiencing online, you're unaware of.
And that's why, as I say, solving these problems is not optional.
We must solve these problems.
We must set up monitoring systems.
And it's relatively cheap, by the way, because now that we've done it repeatedly, we know how to do it.
And if we don't, we are essentially being controlled by big tech forever.
We are turned over our democracy.
We've turned over our children.
We've turned over literally our minds.
children. We've turned over literally our minds. We've turned them over to tech companies and algorithms. I think that's insane.
It is insane. And where does it go? How bad can this get?
Yeah, but look, we got Google, with the help of some senators, we got Google to stay out of Georgia.
Yeah.
To me, that's a wake-up call that says, wait a minute, we know not only how to track these companies, but we can stop them.
Have you ever had a conversation with anybody from Google?
Well, Ray Kurzweil's an old friend of mine.
His wife, Sonia, I was on the board of her school for autistic kids for 15 years. I mean, I went to their daughter's bat mitzvah. They came to my son's bar mitzvah, etc., etc. But he won't talk to me now.
He won't talk to you?
He's head of engineering at Google.
So he won't talk to you now because he's not allowed to? Do you know why he won't talk to you? I don't know. And even Sonia won't talk to me now.
And I've never had any conflict with either ever, ever, ever going back, I don't know, 20 years.
Never, never. I mean, they're lovely people. They're very nice people. I know their kids and,
you know, neither of them now will talk to me. Just because he is an executive at Google.
He's an executive at Google. I was supposed to be on a panel with another top executive at Google who used to be a professor at Stanford or some big school.
I was supposed to be on a panel with him in Germany.
And when he found out what it is I do, he pulled out.
He did not show up.
There were 1, thousand people in that audience
who came to see him, the Google guy, not me.
He didn't show up.
Wow.
They, I believe, I'm pretty darn sure,
and this upset my wife, Misty, at the time,
they sent a private investigator to our house.
For what?
Posing as someone who wanted to be a research intern.
What did he do when he was in the house?
I don't know.
Did he leave bugs?
I don't know.
I have no idea what he did.
But I was sitting there with a staff person.
We're asking the guy questions like we do for anyone who applies to work with us.
And the guy, first of all, he's wearing like a white shirt and a tie, which no one does in San Diego.
But we were asking him questions, and his answers didn't make any sense at all.
How so?
Well, you know, I said, so you're interested at some point
in going to graduate school in psychology?
And he goes, graduate school?
Psychology?
I don't know.
So none of this made sense.
We looked the guy up afterwards.
He was supposed to get back to us.
He didn't.
We looked him up.
He worked for a private investigation firm. Now, why do I think Google sent him? Because I had written to that
executive at Google who was supposed to be on that panel in Germany. And, you know, just telling
him about my work, giving him links and so on because he's a former professor.
Okay.
It was only a few days after that that this guy showed up at our house.
And then it was a few days after that that the Google executive pulled out of that conference.
Jesus.
And so they're not interested in communicating with you.
And so they're not interested in communicating with you.
They've obviously either told people not to communicate with you or the people that you would like to talk to are aware of your work and they feel that it would negatively impact their job or their career? I'm telling you, this has just been, for me, in many ways,
a nightmare, an absolute nightmare,
because there are people who won't help us,
who won't serve on our board,
who won't do this, who won't do that.
We had an intern lined up who was very, very good.
We get some really sharp people.
They come from all over the world, actually. And we had this person all signed up, and her start date was
set up. And she called up, and she said, I can't do the internship. I said, why not?
My grandmother. My grandmother looked you up online, and she thinks that you're like some sort of Trump supporter. And she said, she'll cut me off
if I do this internship. So that's one of the reasons I keep repeating. I did it four times,
but I've keep repeating, you know what? You lean left. Yeah. Yeah, because but it doesn't help.
It doesn't help.
They don't care.
Well, ever since that, ever since I testified.
Things have terrible things have happened.
My one of my board members said to me, look, he said.
He said, in a way, you should be grateful.
And and please that they left you alone for so many
years. He said, but that for them was, you know, that was it. That was the final straw.
And, you know, what happened after that hearing was Trump tweeted about my testimony.
Hillary Clinton, whom I've been supporting forever, Hillary Clinton replies to Trump on Twitter and says, this man's work has been completely discredited. It's all based on data from 21 undecided voters.
What?
She said that?
Yeah.
Can you sue her?
She said that?
Yeah.
Can you sue her?
I could have, but it would take me away from the research.
It would cost a fortune.
Yes, I could have and probably could still sue her, yes.
Because that's a factual statement, which is false and defamatory.
But I have to try to stay focused.
I really do. And I keep
getting pulled away. And believe me, what's happening right now in our work is tremendously
exciting. And everyone loves what we're doing. We love each other. We love the whole thing that
we're doing. We love the discoveries. We're blown away over and over again by the numbers. And we have very ambitious plans moving forward.
long as I can still function, I'm going to keep doing this. I mean, it's important. It's important.
I have five kids, okay? Someday I hope I'm going to have grandkids. And, you know, it's important for the world right now. It's important for our democracy, which as far as I'm concerned is an illusion. It's an illusion. When you look at the numbers, you realize, no, there's a player in here that you don't see,
that doesn't leave a paper trail, and that can shift millions of votes.
And if it didn't exist and someone introduced it nefarious political parties or nefarious people
would 100 be excited about it like look what we have now yeah yeah yeah yeah and then if we found
out that someone who was like say if donald trump you know if the democrats found out that
donald trump had implemented some sort of a system like
you're talking about, people would be furious. They would say he is a threat to democracy. He
should be locked up. He should be in prison for treason. Does it concern you that you're the only
one? Well, I don't understand that because this is really good science. I mean, in other words, the work I do has been published in top journals. That initial SEAM paper, that was published in the Proceedings of the National Academy of Sciences.
It has since been downloaded or accessed from the National Academy of Sciences more than 100,000 times for a very technical scientific paper.
That's practically unheard of.
I've never had that happen before.
And the papers that I have coming out, they're in top journals.
We're submitting more in top journals.
This is good science.
So why aren't 20 universities doing this stuff?
You know why? Because they're getting funding from Google or they're terrified
of Google. The head of Europe's largest publishing conglomerate, his name is Dopfner,
published a piece a few years ago, was actually called Fear of Google. It's a superb piece.
It's about how in a lot of industries right now, you cannot make a move without taking into account how Google's going to react.
I want to Google fear of Google.
Yeah, Google fear of Google.
Let's see what happens here.
Yeah.
See if you get Dapfner.
Fear of, what's it suggest?
Fear of rain, fear of God, G-O-O, fear of Google.
Here we go.
Yeah.
What's it say?
Who's afraid of Google?
Everyone.
Wired Magazine.
Why we fear Google.
That's some other website that I don't know.
What Americans fear most according to their Google searches.
Why are people afraid of Google?
Quora.
Well, don't forget, whatever you're getting has to do with your history.
So someone else is going to get a different list.
And that's scary because,
you know, ask a con artist. I don't know, do you know any con artists? Because I've met one or two.
You ask a con artist and they will tell you straight out, if you want to do a con,
the more you know about someone, the easier it is. For sure. So that's the problem there is that everything is personalized
and everything you're seeing there
is based on you
and your 20 plus year history
and the 3 million pages of information
they have about you.
They build digital models of all of us.
Do you use social media?
I've been trying to close my Facebook page I think, at least three years now.
They won't let me close it.
Uh, they won't let me change it.
It's still up there.
Um, I didn't even set it up originally.
I think it was Misty, my wife, who set it up.
But they won't let me touch it.
And they won't let me close it.
Speaking of which, okay, um, I'm sitting next to a guy on an airplane the other day.
And he says he's saying how he doesn't he's very proud that he doesn't use any social media.
I said, so, wait, you mean you don't have a Facebook page? And he goes, oh, no, positively, I do not have a Facebook page.
I said, you have a Facebook page.
He goes, no, what are you telling me?
He says, I know, I know, I don't have a Facebook.
I would know if I had a Facebook.
I said, no, you don't understand.
Every time someone mentions you on Facebook or posts a photo in which you appear,
okay, that goes into your Facebook profile.
Okay, that goes into your Facebook profile.
You have a big, an enormous Facebook profile, except that you can't see it.
And what do you say to that?
Well, I said it in a way, I guess, that was pretty convincing, and he was upset.
He didn't like the concept that he might have a Facebook profile that he doesn't know about.
That you can't opt out.
Well, it's not only that, but even when you think you're... I mean, Google,
God, do they ever say anything truthful publicly? That's a big question. But I mean, Google claims, for example, you can delete your Google data.
You can go through the motions of saying, I want to delete my Google data,
and then from that point on, you can't see your Google data, but they don't delete it.
They never delete it.
Even if they deleted it on one server, it's sitting there and backup after backup after backup, and not only that, if you read, I think I'm the
only one who reads these things, but if you read Google's terms of service and Google's privacy
policy, it says right in there, we reserve the right to hold on to your data as we might be be required by law or in any other way that protects Google now what about
Twitter and Instagram things like that well with with Twitter and Facebook are
the same entity right Instagram yeah and and yeah, Instagram is part of Facebook. So here, my main concern is, again, is this power to suppress. So I don't know what your opinion is. In fact, I'd love to know what your opinion is of what happened early 2021. I think it, when both Facebook and Twitter shut down Donald Trump.
What do you think of that?
I don't think they should be shutting down people at all.
And by the way, he was still president.
Yeah, I think that what these things are,
I think we're at a time in history
where you can't look at them as just private companies because the
ability to express yourself is severely limited if you're not in those platforms. I think they
should be looked at like utilities. And I think they should be subject to the freedoms that are
in our constitution and our bill of rights. and I think the way the First Amendment protects free speech,
it should be protected on social media platforms
because I think as long as you're not threatening someone
or doxing someone or putting someone in harm or lying about them,
I think your ability to express yourself
is a gigantic part of us trying to figure out the truth.
Like when it comes to what are people's honest opinions about things, do we know?
We don't know if honest opinions are suppressed because they don't match up to someone's ideology.
I think that's a critical aspect of what it means to be American,
to be able to express yourself freely and to find out how
other people think is educational. If you only exist in an echo chamber and you only hear the
opinions expressed of people that align with a certain ideology, that's not free speech.
I think free speech is critical. And I think the answer to bad speech, and this is not my thought, this is many brilliant people believe this, is better speech, more thought, is more convincing arguments, more logical, sustained reasoning and debate and discussion. and I think as soon as they start suppressing ideas, as soon as they start suppressing
and deleting YouTube videos
and banning people from Twitter
for things that have now been proven to be true, right?
There's a lot of people that were banned
because they questioned the lab leak theory.
You know, their videos were pulled down.
They were, you know, they were suspended from Twitter. Now that's cover of Newsweek. It's constantly being discussed.
Sure.
It's discussed in the Senate.
Well, this is a very old idea. The way Voltaire said it, I'm paraphrasing, is, you know, I may not agree with what you say, but I will defend to the death your right to say it.
Yeah.
the death, you're right to say it. And I think it was dead wrong. I mean, I was happy, of course,
that this happened, but I think it was dead wrong for Twitter and Facebook to literally cut off communication between the current president of the United States, who's still in office,
and his supporters. Yeah. And the real question too is how much manipulation was being done by
federal agents in the January 6th event? Did they engineer people going into the Capitol?
Did they encourage them? And you saw that Ted Cruz conversation with the woman from FBI where she said, I can't answer that.
Did the FBI incite violence?
I can't answer that.
You can't answer that.
That should be never.
Would they incite violence?
Would the FBI manipulate people to do something illegal that would not have done that look if you pay attention
to those people like if you watch uh there's a great documentary on um hbo this q anon documentary
it's uh into the storm have you seen it no it's worth watching it's a four four parts something
five something like that uh multiple parts. And it's great.
And you realize like how easily manipulated some of these poor folks are.
They get involved in these movements.
Now, if somebody wanted to disparage a political party or to maybe have some sort of a justification for getting some influential person like Donald Trump offline, that would be the way they would do it.
That's, yeah.
Yeah. So you look at, he's responsible for violence. He's responsible for,
look at this is as bad as Pearl Harbor. This is as bad as D-Day.
But the bottom line here really goes back to George Orwell, which is, you know, if you control information,
you control everything. And what we've done is we have lost control.
Authorities, gatekeepers who are well-trained journalists, let's say,
you know, we've lost control over information. Information is now largely in the hands of algorithms which are controlled by executives who are not accountable to the American public. They're accountable just to their shareholders.
like we're in a terrible position. I'm going to, you know, you asked me this before, but I'm going to continue. I'm going to do my research. I'm going to keep digging. I'm going to do my monitoring.
I'm going to try to set up, I hope this year, this nationwide system, you know, and so, and so,
and now I'm going to point to my hat. So it says tamebigtech.com. Okay, good. That's your website? Could you say it again?
Tamebigtech.com.
Well done, well done.
Thank you.
Yes, because I need help from people.
I need people to provide funds, but also to help us find funds.
This is the year where I think we should set up this first large-scale nationwide monitoring system, which could be used not only to keep an eye on these midterm elections, but we could finally start to look at our kids.
That's become my main concern now, is our kids.
Because we don't know, we don't understand what the hell they're doing.
We don't know what they're looking at, what they're listening to do what's best for them,
what makes them the most money,
what spreads their values,
and of course, sometimes what's good for intelligence purposes.
They're going to do those things,
and we have no idea what they're doing
unless we track them.
So anyway, that's my fantasy.
This is the year where we're going to get this thing
running in a way that it would be self-maintaining. So it would continue year after year after year.
Not optional. I've said that before. It's not optional. So if people go to tamebigtech.com,
they can get more information.
I actually created a special booklet that we're going to give out for free.
I had a copy to bring you, and I left it in my car.
But we have this booklet.
I took my congressional testimony.
I updated it.
I expanded it.
And I turned it into an essay, which is called Google's Triple Threat to Democracy, Our Children, and Our Minds. And it says right on it, prepared for Joe
Rogan's whatever. And I cannot, you know, I am
doing this with the help of all my wonderful
teammates. I am
so far still the only
one. And that's
disgusting. That's horrible.
There's something
fundamentally wrong with that picture.
And...
Imagine if you didn't exist. like if you never had started this
would we be completely in the dark about this stuff you would you would be completely in the
dark because there's no one doing these kinds of experiments and there's no one collecting all that
but when you think about the internet and how many people on the Internet are interested in politics and interested in the influence of big tech and the dangers of big tech, when they talk about psychological dangers like Jonathan Haidt's work with young girls and self-harm and suicide and the rise of depression amongst young people, you would think that this would also be something that people would investigate and dig into,
the fact that you're the only one.
That's very strange.
It's a tremendous responsibility.
It's horrible.
I don't like the responsibility.
I'm gone at the moment,
so we have two cats in our office, and I'm the poop cleaner.
So when I'm gone, that means someone else has to clean the poop.
So I said to my associate director, I told her last night, I said,
just remember that the more credentials you get, the more responsibilities you get,
the more poop you're going to have to clean.
And that's the truth.
So it's very tough.
I don't like being in this position, and I do wonder about Misty.
And I'll probably always wonder about Misty and I'll never know because again, her,
her truck disappeared.
So.
Well, listen, man, thank you very much for coming here and thanks for taking your time.
And I know you're very busy and I'm glad we could get this out there.
Uh, I don't, I don't know what to say.
I mean, you, you, you've given me a great opportunity
I hope
this was you know interesting for you
it was okay scary
good then if I
scared you I'm doing my job yeah you scared the shit
out of me good
all right thank you
Robert thank you Joe bye everybody