Cognitive Dissonance - Episode 446: Social Media Propaganda
Episode Date: December 10, 2018Stories from the Week...
Transcript
Discussion (0)
This episode of Cognitive Dissonance is brought to you by our patrons. You fucking rock.
Be advised that this show is not for children, the faint of heart, or the easily offended.
The explicit tag is there for a reason. recording live from glory hole studios in chicago this is cognitive dissonance every episode we blast
anyone who gets in our way we bring critical thinking skepticism and irreverence to any
topic that makes the news makes it big or makes us mad it's skeptical it's political and there is
no welcome at this episode 446 of cognitive dissonance.
And this is going to be a little different.
See, so instead of the usual plethora of wacky religious hilarious, goofy stories, it's a little bit of wacky.
So we're going to talk a little bit.
We're going to talk long form.
It's going to be if you're looking for, you know, dick jokes about the prophetess can't care, relisten to episode 444.
Yeah.
I think you'll really like that one.
Again, just go ahead and put that one back on.
But we're going to be talking about slightly more serious topics today or one topic in particular.
Mainly one topic.
Yeah.
We're going to be talking quite a bit today about social media, social media and kind of their social and political responsibilities
and kind of what they've done with those responsibilities, what they haven't done
with those responsibilities and the impact that that's had on our elections. And then I think
also just on America in general and sort of how we think. So Cecil, you found a whole bunch of
material that we're going to talk about today. Yeah. I think we start with a little bit of
background on sort of what we're going to be talking about.
We're going to be talking about two podcasts that we listen to.
They're both of them were from The Daily.
We'll put links in the show notes if you want to listen to those podcasts exclusively.
One of them happens to be on an internet group that was in Pennsylvania. We wound up being a man and a wife who were, ran a Facebook
page and they talk about sort of what kind of stories they had. We're going to get into that
in a little bit. And then we also are going to be talking about, uh, specifically the,
the drama that has unfolded at Facebook, um, since sort of the beginning of all of this,
we're talking about before the 2016 elections, when people were
saying, hey, some shit's going on down there. The Times did a expose on this, a big, long article
about this. And they talk about sort of what has been happening, you know, in a timeline fashion
since the beginning of all this. And really it comes down to, it starts out with the idea that Facebook
really was sort of suppressing this for a very long time. They had found out,
Stamos is one of the guys who was their senior-
Chief security officer.
Yeah, he was the chief security officer of that company. And he found out a long time ago
that there were some real issues with fake news, with Russian, with Russian bots and that
sort of thing. And they, and they ignored it for a very long time. They pushed it under the,
under the rug. Well, and just as much as they ignored it when he brought it to light,
when he brought it to the board and bypassed Zuckerberg and Sandberg,
he got in all kinds of shit for doing that. They yelled at him for it.
They were upset.
It was a big problem.
You know, what's interesting is the story that Facebook has told publicly does not match the reality, right?
So let's talk about that a little bit. The story was, the narrative as we kind of were led to believe was that the Russians had created, let's go all the way back.
The Russians, we know, what do we know? What do we know is true? What we know is true is that
for a significant amount of time, although nobody really knows exactly when it began or what the
first sites were, the Russian government has been using agents to spread a campaign of disinformation in the United States,
aimed very specifically at creating a divided electorate. They did this by creating both fake
news stories and distorting existing news stories and amplifying the message of both fake news and
distorted or one-sided, particularly conservative news often,
and then using social media to amplify that message and to create engagement through
divisiveness in order to sway not just what we think, but how we think,
and in order to change the election cycle. And they did this in the Ukraine
too. So they have a test case that's already, they already did it. Like they've already done it.
And they successfully, it's successful here, but it's successful there. We don't know how
successful here. Right. And that's the problem. We don't know how successful because the social
media companies will never tell us how successful or unsuccessful it was. But we know that it was successful in the Ukraine.
Well, you know, and that's part of the problem, right? Is that how do you measure how many minds
got changed or how many minds got, you know, how many people showed up to vote or how many people
got influenced in ways that they don't even know they were influenced? Because that is what
propaganda does, right? This is a propaganda campaign.
If you're being propagandized well
and you're being fed disinformation well,
you don't ever realize that it was disinformation.
You are the subject of disinformation.
It is only valuable if you believed that it was information.
They do know that one number,
and they mention it in one of the things that we watch.
They say 146 million
people on Facebook were touched
by these messages. Tens of millions
on Twitter were touched by these messages.
That's not
quantifiable. We don't know
how deep they were touched. We don't know
how much
a mind was changed, like you were saying, but we know the
number. And to put that into perspective, 146 million people were touched by these messages. 137
million people voted. So in terms of the amount of reach, it is a deeper reach into the mind and
vision of the American population than the population turned out to actually vote.
The population is 320 million.
Remember, not all of that 320 million
are eligible to vote.
Many of those people are under the age of 18, et cetera.
So you've got, and no matter how you cut this,
you have just an incredible amount of impact.
A massive, massive, massive impact on what we think
and like the information that we get and the way that we engage that information.
Yeah.
And this is known.
Facebook knew about this before we knew about this.
Yeah, absolutely.
They knew about this and they did not tell us.
And more than they didn't tell us, Zuckerberg went on and said, it's a crazy idea.
This isn't happening.
He said it was crazy months after he had already found out that it was happening.
Yeah, this is after this was.
So, like, he knew that his site, that his that his billion plus dollar company was being leveraged in by a foreign government to influence the way that you think the the way that you vote, to subvert our fucking democracy.
He knew it.
Yeah.
And he went on and he said, that's crazy.
That's not happening.
Also, in the back of his mind, I fucking know that's happening.
Yeah, he knew it was happening the whole time.
I know that that's happening.
And I will say, like, and we watched also an interesting documentary.
What was the name of the documentary we just watched?
It's a half hour NBC special.
It's on NBC.
I'm going to post the link to it.
It's Factory of Lies, Democracy Under Attack
is what it's called.
Very dramatic name, by the way.
Very dramatic documentary.
Very dramatic documentary.
But there is some interesting takeaways.
It's about a half an hour documentary.
I'll post it on this week's show notes.
You know, like I thought I was thinking about this
and it's like, it's not
a surprise that Zuckerberg,
that Facebook would sweep
this under the rug, right? Because not only
is it damaging to their business model
that this happened, but you know, like
the more that we are engaged,
the more times we're
on Facebook, which means
the more of our personal information we're
giving to Facebook, which is what they gather and sell, right? And means the more of our personal information we're giving to Facebook, which is
what they gather and sell, right? And then the more of our ads that they see, because if I'm
engaged in an argument on Facebook and I'm going to go and I'm going to check it, or do they reply?
Do they, I'm going to reply. Do they reply? I'm going to reply. I'm more like the more divisive
the news that I get, the better it is for Facebook's business model. This, this is good
for Facebook because remember all. This is good for Facebook
because remember all Facebook wants you to do
is spend time there.
There's an interesting confluence
of a lot of different things that happen
to make Facebook the powerhouse that it is.
And I want to go through a few of these things
to talk about them.
Facebook, before they became public,
partnered with big data.
And what we don't understand is that like Facebook isn't just
the only place that gathers your data. Of course not. There's a lot of places that gather your
data. There's a ton of places out there that have a shit ton of information on you. And I found a
ProPublica article that I want to read a little bit of just to just sort of put this under,
put this into perspective. I'll post the ProPublica link as well on the show notes.
They started with
the basics like names, addresses, contact information, and add demographics like age,
race, occupation, and educational level. Also include income levels and things like that too.
The companies collect lists of people experiencing, quote, life event triggers,
end quote, like things like getting married, buying a home, sending a kid to college, or even getting divorced. Credit reporting giant Experian has separate marketing
services division, which sells lists of names of expectant parents and families with newborns that
are updated weekly. The companies also collect data about your hobbies and many of the purchases
you make. Want to buy a list of people who read
romance novels? Experian will sell that to you. Oh no, pardon me. Epsilon can sell that to you.
As well as a list of people who donated to international charities, a subsidiary of a
credit reporting company, Equifax even collects detailed salary and pay stub information for
roughly 38% of the employed Americans, as NBC News has reported. As part of handling employee verification requests,
the company gets information directly from employers and how they get it. Here's another
thing. Two companies actually responded with details of how volunteers' information has been
shared. Upscale Furniture Store Restoration Hardware said that it sent
your name and address of what you purchased to seven other companies, including data cooperatives
that allow retailers to pull data about customer transactions. Walt Disney also responded and
described sharing even more information, not just the person's name and address and what
they purchased, but their age, their occupation, their number, the ages and gender of their children. It listed companies that received
data among them owned by Disney, like ABC and ESPN, as well as others. In September 2013,
Axiom, AxiCom, I guess, I don't know if I'm pronouncing that correctly, debuted aboutthedata.com,
which allows you to review and edit some of the company's marketing
details about you by entering your name, address, and birth date. And the last word,
digital security number, I guess you can go on this aboutthedata.com and change the data or
delete some of the data, things like that. I don't know, I haven't done it, so I don't know
what you can do, but it seems like a pretty interesting thing that they've put together.
Now understand that once they partnered, they had this really cool thing, right?
So you have all this data out there that's floating on about Tom, right?
So there's a million things about Tom that tell you all these different life events that
have happened, all these other things.
But Facebook allows this really cool thing to happen, which is, and I don't know, I say
cool.
I don't mean, but interesting thing to happen, a powerful thing to happen, which is, and I don't know, I say cool, I don't mean, but interesting thing to happen, a powerful thing to happen, which is suddenly I get access to Tom when I know all this other stuff about him. Way better than you can predict with a TV ad. Way better than you can predict with a radio ad.
It's personalized. It's personalized.
It's just like Minority Report.
If you remember the movie Minority Report,
they're walking by the ads and they see the eye
and they're like, oh, hey, Bill, here's a thing for you.
It's the same thing.
And that dystopian movie?
Yeah, and that dystopian movie about Facebook.
Yeah.
But in a way, that's a very incredible partnership
that they have.
They have all these real world things that you're fiddling with, and then they get to put that in front of you in front of in cyberspace.
So the other thing that happens is, is that Facebook users are retreating slowly into echo chambers.
Right. And the reason why they're retreating into the echo chambers is because it's safe there.
Right. I can go to my echo chambers.
There's one of the specific things that we listen to, and you can listen to it
yourself. It's the daily podcast. The guy who created this website that went on to become a
multimillion dollar website that made a ton of money off of fake news, quote unquote fake news,
he went there and started it because he was pushed out of the liberal groups for having a dissenting
opinion. He went there because he wanted to go of the liberal groups for having a dissenting opinion.
He went there because he wanted to go find somebody who was like him and he couldn't find people who was like him.
So he created something that was like him.
You want to go and have these not just contentious arguments, but also want to connect and share and kibitz with all the other people who share your opinion.
You know what?
I will say, like, it reminds me of episode 444. We were joking about the witches. They talk about community, right?
This is what they mean by community. Yeah, exactly. What they really mean by community is
an echo chamber of people that are like-minded. Yes, exactly. Now, the creators of Facebook are
using this platform for personal gain, right? They want to make money off this. That's why they went
public with it in the first place.
They went, they want to make a lot of money off of it,
but they're bending the truth
and sort of abetting dissension in the rest of all of us
because it makes you, like we said, go back.
You're going back again and again and again and again.
And they want your eyes on Facebook.
They want your eyes on Twitter.
Then you get these other creators that start trolling, right?
And this isn't necessarily
a bad thing right it's just people can't we're just kidding around what i'm just trolling what
i'm just kidding around what i just unfortunate i'm just saying some crazy shit well now people
are starting to amplify that crazy shit so the currency of truth stops becoming factual
and starts becoming facebook suddenly becomes this giant argument
from popularity, right?
The more likes something has, the more validity it has.
That's not true.
That's just not a true, that's not a true statement.
That's a logical fallacy.
It doesn't matter how many likes something has,
it doesn't necessarily mean it's true.
But people start mistaking how many likes something has
or a blue check mark on Twitter for something that is real, that is true. That doesn't mean it's true. But people start mistaking how many likes something has or a blue check mark on Twitter
for something that is real. That is true. That doesn't mean it's true. It just means something
popular happened to it. That can happen all the time, but people aren't using their reason to try
to decide whether something's true. They're just relying on others. They're pulling the audience
in the biggest way you possibly can. It's the, it's the world. It's billions of users on Twitter.
Then you start looking into the Russian disinformation campaign.
They're actually using it for nefarious purposes,
but they're starting to look to see these trends
of people retreating to the echo chambers.
Well, they infiltrate those echo chambers,
and then they start posting stuff that might not be true.
They create them.
Absolutely, yeah.
And then finally, at the very end, we talk about the data leaks with Cambridge Analytica.
There's a huge data leak that happens with Cambridge Analytica.
Cambridge Analytica has found out to not just have the data that they paid for, but data that they never paid for.
That people willfully put on the internet, but didn't know that it was going to go for this particular purpose.
And that's when people suddenly just start changing all their data profiles on their,
on Facebook, which is a good thing, right? Yeah. Cambridge Analytica coming to light is a great
thing because it woke people up to the fact that your fucking data is important and people can,
can, can use that data against you. And you've got to be, you've got to be, uh,
vigilant to make sure that that doesn't happen. And then Facebook at
the very end of this starts hiring a PR firm and they hire a PR firm that's dealt with conservative
PR in a long time. That is exactly basically just looking at how to attack the other people without
actually clearing your name. Make it about Apple. Don't make it about you. Make it about other
people. What about Apple? What are they doing? What's Google doing? Muddy the water until nobody trusts the water anymore.
And so that's sort of this confluence of all these things that are happening. And, you know,
it's like a forest fire. We didn't rake well enough. You know what I mean? Like we didn't
rake the fucking leaves up well enough. And a lot of these things are, you know, it's bad on bad.
It's bad stacking on bad. Well, I think, you know, I think that a lot of people, not everybody,
I think a lot of people, if they ever stopped once to consider the business model of something
like Facebook, which I think, I think a lot of people never bother to stop to think about the
business model of the things they engage in. It's like, oh, I can go on here. I can share some
pictures. I can connect with my buddies. It doesn't cost anything. That sounds great. And if they ever
stopped to think about like the business model,
like, oh, it's ad supported.
And that's probably about as far as most people would go
because why do they care?
But it's not just ad supported.
It's much, much more than ad supported.
They're taking the data
and they are feeding you personalized ads.
And if that's where it stopped,
I find that a little weird,
but I know a lot of people don't.
A lot of people even like it. Like they like to see ads that mean something more to them. That's fine. But then they also
congregate your data. And then as a separate thing that they sell, they sell that data as its own
thing. Because really, part of that model isn't just to show you an ad, so that you click on the
ad to go buy a dress. The other thing that they want to do is to have you constantly feed them exactly who you are,
aggregate that data, take that data,
much the same way that like mortgages
were bundled into troughs and sold in tranches rather
and sold in giant groups.
They take that data and then they sell that.
If you take that-
Well, Facebook doesn't do that.
That's the third party people that do that.
Facebook sells data. Facebook does not sell data people that do that. Facebook sells data.
Facebook does not sell data.
Facebook, that's absolutely not true.
Facebook keeps the data, but they do not sell the data.
Facebook allows you, like other apps, to collect data on you.
Okay.
But you have to sign up for that.
Facebook does not sell your data.
They won't go to these big data companies and say, I have a bunch of shit on Tom.
Here's a bunch of shit that he's liked.
But if you opt in based on some of these apps and things,
which is exactly how Cambridge Analytical collected it,
right?
What they did is they had a little personality app or something,
some dumb shit that you've looked at at your phone.
It was like,
what fucking hat,
what fucking battleship am I or whatever?
You know what I mean?
Like it doesn't matter.
But they had that thing.
And so you went in and now they can then scrape your data so they can sell access to you.
I guess.
And so let me walk that back.
There is a distinction.
And that's an important distinction.
I think that from a practical perspective in terms of how individuals interact with Facebook, it's all in the same house for people. And I think that that's an important piece too, right? It's like, yeah, I didn't break
into your house, but what I did do is I left all your doors and windows open last time I was there
and then somebody else, but I didn't, you know, like, it's still the responsibility I think of
the platform. But regardless, the larger point that I was going to make is like data only has value if the data can be used to change your behaviors.
Right.
Data has no value to anybody.
Nobody would buy it.
Sure.
If it didn't have access to if it could not be leveraged to change people's behaviors.
And it's one thing to leverage that to change your behavior to get you to buy a
dress, right? That's who gives a shit. Like maybe you were going to buy a dress anyway. Maybe you
bought a dress you particularly liked because of it. But maybe now you're thinking about Black Lives
Matters differently. Maybe now you've got shit in your feed that you didn't want in your feed.
Maybe now you're being marketed to by Russian bots
differently. You know what I mean? Sure. Data is valuable and we don't treat data as a monetarily
valuable resource. What we do now, which I think is insane, is we buy, we engage in behaviors,
which are at this point, socially impossible not to engage in, right? Like, I have a cell phone, right?
My cell phone gathers a ton of data about me.
Sure.
All of which I have to agree to in order to use the phone.
Yeah.
And I have to use the phone because I have to use the phone for work.
I can't not have this cell phone.
I'm required to have this for work.
And you're required, really, to be part of modern society society to have certain basic social tools at your disposal.
And you have to opt into all this shit, which is pages and pages and pages long, which you're never going to really read.
And then all this information is gathered and then you give it away for free, even though it's obviously valuable because companies buy it.
So if it wasn't valuable, who would buy it?
And it's valuable because it changes your behaviors. Sure. We need to look at data as a monetarily valuable resource
that is inextricably linked to our right to privacy. Those things are one in the same.
I really do think that when you look at these things and we think about
all the valuable, monetarily valuable information we give away for free,
we can't gather together and sue,
right?
We can't have a class action lawsuit and say,
my,
my,
my data was,
was collected and used without my knowledge because we don't have damages.
We don't have damages because we've not collectively agreed that our data is monetarily valuable as a consumer. We know it's monetarily valuable
for companies to buy it and sell it. But we have not agreed yet, collectively, socially,
that that data that I give you for free, it's crazy. You buy a phone, you spend money on it.
You spend money on the plan. Then you take that phone, you spend money on it. You spend money on the plan.
Then you take that phone and you hand somebody something that they're going to turn around and sell and you give it to them for free. And then when you're harmed by it, and we are harmed by
this, our democracy is harmed by this. We don't have any fucking recourse because we can't show
monetary damages. We need to get to a point where we recognize that data and money are the same,
that data and privacy are the same. You know, what's really interesting is, especially on the
podcast that we listen to, there's a group, it's a website, and I don't remember, I'm not recalling
the name of the website. It's something news, world news, something. It was a website that was
created, a Facebook page that was created specifically pulling articles off the internet. And the guy even says, if I can make...
Mad world news, right?
Mad world news. That's it. If the guy, the guy who they interview in this Times piece says,
if I can make a non-story into a real know, if I can make it into a bad story,
that's a good thing.
That's a viral story is what he says.
Yeah.
And, you know, I think-
He wasn't in the business of creating viral stories.
Absolutely.
Viral stories that were,
and the headlines on these stories,
when they read them aloud,
were just ridiculous.
Hillary spits in the face of voters
and, you know, Trump, you know, Trump loves,
I mean, it was like, like, I don't remember. I remember that one specifically, Hillary spits in the face of American and, you know, Trump, you know, Trump loves. So, I mean, it was like, like, I don't remember.
I remember that one specifically Hillary spits in the face of American voters or things like
that.
And I remember that one, but there's a bunch that they read off that are all just like
you, you listen to it and you're like, that's just, I would just scroll past this garbage,
but people don't people, people use it.
I feel like you don't even have to know date, know about data.
You don't even have to dig into data to do the kind of things that the russians are doing and to do the kind of things that
this company which profited greatly off of changing news just enough to make it fake news and viral
right you don't have to you don't have to even know anything about anybody because we'll do it
for you we will find these things that we agree it for you. We will find these things that we
agree with and share them. We will find these things that we hate. We'll share them. We'll be
like, fuck this guy. I can't believe this. We'll find that stuff because on social media, that
rewards that behavior. Suddenly you'll get a bunch of likes or angry faces or whatever it is
that makes you feel like I contributed to this community in some way.
And so we don't even need to even roll. I mean, the data is a bad thing and you're right. It
absolutely can cause people to, you know, target you and do all kinds of things, but you don't
even need it. Like you don't even need it to cause damage, which is what happened for many,
many years before anybody even caught this.
So one of the things I want to talk about that because it's one of the notes I took from the documentary wrote says, you know, the idea here is to create and this is kind of new.
Like we I think I think in times past, I was trying to think about what's new about social media.
Like what are some of the things that that makes this different than than broadcast media? Because this isn't really a broadcast media, right? In the same way that I turn on Channel
2 News, you turn on Channel 2 News, it's identical, right? It's always the same news. You turn it on
at 7, it's only on at 7. It's only on at 7 for me. It shows you the same image it shows me.
So that's broadcast. That's passive to the audience, right? What's different about social media is that the focus is to create an audience first, right?
Not to create a message or content for the audience.
So first we create an audience.
Then we create content for that audience.
And that's backward.
That's different than everything we've done before.
Everything before, yeah.
Right?
And I think that that's really important because what that does is it says,
okay, it doesn't really matter
what the content is.
We're not trying to get this content
to an audience.
What we're trying to do
is get an audience to content.
So we will create the content
based on who our audience is.
And we can create
in this hyper-personalized way.
And that's really incredibly unique.
It's like i sat on a
meeting the other day and this was like meeting of some pretty fucking smart folks and we were
having a conversation about um you know getting our about our website and one of the guys said
well let's google and now and see where it comes up and i had to stop and say you know i'm not
gonna use his name but i said i just stop and say hey you know that I'm not going to use his name, but I said, I just stopped and said, hey, you know that like when you Google and I Google, we don't get the same result, right? Like you,
yeah, like you know that, right? And like he paused and then he said, oh, you know,
I guess I do know that. But he didn't know it first. Yeah. He knew it only after he stopped
to think about knowing it. And that's how most of our interactions, I think, with media are.
Most of our interactions with media are not considered interactions.
They are intuitive interactions because they're so casual and they're so easy.
And so there's this list of things we know intentionally.
And then there's the list of ways we behave unintentionally.
And they don't match each other.
And I think that what we see is that because we are not careful consumers of media, and we're not generally, and we know this, like study after study shows that we're not good at differentiating our messages.
It's very easy to manipulate people.
It's very easy to get people to think that, man, this is what the world looks like because this is what the world looks like when I see it.
Sure.
And that step back, that constant vigilance isn't natural.
It's an unnatural thing because it causes us to doubt our community. To do that, you have to doubt
that the community that you are existing within, your social online community, you have to doubt
its validity. We're not good at that. That's not something like we're naturally tuned to do.
And that is being intentionally manipulated by bad actors who want to get you to behave differently. And I think that story after
story shows that it's working. It's causing us to behave differently. And that's incredibly
distressing to me. It's interesting because one of the things that they were talking about
in this video that we watched, they were talking about in this video that
we watched, they were talking about Facebook ads. And they did this really innocuous piece on like
one of those, the daytime TV shows, like daytime news shows, early daytime news shows,
this day or this week or whatever it is. I don't even know what the fuck those things are even
called to be perfectly honest. But you know, like the morning show, it starts at nine. It's a news show, but it's also kind of a talk show.
And they did a fluff piece on Facebook ads.
And this woman walked away from the piece thinking, wow, you can target people by their
demographic.
You can target people by what they like.
You can target people by this and this and this on all these like little, very small
slivers.
You can, like you said, I can send something to my audience. I can sense, I can now, I know what my audience is. All I have
to do is just send the content to them, right? We create the audience first. And so she sees this
and she thinks, wow, this could maybe use, be used for some nefarious things. And then a couple
years later, here we see, here we see it's used for some nefarious shit. And it's funny because
it's just like, like it's somebody who's,
who's mining in a mine
and they have TNT
and they're like,
man, this TNT could probably
do some real damage.
Huh?
You know, if you put a bunch
of this shit in a truck,
you might be able to blow up
the World Trade Center
or something, you know,
but they don't,
we just didn't add
two and two together
or we did
and we just didn't care.
Yeah.
Well, I think,
I think it's like,
I think social media is like anything else, right?
Is that it's like driving while texting
or it's like drinking or smoking or it feels good.
We have an incredible amount of data that shows
that like maybe this isn't really good for us.
We have an incredible amount of data that shows that.
But it feels really good and we like it.
And so it's like, well, I'm still probably going to eat that steak. You know, but it feels really good and we like it. And so it's
like, well, I, you know, I'm still probably going to eat that steak, you know, like it's maybe not
good for me. And I recognize it's not good for me, but I like it more than I care about how bad it
might be for me. So that's the, that's just the truth. Right. And then that's okay that that's
the truth. But then the question is like, what are we going to do about it? Cause the answer is not
walk away from social media. The world is not going to do that. So we're not going to do that. We have to do something. We
have to make changes. And like these social media companies have been very disingenuous in the way
that they've said, we create the platform, not the content. Yeah. That's really been kind of a
message. That's been their message. We create the platform, not the content. We have, we have terms
of service that nobody fucking
reads. And if you violate them
and you've already violated them and somebody
reports you, then we'll have this discretionary
decision about whether or not we take down your post.
We probably will, but whatever. Maybe not.
Unless you're Trump, in which case there's a financial
disadvantage. Well, there's some things
that they choose to ignore and there's things
that they choose not to ignore. You know what I mean?
They do it all the time. Right.
I really think that what we have to recognize are a couple of things are different about this world now that were not true before. One, I think that
these social media platforms are the new town square. We have to just say they are the new
town square. So if they are the de facto new town square, then we probably need to regulate speech.
Governmentally, we have to regulate free speech in these town square places in the same way
that we regulate speech in other town squares, right?
Because so the Nazis can have a page.
I think that that's got to be, I think we have to say like, because your other option is
then we continue to allow private companies,
no matter how big
and no matter how influential,
to have complete discretion
without any oversight.
I think, go ahead.
And that strikes me as that's not working.
I think we already know
that that's not working right now.
I want to point something out here.
We know that Mark Zuckerberg lied to us.
We know.
It is a fact.
It is a 100% certainty fact that he lied to us, right?
He knew about this stuff.
Stamos has come out and said, I told them about this stuff in early 2016.
He said a few days before the election
or a few days after the election,
I don't remember which.
This is crazy.
That is crazy and it's not happening.
That is 100% a lie.
Yes.
I don't want to trust him or his company
to decide what's true.
Right.
And that's what we're going to have to do
if we decide to say,
okay, Facebook,
we're going to let you crack down on what's true and what's not.
We're going to allow you to get to filter our news for us and say this is fake news and will not be reached.
This is not fake news and will be reached because I'll be perfectly frank.
I don't want a liberal bubble that is Facebook. But make no mistake that if there is no regulation about this,
then all of this is entirely at the discretion of private corporations whose primary investment
requirement is to keep you engaged, and disinformation will keep you engaged,
divisiveness will keep you engaged. Divisiveness will keep you engaged.
We know that that's true.
Yeah.
It's been studied.
It's not a guess.
Like, we know it's true.
Sure.
Their financial incentive is to lie to you.
And we cannot hold them accountable because they are not accountable to you yet.
They are only accountable to their bottom line.
They're a private company.
We don't have any reach into them.
They don't owe you shit yet.
They don't owe you anything.
Yeah.
So I think we need to decide, like, they are the new town square.
Yeah.
That's just, and then I think we need to treat them like a public utility.
So in many-
Break them up?
Well, no, I mean, maybe.
I don't think so.
But I think that, like, I mean, because you're looking at, right now we're looking at antitrust. If you're talking about antitrust stuff, Facebook, Apple, Google, Amazon, these are trillion dollar companies.
At least two of them are.
Trillion dollar companies.
The U.S. budget is $3.8 trillion.
These are trillion dollar companies, right?
I just named two.
Facebook and Google are both in the hundreds of billion dollar range.
I don't know that they've reached a trillion dollars yet, but we're talking about billions, many billions, hundreds of billions of dollar companies.
Yeah. So, you know, that's bigger than I think the robber barons were.
break them up in the same way that you do trust busting. Maybe you do. I don't know.
I do know that like, I do know that these need to be thought of as a, I think we need to create a new category is what I'm saying is that a new category of public utility, a speech-based public
utility should be created the same way. Like we have a gas company and a phone companies are
public utilities, but they're private companies. I buy my electricity from Commonwealth Edison.
Other people buy it from Exelon. Other people buy it from, so they're private companies. I buy my electricity from Commonwealth Edison. Other people buy it from Exelon. Other people buy it from... So they're private companies,
but they're public utilities. Yeah.
So there is a... And I think that what we have now is we have these giant social media platforms
like Twitter and like Facebook, and they're big enough where we should say this is a public
utility related to free speech. And we need to have some rules
and laws and there needs to be some hooks and some transparency into how these companies operate
because their impact is so huge. Their impact, if you can't engage it, is massive. And their
impact, if you do engage it, is massive. And it should strike anybody when you say like Facebook is a company worth, you know, many billions of dollars.
Oh, but isn't it?
It's free.
How are you worth billions of dollars if it's free?
How's Google worth billions of dollars?
Because I'm giving you something.
Yeah.
I'm giving you something of tremendous value.
At least, at least with Apple and Amazon, you know, you're buying something.
Exactly.
Yeah.
You know what I mean? Like, at least with those companies, you know, you're buying something. Exactly. You know what I mean?
At least with those companies, you know you're paying for something.
There's no transparency to what you're paying for.
The other two are advertising companies.
Make no mistake.
Facebook and Google are advertising companies that happen to have a service that you like.
But they are advertising companies, bottom line.
And so I think we need to say, okay, if we are going to decide that these things are
important, and I think we would agree that these are important. I was reading a little bit to
prepare for this. The average user that has the Facebook app installed, the average user
checks Facebook 14 times a day. That's the average number of times. Most people check Facebook within the
first 10 minutes of waking up. Wow.
Most people are engaged on Facebook between six and eight minutes per check.
If you're on Facebook 14 times, times six minutes, and you're awake for 16 hours,
you're spending about 12% or one eighth of your waking life on Facebook. You can't pretend it's not important.
It's one of the most important things you do
because you engage in it one-eighth of your waking life
if you are the average user.
Yeah, yeah, yeah.
That's a massively important, massively powerful tool
that we have no transparency into.
We have no oversight and regulation of at all.
You know, it's interesting
because I was thinking while you were talking,
we know when we walk in the supermarket
that the weekly world news is garbage.
Right.
We know that the sun is garbage.
We know, and that's the sun in the States,
which is a tabloid.
Right.
We know the Enirer is probably going
to be garbage, right? We know it's not real or it's fake or it's exploited or it's, it's just,
it's blown up. It's, it's, you know, as they were talking to one of these documentaries,
there's a kernel of truth, but there's no real truth, but we ignore it or we buy it,
but we buy it for sort of a guilty pleasure. It's not a, it's not a thing I'm going to get
my information on.
Certainly not the Weekly World News,
which I think is a parody newspaper anyway.
It's gone now, by the way.
I looked it up.
Oh, really?
I didn't realize.
I know.
Weekly World News for many years was a bad boy type of thing.
I loved it.
I have an entire book,
or at least used to have an entire book,
that was just Weekly World News covers.
Oh, it's amazing.
And you'd scroll through it
and just see all the Weekly World News stuff on there. It was, it was funny. It was hilarious. They had a goofy,
um, you know, goofy stories or, you know, 10,000 pound person or whatever.
He was assassinated by UFOs and stuff like that.
It was hilarious, but you knew it was garbage. You knew it was garbage.
How do we get to this next level of human understanding where I don't have to tell
anybody if they walk into the supermarket that it's garbage right now, but we just know
intuitively that it is garbage, right? I don't have to be studious. I don't have to be intelligent.
I don't have to fact check to know that when I pick up those things, it's not, it's just junk. How do we get there now
with what we have in technology?
How do we get there with that sort of,
like with news sites,
with, how do we get there?
I was thinking about that.
And I think it's easier.
I think that there are real technological solutions
that are not actually that difficult
to put in a place.
So if you really gave a shit
about spreading fake news, right?
The first time
anybody, the first time a link gets shared on Facebook, let's say, because we're talking about
on Facebook, it should have to go through an approval process, right? And you could do that
much simpler than it sounds like, right? Because you could say you could whitelist anything from
your major, from WAPO, from Fox News, from MS.
You could whitelist all the big ones immediately, right? So if address equals whitelist,
then it just gets shared. And every share, after the first verified source, every share is
whitelisted subsequent, right? So that means that it's not like, oh my God, every time I want to
share a story, I can't share the story because very likely you're not the first person to find it, right?
So you could create a system where anytime something, not from a whitelist source, any
source could apply to Facebook to be whitelisted, right?
So, okay, I'm a legitimate news source.
I want to be whitelisted.
I go to Facebook.
I apply to be whitelisted.
There's a transparent program, a series of metrics that Facebook applies to say, yes,
this is a legitimate news source. Not just a blog
or a whitelisted media
outlet. Right. If I want to share something
and I think Facebook should identify
a very clear banner
style with a different color,
news. Entertainment
will get a different color and a different moniker.
So if I want to share my blog about my
fucking kids,
I just don't list it as news, right?
Now, anybody who's looking at their Facebook feed and they're scrolling through would get a big visual cue.
That's news, that's bullshit.
That's news, that's bullshit.
It would not be technologically that difficult
to put those systems into place.
It just wouldn't.
But they don't, like, there's no incentive to put that in place.
The financial incentive is the opposite.
It's a disincentive.
Right.
They want to make sure you're in front of it, and they prefer this sort of thing.
That's the thing.
That's the interesting thing about this is that they want your eyes in front of it.
And the more divisive it is, the more fake news is out there, the more chance they have of
having you in front of. Because this isn't a complicated problem to solve. It's just not.
They pretend it's complicated, but it's not a complicated problem to solve because you can come
up like, I'm just some fucking guy. You can come up with good solutions to this in the better part
of an afternoon, which will then take time to build. Yeah, you have to build it. And you also
have to hire for it too, right? Because there's going to have to be a slew of people
that have this entire day just spend like,
what the fuck is science blog slash this slash this?
What the fuck is that?
That's not news, blah.
But you could also just rely on the whitelist only.
You know what I mean?
So you just like, there is no approval process
except for getting yourself whitelisted.
That's the only approval process, period.
There's not like individual instances.
There is no interest in fixing this problem.
Right.
It is more financially valuable to not fix this problem, but to put out a commercial that says, hey, we're going to show you more baby pictures.
Yeah.
a commercial that says, hey, we're going to show you more baby pictures. But the reality
is like, if I look at a baby picture
and I fucking hearty face it or
whatever, and then I say, oh, your baby's
so cute, I'm very unlikely to ever
go back to that picture 23 times
because I'm engaged in an argument about Hillary's
fucking email. About how cute this baby is. Right.
I mean, you might be engaged in that argument. Your baby's
ugly! I do that on everybody's
baby, though. I'm just like, your baby's super ugly.
Why do you put this shit on Facebook? Show me a cat. Right. Because I remember, like, when I used to that on everybody's baby, though. I'm just like, your baby's super ugly. Why do you put this shit on Facebook?
Show me a cat.
Right.
Because I remember, like,
when I used to play on Facebook,
like, I would get into arguments sometimes.
And then I would get, like,
I would get, like,
You'd get hot.
I would get fucking hot about it.
Yeah.
And I'd spend a huge chunk of time
back and forth on that shit.
I remember the first time
I ever got into an argument on the internet,
I was in college. I get into an argument on the internet and I was like, I was, I was palpitating.
I was hot. I was all the things that happen in person was happening online. I was typing. I was
like, oh my gosh, this guy's such an idiot. What a fucking fool you are. And I, for years would go
back to the site, argue with this guy. I remember arguing about evolution was a main thing I was
arguing about. And I would argue with this guy and I remember arguing about evolution was a main thing I was arguing about.
And I would argue with this guy
and I'm just like, you're a fucking idiot.
Like, oh my gosh.
And the arguments would get heated
and we'd get shitty with each other.
And I just saw, you know,
cause initially my argumentation style was very simple.
Like, let's talk about it.
Let's figure it out, blah, blah, blah.
But I was getting pushback.
Exactly, yeah.
The rhetoric started getting put and it was snarky and shitty. And then I was like, oh, blah. But I was getting pushback. Exactly, yeah. The rhetoric started getting put and it was
snarky and shitty. And then I was
like, oh, fuck you. Like, oh, well, fuck
me, fuck you. Yeah, right. And there's
this level of anger
and I was upset and I wanted
to come back and like you say, check it six
times a day to make sure because back then
you didn't get notifications.
You'd have to go there and
be like, well, I want to go back to that Bell bulletin board
and see if that guy said anything.
Fuck that guy.
I'm going to tell that guy
he's wrong again.
I'll keep telling him
that somebody's wrong
on the internet.
I need to do something about it.
You know?
And I did.
I remember arguing about it.
And I know you've been
in arguments.
Oh, yeah.
I used to like,
I used to joke like,
I had this terrible job.
I hate it.
And I would have a bad day at work.
I'm like,
I'm going to go find someone
to fight with.
I would just go find somebody
and I would just fight with them online.
Cause it was a fun thing to do,
to go argue and bitch online.
Like,
it was just a,
it was a way to vent.
Sure.
Sure.
It was a way to just vent and be like,
I'm going to go be right about something.
Cause I was wrong all day.
I hated the way that felt.
You know,
I'm real right about this.
I bet somebody doesn't think so.
Do you remember what I said?
Oh, man. You would say
controversial shit that you knew
would draw people out. Right. Yeah. And then she'd
be like, I can fight this all day. Yeah. I can
roll with this all day. This is my day now. Yep. This is what I'm going to do.
This is my day now. You know, it's interesting too because
like, when I go to bed,
I'll take my iPad
in to watch things
on occasion, Netflix on occasion
once in a while
my wife will be like
hey let's look at cute animals
she'll say
let's look at cute animals
and so there's a page
RR on Reddit or whatever
and we'll scroll through
cute animals
there's nothing that can
put me asleep faster
than scrolling through
cute animals
right
like fucking cute cat
being like
oh look at how coy that cat is
that's super cute
you know
that's not going to
keep me engaged right I won't go back day after day, after day, after day, after day to go see
those because it literally puts me to sleep, right? I won't go back day after day. I won't do that,
but I will like check my, you know, check the politics subreddit and go back there a bunch
of times during that day, because I want to find out about this new story that developed.
What about the Jim Acosta thing?
How's that turning out?
What about this Ivanka email thing?
How's that turning out?
What about this other piece where he says he doesn't even want to listen to the Koshaghi
or whatever that guy's name is?
They don't want to listen to that tape because it's a suffering tape.
What about that?
Like, how does it?
And what do the people on Reddit think about that?
I want to go into the comic section.
I want to read about it.
Because it's not only an echo chamber,
right?
Which I,
which we do retreat into period.
It's just something we do.
I try to break out of that,
but I definitely retreat into my echo chambers.
I think just as much as other people do,
but you know,
there's also that feeling of there's a little bit of,
you know,
it's,
it's the drama of your life.
It's the,
you know,
like life wouldn't be fun
if there wasn't anything to conflict with, right?
Right.
If heaven was a real thing,
heaven would be fucking boring
because you're constantly traveling
to the most beautiful place on earth, period.
And you're staying there for an infinite amount of time.
There's no conflict in your life.
Life would be boring.
Right.
And so we seek those conflicts out
when they're not having them in a personal life,
just like you just explained
you did after a long, hard day at work. Just like you just explained you did after your,
after a long,
hard day at work.
Yeah.
Yeah,
absolutely.
It's just,
I think I don't,
I mean,
I want to be very clear.
Like these things aren't going away and human nature is not likely to change.
So if we know human nature is not likely to change and we know that these
social media systems are going to continue to evolve and they're continuing to get,
you know, more sophisticated,
which there's no reason to believe otherwise.
The thing we have to do socially,
culturally, we have to figure out now,
okay, well, we know some things we didn't know.
And that's okay.
Yeah.
It's okay.
We know some things we didn't know in 2014
and in 2004 when this was invented,
in 2016 even.
So knowing this, what do we do now?
Yeah.
What do we do now?
We have a responsibility now to reconsider the way that we look at these media platforms.
I think we need to consider the media companies the same way we consider television stations,
which are highly regulated.
Sure.
FCC, right?
Yeah.
We need somebody that says, okay, I am responsible.
Right. Yeah. We need we need somebody that says, OK, I am responsible because right now what we have is we have nobody responsible for any of this.
Two point two billion people on Facebook.
And the only guy who's in charge is some guy who built this in his dorm room and is financially incented to lie to you about the Russians influencing our elections. Sure. Those are all true things. Sure. You know, what's interesting too is we watched a documentary about sort of how the Russians did this, right?
How the Russians sort of, what they did.
And I know that there's going to be pushback because we get it all the time.
We feel like, oh my gosh, Russia, are you kidding me?
But, you know, there's things that are easily documented.
I mean, just read the Times article.
Yeah.
Just read the Times.
You know, don't believe me.
Go fact check this yourself.
And I think that should be sort of the,
I think that should be the message
of this entire thing,
which is don't believe everything
you see on the internet.
Fact check that shit, man.
We talked about this a little bit
while we were watching the documentary.
You know, I, every time,
and I know I'm not, I'm not a,
I'm not an average media consumer, right? I know that. Every time I see something, memes,
they were saying one of the things, one of the throwaway lines in this was, you don't fact check
a meme. I immediately paused the documentary and said, I do. Like, I genuinely do. If I see
something flash across my screen, that's a meme about something. And it looks like,
you know, if it's something I know is true, it's true. And that happens a lot. I think I'm a pretty
well-rounded person when it comes to information that I gather. And I feel like I'm the kind of
guy who studies news a lot. So I am informed on a lot of this stuff that some people might not be.
So I'll see something, I'll be like, oh, I know that's true because I read a story about it. I read an article about it.
I've already been informed by a good news source. So I understand that it's true.
But then some things will come across and be like, is that true? And then I'll fact check it.
If something comes across that I know that I haven't read before, I will do my due diligence
to fact check it. I might not even be sharing it, right? I might not even be commenting on it. I just want to know whether or not that thing is a true thing. I know that I'm not,
I'm not the person who is the standard, right? I know there's a lot of people out there that
don't do that. And the guy's right when he says, when he's saying you don't, what he's saying is
the common person doesn't. That's, that's, it's shorthand for that. But sometimes the
insidiousness of this is that they put in
some important, really
true information.
Like they said, 80% of the story is true
and copied. And then 20%
isn't. And if you listen to that
daily podcast,
that's what they did.
They added their own
spin on these stories.
And the spin was the harsh, harsh headline,
tagline, viral line that's going to get you hooked.
But they're forgetting that the content of the story
isn't really as...
Yeah, and that's a common thing across a lot of news media.
And we talk about that a lot on this show, that that headline doesn't match what's in this article.
And you can do all of that, too.
Keep in mind, too, that you can do all of that message changing through omission.
Yes.
So you can take a story and just omit.
Absolutely.
You know, there was a great example from the documentary we watched, the NBC documentary we watched, where the moderator of the 2016 debates.
Fox News guy.
Yeah.
Asked Hillary Clinton a question about an email.
And he asked in such a way that he excluded the back half of her sentence.
So she had said something in an email about how she wants open trade and open borders.
And the rest of that sentence was with respect to energy.
He excluded the second half of the sentence. Absolutely. So is it true that Hillary said
she wants open trade and open borders? It's true that those are all words she said in that order.
But if you omit the back half that contextualizes that sentence, you have changed the message.
that sentence, you have changed the message. So you can have a story which is, in fact,
technically full of true things, but which message has been changed. And that is literally a disinformation technique which is and has been employed and is being employed aggressively
in order to create divisiveness amongst us. This is something that
is in the playbook. It's part of how we get people worked up about things that didn't fucking happen.
You know, it was interesting the other day, I was listening to a politics podcast and there's
a right wing guy on there. And somebody had said something about the voter suppression in Georgia.
And when they said the thing about the Georgia, the thing about voter suppression in Georgia,
Ivan had a chance to go fact check the things he said,
but his response was,
well, a lot of that stuff had to do with the counties,
had nothing to do with Brian Kemp and a lot of things.
And he started naming off all these things
that had to do with counties.
Like, you know, maybe 50 people were gonna vote there,
and they were planning on closing down the voting center
and things like that.
But one thing he steered clear of by omission was when Kemp was closing down voter registration things and also holding people on voter registration and putting in sort of these weird draconian voter registration laws that excluded tons of people who were just registered, you know, things like that.
And he didn't include that stuff.
And he made it seem like Brian Kemp's hands were completely washed clean. Right.
Oh, he's not yet. But if all you have to do is just omit the tiniest bit of information and suddenly it looks like, well, maybe it's not Brian Kemp's fault at all.
Maybe it was all about the counties. Maybe it was all about the way in which the state handled it. And he had no control as the secretary of state in that particular election over a dozen little tiny things that happened during an election.
But when you start talking about the big picture, no, he still had a lot of control over the election.
You just omitted it.
Yeah.
And one of the things that like that social media is really good at doing is making it seem like a story because of repetition.
I mean, you talked to us before. Because it's repeated a lot, it makes it seem like it's more
true. That's something that's part of how we as people process information. If I get a piece of
information once, maybe, if I get it 12 times and it all has the same basic message, it seems more
real, even though the reality of that message may not have changed. And it is a actual tactic by the Russian disinformation campaign to do that by changing
the frequency and changing the balance of what the audience sees. So if they see that more often,
that changes your perception of reality. You think that's what the world looks like.
It was funny because one of the things Zuckerberg said when he was denying that any of this happened is he said, people vote based
on their lived experience. And that's a funny thing for him to have said, because what it does
is it seems to deny that Facebook is part of your lived experience. But it is part of your lived
experience. It accounts for a huge part of how people interact and behave.
It's his own company. It's a hand wave. He's hand waving off his responsibility.
And I want to talk about that a little bit too, because that's something that people do
erroneously. It is always a mistake, always, always a mistake to say, it's just Twitter.
It's just Facebook. It's just online.
It's just online. As if that minimizes the impact,
the emotional impact,
the social impact,
and the intellectual impact of that message.
Look at how much bullying gets,
just gets washed under the rug
because it's just online.
But you know, like,
if you doubt that for a second,
look at the money.
Yeah.
If all of this was unimportant,
companies couldn't make billions of dollars.
Sure.
They can only make billions of dollars because it's important enough to change your behavior
to get you to move your money around. Yeah.
So if you ever doubt whether that's true or not, just look and say,
hey, that's free and it makes billions of dollars. How?
Oh, it must actually be important. People must be emotionally invested. So this idea that you can brush off an interaction
because it just happened on Facebook is bullshit. Zuckerberg tried to do the same thing about his
own company as if it's not part of our lived experience. Oh, but we're bringing people
together. Oh, our message is to bring people together. You're all together. Kumbaya. This
makes people happier. Oh, but it's not part of your lived experience.
Wait, which is true?
Which is true?
They can't both be true.
You can't collect both of those things in the same reality.
And we know that you follow the money.
That's following behavior.
It's true.
This is part of our lives now.
We have to recognize that like all of the onus right now is on you.
Because it's one thing for me to say like,
I think we need to have all these regulations.
I think we do.
I think we need to treat these
like a public utility.
I think we do.
I need to create a new space,
public space for these things to exist.
I think what they need
is all of the onus is on you right now.
Yeah, well, and I don't think that,
I don't think that I want to push away
any of my personal responsibility
because I do want to be responsible for myself.
And so I absolutely will
check my security settings.
I absolutely will do that.
But I will say they should
make it a lot easier to do that.
Yeah.
They should have a big,
big fucking button in the corner.
It's like, turn that shit off, period.
And it shouldn't be listing
every tiny little thing
that I want to say yes or no to.
It should, you know,
turn it off or turn it on.
You want the bells
and whistles of Facebook.
You're going to have to have it on.
But if you don't want that and you just want to look at baby pictures, well, you know, turn that fucking thing off.
You know, stop them from collecting a lot of that stuff.
Make it easier to navigate for the average person to try to figure it out.
Like there are some, I don't know if you've ever seen these Tom, but sometimes they'll post some really weird esoteric fucking here's how you do it.
It's like fucking how to get 30 guys on Contra.
Like it's fucking crazy. It's's like what do i have to do i have to go through like four different
menus to dig myself down so that i won't be targeted by some weird crazy shit that i never
wanted to be targeted by but you have to like you have to dig in there no nobody's gonna know that
just organically there was a guy i was watching a documentary recently and by by the way, I'm not going to post the link to it,
but another great documentary,
because this is all over the news right now.
Well, another great documentary happens to be one
that what PBS did, Frontline did.
It's a two-part documentary.
You can watch it on Amazon Prime.
It's two-parter on Facebook.
Excellent documentary, even more in-depth
than anything we covered on this show.
Really excellent stuff.
But, you know, they talk about this guy who's in,
he's in one of those overseas countries,
I don't know where,
but they have different privacy laws there.
And so he asks for his data on Facebook
and they deliver hundreds of pages to him
of all the data that they have on him.
They have deleted conversations that he had
with a friend of his that was in a mental asylum.
He deleted those conversations and they still have them. Of course they do. They have them because they're not deleted from your view. Yeah. It's like, yeah. What were you crapping?
Yeah. It's not, it's like, yeah, yeah. You know, like, yeah, I can get away with it with the people
I know, you know, those people that don't have access to the backend, but the people that have
access to the backend have everything. Like this is is part of the thing, too, is remember, you didn't pay for it.
Yeah.
So you don't have any consumer rights.
Sure.
Because you didn't pay for it.
There's a reason Facebook is free.
Yeah.
As long as Facebook is free, no matter what happens, you don't suffer any monetary damages,
which means you can't join the other 50 million people whose data was collected against their
will and have a class action lawsuit.
You can't do that
because you can't prove any financial damages.
You don't pay for it for a reason.
They don't want you to pay up front
because then you're a consumer with consumer rights.
You don't have consumer rights
when you get something for free.
So I deleted the conversation.
No, you didn't.
You deleted your ability to view the conversation.
That's all you got rid of.
All you did was curate your looking the way you get to see it.
Facebook already has it.
And that's okay because you gave it to them for free.
You're using their platform.
Stop. Stop.
And that's a solution.
That's a solution.
Make it cost something.
But they don't want to make it cost something.
Well, you had a solution.
You said one of the things we're watching this documentary and you said, well, what,
nobody's even mentioned this, but why don't people just not use Facebook? Right? Like that's a
solution. It absolutely is a solution. It's absolutely a solution to turn that, turn the
notifications off on your phone. It's absolutely, absolutely a solution to limit your Facebook time
to a certain portion of the day that that's the only time that you check it.
It's absolutely a solution
to delete the app off of your phone
or to make it so that your browser
doesn't remember that site
and you have to type it in every time
to log off every time.
So you have to type in your password every time.
There's plenty of hacks, life hacks
that you can use to make Facebook.
It's funny because all the life hacks are just,
don't use it. Don't use it.
Don't buy it as easy. Like don't make it as easy as possible. Right. Don't. It's so funny. You know,
yeah, it's like it's no longer there for you to just mindlessly while away the hours on. It's now
something that you actively have to engage in and you might not do it. You might be like, well,
fuck, I don't feel like logging in. Right. I don't feel like I do that all the time, man.
Constantly do that.
I'm like, yeah, maybe should I check?
But no, I don't feel like logging in.
All right.
I'm doing something else now.
The end.
You know, it's so funny because like I will share this one personal experience.
And it doesn't even have to go on the show.
But like I've I've I've stopped using Facebook for several months.
So I haven't used Facebook in any substantial way since July or August.
I haven't.
I posted a few things.
I took them down pretty much immediately.
I don't miss any, like after a very short period of time,
you don't miss anything for it.
I look once a day at my wife's Facebook
because it matters to her that I look at it.
And then I don't look at anything else.
So I don't bother to look at anything else.
I don't look at my notifications and I don't read my,
and like your life goes on just the same, like nothing changes. Yeah. Like for, for most people,
you know, sure. It's not, it doesn't have to be a big part of your life unless you want it to.
But I think no matter how it is a part of your life, however, you decide to interact with it.
And I don't, I don't recommend the way that I do it. Like I'm not saying that that's not
necessarily for everybody.
Right.
But what I am saying is have a thoughtful.
This is something that we do need to be thoughtful about how we engage.
It's like eating meat.
Whatever your decision is on eating meat, don't do it just because this is what you've always done.
Have a real plan because it's important enough.
It's significant enough in your life
that you should have some thought given
to how you want it to be a part of your life,
what role it should have,
like how important should this be?
And then act on that
so that how important it should be
is how important it becomes.
You know what I mean? There's a difference difference. Like be thoughtful, like anything else,
be thoughtful about it. Yeah. It's, it's, uh, you know, I know that I don't use Facebook in the
same way that I did maybe a year ago. Um, and I don't use it specifically because I don't find,
I don't find a ton of value in it. Like I don't. I like to interact with certain people on Facebook
and I will check Facebook and check certain pages
and I will look at my notifications
and see who tagged me and what.
But for the most part,
there's a lot of things
I just don't want to engage on Facebook
and I'll scroll right past them.
There is a hobby that I try to keep up with on Facebook
and so I will check it for that
because I'm part of this hobby
and it's something that I do and it's something that I want to keep track of. And so I will use
it for those types of things. I will see what I missed, you know, this weekend and things like
that. So I'll play, I'll play along there. But I, I definitely, we had a conversation a while back
about how many times you check your phone because of the notifications. And I realized I was checking
my phone a lot because there would be a notification on there. I'd be like, oh, what,
what happened on Facebook? So-and-so tagged me on something. And so I turned off all those
notifications, Twitter and Facebook. I don't get them anymore. I'd never get those notifications.
And I find that Twitter, I check every third day, every fourth day I'll check too, because I don't
have any reason to be on Twitter. And then I only check it for the show and I only have a show account.
So I only look at what those mentions are for the show and then I don't pay attention to it otherwise.
And then and most of those posts, anything that could post it on social media for us is not done by us.
It's done by our employee.
Our employee handles all that stuff.
We don't ever really do it.
I don't look at a lot of the interactions.
You know, our employee may bring our attention to some interactions.
But the best way to contact the show has always been an email
because that's something that Tom and I actually actively look at. We look at it every time we
listen to the voicemails and we look at the emails. That's how we interact with the audience.
That's something we've been doing for a long time. I rarely will look at sort of if a show has a
dozen mentions on it, I might not even bother to look at sort of what those mentions are because
it's just not as important to me. But I did find that the phone
was sort of controlling how I was acting.
And I didn't like that.
I don't like being manipulated.
I know that I'm being manipulated a lot in my daily life.
But if there's any way I can control that or curtail that,
I will try to curtail that.
I will try to change my behavior.
I was curious how it,
like turning off the notifications,
like it did, it changed your interaction.
It changed my consumption.
It absolutely does because I'm not being, there's not a ding every time that happens. I have to
keep the Facebook messenger notifications open because our group that does the show,
the Citation Needed show happens to communicate that way. So I have to keep that open, but I would
prefer that it wasn't on there to be honest, because I would turn off those notifications
too. I would be, you know, when I check Facebook, my couple times a day that I check Facebook,
I would do it on those times and pay attention to those notifications then.
Because you're right.
I think that there is something to that.
I want to talk a little bit about sort of, you know, another thing that we sort of got
away from it.
It's this kind of, this is a conversation.
So sometimes that happens.
But I want to talk a little bit about how the Russians were doing some of these things. Cause we talked a little sort of touched on a bit and you were
saying, you know, one of the things they do is inflate, make it inflate the numbers, make it
look like the numbers are different, make it look like some stories have more validity or more people
believe in this thing than actually do believe in it. And that's something that people don't
understand. You know, like when I first started hearing about the, the, the, the Russian interference, I started thinking, I was like,
well, what did they, they bought ads? Well, I would never see them because I have ad blocks.
So why, it doesn't even make any sense. Like, like I didn't think that that would have been
a big change. Like I was initially very skeptical of it and I still don't know what it, what effect
it had. So I want our people to understand, I am still skeptical of whether or not it affected
anything. Cause I don't know how much it affected anything. And I don't think we ever will know,
but I know that there was some effect at, there was some effect and it was coinciding with a lot
of things that were happening with the election. So I don't think you can just write it off.
But one of the things that I, that I was, I sort of, the reason why I wrote it off was I was like,
well, if they post a story that's fake, maybe people will find out if it's fake. If they post an ad that's garbage, people won't really
pay attention to it. But some of the stuff that they were doing was really insidious. Commenting,
being just a commenter, having multiple, multiple accounts to go in and comment and like,
and to do that sort of thing to make it look like they were a user rather than a provider.
And so, you know, they get the, they, they wind up, uh,
changing people's minds, not just by providing bad information, but by upvoting and sharing that bad
information back to you and finding these conspiracy nuts. We talk about the conspiracy
nuts every week, but the Russians have made it, you know, an, uh, a target of theirs to find that
stuff that is really crazy. The crazy conspiracy stuff,
amplify it just enough.
So it makes it seem like it's actual news and send it back to us,
which we can't then distinguish from actual news.
Right.
Yeah.
And like,
it's not like,
it's not,
you go on and it says,
you know,
Vladimir,
you know,
it says,
it says,
it's Bill,
it's Bill and it's Bill Johnson.
Right. And that's like, it's not like those fake news Bill. It's Bill Johnson. Right.
And it's not like those fake news sites are.russia.u.
Sure.
They're.com websites because they just buy space in the American space.
They could be living here.
They don't even have to be overseas physically.
It is insidious as hell.
Did you see the New York Times article where it was like, can you spot which one is Russian and which one's not?
They had a thing where like you click and it's like, man, it's really hard to tell.
It's really hard to tell.
And like if you're doing what I suspect that most people do, which is scrolling casually because you're not thinking about this as your, you know, I think like,
I think people are good if they're saying,
okay, now I'm going to go look for news.
And so now I'm going to behave
in this sort of concentrated way.
And I have a set of behaviors
based on how I'm going to go find my news
and who I get my news from.
I think it's different.
And that's part of why it's so insidious
when it's just like,
I'm gonna go on Facebook
and I'm gonna see your baby picture. that's so great and then i scroll and like
wait what was that fucking thing because when you're not looking for it intentionally then
that's like wait what and then you see it again maybe not and you know what i mean like yeah that
that placement is kind of it that's part of what the problem is, is because they're catching you off guard
by putting this in spaces where you're not intentionally trying to find it.
Sure. Yeah. You know, it's interesting too, another thing that they did was
they timed things really well. You know? Right.
That's the thing that we can easily pay attention to, right? It's something that we can look at
and say, oh, well, that's times really interesting. I should maybe scrutinize that a little more than
I should anything else, right? Because of the timing of it is very suspicious.
Look at, I want to read off some of the dates that WikiLeaks dropped some things. WikiLeaks
on October 12th of 2016 released million, like the 2000 emails or whatever that Hillary had.
Right.
So that was a big WikiLeaks drop.
But on the 7th,
Trump,
the Trump tape got released.
The insider edition,
Trump tape got released.
Access Hollywood.
Access.
Oh yeah.
And the,
I write access Hollywood.
I apologize.
I just don't,
I didn't,
I didn't want to impugn in.
Sorry.
My goodness.
For the pedantic correction I'm trying to get in front of.
Oh my goodness.
Oh man, that would have been rough for us.
I'm telling you, one celebrity out there.
I believe it was on Current Affairs.
It was on the Maury show.
You know, they dropped that pussy tape then, right?
So there's some timing things that are happening
that are real unfortunate for the Clinton campaign, right?
Really unfortunate.
And the WikiLeaks, we've talked about this
a dozen times on this show,
but do yourself a favor.
They have them on WikiLeaks.
Go look at those fucking Podesta emails.
There's literally nothing in there.
Search for all those search terms.
They're lying to you
when they say those search terms.
They're not there.
The story is just the fact of the story.
The story is the timing of the story.
Yeah, yeah.
All they did was create, not all they did,
the creation was, look, we need a smoke screen.
We need something to other, pay attention.
It's the Apple research piece, right?
Yeah.
It's like, hey, Facebook got in trouble.
What do they do?
They hired somebody to make Apple and Google look mad.
So now you're mad at kind of everybody
and you don't know what to do.
And it all seems like poison, so fuck it.
Yeah, yeah. It's interesting too. They're talking about another
thing that they, that the Russians do is they have a consistency of message over several, several,
several tweets. And actually that feels like to me that they would be easier to spot that way.
But what it really does is up the numbers of the person who you like up the, up that it makes that
thing more true because there's a
consistency of message. Maybe someone's trying to mislead you when they keep saying the same thing
over and over and over and over again. You know, you say a lie enough times it'll become true,
you know? And, and so it's something to pay attention to. It's something that we, you know,
what I think you should walk away from this with is maybe start putting together a toolbox to
contend with all of this information that we're getting. Yeah start putting together a toolbox to contend with all of this
information that we're getting. Put together a toolbox. Because years ago, I read in Demon
Haunted World, the baloney detection kit. And I've carried that with me throughout my entire life.
I've used that from my young life when I found that baloney detection kit. I use that all the
time. It's how to spot logical fallacies. It's how to spot people that are trying to trick you
It's how to spot scam artists
And it tells you all those things
Because they're easy to see if you just pay attention
If you just use that for a moment
Develop one of those for social media
You're a skeptic
You're listening to the show, you're a skeptic
Be a skeptic out there
Don't just post something because you think you want it to be true. Post something because you know it's true.
That's important. You don't want to share anything. I always feel so awful. And it's
happened a couple of times on the show where we shared something too quick, something that
happened that was too quick. And we've said something that's false. And I've always felt
awful about it afterwards because I want to make sure that, you know, I'm not giving you, I'm giving you a lot of opinion, but anything that I tell
you that I think is true should be true. I shouldn't, I don't want to lie to anybody. I
never want to lie to anybody. So I want to make sure that everything I give is true. So I feel
awful when it happens. So develop something like that, carry that with you in life so that you can
use that. Because like I say, you know,. Because like I say, it's easy to do.
It's easy to fact check these things.
It's easy to find these things out.
There's plenty of sites out there that will tell you whether or not something is true.
That can show you why it's not true, give you sources on why it's not true.
All right, well, that's going to wrap it up.
We hope that you like this.
Like we say, it's a very big departure. We normally don't do this sort of thing, but it's a very big departure from what we normally do.
If you liked it, disliked it, send us a message
and let us know what you thought of it.
Dissonance.podcast.gmail.com is the best way to get in touch with us.
Don't leave a Facebook message.
If you left it, you didn't leave it for me.
That's going to wrap it up for this week.
We're going to leave you like we always do with the skeptics creed credulity is not a virtue it's fortune cookie cutter mommy issue hypno
babylon bullshit couched in scientician double bubble toil and trouble pseudo quasi alternative
acupunctuating pressurized stereogram pyramidal free energy healing water downward spiral brain Thank you. aliens, churches, mosques, and synagogues, temples, dragons, giant worms, Atlantis, dolphins,
truthers, birthers, witches, wizards, vaccine nuts, shaman healers, evangelists, conspiracy,
doublespeak, stigmata, nonsense.
Expose your sides.
Thrust your hands.
Bloody.
Evidential.
Conclusive.
Doubt even this.
The opinions and views expressed in this show are that of the hosts only.
Our poorly formed and expressed notions do not represent those of our wives,
employers, friends, families, or of the local dairy council. you