This Is Woman's Work with Nicole Kalil - THE INSTABILITY OF TRUTH: Brainwashing, Mind Control, and Hyper-Persuasion with Rebecca Lemov | 333
Episode Date: August 6, 2025How do you know if your truth is really yours? In this episode of This Is Woman’s Work, we’re diving into the murky waters of brainwashing, mind control, and hyper-persuasion—because your thoug...hts might not be as independent as you think. We're talking psychological manipulation, eerie government-funded experiments, and the sneaky, modern ways we’re influenced every damn day—by our feeds, our leaders, even our well-meaning friends. Our guest is Harvard historian Rebecca Lemov, author of The Instability of Truth. She brings the receipts—tracing the roots of brainwashing, the science (and pseudoscience) behind behavioral control, and why we’re more persuadable than we’d like to admit. From Cold War experiments to TikTok trends, Rebecca helps us understand how easily truth can be distorted—and what to do about it. Because if you’ve ever reposted without thinking, felt weirdly pressured to agree, or just wondered, “wait, is that true?”—this episode is for you. Connect with Rebecca: Book: https://wwnorton.com/books/9781324075264 IG: https://www.instagram.com/rebeccalemov/ X: https://x.com/rlemov?lang=en Related Podcast Episodes: How To Rewire Patterns That No Longer Serve You with Judy Wilkins-Smith | 323 Unmasking AI with Dr. Joy Buolamwini | 259 Pants On Fire (The Truth About Lying) with Lauren Handel Zander | 219 Share the Love: If you found this episode insightful, please share it with a friend, tag us on social media, and leave a review on your favorite podcast platform! 🔗 Subscribe & Review:Apple Podcasts | Spotify | Amazon Music Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
I am Nicole Khalil, and you're listening to the This Is Woman's Work podcast.
And friend, let's face it.
There are certain things in life that are always unstable.
Wi-Fi when you need it most.
Certain politicians.
My relationship with cheese.
And as it turns out, the truth.
Yep, truth isn't quite the sturdy anchor anymore that it really should be.
In a world filled with algorithms, clickbait, deep fakes, and influencers trying to sell you shit
while crying into a ringlight, it's getting harder and harder to separate fact from fiction.
Or to quote one of the greatest cinematic thinkers of our time, Bing Bongbong from Inside Out,
when Joy says these facts and opinions look so similar after knocking over boxes labeled facts and opinions,
and Bing Bon responds, don't worry about it. It happens all the time. Bing Bong and his infinite
wisdom just nailed the problem. We are all just casually shuffling facts and opinions around like
it's totally normal. And if that doesn't scare you a little, it probably should. Listen,
I'm a big believer in intuition, in trusting yourself. Hell, that's how I define women's work.
But how do you know if what feels true to you is actually true?
How do you recognize when your thoughts, your beliefs, your truth has been influenced or
worse, manipulated by someone else?
Brainwashing, mind control, hyper-persuasion, these aren't just the plot lines of conspiracy
thrillers or cult documentaries.
They're real.
And in many ways, they're probably more effective now than ever.
And here's the kicker.
I'm guessing if we're being brainwashed, we probably don't even know what's happening.
Now, I'm no expert on psychological manipulation, though I have survived both the social media
comment section and middle school, but I did bring someone on the show who is an expert.
Rebecca LaMov is a historian of science at Harvard University, a visiting scholar at the
Max Planck Institute, and the author of The Instability of Truth.
Her research dives into the data, technology, and the history of human and behavioral sciences.
Basically, as I understand it, how we've been shaped and sometimes seriously messed with by the people who want to control how we think and act.
Today, we're unpacking the origin of the word brainwashing, some eerie experiments that pave the way for emotional manipulation, and the very modern ways our thoughts are being influenced without us maybe even realizing it.
Rebecca, I'm already fascinated by the topic, so thank you for being here.
And let me kick us off by asking, how do you define brainwashing and how would we know if it's happening?
Thanks so much, Nicole. It's great to be here and have this conversation with you.
So I would start up just by saying that I've been somewhat obsessed with this topic for longer than I like to even mention.
And long enough ago that when I started to want to teach a class about it at my university,
they thought that sounded ridiculous.
And although they let me do it anyway, but they said, why would you be interested in that topic?
It's so odd and can't really have much to do.
But now it seems strangely relevant or most many people find it really relevant, as do I.
So it's great to define terms.
I mean, brainwashing is a great topic because it's had many different definitions,
but the one I find most useful and simple is a kind of double a cognate phrase, which is coercive
persuasion, and that's because it emphasizes that brainwashing isn't simply about force or sheer
coercion or a kind of torture that can involve those elements, abuse, but at the same time,
there's the persuasion part of it, so there's an attraction also, such as you see with cults
or just even in extreme situations,
there's often a nuance or a set of choices
that involve persuasive element
and even a kind of compliance or, I guess, an agreement.
So that makes it really tricky
and difficult also to recover from.
So what I'm thinking as you're talking
is it's almost a slow thing that's happening.
It's not like you get brainwashed in one moment
of one day, like this coercive persuasion and the idea that it's decisions or little agreements
that are happening that sort of lead you down the path of being brainwashed. Am I misinterpreting
that? No, I really like that idea of little agreements. And it's one discovery I made because what I
set out to do is kind of a thought experiment or an actual research experiment, which entailed, you know,
If I could look at the more extreme cases in the 20th century that we often refer to, just the thumbnail sketch of the terrible thing called brainwashing, which involves POWs during the Korean War, who were subjected to thought reform in these prison camps, it seems like that must have happened quickly, that must have happened just through sheer distress, which was definitely true.
But when you really read these soldiers who are often very young, they were just some of them 17 to 21 years old, having gone through starvation, having gone through, you know, forced marches, having gone through humiliations of finding the U.S. not to be the power they thought it was going to be in 1952 or 51, they then did go through a different, a period that when you read their stories, it does involve small things that really influence them, small,
decisions over time. In fact, one of the soldiers, the POWs, I focus on this person named
Morris Wills, he described how brainwashing is a horrible slow drip-by-drip process. That's almost
a quote. So I think it's very insightful of you to pick that up. Okay. So then if it's slow and it
might be, you know, maybe a small agreement that feels slightly uncomfortable, but not so
uncomfortable or obvious that we look at it and go, oh, I'm being brainwashed, right? How do we know,
how might we protect ourselves from being brainwashed? And I have to say, too, I imagine this is
happening a little bit more regularly in our day-to-day lives today, you know, whereas if you
think about it in a prison camp, that seems like a very extreme situation where we could look at it and
go, okay, I could see how that's happening there. But I guess my question is more in today's
day and age. How might we know if this is happening to us? Right. That's a great question. I mean,
one of the surprising things about just to briefly revisit the prison camps is that those POWs,
they were just seen as almost criminals or culpable for what had happened to them. They
weren't granted compassion or just a sense of the incredible harrowing experiences they had had or even
the step-by-step nature of it. And they were just seen as having betrayed their country.
essentially in their trauma, which was significant and severe, was never acknowledged by a single
expert, really. So the comparison with today's is, you know, there might seem to be very
little in common with the sorts of situations that most of us encounter, especially in
online life or in our minor interactions, like ordering a coffee at Starbucks or something
like that. But the lessons I'm trying to draw is a kind of attentiveness to the drip by drip
process that these tiny decisions have consequences and that we think that they're just
the product of a moment. But that's exactly the experiences people in cults also describe.
I mean, no one joins a cult deliberately. Most people, I mean, if you go into their stories,
it was something like, I happened to be standing at the bus stop that day. I unfortunately
had just broken up with my boyfriend or girlfriend. I, you know, I stubbed my toe.
and someone offered me, you know, some kind of kind comment or something, some area of
vulnerability. So not to make you paranoid, but just to make a case that I think the best
approach to, I think it's a potential of every moment, is that it is that we be more aware
of consequential quality of seemingly small decisions, especially online, like clicking,
liking, commenting, interacting. So in the examples that you,
gave what seems to be, and I'm sure there are a lot more commonality, is the experience of hurt
or pain in somebody in some way temporarily relieves that. Is that part of what's necessary?
I mean, I guess my question is, what are the conditions that need to be in place, or what is
being taken advantage of or manipulated in order for brainwashing to even occur?
Yeah, that's also at the core, you know, a really core question.
But there was a great scholar and psychiatrist named Robert J. Lifton, who set forth in 1962
what he thought were eight conditions, or they were really like circumstances that accompany
brainwashing, and one of the primary ones includes milieu control, so to the extent the person
experiences in an informational environment where the incoming stimuli and the outgoing.
are highly controlled and even sculpted.
So that's Lifton.
But what I argue, and I agree with his conditions,
but I also argue in the book that what I found in my research
was that there's something preceding these eight conditions,
which is what I call ungrounding,
which I describe as a set of successive shocks
to the point of disorientation,
that kind of makes a person more vulnerable in a situation
where then the various other factors can be, you know, love bombing and informational control
and even threats at the same time, threats of going to hell, if your family being hurt,
or just being thrown out of a community or various other things are brought to bear from there.
The word ungrounding, I find fascinating, because I think we've all experienced that,
a feeling of not being firmly planted and maybe physically, but I mean it more like in a mental
and emotional way. I am operating under the assumption that varying degrees of brainwashing
is happening way more commonly than we might think. Is that a fair assumption or am I being
dramatic? I mean, I argue that there is a connection and some might disagree. But,
I bring to bear research on current, I mean, my last three chapters of the book are about
digital environments, social media interactions, and including even AI driven, you know, chatbots
and those kinds of relationships, which are just, and I kind of tell the history of what I call
hyper persuasion. And I think under conditions of hyper persuasion, which we find ourselves in,
which is just where these feedback loops and informational control are intensified and speeded up.
and we become ungrounded really quickly.
I mean, anyone can testify to this, including myself.
And this is why I bring myself into the book.
I mean, none of us is really immune to that experience of destabilization.
It does have a physical aspect to it, but also, as you said, mental, emotional, and even spiritual levels.
I mean, in a way, I'm stripping or I'm including its extreme resonances,
but I'm trying to show the dailiness and the ordinariness of it.
Okay.
Where does, you've alluded to this already,
but where does trauma and brainwashing play a part and where does it not?
Like if we have experienced trauma,
should we be a little bit more vigilant or does that not really play a part?
I guess my question is,
where is trauma and brainwashing linked or not linked at all?
I find it to be linked.
You could say that we live in a moment when trauma,
is seen everywhere and perhaps over-attributed. So I wanted to ask, you know, does it make us,
does it continue to make us vulnerable today? Or are we, is this, is this link still present? And I do
argue that it very much is when you think of trauma as a kind of each person's emotional
repository of unresolved, probably distressing events. To that extent where we are vulnerable,
I don't think it should make you more worried, but it should.
just can make you more aware of the way that even something like Facebook, as shown through
the famous 2012 to 14 experiment, showed that these emotions of extreme negativity and
distress, really distress and trauma can be capitalized on. And the platforms know, you know,
we're very aware of that. I want to get into that experiment in a second, but it makes sense
to my brain that trauma and brainwashing could be somewhat linked because of that word
ungrounding that you had talked about. When we experienced trauma, I can't obviously speak for
everyone, but you feel ungrounded, you feel disoriented, you feel shocked, and it feels like
it could be trauma on top of trauma for that to be exploited or taken advantage of. Yeah, can I just
make a quick comment on that is one of the aspects of brainwashing that experts came to focus
on or coercive persuasion is dissociation, the dissociative effects. And in the 1950s and
60s, once this started to turn up in military, the U.S. government and CIA wanted to see
if they could create such states as a potential weapon to be used or deployed. And they did find
that pursuing dissociative effects were, you know, that they could say.
successfully induced dissociation in people that sometimes through brutal methods or administering
LSD or sleep deprivation or things like that. So there was that connection that arose. But I think
dissociation is really a key because that's kind of at the core of the traumatic response, as I
understand it. And then again, that refers back to ungrounding because you sort of separate from
the body, the bodily response, and perhaps store these emotions.
Okay, so tell us about the Facebook experiment, for those that don't know, including myself,
what it tells us about, what it tells us, basically.
Yeah, so the Facebook experiment was one that, it was the last one that Facebook's research
team published on, you know, self-published in a prominent journal in 2014, and it ran,
And before this topic became something they no longer publicly wanted to really put in a banner headline.
But it basically had to do with massive scale emotional contagion.
There were 693,000 users who were selected without being told or informed that in 2012 for about three to four months.
And during that time, each of their feeds, their news feeds was altered.
and one group had their newsfeeds altered in a negative direction.
The second group had their news feeds positively adjusted,
so the items that appeared in their feed would be algorithmically more positive.
And then there was a control group which had its news feed just delivered in the usual way.
And the discovery was that there was a measurable effect.
Both the positive and the negative groups were then demonstrated to post more positively,
correspondingly positive or negative, and the negative effect was even more statistically significant.
Those who had their feeds more altered were then more negative.
And this caused when it was published in PNAS in 2014, it caused an outcry.
A kind of public examination and headlines such as Facebook is using you as a guinea pig.
And even I found more heart-rending in a sense were user comments on the Facebook page,
the research team's page saying things like, can I ever know if I was in this experiment?
I went to the hospital with suicidal thoughts during this time, and I think that I might
have been, can I ever know if my thoughts, if my emotions were adjusted?
Because really the conclusion of the experiment was that this was possible that emotional
contagion could be achieved across a network without face-to-face contact and in the
absence of direct interaction. So besides being totally fucked up, was it the point to just see if
emotions could be manipulated via social media? I guess what were they attempting to find out?
Well, there were pre-existing, there was pre-existing work on emotional contagion, and it showed that it
could take place across a network, but they used data from, for example, the Framingham heart
study, which showed that heart conditions and even, you know, weight loss and weight gain, and
even sometimes depression could travel across a network of people who knew each other, even quite
a complex network. I think this was done in Framingham, Mass, and they had data over many
decades. But the challenge for the team, and they sort of seemed to take it up almost like a
challenge, and they were a little bit saucy, I think. If a scientific article can be saucy,
they were a bit saucy because they basically announced, now we see that this can take place
across a digital network. You need not be physically present. You need not have a direct interaction.
the experience that babies have with their mothers or that one of the examples they used in the first sentence or second sentence of the article was a mother traumatically traumatizing her daughter in a famous memoir by Vivian Gornick in which the daughter was, you know, describes this extremely distressing relationship to herself.
And this was the example Facebook used of the kinds of what emotional contagion was.
And when they were announcing it, they were basically saying, we can now operationalize this.
We can do this at will.
We can almost turn up a dial and dial it down and up.
And the users will be affected.
I don't know if it's considered an experiment, but I think of the algorithm.
And you said earlier, like a feedback loop, I find that the idea that whatever it is that I believe or that is important to me or whatever, it gets reinforced.
at a really high level via social media, Google, things like that.
Like I always think of, I don't know if it was a study,
but if I were to Google the exact same word search
and somebody who's my opposite, teenage, white, male in the South, whatever,
and they Google the exact same things,
that they would get a completely different set of information that I might get.
And so I guess my question is,
is the algorithm, is the reinforcing what we already believe, is the feedback loop and dismissing
any information that could possibly go? Is that contributing to our own brainwashing?
Yeah. I think that's what I'm, I'm arguing that that can happen and that we're, in a sense,
we're asking the question, what is one's responsibility within these loops? Because it's quite
concerning when you think about it that it's possible to direct a political ad at you as a
Facebook user that no one else will ever see the exact same ad. It'll be slightly tailored to
your say there's data which has been circulated on your five, what is the famous psychological
test that you kind of give your general configuration. They could direct an ad specifically at your
psychological makeup and they already do make thousands of digital versions of an ad and direct
them that way, target. So that's kind of what I mean by hyper-persuasion or hyper-targeting.
And there's no record of it. Facebook keeps a library, but they don't keep a library of the
thousands or millions of versions that get disseminated across the network. So in a sense,
we are, and you can see this on your Netflix screen or whatever, wherever you watch shows.
You're not seeing the same menu that a kid in another town or have a different gender,
whatever it is. But we forget that. I mean, I too, there's a certain illusion that we're sharing
the experience, and I think increasingly we're not. And that is something to think about, just to be
aware of, and maybe to take steps to counteract in different ways. Okay. I have so many different
directions I could take this, but there are a couple questions I want to make sure to ask.
the first being the differences between terms like brainwashing, mind control, hyper-persuasion,
coercion, coercion, are they different terms for a similar thing? Or are we talking about stages?
Or how are we meant to understand these different terms?
No, that's a really good question, too. I am in the field of history of science and I use this
kind of tool that historians use of looking carefully at how words are used because it's often
difficult to have an absolute definition. People have used brainwashing in many different ways
and no one can agree. It's often dismissed, you know, as being very incendiary. I think that
mind control, they just have different valances is my short answer. So brainwashing has this kind
of its own history. It gains headlines. It seemed as very spectacular.
but in court it's never held up. It's never, except in the use to prosecuting the Manson
girls. It's never successful as a defense, as in the case of Patty Hurst, where she was, her legal
team was unable to convince anyone that this could stand up in court. But a term like mind
control, I think, tends to have a suggestion of a technological change into the brain even.
And I look into that too in the book, some of the chapters on people who had unwanted psychosurgeries done that were meant to have a behavioral effect defined as having and perhaps being useful across a social system.
And then the final word hyper persuasion, I kind of coined that to describe the more digital environments.
But really, they have a lot in common and that's what I'm arguing.
Okay. So I think my biggest question is how do we find our truth and the truth? Because how often we see something differently than another person and we assume the other person is being brainwashed without allowing for the possibility that we could be too. And I find myself, you know, like in today's day and age politically, I'm like, how could anybody possibly see it this way? And then I'm,
I'm very aware that they think the same thing of me.
And so in the information age or anti-information age, depending on how – but, like, how do we make our way to the truth?
And by truth, I mean, like, factual.
And then how do we also honor our own truth based on experience?
How do we get at the truth is what I'm trying to get to?
Yeah, it's a profound question.
I think that each of us is.
being asked to wrestle reckon with today because it's just I feel like each year each week I live
longer it becomes more highlighted that we're you can easily receive a whole different set of
facts and there isn't an absolute and what I mean by the instability of truth when I just came up
with that phrase when I was talking to my mom and I was saying I was trying to describe what I was
arguing in the book, not that we're at the end of truth, not that we're post-truth,
not that truth is dead, but it's been destabilized simply because our ways of knowing it
are undermined. And you can even see this, you could see this during a period like COVID,
differences just in what scientific articles mean. How do you interpret the journal,
the published results? It's really, experts can and do and should,
disagree, but we're very uncomfortable with that. And so there's this kind of process that continually
destabilizes or undermines the sense you started off the interview with that we, there was
once a time when we could kind of agree. I think part of it has to do with changes, technological
changes in how media and information are disseminated. So my short answer is, I mean, we like
to talk about our truth. And I'm not a relativist about truth. But when
I, when I talk about it in the book, I mean something more like how we, how we reality test.
And I think ultimately, truth with a capital T, the kind of thing that Max Weber or even
yoga teachers sometimes talk about is a different kind of thing that we each have to answer for
ourselves. And I even have a very nice quote from Solzhenitsyn that might be, I found, helpful,
which is, he says, the line separating good and evil passes not through states, nor between
classes nor between political parties either, but right through every human heart.
Great quote. And, you know, it feels sometimes a little overwhelming, I think, right now to be
responsible for our own truth and then also to, you know, want to get at the truth, as if there is one
truth. But when it comes to certain things where there is a fact and then,
acknowledging that we have interpretation of the fact. I guess my question is in the face of
like straight lies. How do we get to the fact and then wrestle with our own interpretation
of it as it goes through our heart? Yeah. I mean, that's a great question. I think it speaks to
this a struggle, a real anxiety-producing struggle that maybe we are all engaged in.
What I want to be arguing in the book, what I am arguing is that, and in general, is that
when you find yourself describing someone else's brainwashed or obviously deranged, that it
mostly functions as an exiling device or an insult or a way of saying, not me, never me.
And you can see this in an extreme form in cults.
Why are we fascinated with cult documentaries?
Because there's a little part of ourselves that says, I would never have fallen for that.
or here's where I would have realized he's not the most handsome man in the world.
And I wouldn't have given him access to my checking account or things like that.
I wouldn't have allowed myself to.
So there's a way that brainwashing or this idea, even Stockholm syndrome, is kind of an exile,
which creates this other category.
And part of it is because we're uncomfortable with that.
And I argue that this is clearly to me something we're all.
subject to this phenomenon. And we don't yet know it fully its nature. So to be comfortable
with that uncertainty and to see brainwash as a kind of window, brainwashing as a kind of window
and for self-examination, which is why I put myself in the book to and describe it. It's more
of a process. You know, I'm from this field of history of science where we do talk about
the nature of facts. And in fact, a lot of my first couple of books were about.
about that. I mean, how we establish truth, how truth is essentially made by people with
reputations with authority under certain historical conditions. And it is a changing thing.
Yeah, I'm fascinated. I have so many other questions. And I think, you know, this notion that
thinking someone else is brainwashed or that they're the idiot or that, you know, you have
the answers and they don't, I often, when I catch myself doing that,
because there is an element of wanting to feel that it could never happen to you or that
that sort of separation, what I find myself trying to do in those moments is having a little bit
more empathy of, like, if I experienced the same set of experiences or information or
whatever, and then just that reminder of it probably is, to some extent, happening to all
of us, which is scary. And also, if we are moderately aware that it could be happening,
we can pay attention and do something about it versus if we resist and say, it's not happening
to me, then there is no opportunity to do anything about it. Yes, no, maybe so. I agree. And I
really agree. And I think it's so important because it just contributes to polarization to think
that this only happens to that other segment of the population, that it's demonstrably true that
we're all in this. We are all immersed in these set of evolving conditions. And I wish it were
as simple as we could just cut off that arm or that. But where would that lead us? Yeah.
Well, I know I cannot be the only person who's fascinated. So I'm going to remind our listener
that you can get your hands on the book, The Instability of Truth. It's also available as an
audiobook. And Rebecca, thank you for being here today.
And also for taking on this work, I would imagine it must feel heavy sometimes.
So thank you for doing it and for sharing it with us.
Thanks so much, Nicole.
It was great to talk with you.
My pleasure.
All right.
Well, friend, if your brain is feeling a little wobbly right now, welcome to the club.
Because if truth is unstable and our thoughts can be influenced without us even realizing
it, then the work, our work is to pay attention, to ask better questions, to pause
before reposting or commenting, to check our sources and ourselves, and maybe just maybe to admit
that we're all a little more persuadable than we'd like to believe. This isn't about living in fear
or questioning everything to the point of paralysis. It's about staying open and curious. It's
about building enough trust in yourself to know when something feels off, even if everyone
around you is buying in. Trust your intuition and sharpen your discernment.
Know that being informed is just as powerful as being empowered, and that being both, well, that's the game changer.
It's also woman's work.