3 Takeaways - The Battle for Your Brain: The Emerging World of Neurotechnology, Brain Hacking and Thought Control (#190)
Episode Date: March 26, 2024Brain sensors embedded in watches. Earbuds that decode our brainwaves. Neurotechnology that reads our emotions and thoughts, and can be used to manipulate them. It sounds like science fiction, but it�...��s science fact. Don’t miss this chilling and cautionary talk with Duke professor Nita Farahany, author of The Battle for Your Brain.
Transcript
Discussion (0)
Welcome to the Three Takeaways podcast, which features short, memorable conversations with the world's best thinkers, business leaders, writers, politicians, scientists, and other newsmakers.
Each episode ends with the three key takeaways that person has learned over their lives and their careers.
And now your host and board member of schools at Harvard, Princeton, and Columbia, Lynn Thoman.
Hi, everyone. It's Lynn Thoman. Welcome to another
Three Takeaways episode. Today, I'm excited to be with Duke professor Nita Farahani to talk about
the new era of brain tracking and hacking. According to Nita, we are rapidly heading
toward a world of brain transparency in which scientists, doctors, governments, and companies can peer
into our brains.
We already have little privacy with companies tracking us on our computers and our phones.
Imagine a world where your brain can be interrogated to learn your political beliefs or your sexual
orientation, or where your thoughts can potentially be used as evidence of a crime, or where your
own feelings can be held against you.
I'm excited to learn more about the coming world of brain transparency and brain hacking and tracking.
Nita's new book, which is wonderful, eye-opening, and also shocking,
and in many ways horrifying, is The Battle for Your Brain.
Welcome, Nita, and thanks so much for our
conversation today. Thank you so much for having me. I'm really looking forward to the conversation.
My pleasure. How far are we from a time where mainstream tech companies use neurotechnology,
that is our thoughts, as the new and potentially primary way that we interface with their platforms?
I think the first step is how far are we from major tech companies integrating brain sensors
into everyday technology? And I would say that day has arrived. Already, tech companies have
made major investments. Neuro tech companies have shipped millions of headsets, headphones, and different neurotechnology devices
worldwide that people are using to track and hack their brain activity. Major tech companies are
hoping to take those brain sensors and embed them into our everyday technology like our watches or
our earbuds or headphones. And later this year, there are devices that are launching
with exactly that multifunctional headphones that have brain sensors integrated into them,
multifunctional earbuds that have brain sensors. So not only can you take a conference call or
listen to music, but you can have your brain activity tracked at the same time.
Can you give some more specific examples?
I should start by saying every time that I tell people about this coming future,
which is technology that can decode our brain activity, people are pretty terrified of it. I
think that's their first reaction to it. I think this technology soon will be the way in which we
will interact with all of the rest of our technology. And I give that within the two to five year timeframe, that is that it'll be a primary way that people
interact with their computers, with the other technology around them to turn their lights on
and off, etc. And already, there are people who are using the technology for their own purposes,
not necessarily to interact with all the rest of their technology. And one of the ways in which people are doing so include me when I use headsets for meditation.
So for example, I'm not great at being able to figure out how to meditate on my own. That is,
if I just close my eyes, I quickly am stressed out or thinking about what I need to make for
dinner or thinking about what I need to get for my kids or the next memo that's due. But with guided meditation,
both through a guided program that controls my breathing or gives me prompts about how to
think about or focus my inner attention and inner peace together with neurofeedback.
What that means is I'm wearing a headset with just a few sensors that can pick up basic
brainwave activity.
That is the electrical activity in your brain.
And it can tell when my brainwave activity is associated with being stressed out, mind
wandering, or in a meditative state.
And it can give me feedback.
And that feedback can come in the form of something like birds chirping. There are different ways in
which you can get that feedback. It gives me a signal that my brain activity has reached a point
at which the stress levels are decreasing, the brainwave activities that are correlated with a
meditative state of mind are increasing. And I've used that for a number of
years. It helps me both by decreasing my stress levels, but also it helps me with stress is a
major trigger of migraines for me. And I'm a chronic migraine or it helps me keep my migraines
in check. I was shocked at Berkeley computer science professor Dawn Song's experiment with
gamers who were using a neural interface,
a brain-computer interface to play a video game, and how she was able to steal information from
the gamers. Can you talk about that? While meditation might be a great thing to be able
to do on your own, once you hook your brain activity up to being able to play a game or being able to interface
with an application, it can also be interfered with as well.
And so researchers were interested in trying to discover whether or not the brain could
be probed without a person even being aware that it was being probed.
So they set up an experiment where one of the ways in which this technology is already
being used in a pretty widespread way is by gamers who want to interact with video games by being able to think about moving left or right or through a game.
So their brain activity is being used to control the environment that they're playing in.
But that also means that you can use the brain activity that's being recorded if you have access to that data to
be able to probe it for information that may be concealed inside.
And so the researchers subliminally primed within the game, the numbers and the combination
of numbers for pin numbers or for a home address to see whether or not with pre-conscious signals,
that is, without the gamers even being aware that their
brains were revealing information, if they could flash up pictures or images that would then lead
to a reaction in the gamers' brains that would signal like, yes, I recognize that, or no, that
isn't the right combination, and then use that information to be able to decode and decipher
what their PIN number and what their home address was.
And they were able to do so accurately.
And that helps, I think, illuminate really well the risk that we're talking about, which
is once your brain activity can be hacked and tracked and opened up for others to be
able to interfere with or be able to probe, we can reveal a lot of information that we
wouldn't want
to otherwise reveal.
What neural interfaces exist now?
How are they being used?
We should differentiate first between what a lot of people have heard about in the media,
which are implanted neurotechnology devices.
This is what companies like Neuralink or Synchron or BlackRock Neurotech are developing.
And these are revolutionary, especially for people with paralysis or who have neurodegenerative
disorders where they have an array of electrodes that are implanted deep into the brain that
can be used to decode their brain activity and enable them, for example, to be able to
control their environment or to have even their intention to communicate decoded.
That's not the kind of thing I think that will go widespread. What I'm talking about are wearable
brain technology, primarily that decode electrical activity in the brain. I'm talking about sensors
that can be applied dry to the skin and can be in your baseball cap or in your visor or soon in eyeglasses
coming this year in earbuds and in headphones where the surround, the cushion around the
headphone has a bunch of sensors that by touching your skin can pick up the electrical activity
from your brain.
And that kind of neurotechnology, some of it already exists right now. Mostly the
ones that exist right now require that some of the electrodes make contact with your forehead.
So they're worn like a headband or they're worn inside of a baseball cap or a hard hat.
The ones that are launching now are multifunctional devices. So Meta has invested a
tremendous amount of money into integrating EMG, that is electromyography,
brain sensors that pick up information that goes from your brain down your arm to your wrist and
decodes your intention to type or swipe or move. Or companies like Emotive have devices where
in your ears, there's a sensor in each ear and you can listen to music and do your conference call
while having your brain activity tracked and decoded. To be clear, it depends on the number of sensors. It depends
on where they're placed for how much brain activity is actually being picked up. And so
you might be picking up just a very narrow signal that can be used for meditation or for fatigue
management, but that doesn't pick up as much information as those implanted electrodes,
which would go deep into the brain, or even one that would have many, many sensors across the
brain. So for those types of tech with sensors, with many sensors, what kinds of thoughts will
third parties be able to pick up and read from people's brains? So everybody asked me, can we literally read
thoughts? And I think a starting place for my answer to that is it depends on what you think
thoughts are. So if you mean like, can somebody literally decode what you're thinking about or
every image in your mind? The answer is no. Neurotechnology and EEG technology, I don't
think with consumer wearables, we'll ever get to
the place where we'll have like your full complex thoughts decoded. But that doesn't mean we
shouldn't worry because there is a lot of cognitive and affective states that can be decoded.
Affective states are like, are you happy or sad? Are you tired? Is your mind wandering? Is your mind paying attention or focused? Those kinds of
things can be picked up. Simple words, simple numbers, like we were talking about for pin
numbers, your brain can be probed for that kind of information. We can also probe your brain for,
and I say we, scientists, I can't, can probe your brain for things like, do you recognize a murder weapon? So already in
criminal cases across the world, police departments have used this kind of brain interrogation
technique where they'll show images or they'll ask words, play a sound, a voice to see whether
or not the brain registers recognition memory. So we can decode a lot of what's in the brain, even if it's
not the complex thoughts. You can pick up things like a person's political persuasion. And when
you triangulate that information with everything that's being picked up about our digital presence
and the fact that there is no other form of privacy that I think truly exists, it's like the final frontier
of privacy falling. It's completely horrifying. Will they be able to read our emotions? Can they
read our blinking, the beating of our hearts, our sweating, our bodily functions that we don't
actually control? It depends on what kinds of restrictions we put
into place. We as a society get to make choices about whether we use these devices. And if we do
use these devices, what the term should be. I think now is the time where we can make those
kinds of choices. But if we do choose to use the devices, or for example, if we're required to, in some
countries, people are already required to wear the devices, then yes, there is a lot
of information that can be picked up.
We have to remember that these kinds of devices don't exist in a vacuum.
And so things like your heart rate, which may already be being decoded by a watch that
has an ECG sensor that picks up your heart rate activity,
facial recognition and cameras that are being used to analyze micro facial changes to try to
figure out what you're thinking or feeling together with brainwave activity or EMG activity.
When you put all of that together, it becomes increasingly precise at getting at what we're
thinking and feeling.
I do worry a lot about manipulation.
And it isn't just that the information can be picked up.
Once you have that information, you can try to change a person's mind.
And you can try to change a person's mind in the way we do in everyday life, which is
to try to persuade another person.
Or you can look and see
how a person's brainwave is reacting. So for example, you want to know whether or not you
are adequately doing things to addict a brain to a technology. Already, we can see it based on your
behavior, how much time do you spend on social media, or how much time do you spend watching a
television show on Netflix. But when you can see literally the signals of
addiction starting to be activated in the brain as a responsive way, right? If the same company
has both your brainwave activity as well as your behavior and can tweak what they're doing to try
to addict you to their products, then yes, they can manipulate you. The First Amendment of the U.S. Constitution,
which protects freedom of speech. What about freedom of thought? How do you see the issue?
I believe that if we had gone back in time and understood how our brains could be hacked and
tracked, the U.S. Constitution and other constitutions would have concerned themselves with
freedom of thought and not just freedom of speech. That is, the protections that we have
in those spaces would have been quite clear about the protection of freedom of thought.
There are some echoes here and there that speech begins with thought and therefore
thought ought to be protected. But there's nothing in the Constitution, there's nothing in existing case laws or constitutional law in the United States that protects freedom of thought.
The UN Declaration of Human Rights does protect freedom of thought. But that freedom as it's
currently protected has been pretty narrowly understood to be about freedom of religion
rather than about freedom of thought. The last special rapporteur for freedom of religion rather than about freedom of thought. The last special
rapporteur for freedom of religion and conscience, Ahmed Shahid, he submitted the first ever report
on freedom of thought to the UN General Assembly in October of 2021, where he argued that we really
need to expand the human right to freedom of thought to include a much more robust understanding, including the potential risks that are posed to it by emerging fields like neurotechnology.
When the government can find out what's on someone's mind, whether or not they want to
share it, what does it mean to be protected from self-incrimination?
Unfortunately, not enough. Self-incrimination, as we interpret it
in the United States and in other countries, generally means a right not to be compelled to
testify against oneself. That is, that you don't have to create testimonial evidence.
That's generally understood to mean that you can't be compelled to speak against yourself.
You have a right to remain silent is the idea. I'm not sure how that right to remain silent helps you very much if your silence
can also be decoded, right? And the way that self-incrimination law has been interpreted in the
past has allowed bodily evidence to be used against you because you're not speaking. So,
for example, if the police want to prove that you're drunk, they can take a blood alcohol test and use that. And that doesn't violate your right
against self-incrimination. They can even require you to speak in some ways, which is to provide a
voice print, for example, or a handwriting sample. Those aren't protected by the right
against self-incrimination because it's not being compelled to create testimonial evidence. And when brainwave activity can be probed and decoded, we need a right to remain silent that
includes a right not to have your silent utterances compelled and used against you as well.
We're not there yet, which is part of why my book tries to create a pathway forward.
The set of rights I believe that we need, given the lack of rights
under the Fifth Amendment of the U.S. Constitution, under the First Amendment, under the Fourth
Amendment. Companies can now sell so much about us. They can sell our locations, our political
affiliations, our financial records, our online activities, and even our health and genomic data.
What do you think about third parties now having essentially complete unfettered access to our
raw brain data, including the right to record it, to store it, to mine it and sell it?
I think it's chilling and disturbing, right? That's not even enough. I think it's terrifying.
And I say that because it's not just government use, it's chilling and disturbing, right? That's not even enough. I think it's terrifying.
And I say that because it's not just government use, it's corporate use and commodification of the brain that really drove me to write this book.
If you look at the past decade or more, companies have had unfettered access to our personal
data and increasingly more intimate data where they can create very precise dossiers about everything about us, things that we may's even more problematic than people realize, because while they might trust
a particular company as having pretty good privacy practices, once that data is used,
sold, what's the promise that the next person who gets access to that data won't do something
problematic with it as well.
I'm worried people will give that information away just as easily as they have given all of
the rest of their information away. And we have a moment to get it right. But that's only if we
speak up and do something about it. What scares you the most?
What scares me the most is the use of this technology by oppressive regimes.
I'm Iranian American. When I think about the ways in which governments can misuse this information
to precisely and chillingly be able to find out how people are feeling, see patterns of collective actions starting to form,
chill even the thoughts of dissidents because people fear that even their brains are being
censored. I fear what it looks like in the hands of those governments. And as a mother of young children, I fear their ability to grow up in a
world in which they can figure out who they are without feeling like even their thoughts
are being listened to or scrutinized. Before I ask for the three takeaways you'd like to leave
the audience with today, what else should I have asked you that I didn't? Or is
there anything else you'd like to mention? I think probably the biggest is, so what do we
do with this? Because I've written this book really to raise awareness, but more than that,
to create a pathway forward. And so it might be worth just talking about that kind of hopeful
pathway forward. Because I think what many people's first instinct and second
and third might be is, okay, well, let's just ban this technology. And I'm never going to use it.
And it seems horrible. So my response to let's ban the technology is one, that hasn't worked well in
the past, the banning of technology. But two, there is promise that this technology can provide in our own hands to be used for our own purposes to track our own brain health. I think that there is a lot of promise and there is a lot of risk. There are terrifying risks. we recognize a right to cognitive liberty as a human right that would give us self-determination
over our brains and mental experiences. That would require that we update our rights to privacy,
to include mental privacy, our right to freedom of thought, to cover the right not to have your
thoughts read and used against you and punished for your thoughts and manipulated, and the right
to self-determination, which would mean the right to use the technology and the right to informational self-access about what the
technologies can reveal about ourselves. And I believe if we start there, we change the default
rules to ensure that the technology benefits humanity rather than becoming the most oppressive
new technology that we could have imagined.
It's so important.
Nita, what are the three takeaways you'd like to leave the audience with today?
First and foremost, I think it's important for people to realize I'm not talking about the future.
The technology is here.
It's in classrooms.
It's in workplaces.
It's in individual homes.
It is technology that is here and will only continue to grow over time.
The second is it can benefit humanity, but it also can threaten and does threaten our
very last bastion of freedom, one that we must defend against.
And the third is that there is a pathway forward.
And that pathway is through recognizing a right to cognitive liberty.
It requires updating our understanding of existing human rights.
That can be done without much difficulty.
And it's so crucial that we do it now.
Nita, thank you for your very important book and this eye-opening conversation.
This has been great.
I highly recommend your book,
The Battle for Your Brain. Thank you so much for having me. My pleasure.
If you enjoyed today's episode and would like to receive the show notes or get new fresh weekly
episodes, be sure to sign up for our newsletter at 3takeaways.com or follow us on Instagram,
Twitter, and Facebook. Note that 3takeaways.com is with the number 3.
3 is not spelled out.
See you soon at 3takeaways.com.