Ten Percent Happier with Dan Harris - 296: How to Use Social Media without Losing Your Mind | Randy Fernando
Episode Date: November 2, 2020Given that social media has been blamed for rising levels of anxiety, depression, loneliness, and political polarization, is it possible to use this technology wisely? That’s the question w...e dive into today with Randy Fernando, who is featured in a new Netflix documentary called The Social Dilemma, which is all about the many alleged pernicious impacts of Facebook, Twitter, Instagram, et al. Randy is the co-founder and Executive Director of the Center for Humane Technology, and a longtime meditator. We start by talking about what he sees as the dangers of social media, but then get into a fascinating discussion, where he ticks off a ton of techniques -- informed by his knowledge of Buddhism -- to use social media that won’t cause you to lose your mind. Where to find Randy Fernando online: Website: http://www.randima.com Other Resources Mentioned: • Center for Humane Technology - https://www.humanetech.com/ • The Ledger of Harms: The Facts about Social Media's Harms - https://ledger.humanetech.com/ • AllSides - https://www.allsides.com/unbiased-balanced-news • Your Undivided Attention Podcast - https://www.humanetech.com/podcast • Tips for Taking Control of Your Tech - https://www.humanetech.com/take-control • Resources for Families & Educators - https://www.humanetech.com/families-educators Additional Resources: • Ten Percent Happier Live: https://tenpercent.com/live • Coronavirus Sanity Guide: https://www.tenpercent.com/coronavirussanityguide • Free App access for Frontline Workers: https://tenpercent.com/care Full Shownotes: https://www.tenpercent.com/podcast-episode/randy-fernando-296 See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Transcript
Discussion (0)
Before we jump into today's show, many of us want to live healthier lives, but keep
bumping our heads up against the same obstacles over and over again.
But what if there was a different way to relate to this gap between what you want to do and
what you actually do?
What if you could find intrinsic motivation for habit change that will make you happier
instead of sending you into a shame spiral?
Learn how to form healthy habits without kicking your own ass unnecessarily by taking our healthy habits course over on the 10% happier app. It's taught by the
Stanford psychologist Kelly McGonical and the Great Meditation Teacher Alexis
Santos to access the course. Just download the 10% happier app wherever you get
your apps or by visiting 10% calm. All one word spelled out. Okay on with the
show. to Baby, this is Kiki Palmer on Amazon Music or wherever you get your podcasts.
From ABC, this is the 10% happier podcast.
I'm Dan Harris.
Hey, hey, a question for you.
Given that social media has been blamed for rising levels of anxiety, depression, loneliness
and political polarization.
Is it possible to use this technology wisely?
This has been an incredibly urgent question for a long time, but perhaps never more so than
right now as we head into an election.
And it's the question we're going to dive into today with Randy Fernando, who's featured
in a new Netflix documentary called The Social Dilemma, which is all about the many alleged pernicious impacts of Facebook, Twitter, Instagram
at all.
Randy is the co-founder and executive director of the Center for Humane Technology, and
he's also a long-time meditator.
We start out by talking about what he sees as the dangers of social media, but then we
get into a fascinating discussion where he ticks off a ton of techniques
informed by his knowledge of Buddhism for using social media in a way that won't
cause you to lose your mind. Here we go. Randy Fernando.
All right, Randy, nice to see you. Thanks for doing this.
Great to be here. Thank you so much, Dan.
Just by way of background, I'd love to hear a little bit about you. How did you arrive at this
confluence of meditation, slash mindfulness, and the perils of social media and technology generally?
It's been an interesting journey. I was born in Sri Lanka to a very Buddhist family,
so my parents both in theory and practice are just very serious about Buddhism and it's been very
useful for all of us in terms of navigating our lives. It's sort of the framework that I use and
we use at home. My wife and I, when we discuss life and its problems and challenges, it's a great framework.
It's been really helpful.
I'm very grateful to my parents.
They taught me the precepts when I was five.
And so I've really not had to look for anything else
to give me that guidance.
So I actually learned basic meditation about age eight
from my mother and programming from my father.
So those are the two threads that kind of diverged
and then came back together in quite unexpected ways
I would say in my career.
I loved programming, I loved making pictures
appear on the screen.
And so I just followed that.
I just happened to pick something that turned out
to be really relevant as I grew up.
And I ended up at Cornell to study computer graphics and then that same passion took me
to Nvidia in Silicon Valley.
I got to manage a bunch of different software projects and to author three best selling
computer graphics books.
So that experience, you know, which is one of those weird things you get to do in Silicon
Valley at a young age out of my master's. That experience, you know, which is one of those weird things you get to do in Silicon Valley
at a young age, out of my masters, I had just come out and got these great opportunities.
And around that same time, I got back into meditation and morbid study.
I was reading a lot of suitas and meditating much more seriously in my late 20s.
And at the same time, I started volunteering. I started looking for things I could
do and just started learning. And that process led me to end up helping to build a nonprofit called
Mindful Schools, where I served as the executive director for seven years. And we ended up bringing
mindfulness to nearly a million children globally. And now I think that number is several million kids. And at the same time I started doing retreats regularly because
of that work. And so I was doing one to two retreats every year. And I think that was
all really helpful. And then I ended up co-founding the Center for Human Technology with Tristan
Harris and Issa Raskin. I haven't actually
been interested on right after he was on 60 minutes and he was getting this flood of interest.
It was crazy and he asked me to help to kind of corral that and organize it. And that's what led
here. And so throughout my career, I've really been exploring all of these different intersections.
I've really been exploring all of these different intersections. I'm very interested in helping the deeper Buddhist teachings to survive in a world of mindfulness
that often is very watered down.
And I think there's a lot of wisdom there.
And so I serve on the Board of Spirit Rock Meditation Center.
And also a small or called the Buddhist Insight Network, both of which are really dedicated
to preserving these teachings.
I'm really pleased, and I'd say pleasantly surprised, that all of these things have come
together with the work at Center for Humane Technology in a way that I had never anticipated.
Can you say more about that?
What is the Center for Humane Technology, who's Christan Harris?
How do the Buddhist concepts get woven in there
in practice, et cetera, et cetera?
Sure.
So the Center for Humane Technology
is a non-profit organization of deeply concerned technology
and social impact leaders.
And we are focused on addressing the harms
of the social media platforms.
So from outrage, polarization, addiction, depression, political manipulation,
and ultimately the breakdown of shared truth, which is the one that keeps us from solving everything else.
When you try to think about what is human technology?
What is it that you want?
It's easy often to define what we don't want.
It's easy to define what's going wrong and to say, oh, here's the problem. And then you get to the solution and it's often
much trickier and more subtle. And I would say where the Buddhist concepts really come
in for me personally is, human technology first and foremost, I think, needs to reduce
suffering. And from the Buddhist point of view that largely is
going to be related to addressing greed and hatred instead of perpetuating
them and reducing ignorance as well. But a lot of technology right now actually
does the opposite. Instead of reducing greed and hatred, it promotes that.
Because that's a lot of times that's the way to sell more product.
Do you want people to be dissatisfied with their situation, with what they have, and that's a lot of,
I would say, bad marketing is all about that. And good marketing is looking at what the values are
that someone wants and where there's a real problem. And often that's a much narrower spectrum
and results in a lower revenue. So human technology reduces suffering. It has to be
values centric. It has to look at what people are needing in their lives, what their values
are. It has to recognize that technology is not neutral. That anything that we build is an expression of our value system, and also when we place it
into the world, when it lands in this water, that everyone's living in. The technology itself ends
up conditioned by that water. Humane technology has to be sensitive to human nature, it has to recognize
we have certain physiological vulnerabilities.
And that's just how we're built. And most of that is about survival. It's just about helping the human race survive is kind of how we're evolved.
I would say we're not really evolved towards happiness necessarily. We have to work to that. We're evolved to reproduce and perpetuate the species, but when we think about happiness and well-being and reducing
suffering, often that requires some work. Another characteristic of humane technology is it
builds shared truth instead of dividing us. This is a huge problem right now where people are more
and more divided a lot of the technologies out there, especially social media platforms,
are dividing us off in an unintended ways, but it's a big deal and the way they're designed
actually perpetuates that division and is driven by the underlying business model as well.
And the last thing is, human technology has to account for the unintended consequences that
it generates and tries to minimize them.
And again, this is related to the Buddhist view because we just see everything through conditions,
one thing conditions the other thing and it's just like this endless stream of conditions
that are perpetuating.
So when you have the privilege of designing software that is going to be used by millions or billions of people,
you have to be aware that there are some serious conditions that you are perpetuating out in the world.
And so, it makes all the difference when we're trying to design more human technology. So you're right that we will dive more deeply into solutions once we've walked
through the problems associated with social media in a more granular way. Just to not let
it hang though, you referenced this individual Tristan Harris. He showed up on 60 minutes.
I remember that piece. It was an interesting Cooper piece about how social media is designed to hook us.
Tristan, if I recall, worked at Google and then left and started becoming a critic from
the outside.
Can you say more about him and your work together?
So, Tristan, yes, as you said, he was at Google.
And what he saw when he was at Google was the immense responsibility that these engineers
were having, that every time each keystroke,
every decision, every product decision was shaping the way millions and actually billions of people
were using products. And he started to see this attention economy gained that was being played.
And then actually there was this race to the bottom of the brainstem, as he likes to call it, where the companies are competing for our attention in order to monetize. That became very clear.
And so then he started to speak out more and more. Now, there's were speaking out as well.
He was very articulate in how he expressed these things and he was able to translate the
experience that people are having on their
phones that they can relate.
You know, that's very resident for them.
And to translate that experience into words that said, oh, that thing you're feeling on
your phone is actually part of a bigger problem.
And that's what happened.
So different versions of that.
And that bigger problem first looked like, oh, this attention thing, stealing our times,
stealing our attention.
And then the bigger things started to look like, oh, wait, it's impacting our relationships.
It's impacting democracy.
It's impacting polarization.
It's impacting shared truth.
It's impacting our ability to actually solve other problems.
And now at this level, it starts to become existential.
Because if you can't build common ground to solve problems,
you're going to be in big trouble as human race.
So you and Tristan are featured in a new documentary
on Netflix called The Social Dilemma.
I've watched it.
It's very interesting and unsettling.
What would you describe as the basic thesis?
I think what's special about this film is it describes how social media works
and the harms it creates as told by the insiders who helped to build the
products. The people who were there and saw from the inside what's going on.
There's a lot of credibility that comes with hearing what people who saw, people who are on the
inside and saw what was going on and the decisions and the thinking. And now can see, you know,
10 years later, roughly, the consequences. What has played out as a result? And this is
particularly true in a time when all of us especially children are on these platforms more than ever
We need to understand how they work. What are the implications? What are the mechanisms?
How does the actual attention grabbing work? What are the different types of
Notifications or infinite scroll or the kinds of the way choices are presented to us
in menus, all of these determine our behavior and then to understand this process of how
actually we are the product, right? This is something that Douglas Warshkot said back
in 2011. We are the product. It's actually our attention and our behavior that is being sold.
It's access to our brain.
It's access to our next thought.
That is what is being sold.
That is what is being auctioned.
And it's not only being auctioned in a general way.
It's being auctioned in a very specific microtargeted,
highly optimized way, where a third party can say,
hey, I want to buy this ad, I want to target this specific
group of people, they live here, hear the demographics,
here's the gender, here's the race, here's the age, here,
their interests, all the fun stuff, everyone shares on
Facebook, that information is then used to target you, right?
And all the other platforms, and it is then used to target you, right, and all the other platforms.
And it is not only the information we share directly, it is also the information that is inferred. So the algorithms also infer information about us. So for example, it can infer your
wealth level. So when you're an advertiser, you can choose, it's so detailed, but you can choose things like,
you can guess someone's net worth and say,
hey, if they're high net worth, send this ad to them.
And so like that, there are all these things
about what their interests might be.
And all of this gives access for a third party
to get access to your brain at specific intervals
while you are scrolling through a feed
or interacting with the product.
Something jumps out, right? It's decided. This is the moment that you might be
susceptible to a specific kind of ad because the platform's goal is to deliver this ad in a way that makes you click on it, right?
Because that's what the transaction takes place. And
right because that's what the transaction takes place. And so I think this is a very dangerous model and it's very, very easy for malicious third
parties to hijack that and to use it for different motives.
So no question we've seen malicious third parties gain the system, but just to play devil's
advocate isn't this creating an efficient market for advertisers
to reach people with products or services that might be useful to them. For example,
some of the targeted ads, I'm not on social media that much for reasons that we'll get into,
but sometimes like a pair of shoes pops up and I'm thinking, actually,
need a pair of shoes and that's a good pair of shoes. I'm going to get that.
And by the way, just full disclosure, 10% happier,
advertises on some of these platforms,
and isn't it a good thing that we can figure out
who our target customer is in recent
with something that's going to make them happier?
Yes, it's not always that conflict.
I think a lot of it comes down to, again,
the root principles of, is it increasing greed and hatred and delusion or reducing it?
And in some cases if you're offering a meditation app to people and you're sending it to the right
people and it's a sincere app and the whole model behind it is sincere, that part of it is not a
big problem. One of the things that would be really helpful is if the platforms let someone explicitly signal, I'm looking for a meditation app. Right now it's all inferred,
and this is where it gets very mighty and very dangerous. If instead I was able to say, okay,
I'm actually looking for a meditation app right now, can you help me find that based on what
you the platform know about me and about the world?
Now that kind of relationship as sweet as it sounds can only work effectively
if there's a very high degree of trust. Essentially some kind of a fiduciary relationship
between you and the platform. Just like your lawyer, it has to protect you, your doctor has to protect you, has to do what is in your best interest and in return for that to work, you give them all kinds of information.
They have access to everything. And that's the point. But in this case, we know time and time again that these platforms have failed to protect people. In fact, they sell that information. It's easy to game. Some of it is definitely their fault.
Some of it is other people using the system as designed to manipulate in creative ways that were unanticipated.
So then each time the platforms go and patch up that part. But one of the key premises that we make and that the film makes is that the whole stage, the whole
platform, the whole ground is already tilted and it's tilted because of this
underlying business model. If you follow the money, you can see that the
incentives are not aligned between the people who are on the platform just happily
sharing their information and interacting, and the advertisers who are
buying access to their thoughts and their behavior changes.
That's the problem.
When those incentives aren't aligned, naturally you're going to have this tilt that is not
in favor of the actual person using the platform.
This becomes true when you look specifically, for example, at each of the different areas
where harms come from.
So, I'm going to, I'll mention a few interesting factoids, which I think are very relevant.
Just so everyone understands, what sets take you?
So in mental health, from 2016 to 2019, there has been a quadrupling in the number of
cosmetic surgeries for the sake of looking good on social media.
For social relationships and values, every time someone treats an AI like it has human
qualities, so like interacting with Siri, for example, the more they later dehumanize actual
humans and treat them poorly.
Children under age 14 spend nearly twice as long with tech devices as they do in conversation with their families.
So the time is about three hours, 18 minutes per day.
With truth and facts, fake news spreads six times faster.
And that's because it has so many more degrees of freedom
and is often so much more
appealing. And when you pair that with another factor, which is really troubling, people tend not to
change their minds back. Once they've been given a factoid and it's kind of planted in their head,
it's pretty hard to change their minds back. So even if you track down everyone and say, all right,
all right, all right, wait, hang on, that thing you saw is not true. You can try to do it, but it's a lot harder.
People are loyal to what they've heard. This is this idea of first impressions.
With children, children who have been cyber bullied are three times more likely to contemplate suicide than their peers.
There was a study that tracked 200 children from the ages of 2 to 5, and children with higher
levels of screen time showed greater delays in development across a range of important
measures, including language, problem solving, and social interaction.
Polarization and extremism, anger is the emotion that travels fastest and furthest on social media.
Every word of moral outrage added to a tweet increases the retweet rate by 17%.
So these are some of the examples, right? And so this is I'm just trying to explain
why we say that the platform is tilted by default. Because if the game is attention
and anger travels further
than fastest and fake news spread six times faster, what is going to dominate
on these platforms? It becomes very, very obvious and it's exactly what we end up
seeing. And I think that's extremely dangerous. So all of these, you can read more
about all of these at ledger.humaintec.com and actually our
podcast, so our podcast called Your Undivided Attention. So it's at humaintec.com slash podcast.
So yeah, it's a problematic situation, I would say then.
Just on the anger and the fake news. I mean, in the film, the cavalcade of experts,
all very credible, in my opinion,
prognosticated in quite horrifying ways
about where we're heading as a global society,
given the pernicious impact of social media on society
that we can't agree on a basic set of facts
upon which to have a debate.
Anger wins out.
Look what's happened in Burma with the Rohingya Muslims and the Buddhist majority there
carrying out what appears to be pretty clearly a genocide fueled by Facebook allegedly.
So that prognostications run toward civil war, like all over the place.
So there's a lot to unpack here.
So one is that people have always had their perceptions,
their hatreds, their desire for fame.
All of these are there.
The film is not saying social media is causing all of these things to happen
out of the blue.
What it is is an accelerator because it is now the
infrastructure. It is a big part of our communications infrastructure and it's also a big part of how
we make sense of the world. It's taken over a lot of journalism because credibility is based on a
different currency. It's not based on how well did you research your article? It's based on how many likes and comments and shares is it having?
How much influence is it having?
And unfortunately, we just explained how that, that second metric, is much more related
to sensationalism, to anger, to fake news.
And so you end up in this world where everyone says things like, oh my
God, I saw the greatest thing. The most amazing thing I have ever seen yesterday. That kind
of language people use all the time. And that's the kind of language that now has become
necessary to get attention because everyone's competing. So you're saying, oh my God,
this is the cutest cat video I've ever seen. You've got to see this Because everyone's like, oh, I've seen cat videos, Dan. I have seen cat videos
Yours is probably great, but I've seen them all so to get someone to rise up and see the next one
You have to use language like that and with cat videos the consequences are not that great
But when it comes to domestic civil war or disagreements
But when it comes to domestic civil war or disagreements, it becomes a major issue. And it turns out that protecting our physical borders, we've already got that, right?
We've got the United States has probably a trillion dollar military, right?
Many trillions of dollars are invested in that.
But in contrast, the digital borders are not secure.
And it's a lot easier to penetrate those. And actually,
it's pretty cheap. On the order of hundreds of thousands or millions of dollars,
you can start dividing people, you can plant narratives, you can make fake groups and invite
people to fake events and all of these things have happened. And so you end up with people physically
showing up two sides, both of whom were manipulated. And I think
all of us, I'm sure, all of us have fallen for this stuff, where because we care so much
about a topic, we end up forwarding something without checking, right? We were just like,
oh my gosh, that has to be true, because we want it to be true. We can all be easily manipulated,
we can all be part of the problem all too easily with very good intention
More of my conversation with Randy Fernando right after this
Celebrity feuds are high stakes. You never know if you're just gonna end up on page six or Du Moir or
In court. I'm Matt Bellas. I and I'm Sydney battle and we're the host of Wonder E's new podcast
I'm Matt Bellissi. And I'm Sydney Battle, and we're the host of Wonder E's new podcast, Dis and Tell,
where each episode we unpack a different iconic celebrity feud.
From the buildup, why it happened, and the repercussions.
What does our obsession with these feud say about us?
The first season is packed with some pretty messy pop culture drama, but none is drawn out
in personal as Brittany and Jamie Lynn Spears.
When Brittany's fans formed the free Brittany movement
dedicated to fraying her from the infamous conservatorship,
Jamie Lynn's lack of public support,
it angered some fans, a lot of them.
It's a story of two young women who had their choices
taken away from them by their controlling parents,
but took their anger out on each other.
And it's about a movement to save a superstar,
which set its sights upon anyone who failed to fight for Brittany.
Follow Disenthal wherever you get your podcast. You can listen ad-free on Amazon Music or the Wondery app.
It's interesting because I was watching the film and you're talking about QAnon and PizzaGate and I'm thinking to myself,
well, anybody who believes any of that stuff is gullible,
right, to put it gently. But your last statement before I started talking and the film, it also
seemed to be arguing that even those of us who consider ourselves to be smart or whatever
can be hijacked by fake information, even if it doesn't mean believing that they're running a pedophile ring out of the basement of a pizza parlor.
That's exactly right. I think it's just our own good intention.
Our own well-meaning of wanting to share, oh my gosh, this has to be true.
This matches with my worldview.
Therefore, I want to share it with my friends, because everyone needs to know this.
At least we should do the diligence of looking at the actual
story that we're sharing, reading at least skimming it and saying, okay, yeah, that seems
credible. It's got reliable sources behind it. When we share an interesting infographic,
share the source. So other people can take a look and say, oh, actually, that's been the
banked or, you know, they can have an intelligent discussion about it.
So it's increasing the friction a little bit.
So this then is parallel to mindfulness
of saying increase the space a little bit, increase the space
before we react, increase the space,
take a look at what we're doing
and respond with a little more wisdom.
I think that's one way to do.
Now we're starting to get into practical solutions
for why is use of social media,
but before we really, really dive in on that,
let me just ask about one other area
of the prodigious impact of social media
that you talk about in the film,
which is mental health.
We've dealt on this a little bit,
but how strong is the correlation
between social media use and
adverse mental health outcomes such as depression or anxiety?
The opposite of addiction is really, it's well-being, it's love.
You have to have the stability inside to be able to overcome a lot of these feelings.
And when we are feeling vulnerable, when we are depressed or angry or anxious, that's
exactly when we are most vulnerable to these algorithms. It's specifically those kinds of mental
states that make us more vulnerable to doing all of the things we just discussed, right? The
sharing of something or going back to post because we want that we want some support, we want
some acknowledgement, we're feeling down. We're low on agency.
We post something where we want people to say,
I like it.
I like you.
You're good because we don't feel good about ourselves
in that moment.
Our own well-being, the reservoir is low.
It's not everyone that's affected in a large way.
Everyone's affected in a small way.
But the people who end up in more vulnerable situations
then get affected
in a larger way. It's easy for someone to get sucked into a rabbit hole and end up sort
of in an extremist rabbit hole, for example. When they're looking, when it's the middle
of the night, their cognitive ability is sort of lower, right? You're tired, you're not
as discerning, and you're more suggestible. You know, I see some negative impacts in my own mind
when I use social media.
As I mentioned, I try to be pretty sparing.
I will occasionally post on Instagram,
but I will delete the app afterwards.
I'm not really on Facebook.
I do use Twitter, which I don't find,
messes me up too badly,
but I reuse it pretty judiciously.
But Instagram, in particular, I notice if I'm using it regularly up too badly, but I reuse it pretty judiciously. But Instagram, in particular, I notice
if I'm using it regularly,
there are two things that I see that are
deleterious to my mental health.
One is I get obsessive about how many likes I've gotten
on whatever I've posted.
And two is that I start comparing myself
to these carefully curated images from
the lives of my friends and colleagues or at parties that I wasn't invited to or whatever.
And so I can see how both can fuel depression and anxiety. I'm just curious,
do we know for sure that are there studies that really show a correlation in terms of depression
and anxiety with social media use? Yes, there are.
I think one way to understand this, you know,
beyond the studies, is to look at it from a training lens and say,
again, from a Buddhist point of view,
and I look at it very much as conditions and saying,
okay, how are the conditions affecting us?
From the moment we're born,
or actually, before that, right,
our existence is highly conditioned.
So our genes, right, which is what we come in with.
Our genes, our environment, our relationships, our experiences, all of these things are highly
conditioned.
So we have to be very thoughtful about what are the ways that technology is training us,
individually, and as a society.
We have this intuitive understanding,
the laws of social physics,
the thing that matters in terms of your voice
and your ability to communicate effectively
is likes, comments, and shares.
Then you start to put this together in your head
and kids are wonderful at doing this.
They learn very, very quickly these kinds of patterns
and they start adapting them.
But so do adults. And so it's certain types of language. As you said, certain types of
curated posts, curated pictures. That's what you learn, that's what you observe. And you sort
of integrate that into your brain and say, okay, that's how this works. And I think that has very
dangerous downstream consequences because it ends up shifting
our value system.
So there have been studies recently on how basically being an influencer or seeking fame
has gone much higher on the list of things that kids care about from less than 10 years
ago.
And that's natural.
This is just practical.
It's not a subjective
judgment. The fact is that in this economy, if you want to, if you have something useful to say,
and you want to say it, you need to have more attention in order to achieve that. And it's
pretty hard because of all the things we said about the default tilt, to achieve that attention, if you follow a path of being modest,
being humble, being simple, being accurate,
being really accurate, it's really hard.
It's a journalist I'm sure you've seen this trend.
And so then you end up in this era of clickbait headlines,
because that's the pinnacle of the fight,
is you even leave the headline out of the headline, right?
You just put the dot, dot, dot, dot,
the one thing that you need to know about the selection is
and you just wait for the click.
It's crazy.
So given all of the harm, macro and micro,
that you've just described from social media,
is there a way to use this technology wisely given that it's so
pervasive. People need it professionally. They need it to keep up on their friends. If you drop
off the Facebook, you may not know what your family's up to. You may not hear about the family
gatherings or the parties that you actually do want to go to that would be healthy to attend.
Is there a way to use this stuff in a healthy way
that's conducive to human flourishing
or is the only answer abstinence delete the apps?
Because certainly some of the people in the movie
are suggesting don't use this
and we certainly don't let our kids use it.
Yeah, I think there's a lot of facets to this.
One is age for sure, because I think for kids
to use this at a young age is
exceptionally dangerous. So I think that's one thing that one line to draw in the sand.
One of the beautiful things about this film is that it allows everyone to have some shared
understanding about this problem. There's a lot of safety now in bringing up this topic. You're
not going to be the weird one at your parent group or anywhere, because millions of people have watched it, including teens, right? So that's safety. That the film brings
to allowing these conversations to perpetuate is huge. That's the first piece.
I think one of the fundamental problems, as we've talked about, is that the platforms are tilted,
right? So the tilt is there. If you're on it,
you are going to be subject to that tilt in one form or another. And you may think, oh, it's okay,
like I can overcome these things. And in many cases, that's true. We can't. We can figure out what's
true or not. And we, you know, we won't be too manipulated. But remember, the game is always progressing.
And already we've probably made mistake.
I'm for sure.
I know I have shared things that weren't true
because I was excited or behaved in ways
that I think are not the way I would want to behave
because I was online, because of these specific conditions.
And all of this is only going to get worse
with deep fakes.
Now we can fake video, we can fake audio,
we can fake text very easily.
There's all kinds of stuff that we are
inevitably going to be vulnerable to.
And I would ask the question more about,
why are you going there?
Why are you using it?
What are you seeking from it?
So one example of where Facebook actually I think is useful
is if I'm about
to see someone I haven't seen in a while or talk to them, you can go on Facebook, type
their name and look at their scroll, look at their feet, see what they've been up to.
That works great. But the problem is most people go back to the platforms and they let
the default feed be their experience.
So when that's your experience, what happens is there's sort of this replacement of the timeline
and the sequence of actions that you are intending to do. And that gets replaced with a different
sequence of actions that you are not intending to do. And sometimes you can drift really far out.
I think YouTube is the best example of that.
If you go to youtube.com, like just open an incognito window and go there. And you will see just on
the default homepage, so many interesting random things that you want to click on. And I saw one
which was really amusing. It was like little coffee cups made with Lego, right? Tiny micro things and I was like,
oh, that's really cute. I'd love to click on that. And you can see every single one of those is interesting in one way or another.
And the more it knows about you, the better it'll do at finding that.
So that's not the experience you want. If you go to YouTube and you're looking for how to learn something a new skill, for example.
YouTube is great at that. You can go, you can get that information, watch the video, and then you should close it. This is where it gets tricky, there's auto play, there's recommendations,
and our eyes are sort of, we can't help ourselves seeing those things and saying, oh,
wow, that could be interesting too. Let me just click one more video and then it's a bunch of time, right? That experience everyone is familiar with.
I think the challenge is the platforms are not well-incentivized to solve that problem,
to make it such that when you come,
you get exactly what you want and then you leave. So if I'm hearing you correctly,
one way to use these platforms wisely
would be to be pretty specific and intentional about it.
So if you want to go to YouTube to learn how to tie a tie
or how to hang up a chin up or whatever it is,
I just hung up a chin up.
Then you can go there and look for that, but then be
aware that you are facing off against supercomputers that are really good at getting you to stick
to the site. So if you can go and get what you're looking for and then turn it off. And
then also with Facebook, same thing I heard you say that, you know, if you want to go find
out what a specific friend is up to before you see them again, go look. But if you're just going to get sucked
into the random timeline, then you're likely to get manipulated.
Yes, that's for the trickiest. And I think one of the real challenges is, for example,
Twitter has lots of good news from people who are experts in the field, for example.
Many experts in many fields use
Twitter to share their insights, their sort of latest insights, the stuff that hasn't made
it into articles or into Wikipedia or any other reliable source yet.
And so if you want to know the latest, you end up going there onto Twitter.
But you can make a pretty good argument that this sort of very short limited character tweets
and sequences of tweets is not the best way to communicate real insights or to have deep
conversations.
That's the problem.
You're using it outside of the fundamental way it's being driven to operate.
There are many well-intended technologists at these companies who are also frantically trying
to patch all of these things, right? They're trying to put out all the different fires,
they're trying to patch all the different problems, but they are unable to change the actual
default tilt of the platform because they don't have access to the business model to change the
way it actually works to change those actual incentive structures.
So they're forced in some ways, an impossible battle to just put out fires constantly and
find new ways of putting out fires.
One way to think of this is, is it a good idea for all of us to just have a megaphone
and go into public square and everyone start talking, right?
It's just not a natural way of interacting. And so you end up in a set of assumptions
that just don't match what's actually healthy for people.
And that works on a lot of different axes.
So let me just say a few words about Twitter
and just to be clear,
I have no investment in Twitter
and I'm certainly open to the many, many critiques of Twitter.
But I don't find it personally problematic.
Yes, I find it deeply unattractive that people that I know who are otherwise sane are on
there just basically spewing a lot of venom.
I think it brings out the tutu-per-idiveness and if that's even a word in otherwise calm,
reasonable people.
So, I see that critique and I do see it in my own timeline,
but I don't feel that sucked into it.
When I look at Twitter, I find it's very interesting
for me on a couple of levels.
One is I'm very interested in what's happening
with the pandemic and lots of epidemiologists on there
and I can go and read their threads
and it's very interesting.
I'm also very interested in the election
and I like to read
perspectives from people on both sides. Sometimes the hot takes are too hot and it's too hot, but I like
seeing what articles people are posting and because by the way, then that takes me into a deep dive
into something that's well researched and not just a few characters, et cetera, et cetera.
And by the way, I, one other thing about Twitter is that random people can reach out to me
often.
They're saying nasty things, but usually they're saying really nice things.
And they're asking me a technical meditation question.
I can take a few minutes to answer it.
So I don't know.
What do you think about, have I sold out?
Am I deluded?
What do you think of my take on Twitter?
No, that's great.
I think your take on Twitter is accurate for you.
Again, I think the point is about the way the platforms are tilted by default.
If you're careful, if you curate correctly,
and the way you're using it is very different from the way a lot of people use it.
And so I think that is helpful.
There's a great website by the way called allsides.com
that will show you the left, the center, and the right for many different articles. It's just a,
it's a way to see all of the different views. I do this too. It's actually one of our recommendations.
It's to don't go to the most sensational opposite sides, but go to the ones who represent
different viewpoints and try and understand. I actually go into the comments and I try very much to understand what is going on because
I think ultimately this is about understanding other people's viewpoints.
That's what we're missing in the overall political landscape.
We don't spend enough time understanding the viewpoints of others.
Is there a way to affirmatively decide to use social media and to be a vector of positivity?
Just on a personal note, yesterday, the day before we were recording this, was my son's
first day of in-person school.
We're living in a little town where you have the option of doing remote or in-person.
And we had let him do remote for the first couple of weeks.
And then he decided to go in person. And so I went to drop him off and he was freaking
out. He was so scared. Everybody's in masks. And it was really hard. And his cousins in
first grade, she was with me at the time and being very supportive and ultimately coaxed him to go into the school.
And I got this picture of her carrying his jacket all the way down this long hallway
that I was not allowed to go down walking Alexander into the classroom.
And so on his part, it was remarkably brave and on her part, it was extraordinarily kind.
And I had to thought, I haven't done it yet, just because I'm lazy.
Maybe I should post this, you know, like I'd be here by interrupt your political death spiral that you're
in or whatever you're in right now to show you something really cool that this six year
old did for a five year old yesterday. And so the all long way of just saying, is there
a way to use social media with the intention to be positive?
Yes. So let's talk about that.
I think there's a lot of interesting discussion there, especially as usual, kind of using
the Buddhists lens to analyze it.
And I think this is exactly what we should be talking about.
At the same time, there are a lot of tips that are really helpful at humantick.com slash
t-control that are actually somewhat unconventional and different from the typical tips that you hear.
But for now, let's talk about the Buddhist analysis
of this, or at least my version of it.
So let's start with what it feels like as we scroll.
So on one hand, right, there's a cat video, super cute.
There's a political thing, super heavy.
There's an inspiring picture of the two kids
that Dan mentioned. Then there's an inspiring picture of the two kids that Dan mentioned,
then there's an ad somewhere, right?
So your brain is really getting sliced around.
And I definitely feel that when I'm scrolling through, I don't find that to be a pleasant
experience.
I find that to be a very sliced experience.
And paired with that, we have to look at what is the intention that we came in with? What is our
intention when we post? One really simple exercise we can all do, write your intention when you post.
Just write it along with the post at the bottom. Here's why I'm posting. When you do that a little
bit, you'll stop posting because a lot of the time the posts are related to, well, I mean, the deep one is I'm feeling
bad about something. I'm feeling anxious or depressed or angry and I need some kind of,
I'm trying to make up for that. I'm not looking at that. I'm making up for it in a different way.
So then it takes you to this point where you say, all right, well, there's two options. One option is to create distance between us and the devices.
I think that is wise.
I don't think of these things as a binary thing.
It's kind of a spectrum.
And you know, knowing yourself, you have to create the right amount of distance.
Probably it's more than you think.
And then the second part is to address the underlying instability.
Like, what's the problem underneath? How do we become more stable underneath? And actually,
I think it's very closely related to the brum of Iharas, which I think I heard you've been talking
about that on the podcast as well. So this idea of loving kindness, compassion, sympathetic joy and equanimity.
When we are practicing those, we end up weakening our sense of the fixed self. That's the thing
that causes all of our problems, really. So not understanding that, okay, our self, we're just
just this thing that's highly conditioned, it's very much in flux. It's always changing.
And it's shaped by the conditions outside of us. It's shaped very much by our relationships.
And this is why it's so important when we look and we say, if we have a kid, who are their friends?
And it's important for us to look who are our friends. And the reason is that they have the largest footprint in terms of shaping us.
But now there's this invisible friend, right, or unaccounted for friend, which is your phone.
And that's actually around you more than a lot of these other friends, a lot of these other companions.
And it's training you all the time.
So this is also about more than just what you went to the platform for.
It's about all of the different interactions.
Every time you click a button and you're instantly gratified,
it reduces our tolerance for when things don't go our way.
So back to this connection idea,
deep connection with others is a very good solution, a strong
way to develop protection for ourselves and deep connection to others is best achieved
in person, in video, on the phone, not online chat or texting, because online chatting and
texting are highly interactive. Say I'm chatting with you on Facebook. I'll send you a message.
You'll say, Bling and then you will look and then you'll start typing something. Meanwhile, I'm looking at something else
and then it says, Bling and then I'm brought back and then I say, Oh, okay. So now I'm typing. And while you're seeing my dots,
you're looking at something else again. And so this process keeps repeating. And in effect, you're doing a great disservice to your friend because you're doing
this repeated interruption thing, which from a mindfulness point of view,
it's the exact opposite of what we're trying to cultivate in terms of just a healthy,
sustained attention, sustained mindfulness.
We are slicing it up constantly.
And also, practically speaking, what you'll also find is that when you do this kind of chatting, you end up spending
a lot more time for a very small number of words. It just takes a long time to do.
Well, really appreciate you coming on and talking about why it's healthy ways to use social media.
Going in, I wasn't sure that there was going to be any affirmative answers to that, but
you gave many with the appropriate caveats.
We will be putting links, everybody, to the various web pages that Rand decided in the show
notes.
If you want to go check out the Center for Human Technology, we'll make that easy for
you.
We'll put a link to the documentary on Netflix.
Randi, anything I failed to ask you before we close
here? No, it's fantastic. I think we covered a lot of the
things. Let me just take a quick scan and see if there's any
other big thing we should we should touch on. Oh, there is one
thing. It's the topic of what do we do?
After watching this film, I get it.
I'm really concerned.
What's next?
And how do we fix this system?
One thing we can do to be really helpful is watch the film with others and have conversations
with them.
So then after that, there's this movement.
There's a movement for hum human technology that's building up.
You can sign up at humantech.com.
That's going to take us to this kind of the bigger impact, the bigger change that we all want to see.
Randy, thank you to you and your team for doing all this work. Really appreciate it.
Thank you for coming on today.
You're very welcome. It's been a real pleasure. Thank you so much for the opportunity. Big thanks to Randy. Really appreciate him coming on.
One last thing before we go, we, as I hope you know, care deeply about supporting you
in your meditation practice and feel that providing you with high quality teachers is one of the
best ways to do that. Customers of the 10% happier apps say they stick around
specifically for the range of teachers
and the deep wisdom they impart
to help them deepen their own practice.
For anyone new to this app,
we've got a special discount just for you.
And if you're an existing customer,
we thank you seriously for your support.
To claim that discount, go to 10%.com slash reward. That's 10% one word all spelled out.com slash reward.
Finally, big thanks to the team who work incredibly hard to make the show a reality on the regular Samuel Johns as ouret audio, Maria Wartell is our production coordinator.
We get a ton of massively helpful input from our TPH colleagues, including Nate Toby,
Jen Point, Ben Rubin, and Liz Levin.
And finally, big thank you, as always, to my ABC News.
Long time comrades, Ryan Kessler and Josh Cohan, we'll see you all on Wednesday for a special
post-election day episode. We're going to record this one late at night. Once we have
hopefully some sense of what's happening with Lamarad Owens. We may remember
from back in June one of the most popular podcasts we've ever done. So very
excited to have Lamarad back on at a time when we're gonna need them. See them. [♪ OUTRO MUSIC PLAYING [♪
Hey, hey, prime members.
You can listen to 10% happier early and ad-free on Amazon Music.
Download the Amazon Music app today.
Or you can listen early and ad-free with 1-3-plus in Apple Podcasts.
Before you go, do us a solid and tell us all about yourself by completing a short survey
at Wondery.com slash survey.