Your Undivided Attention - Pardon the Interruptions — with Gloria Mark
Episode Date: August 14, 2019Every 40 seconds, our attention breaks. It takes an act of extreme self-awareness to even notice. That’s why Gloria Mark, a professor in the Department of Informatics at University of California, Ir...vine, started measuring the attention spans of office workers with scientific precision. What she has discovered is not simply an explosion of disruptive communications, but a pandemic of stress that has followed workers from their offices to their homes. She shares the latest findings from the “science of interruptions,” and how we can stop forfeiting our attention to the next notification, and the next one, ad nauseam.
Transcript
Discussion (0)
We carried around stopwatches, and we timed every single activity that people did to the second.
That's Gloria Mark. Back in 2004, she convinced a random sample of office workers to essentially ignore the clicking of her stopwatch, as she timed their every move.
They turned to their email. That would be start time. Click on the stopwatch. Then they turn away from email. That would be stop time. Click on the stopwatch. And we recorded all these things, so we could be as precise.
as possible. Precision mattered because Gloria wanted to know exactly what people meant by the
word multitasking. Remember, you know, this was the early 2000s at the height of the multitasking
craze. But it was a totally vague concept. I mean, just how many tasks did the average worker
juggle? What is a task? It was anybody's guess until Gloria and her colleagues answered the
question with scientific precision. And we found that people switched actions about every three
minutes. So that's not just what they're doing on the computer, but what they're doing on the phone
and interacting with people. And at the time, this was 2004, people were shocked that it was
three minutes. They thought that was a very short amount of time. Three minutes of uninterrupted
attention by today's standards is remarkably long. I mean, it's luxurious. The most recent
statistic we have is back from 2016, where people's attention on their computer for any screen,
the median length was about 40 seconds.
40 seconds before your attention breaks.
It takes an act of extreme self-awareness to even notice all these hairline fractures.
When our attention breaks so chronically and so pervasively, we're not even noticing the full toll that it takes on us.
We need a researcher like Gloria Mark with her stopwatch.
to measure the problem with precision.
Over the past 15 years, she's created a whole new set of methods
to trace our cursor movements and our eyes and even our heartbeats as we work.
On today's show, Gloria Mark, a professor of informatics at the University of California at Irvine,
will share her latest findings from the science of interruptions.
The symptoms are shocking, but they shouldn't be.
I mean, we're all experiencing it.
And for the sake of our sanity, we have to stop pardoning the interruptions.
I'm Tristan Harris.
And I'm Isaraskan.
This is your undivided attention.
We're surrounded by distractions, and yet they're invisible to us.
I want to know how did you start noticing this thing we can't notice.
So the trigger was a cultural change that I went through.
And whenever I do research, it's always inspired by some kind of personal experience.
And so I moved in the year 2000 from Germany, where I had been living for some time, back to the U.S.
When I was in Germany, there was a common practice that you would go out to lunch with your colleagues.
And then we would take a walk around this campus.
It was a beautiful campus.
It's called a round.
You just take a round.
And then you get back to work.
And I come to the U.S., and it was all I could do to buy a sandwich between my classes or between classes and a meeting, rush back to my office and sit in front of my computer and eat my lunch.
And as I walked down the hall, all my colleagues with their doors open were doing exactly the same thing.
because we just didn't have the time to do all these things.
So I really, really began to wonder to what extent do other people experience this.
And so that's how I became interested in distractions, interruptions, multitasking.
Are there a lot of people besides you working on, like just measuring and seeing what's happening to our attention?
There are some researchers.
I wish there were more.
I find this topic to be so profound and so important, and I just wish that there would be an entire field around this topic.
It's not just computers, though. It's just a wider spectrum across society.
I was just recently looking at statistics on television viewing, and there's something called the average shot length,
median shot length. And I found a site that had tracked these shot lengths over the years. And it's
actually quite short. It's amazing. And so actually, I'm looking at this graph now, and I see that
if we're looking at average shot length, it was about 13 seconds in the 1950s. And the last
statistic. The last year they have is 2010, and it's probably about three and a half seconds.
Wow.
And similar things in films, it depends on the director, but the shot lengths also seem to be
decreasing. So we're getting bombarded from all directions. And I'm not sure of the cause and
effect. You know, I'm not sure if people are being trained to have short attention spans from
TV and films and then applying it to computers or if their short attention spans have
developed from computer activity. But I will say that when people go on a computer, we have
access to more information and more people faster than we ever have had in history. And so that's,
that just feeds into the natural curiosity of humans. It wasn't in one of your studies you
mentioned, I know that there's probably a more recent stat now, but how many hours
information workers spend in their email? People average about 74 times a day checking their
email, but each time they check their email, they spend about 32 seconds on average. So you can
imagine all these interruptions that are happening throughout the day, checking the email
and spending this amount of time and then having to reorient back to work.
I remember the high-personer sample checked it 435 times a day.
Oh, my God.
Wow.
Yeah.
I'm really curious.
What happens, like, in that moment, you get distracted or you self-interrupt.
You go to your email.
What does the process look like to get back to what you were doing?
Well, so people have to cognitively reorient to where they were.
And this is an extra cognitive load, right?
Because you have to recall.
for example, where you were, if you're working on a document, what was your train of thought
before you got interrupted? And I want to mention one thing that I think is very relevant.
We looked at types of interruptions, and we tend to think of interruptions as only coming from
external identifiable sources like email notifications or social media notifications, but about
half of all interruptions are due to oneself.
We call these self-interruptions.
Half of interruptions are coming from the inside.
From the inside.
So the way to explain it is that you're observing someone
and they're typing in a word document
and then for no explainable reason,
they suddenly stop and they check email.
Or they stop and they pick up their phone
or they stop and they just look up something on the web,
which may or may not be related to what they're doing.
I found this part of your research on self-interruptions profound
because you have sort of a theory about how these external interruptions
beget internal or self-interruptions.
Yes.
So we had looked at the data and we divided the data into hourly segments.
So we had collected a fair amount of people
because we're tracking them over multiple days.
and the frequency of what we call external interruptions, that's from, say, email notifications,
some external source, and we've looked at the frequency of what we call self-interruptions.
And we find that when the external interruption frequency wanes, it's very interesting,
the internal interruptions tend to kick in.
And it's as though people are just habituated to being in.
interrupted to having these short attention spans. And if the interruption is not coming from some
external source, then people self-interrupt. One of your studies, you cut off information workers
from email for five days. What happened? Yes. We did. We had people walking around the office
wearing heart rate monitors. So we measured them for a week. And we got their average stress experience.
And the next week, email was cut off. And when email was cut off, people became significantly
less stressed and focused significantly longer. The flip side of focus is looking at switching
between screens, and people switched about half as frequently as they did with email. And, you know,
one might think, oh, yeah, you take away email, of course you're going to switch less.
Not necessarily, right?
Because email is just one source of switching, right?
And the way I interpret this is that people have this habit of checking email, right?
And if that stimulus is removed, so the email is removed, people can slowly change their habits.
And so maybe they're not going to switch other screens as fast either.
One of the things that strikes me as you told this story is that,
you know, we are all dosing ourselves with email.
So it's like what's happening in that office is happening the entire world over,
not only just the information workers, but, you know, our political leaders and like our scientists,
the people doing the most important thinking.
Like, I know it's hard to do an extrapolation, but I would love to just know what is,
what do you take away from this study from the specific to the general?
Yeah, I see it as a trap.
It's like a spider's web that's very sticky.
We're all interdependent in this web of communication, right, which is email.
And if any individual were to pull out and say, I'm not going to check my email that often,
they're going to be penalized.
And I'm thinking especially in the workplace, they have to be on top of things, right?
And this is just generating more and more email traffic, and email has become a symbol of work.
And so that's another reason why when we think about email, we become stressed.
Hey, this is Tristan. I want to pause the interview here on the sticky, tangled web of our
incessant communications. Silicon Valley product designers can hack our way out of this mess.
And Asa and I discuss a few simple solutions, some of which, you know, already exist. They just
haven't scaled yet. You know, I downloaded an app called Quitter, which basically
realizes that when people leave their email open in the background, which most people do,
they just end up switching to it all the time. And one of the easiest things you can do
is just make sure you quit your email so it's not actually open so often. So what quitter does
is it lets you set up, you know, here are the apps on my Mac, whether it's Slack or email
or messaging, that it just automatically hides them on a timer. So after five minutes,
it can either automatically hide them or automatically quit them. And that's great,
but no one's ever heard of this thing called quitter. And imagine that that's built into the way
operating systems work. Or another example is a full screen mode. So, you know, not too many people,
I bet, use the full screen button on reading an article or writing something in Google Docs.
But I've found that it's super helpful if you go full screen on something. And Asa, I remember you
used to even set up two different computers side by side. Yep. And I still do that. I don't have Slack
or other communication on my big work computer, my desktop computer, and I only do
communication on my iPad so that I have clean separation of space and mind. And so I can't
spin out that way. And our attention just isn't just going into the screen. It's also being occupied
when we're off the screen and we're worried about missing that email that I know is going to come in.
And if I miss it, what's going to happen? So, you know, imagine that the sort of mandate for every
company is attention minimization, that I'm not just minimizing people's attention on the screen.
I'm also thinking about genuinely how my occupying.
attentional footprints in their mind off the screen.
So a simple example is imagine if every messaging communication app lets you say,
this is something I want to say, but send it later.
Send it tomorrow.
Send it in five minutes.
Send it in an hour.
Because if I have that thought and I know I don't want to mess with Aza's attention right now,
I sometimes hold on to that thought because I realize, oh, that's not the right place
to send it.
Another good idea is rerouting.
So what dictates whether or not I'm going to message you Aza with Facebook Messenger or
with WhatsApp or with I message?
Am I going to think consciously of like, oh, this isn't that urgent?
Let me just email you, Aza, instead of sending you a text.
Well, in the ideal world, that's what I would do.
But in a messy, busy, distracted world where I'm not going to think hard about it,
if I'm already in Facebook and I see something and I want to send it to you,
I'm probably going to use Facebook Messenger because that's the closest thing to reach for.
So imagine if all messaging apps had this sort of rerouting mechanism where, you know,
when you start an email, it's very easy to reroute to WhatsApp or it's very easy to reroute to
I message.
And when you start an iMessage, it's really easy to reroute to email.
Because right now that's hard, and especially when we don't distinguish between things that genuinely are urgent that are worth interrupting for versus things that are not, that's a huge issue.
If we can get just a few designers in the room with the simple mandate to minimize user engagement and to maximize the attention on the things that we actually care about, then the ideas will grow exponentially.
You can help us think bigger by joining the Center for Humane Technology's next video conference.
conference. Visit humanetech.com slash podcast to sign up and join the conversation.
I have some stats here in front of me from rescue time. Rescue time is a product that people
install on their desktop. It tracks basically your usage across different apps and gives you
charts and graphs and does that across devices. And so they track that 40% of productive time
at work is spent multitasking. Only about an hour and 12.
minutes of uninterrupted productive time per day. 70% of all emails received were open within
six seconds of their receipt. When you check an email, it takes an average of 64 seconds to resume
an original task. Another study found that when an email involves doing something outside
your inbox, it takes over nine minutes to return to your original task. And a lot of these
studies were done before the age of Slack and before the age of... Yes, I'm not surprised at all
by those statistics. That's what I would expect. Also, these stats about the amount of time that it
takes to get back to a task after you do email, I'm not surprised at that either because we found
that when people were interrupted, it took them about 25 and a half minutes to get back to the
original task. But the reason is we're looking at a different kind of granularity. We're looking at
the level of project. So we had clustered smaller activities into a cohesive, cognitive theme that we
call a project. And so you get interrupted. You work on another project, and then it turns out
you work on another project. And then you go back to your original task. I think what's happening
with when you talk about these gaps, I think you said nine minutes to reorient back. People are doing
intervening things. And so these intervening things are making it even harder for them to
reorient back to that original task before they got interrupted. Their focus keeps shifting.
At 25 and a half minute status also, that's sort of mind-blowing, just putting into context,
just typed it out. I think that's 2.5% of your waking day is trying to return back to the
task you got interrupted from. Yeah. Well, but I mean, people are doing things during that 25 and a half
minutes, and they're doing things that are distracting them from that original task. When I was thinking
about these things years ago when I first met you, one of the things that concerned me wasn't just
the present state of affairs in 2013. I felt like you did, Gloria, that, you know, when you came back to
the U.S. from Germany and that slow clock rate lifestyle and then moving to the hyper-fast, you know,
squeeze your luncheon clock rate lifestyle, that I was getting a sneak preview of the future.
So when I landed at Google in 2012, it was the first time I'd worked at a company that big
that had, you know, just boatloads of email coming in every single day, just an insane barrage
of information. And then I was also in the bleeding edge of using a lot of the early social media
apps because a lot of my friends were making them. And I saw as the sort of tech environment
in San Francisco, it was just peak. It was the worst. It was almost.
like hitting peak oil and saying, oh my God, what are we going to do? And then people started
fracking for attention, realizing, hey, we could actually double the size of the attention
economy by getting you to pay attention to two things at once. Then we can get to pay attention
to four things at once. And we can sell all that attention as if it's the same price to more
advertisers and quadruple the size of the attention economy. And I'm just curious, as you've been
working on this for more than a decade now, how do you see the trends of where this is going?
So I am hopeful that we will find solutions because we just can't continue at this rate. I just don't believe so.
You know, stress is just increasing. And it's not just acute stress, but it's cumulative chronic stress.
And so we have to think how we're going to break these habits. And it's going to have to be a partnership with technology companies as well.
It seems like there's this missing piece that if we're so damn sophisticated and we're so damn good at building great advanced tech, why is it that we're overloaded?
We can hardly think the quality of our public discourse is going down.
It says to me that we're missing something, which is that we have these paleolithic primitive brains.
We feel intrinsically as an evolutionary instinct, we have to get back to those people.
Or, you know, back to the email example, your boss sends an email to, you know, you and your peers.
who work underneath that boss, and you see one of your coworkers get back to that boss in, like, less than two minutes.
What does that do to everybody else who's sitting there?
Yeah, this is social signaling and social obligation, time slot machine mechanics.
Right, so you're just, you're adding all of that into one cocktail, and then now you jack that into the back of a human being for 80 times a day or the 74 times a day you said you mentioned people check their email, and you say, oh, my God, this is like a psychological outbreak.
We've epidemiologically jacked in all these people without even knowing what we've done to.
ourselves and we didn't make it safe first before we started spreading this all over the world.
You know, I'm just curious, you know, hearing all that and given how long you've worked on
this, how much have you seen companies improve the situation? And if, you know, and to what extent
if they haven't done that, why aren't more things happening to protect our finite limits of our
attention in our minds? So I totally agree with you, Tristan, the technology is not fitting our
practices. I mean, we are still in the wild west of technology development, and there's not
enough attention given to what human practices and human cognitive resources are. And that's because
technology has just been developed, you know, like mad, without really thinking about how it
fits to human beings without really doing what's called user-centered design.
We have a lot of designers in technology companies who listen to this podcast, and I don't
think anyone inside of Google or Facebook would say that their goal is not to do user-centered
design, and yet there's this thing where we still haven't gotten it right. And we can segment
between two different components of this, right? So there's clearly the incentivized attention
economy, you know, scooping attention, you know, scoop after scoop out of our brains because that's
the business model the stock price is hooked up to it you can't tell youtube to stop doing it that's
one thing but then there's this other part like their email and text messaging these are not you know
Gmail does not just want to rake in the cash and pump up the stock price by just making people
stressed and burnt out on email every day they don't want to do that um you know i message doesn't
want to do this with text messages WhatsApp doesn't want to do this with with WhatsApp so why haven't
for these neutral communication products like there could be just this design renaissance what do you
think, you know, those designers of those products, the neutral ones, the ones that are
communication products especially, that they need to hear or understand? Yeah, so I believe that
they're just trying to think about very short-term kinds of goals. How can we design a technology
so that it's user-friendly? But the deeper, farther-reaching goals are things like the user
should have agency and control over their actions. The user should have a choice whether
to participate. There needs to be a very deep understanding into the psyche and the behavioral
practices of people, not just in a laboratory setting where a lot of user-centered design gets
done, but look at people in the course of their daily lives, how they're using technology.
And another thing is that, and gosh, we've known this in psychology for many, many years, that there are individual differences and one size does not fit all.
And so we've tested interventions, and they tend to have this premise that it's one size fit all.
And it turns out that there are some people who actually are pretty good at self-regulating.
And if those people, you know, maybe they just want a little bit of self-regulation, or they want to be able to regulate distractions in certain contexts, but not in all contexts.
But using a lot of these interventions actually harms people who have good self-regulation ability.
It actually increases their cognitive load because they're already pretty good at knowing when to take breaks and when to get back on track.
But the flip side is there are a lot of people that are just very susceptible to distractions that don't have this ability to self-regulate.
And they do need support. User-centered design needs to approach this problem in a much deeper way.
What is that goal that we're after? And how can technology be designed to help us reach these goals?
So if you were to rewrite Facebook's or Twitter's mission statements, like what would your version look like?
Well, my version would be how can we use this technology to help people do things like develop deeper relationships, better social connectedness, to be able to get better support from other individuals?
how can we help people achieve a work-life balance?
How can we help people achieve goals that are really important to them in life?
How can we enable people to do lifelong learning?
Because that's what we should be able to do with technology.
So one of the things we'll always hear is,
and I think you have the perfect purchase to talk about this,
is, okay, yeah, those are nice,
but how do you metricize those?
How do we know when we're succeeding?
You would be the best person in the world, actually,
to think about how would we know
we're succeeding at doing those goals?
Well, there are a number of measures that can use.
I don't think there's any single measure.
And I do believe that we have to look at
how people are using technology in their everyday lives.
So we do what we call living laboratories,
where instead of bringing people into a closed room laboratory, we go to where people are.
And then you can use different methods to try to understand to what extent people are achieving their goals.
First of all, want to try to understand what are the goals that each individual wants.
And then you take a really deep dive and try to understand, is this technology inhibiting people from getting.
there, right? And if it is, why? What is it about that technology?
It's Tristan again. Gloria raises a lot of intriguing questions here, which are actually some
pretty great design prompts. These questions would force a designer into an unfamiliar
conversation with the user. I mean, not just looking at their behavior, but asking them about
their values and their goals. How do you do that with technology? Take any in-person
community gathering, it takes work. And you can think of all that work as kind of a pipeline.
Like, where is it going to happen? Who's hosting it with me? Who's going to invite people?
That pipeline right now is really kind of hard. Each step has high drop-off rates. And because it's
so hard, people don't do it. But imagine a world where our devices strengthen that pipeline.
And we go through each step as the Apple designers, as the I-Message designers, as the WhatsApp
app designers and say, why do people find this hard and how can we make this a lot easier?
Like, imagine if Facebook events wasn't this big formal thing for, you know, groups of 100 people
or 50 people or 30 people, but was more of a lightweight thing where, you know, it was easy to
kind of negotiate these things in small group threads, and there was templates. So, like, who's
going to be the co-host and where are we going to meet? And they were, like, clickable buttons that
people could kind of vote up and down or say what they want to do and provide reminders in that
group chat, so they're kind of emerging between group chat and calendaring, and all the
design choices were meant to instrument and strengthen that pipeline because we know that it's a
harder thing to do than to simply go by yourself into a wormhole. And I think that's a whole
design area, a whole design project, that anyone who's working on Facebook events or working
on messaging, communication applications could easily dedicate resources to and get a huge
payoff.
If you want to reimagine the choices people make as they interact with your company's
products, check out the Center for Humane Technology's design guide.
It's a one-page list of questions that helps your team quickly get attuned to the vulnerabilities
that we all share.
You can download that worksheet at humanetech.com slash design guide.
One of the challenges that comes up in the conversation about persuasive technology
is when you ask people what their goals are for their life or something like that,
a lot of people don't actually know their own goals.
And moreover, if you take Instagram's goals, you know, you can actually, as a persuasive
technologist, you can colonize the goals of the human social primates sitting in front of
you.
So I can manufacture a goal of getting you to be addicted to getting attention from other
people.
So now if you actually ask people, well, what?
is your goal? What do you want? They say, no, I really want more followers on Instagram. That's my
goal. Can you help me with that goal? And so one of the challenges is, what does it mean to
ethically persuade someone when they don't have their own goals? And then you have asymmetric
power to get your goals in there. And I think this actually speaks to a second part of the
research, Gloria, that you've done that I've found really profound, is your research on the
predictability of people's personalities. I think it's 100 Facebook likes and you can know
someone's big five personality traits with different degrees of accuracy.
I think it's just 150 to know better than a, I think, as a spouse, and 300 to know better
than the person can predict themselves.
Right.
But, you know, that speaks to, okay, well, at least that's gated by whether or not the data
is safe.
Like, well, maybe we can lock up that data on Facebook likes.
Maybe we can protect people's sovereignty so that they, no one can do that full checkmate
hijacking on a human social being.
But your research shows something different.
What is that?
So I think you're talking about a study we did where we logged people's computer and phone activity, and we applied machine learning, and we looked at only temporal features of the phone and computer use.
And by temporal features, I mean things like how often did they check their smartphone, how often did they check social media, did they check it?
All at one time? Was it dispersed throughout the day? How routine was the pattern? What is the, when they started checking it at the end of day? And so we got a whole lot of different features just about these temporal factors and found on that alone, we could predict people's personality traits from the big five, the ocean traits, with a fair degree of accuracy.
Wow. I think I remember seeing another paper from last year that used eye movement to predict personality traits as well. And I guess one of my takeaways was like human beings just throw off entropy. Like we're constantly tossing out signatures of underlying traits.
Yeah. And people are getting better and better and more creative at finding ways to discover things about people.
using technology, using new inventions. So, you know, machine learning is getting better and
logging tools are getting better and people are just becoming more inventive about the kinds
of data that they can use. When people go on a number of different social media platforms,
what they do is just there for anyone to mine. But we really need to have better methods of
preserving privacy or sanitizing data so that we can break this kind of asymmetry of power.
Yeah, I mean, I think especially when you realize as people worry about their privacy and the data that
they hand over to Facebook. I mean, can you tell Facebook not to collect the sequence of clicks that I'm
making through the app? I mean, what does it mean to not collect that? I mean, Gloria, when we met,
I think we talked about this. I was at a lab at Stanford, the Persuasive Technology Lab, and the last
class was on the future of persuasive technology. And we talked about this future world where you'd
have a persuadability profile for every person. Like imagine you had, I think we actually even use
the Big Five personality traits as one of the features, just so you know, okay, this person is
particularly convinced by appeals to authority. So if I say Harvard University or Stanford University
says that the Brookings Institute says that, you know, this thing is true, you and your mind would be
very easily persuaded by that. But maybe you're the kind of person instead that maybe appeals
to strong authority leaders, says that, or your friend Susan, you trust your friend Susan
for everything on these specific matters. Increasingly, you know, are we going to get worse and worse
of predicting things about people? No, we're going to get better and better and better and increased
computational power. So the interesting thing that I see about your work on two sides to link back
to distraction and the predictability research, which is that there's two ways to predict human beings
and to make ourselves predictable.
one is you simplify human beings you take them from their full rich dimensional complexity
and all the things that they might creatively do and all the thoughts that they might be able to have
and you know the rich expression of their full selves and you say no no no let's just turn
them into lab rats and pavlovianly conditioned them to to check like little you know predictable rats
let's simplify your thinking let's simplify your clicks let's actually turn your tweets from big
long paragraphs or long books into here's a hundred characters let's just make you super predictable
But then that runs out, because you can only make people so simple.
So let's instead build bigger and bigger supercomputers to predict the other side of it,
which is no matter how rich and complex and unique you are.
Let's actually just use bigger and bigger supercomputers to predict more and more things about you non-invasively,
so we don't even have to attach a sensor, which is what your second paper is about.
It's about the ability to predict people's personalities.
And once we're predictable, and that predictability is hooked up to asymmetric systems
with power that grows exponentially from the power, from the profits that they make,
get reinvested into bigger supercomputers that are even better at predicting step T plus one,
suddenly that's checkmate. So I'm just curious what do you think of that? I mean, that's a
big and dangerous trend that sounds like sci-fi, but we actually find ourselves in that reality
right now. Yeah, I'm just wondering as you're speaking, so you've got this one side, which is a very
reductionist perspective, and the other side is a much more complex perspective treating humans as
complex beings. And I'm just wondering if at some point people are just going to rebel and go offline,
because what other option would there be? I mean, if you're going online and information is being
gathered about you to be able to be used for purposes that may not meet your goals, what other
options are there for people? Well, this is why I think, you know, this
is our project, our project meaning the collective we, like not just Center for Humane Technology,
but we call it Humane Technology because the point is, as you said, we, and the answer isn't
to just unplug from everything and to go into a forest and not use any email. As you said,
you know, you can't do that. There is only one way to solve that, which is to make technology
in service of humans being their more full, expressed, creative, unpredictable selves to be
sovereign. So first of all, just curious, since you've studied this for such a long time,
on the distraction interruption side.
You must be the world expert on how you have curated your attention of life or have you.
I mean, what is it like for you and what do you do?
So I am asked this question all the time,
and I am not much better than the people I study.
I have to admit.
I can pull back and objectively look at the problem,
but it's a challenge.
I am as much sucked into.
to social media and the web and email as everybody else is. I do think, though, that, you know,
I come from a psychology background. And, you know, in psychology, we can think about
automatic and controlled processing. And a lot of what we do in technology is automatic processing.
So checking your smartphone, checking email, checking social media, just kind of surfing mindlessly on the web, these are all automatic kinds of thinking.
Now, technology should not be reinforcing us to have these kinds of automatic actions and automatic thinking.
Technology should be helping us to think more deliberately about what we're doing with it.
So the recommendation that we need to be more deliberate, I agree with, and then it goes back to this point that we should feel a dramatic need to protect the very, very, very finite amount of that deliberate, conscious choice-making, creative energy that we have.
So if I'm giving you the ability to connect with your friends, you know, then – sorry, I got a text message ironically in the middle of that, so I'm going to say it again.
Anyway, now I forgot what I was saying.
And that's interesting, right, just to see the way that I'm, like you, Gloria, have studied this topic for about, you know, more than close to a decade now.
My brain just, I just lose the signal, you know, the second, the thing I was saying just 10 seconds before.
And that's actually what is so alarming is I see us all gasping for air while the air is slowly burning out.
And it's like, we're the few people trying to say, we have to like just fill the chamber with a ton more oxygen and totally change the incentives so that everyone's not trying to steal oxygen out of the system.
I actually think that we need to start with young people because, you know, you've got, I sometimes will be on, like, public transport and I see a three-year-old kid playing with their parent's smartphone.
And I'm thinking, oh, man, that's just really bad. It's bad training because these kids are developing habits so young.
But, you know, can we counteract that?
Can we do something so that kids don't develop the habits so that kids can learn resistance
and they can learn that they can get gratification from so many other things?
You know, studies show that taking a 20-minute walk outside can make people significantly
in a better mood with more positive affect.
They don't have to be on their phones for the whole time.
Well, and to your point, Gloria, about, you know, the power of a walk, I mean, the premise of this larger project that we, with the Center for Human Technology, are trying to call designers into all of us, you know, making this change together is by better understanding these invisible dynamics in human nature.
Like you just mentioned a fact, which is that a 20-minute walk, as simple as that, can make a perfect.
difference for someone's well-being. And back in a talk I made, I think it was like four or five
years ago now, a simple change to the Google Calendar invite system, you know, if you're inside
of a corporation, use Google Calendar, and it says, which room do you want to book? And no one ever
thinks beyond the menu. The menu is, well, which room do you want a book? Sounds like a pretty good
menu. There's lots of rooms. We'll automatically calculate which ones are available, not already
booked, et cetera. And the design demo... Meet outside. You could meet outside.
Exactly. Well, that's what the demo did. So I had this design, you know, showing.
that, hey, imagine it let you, based on the duration of the meeting that you're booking,
say, oh, there's a 30-minute walking loop, there's a one-hour walking loop, and it let you set
those up for your company. And then, boom, just like that one little change in like 10, 15%
more people are just on walking meetings more of the time, because we don't actually even want
to meet in rooms. And so invisibly, because our cognition's overloaded, we're just going
with the default choices, picking from menus that have been pre-chosen by some basic, you know,
algorithm that's picking from a set of data objects, which happen to be rooms and managed
perfectly through this optimization engine, but it has nothing to do with the well-being ecology
of a social fabric or what makes a life well-lived. And I think that's just the thing I want to end on
is, is how do we think about this at multiple different levels at the same time? These are the
kinds of things that we as designers need to train up in, and it really ultimately is an education
and understanding of the full-stack dynamics of human nature, which is not to say that there's this
complete encyclopedia and we all know it, it's that there's this thing we have to explore together. So
I'm thankful to you, Gloria, because that's what we were so excited to examine one really important part of that stack, and just really grateful for your time and being here.
Oh, it was my pleasure, and thank you for giving me the opportunity to talk about this.
Can you change something if you're not paying attention to it? I mean, is it possible to change something without actually pointing your attention?
at the thing that you want to change? No. And some of the most important things we've got to
change, like climate change or, you know, inequality. They're going to require a devoted use
of our attention that's sustained, that's shared. A lot of other people have to pay attention
to the same things as we do at the same time. And, you know, attention isn't just this precious
resource for ourselves. It's also precious to do it together. The most precious thing about
a weekend is that everyone has nothing else to do at the same time. And there's something that's
extra special about attention being allocated together. This is not some kind of inconsequential thing
about being distracted at work or feeling stressed. It's that this is the most precious finite
resource that we have. And if we're going to get control over it, we have to have so much more
awareness about the mechanics of what drive it to certain places. So, you know, we can develop
that liberates our attention and help us direct it to things we care about, or we can let it be the status quo.
But then we'd be forfeiting our attention to the next incoming notification and the next one and the next one ad nauseum.
So, you know, it's up to us. Where do we want it to go?
Your undivided attention is produced by the Center for Humane Technology.
Our executive producer is Dan Kedmi
Our associate producer is Natalie Jones
Original music and sound designed by Ryan and Hayes Holiday
Special thanks to Abby Hall, Brooke Clinton,
Randy Fernando, Colleen Hacchus,
David J, and the whole Center for Humane Technology team
for making this podcast possible.
And a very special thanks goes to our generous lead supporters
at the Center for Humane Technology
who make all of our work possible,
including the Gerald Swartz and Heather Reisman Foundation,
the Omidyar Network,
the Patrick J. McGovern Foundation, Craig Newmark Philanthropies, Knight Foundation, Evolve Foundation, and Ford Foundation, among many others.