Your Undivided Attention - Stepping Into the Metaverse — with Dr. Courtney Cogburn and Prof. Jeremy Bailenson
Episode Date: October 6, 2022The next frontier of the internet is the metaverse. That's why Mark Zuckerberg changed the name of his company from Facebook to Meta, and just sold $10 billion in corporate bonds to raise money for me...taverse-related projects.How might we learn from our experience with social media, and anticipate the harms of the metaverse before they arise? What would it look like to design a humane metaverse — that respects our attention, improves our well-being, and strengthens our democracy?This week on Your Undivided Attention, we talk with two pioneers who are thinking critically about the development of the metaverse. Professor Jeremy Bailenson is the Founding director of Stanford’s Virtual Human Interaction Lab, where he studies how virtual experiences lead to changes in perceptions of self and others. Dr. Courtney Cogburn is an Associate Professor at Columbia's School of Social Work, where she examines associations between racism and stress-related disease. Jeremy and Courtney collaborated on 1000 Cut Journey, a virtual reality experience about systemic racism.CORRECTIONS: In the episode, Courtney says that the average US adult consumes 9 hours of media per day, but the actual number in 2022 is closer to 13 hours.Finally, Aza mentions the "pockets of 4.6 billion people" — implying that there are 4.6 billion smartphone users. The global number of social media users is 4.7 billion, and the number of smartphone users is actually 6.6 billion.RECOMMENDED MEDIA: Experience on Demand: What Virtual Reality Is, How It Works, and What It Can Dohttps://www.amazon.com/Experience-Demand-Virtual-Reality-Works/dp/0393253694Jeremy Bailenson's 2018 book exploring how virtual reality can be harnessed to improve our everyday livesExperiencing Racism in VRhttps://www.ted.com/talks/courtney_cogburn_experiencing_racism_in_vr_courtney_d_cogburn_phd_tedxrvaCourtney Cogburn's 2017 TEDx talk about how using virtual reality to help people experience the complexities of racismDo Artifacts Have Politics?https://faculty.cc.gatech.edu/~beki/cs4001/Winner.pdf Technology philosopher Langdon Winner’s seminal 1980 article, in which he writes, "by far the greatest latitude of choice exists the very first time a particular instrument, system, or technique is introduced."RECOMMENDED YUA EPISODES: Do You Want To Become A Vampire? with LA Paulhttps://www.humanetech.com/podcast/39-do-you-want-to-become-a-vampirePardon the Interruptions with Gloria Markhttps://www.humanetech.com/podcast/7-pardon-the-interruptionsBonus - What Is Humane Technology?https://www.humanetech.com/podcast/bonus-what-is-humane-technologyYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Transcript
Discussion (0)
Today on Your Undivided Attention, we're stepping into the Metaverse.
At the moment, the Metaverse is a series of disconnected virtual reality spaces
where people hang out with friends, play games, try extreme sports, or pretend to be unicorns.
Mark Zuckerberg believes that in the future, a billion people will spend their money and time in the Metaverse every day.
And he's bet the future of his company on it.
To prepare for today's conversation, our executive producer,
Stephanie Lep and I spent time inside a virtual reality experience designed by our two guests,
Dr. Courtney Cogburn and Professor Jeremy Baylinson.
You want to make sure that you're looking through the center of these lenses here.
So it's blurry, just adjusted vertically on your face, and that should help clear things up.
Together, Courtney and Jeremy created a thousand-cut journey,
an experience that puts you inside the body of a young black man, Michael Sterling,
as he experiences racism over the course of his life.
Okay, putting on the hand controllers.
We're in a moment of opportunity.
As we'll hear Courtney say in our conversation,
the metaverse is still somewhat of a blank slate.
No one owns it and no one governs it.
How might we learn from our experience with social media
and anticipate the harms of the metaverse before they arise?
What would it look like to design a humane metaverse
that respects our attention,
that improves our well-being and relationships,
and strengthens our democracy?
I'm Tristan Harris, and I'm Azaraskin.
And this is Your Undivided Attention, the podcast from the Center for Humane Technology.
Welcome to your Undivided Attention. I am so excited for today's conversation. We have not
covered topics like the Metaverse or virtual reality before, and we cannot be talking about
the future of technology that is humane to the society that is riding on top of
without talking about where technology is going. And one of the major plays,
places that technology is going is virtual reality. And today we have with us, Dr. Courtney
Cogburn, who's an associate professor at Columbia University in the School of Social Work.
And we have Professor Jeremy Baylinson, a founding director of Stanford University's
Virtual Human Interaction Lab. Welcome, Jeremy and Courtney.
Thanks for having us. Yeah. Hey, hello.
Super excited for this conversation today. And, you know, many listeners might be thinking
to themselves, are we really going to live in a future where we live with TV,
on our faces for hours a day, as Elon Musk kind of said. But just to cite why this conversation
matters for people. Bloomberg intelligence has estimated the market opportunity associated with the
Metaverse is nearly $800 billion in annual revenue by 2024. McKinsey says the Metaverse could
generate up to $5 trillion in economic impact by 2030. And Meta just sold its first ever corporate
bonds to raise money for its Metaverse project. That's Meta, you know, Facebook. And Asa and I actually
have friends who know Zuckerberg and heard that during the pandemic, he spent a lot of his time
inviting his friends to join him in the Metaverse. And he was sitting up there probably in
Montana or wherever he was. And he was playing the Metaverse hours and hours a day. And the
reason for that is that he has banked the future of the company on virtual reality succeeding.
If you especially consider that, you know, the stock price of Facebook has tanked and been,
you know, cut in half since October 2021. So I thought I would open Jeremy by asking you,
you actually had Zuckerberg visit your lab in 2014.
Let's set the stage for listeners about kind of how did we get to this moment where VR is not just this momentary fad,
but how do we compare Zuckerberg's focus on the metaverse and VR now to when you met him in 2014?
Yeah, so thanks for that lead interest on.
So I run a virtual reality lab.
It's a physical place, and we do lots of research there, but we also do outreach.
So in any given year, we'll have a couple thousand people that come to the lab.
They could be government officials, they could be third grade kids on a field trip, they could be CEOs of companies, they could just be somebody who's wandering through the building.
And when we hosted Mark in 2014, we brought him into the lab and we showed him all the demos that we tend to show on these tours.
We had him do climate change simulations to learn about the future.
We had him walk the plank, which is a demo that shows you something called presence, which means the plank, it feels like it's a real pit and you actually get scared and aroused.
We had him go into a virtual mirror
and to change his identity
to learn about things that we'll talk about today.
So he did all the tours
and got a real sense of what immersive VR is.
And then we talked about technology applications
and how a company like Facebook
might use this technology in their business model.
He came prepared with lots of questions.
He was a good scholar about VR.
I'll say he is what we call high presence,
meaning some people just get VR and it really affects them.
Some people actually don't get the medium.
It's not for everyone.
And for example, when we do the plank demo, we hit a button and the floor drops.
You feel vibrations on your feet and the spatialized sound.
It's a pretty intense moment.
And that moment, he put his hand to his heart and his knees buckled a little bit and really got that.
The surprise to me was we hosted him not knowing anything about their interest to actually purchase Oculus,
which was a small company that made this VR hardware.
Facebook bought Oculus for about $3 billion a week later.
and I actually had no idea he was there on a due diligence fact-finding mission for Oculus.
We were just there like we do with anyone that comes to the lab,
teaching them about the medium, answering questions.
Yeah, I remember when I first encountered your work, Jeremy,
it was at a Children and Screens conference in Irvine, California, in 2015 or 16,
I heard you speak.
And you talked about this first experience, just this moment of walking the plank
and how you hit a button and suddenly the ground disappears beneath your feet
and you're just on this one single plank.
And even though you know you're sitting in a room, in a physical room, you said in that talk, your brain doesn't know the difference between the feeling that you're experiencing through your eyeballs.
Your eyeballs are feeling that you are in this high altitude plank and you can fall over.
So if someone pushes you, you will kind of flounder as if someone just pushed you off of a plank.
Yeah, look, the first plank I did was in 1999.
The resolution was ridiculously low.
the frame rate was less than 30 times a second,
the latency was about a quarter of a second.
It was the most ridiculously low-fi VR experience you could ever imagine,
and I was terrified on that plank.
So, you know, we like to say that the brain treats these experiences
in a similar way to the real world.
As I get older, my language has evolved and how I describe it.
I tend to be a little more careful than perhaps some of my early talks.
But Tristan, that's exactly it, which is that when VR has done well,
you don't know it's a medium.
The media disappears.
Right.
And that's what we want to get into today, especially as we turn to Courtney's work.
But I first wanted to just refer everyone back to the work of Marshall McLuhan,
who said that the medium is the message.
A lot of people get confused by this statement.
And McLuhan further refined his statement when he said that our conventional response to all media,
namely that it is how they are used that counts,
is the numb stance of the technological idiot.
For the content of the medium is like a juicy piece of meat carried by the burglar
to distract the watchdog of the mind.
And I think that it just really speaks to what we're really want to get into with this
podcast today, which is that this is a new medium.
And when we talk about humane technology, we're really talking about is humane or
conscious medium design.
How do we consciously make mediums that symbiotically ride on top of humanity and
bring out hopefully the light triad rather than the dark triad of human psychological
characteristics and especially empathy?
And so, Courtney, I wanted to turn to you.
I would love for you to explain what a thousand cut journey is.
I did this experience in your lab in New York City.
But just to introduce yourself in your background for listeners and how you decided to build this thousand cut journey experience to build empathy, which is one of the key things that this medium really can deliver.
Sure. I think media in general, mass media is such an important component of our lives, right? The average adult consumes about nine hours of media a day. We have some sense of how that affects people psychologically, their behaviors, their decision making. We know less about the effects on their health. And we're definitely in new.
territory with VR in terms of how are we really affecting people when they go through these
deeply immersive experiences. So for me, I'm a psychologist by training, but really would
describe myself as transdisciplinary, working across multiple disciplines and methods. I am interested
in questions of racial equity in society. I study how racism affects health. And I found myself
at this point where the science and the data didn't seem to be convincing people.
well enough that racism is a reality and we need to do something about it. And I was spending
a lot of time convincing people to care. I think we're in a different social political moment now,
post-George Floyd, but certainly over the course of my career, that was not the case.
So I wanted to use VR to build on this adage of being able to walk in someone else's shoes
and really think about ways for people to experience firsthand what this feels like in your body.
Do you understand what I'm talking about or what someone is talking about when they say they've been treated badly on the basis of race?
And I think more importantly for me, how do you understand systems of inequity?
How is it built into housing?
How is it built into policing?
How are racial inequities built into the health care system and our financial systems, et cetera?
And as I learned from Jeremy, it's really difficult to do that in VR.
So we opted to put you in the shoes of an individual.
to take this first-person perspective-taking approach,
but allow that user to be in that life
at multiple points in time in multiple contexts.
So we're able to capture structure and patterns
through an individual perspective,
but still convey that it's not just the one time
someone calls you a name.
It starts very early in life.
It goes across the life course.
It's not just the classroom.
It's not just policing.
It's all of these ways in which it can show up.
So I did your experience, A Thousand Cut Journey.
The key moment for me, just in the introduction,
which is you're in this kind of dark room with a wooden floor,
and you turn over to the side and there's a mirror.
And when you walk up to the mirror, you see that you are a black man.
Look at your reflection in the mirror.
Take one step closer and examine your new body.
Wave to yourself.
You've now become Michael Sterling.
Your friends call you Mike.
And I'm still surprised by how visceral and impactful that was.
And I know that you call that the body transfer or like when we talk about empathy,
we can often just conjure ideas of, oh, empathy, empathy, empathy.
But the body transfer of actually seeing your own body in a different way is a profound experience.
Then you are, you know, bounced from that experience into, I don't know,
you're basically in kindergarten or first grade.
Is that right, Courtney?
Yeah, first grade, yeah.
Yeah, and you're playing with blocks and the teacher starts basically yelling at you
and the kids are making racial statements about you.
Well, I'm making a robot, and it can spit fireballs,
and you have to admit, that's pretty cool.
You should make a fireball, Mike.
Mike is a black fireball.
Ah, I'm from the scary black fireball.
Yours will be the scariest because yours is black,
and black is always the scariest.
Mike, Mike, look at me.
The teacher's telling me to look at her.
And she's saying I threw something.
You're being dangerous and you're going to hurt someone.
You want to talk a little bit about how you designed this experience together
and some of the things people may not know about what it's like to create something like this
and what your motives were?
Yeah, I mean, that was the beauty of this collaboration with Jeremy and his team, right?
You have me and my team at the School of Social Work, deeply embedded in the work of racism
and racial inequities, et cetera.
And you have Jeremy and his team engaged in social behavioral applications.
of VR for 20 years coming together, right, and bringing our expertise. And it was truly
transdisciplinary in nature. And I'm going to stop to describe this because I think it's important
to the bigger point you're asking about how do we not end up in a place where we're replicating
inequity and how do we build and imagine new ways of being together that might relate to this
bigger question about the metaverse. It's a process question. Who's at the table and how are they
working together, who has power? So Jeremy's work in VR is not more important than the social
work knowledge at the table. Both of those things have to coexist and work together. So we're learning
a lot about body transfer and presence and what works in VR based on previous work, being able to
see yourself in a mirror, using the body, not just standing around and looking that you have to use
the body in particular ways. And so then we started to imagine what are some meaningful ways to use
your hands and body in this experience that align with the narrative, but that also match up with
the research in VR around presence and body transfer. It wasn't that we came up with an idea
and handed it over to a VR team that went to design and create a piece for us, right?
At every step along the way, we're going back and forth and iterating to try and decide
what's meaningful. So the use of the body, looking at brown hands, seeing yourself in a mirror
came from the research that Jeremy is done and understood
and then grabbing and playing with blocks
and the ways which kids are looking at you
and interacting with you, right, comes from our team
or why would you need to get down on your knees
and put your hands up, right?
Which is one part of the experience,
which was quite tricky technically to achieve.
And I think, Jeremy, if I'm remembering correctly,
there was pushback from the tech side to say
to change the height level, the eye level and VRs,
really hard, let's not do that. And we pushed back and said, if there's a police scene,
you need to get down on your knees and put your hands up. And so we figured out this tech solution.
So it was a lot of iteration and back and forth. That was an important process for developing
the content. That reminds me of actually the way Sesame Street was developed, where in order to
understand how do you make television that is effective at teaching, they had to bring a lot of
different expertise into the room. So developmental psychology, people who are good at making
like sticky content like advertisers and marketers, teachers, you have the puppeteers, you have
an entire group of people that are seeing like the whole individual of what is a child and
how do they learn in a wider context because this is a thing that would strike me as you're
speaking. You know, there's this really nice line between what is it to be ergonomic to
fit and understand the human body, the human mind, and externalities, because the degree to which
you don't understand the human experience, the human mind, how we bend and fold, is the degree
to which you're probably going to create breaking harm. And so the more totalizing the medium
as is VR, the more harm that you can do, and the more you're going to have to bring in like
a diversity of experiences and knowledge to be able to fit well that whole.
whole human being. You know, to link this for listeners back to the work of social media and the
social dilemma that many people have seen, you know, Instagram is impacting the development of
teenagers. But it's not designed with the child developmental psychologists and the teenager
development psychologists at the table saying, what does healthy teenager development look like?
How to be designed around those considerations from the beginning and have that design expertise
together. I think, Aza, your example of Sesame Street is the similar thing. It's like that was one of the
rare instances where up until that moment, and McLuhan and Neil Postman would criticize television,
you would think that the race for where television goes is just towards junk TV. And it did go
towards junk TV. But Sesame Street's like, hey, wait a second, what if we did this in a conscious
way as best we can? And we actually brought the child developmental psychologist to the table
to get something that was specifically purpose built for the values that we know we want
to include there. And so with the two of you working together and actually exemplifying what
that process looks like, you know, I think about more broadly, does Facebook when they're designing
common thread interfaces, do they have scholars in democratic theory and people who understand
what does it mean to create a pluralist society in which healthy, turn-taking conversation and
deliberation is happening? They didn't start with that as the design premise. They didn't start
with that with the expertise at the table. So what we're really trying to say here is like,
okay, what can we learn about doing this the right way, especially since we have a brand new medium
that didn't exist before? So I'd love to hear anything that brings up for you.
Yeah, and it's easier, frankly, when you have Sesame Street.
You have a very specific target audience that you're trying to help develop healthy citizens of society who are caring of other people.
You have clear goals, clear outcomes, and a clear target group.
It's easier to define who you need at the table at the very beginning.
When you're trying to influence all of society, the scope becomes very, very different.
And it's difficult to anticipate until you start negatively impacting teens that you need to have someone in your team at the beginning.
But there are people who understand inequity, marginalized groups, who's most vulnerable, who's going to be most negatively impacted.
And if those people are at the table at the beginning, they're more likely to raise the concern and the red flag earlier at the beginning, rather than bringing them in after you've already caused harm.
But ultimately, that also doesn't matter if the end goal is profit and money, right?
That I can raise red flags all day.
But if you're not interested in building a healthy democratic society, if that's not your number one priority, you're interested in market share of the Metaverse, then it may not matter that I'm at the table, right?
So there's a shared value system that's important here as well.
So when Jeremy and I work together, you know, if I had just gone to straight up computer scientists who knew how to design VR, who could care less about racism in society, this may not have worked.
worked very well. That may not have been what they're particularly interested, and we don't have
a shared value system. So Jeremy didn't need an expertise in racism, but it was clear that he
cared about it. It was clear that he thought it was important. It was clear that he respected
my work and expertise so that when I did say something or offer an idea that it was received
with that in mind. You know, Courtney, one of the things that's impressed me most about the last
few years was this incredible study you've done at Columbia where you've made Thousand Cut part of the
curriculum and followed a cohort over time. So I think in answer to the question, you know,
one of the things that we're doing to think about bad possible effects of Thousand Cut is studying it
out in small field studies before we release it to the world. So Courtney, maybe you could tell us
more about what we've learned there. Yeah, you know, I think Jeremy and I both being at the root
psychologists, although we've both meandered into various fields and disciplines since then,
is that we want to understand what effect we're having on people. And I think in the studies
that we've designed, we have that in mind. And they're starting to get a sense of not just
things like empathy, which I think is important, but I think is a limited frame of what VR is
capable of. I've been really interested in how can we shape the ways in which people understand
society and analyze structural problems. Not just how I feel when I experience racism. What is your
relationship to racism? We all live here. We're all a part of this system. So it's not just sort of
looking at the target group and how they feel and what they're experiencing. It's actually
looking out at the context and saying, how did we get here and what can we do about it? And in order
to intervene on the problem, you have to understand the problem. And so I think we're getting some
evidence in our empirical work that in a short period of time, we're shifting people's willingness
to identify race and racism as factors that we have to account for. We're shifting,
understanding that it's not just the matter of individual choice and behavior that's leading
to these outcomes. It's policies and legislation and other types of structures and systems
that are creating barriers and inequity. And that framing and understanding is really important
to what do we do next. And it's not just about
feeling badly and do you think I feel badly what I'm having this experience? Of course I am. It's
racism. It sucks for everyone. It doesn't feel good. But I think we're finding some evidence that
VR can be quite powerful even beyond. I think I'm hearing you say that it's not just I get to
teleport into one other person's shoes and then I get the knowledge of what it feels like to be
them. But in so doing, I'm starting to see systems from many different perspectives and it's that
multi-perspectival knowledge that lets me but understand the elephant that we're all feeling
versus just the immediacy of like, oh, now I have an empathy bond with one other perspective.
Yeah, even the focus on empathy in the context of racism, let's say, places the focus of racism
on the person experiencing racism.
What I'm trying to do is have us focus on where is the racism coming from?
And it's not me, right?
So you understanding me is not understanding racism.
There's even questions about whether you could ever really understand or empathize, right, with my experience.
But what you can understand is society.
And if we're going to shift anything, that's definitely a piece we have to shift to.
Yeah, totally.
For me, experiencing a thousand cut journey, there's a point at which you're, I think it's like 15 years old,
you're going to go to a basketball game and your mother is there on the couch and she tells you, you know,
be careful because you look like there's this guy on TV
who the police say that they're going after.
You're wearing a tank top, blue tank top looks like the guy on TV.
I'm just, I'm worried about you.
You say, yeah, yeah, mom, you go outside,
and then suddenly a cop car pulls up to you,
and it comes up fast.
Like, I literally jolted back
from seeing this cop car kind of physically come to me.
Get down on your knees and put your hands up.
It just came up and told me to, on a siren and a speaker
to put my hand on my knees.
I put my hands up.
Keep your fucking head down.
Don't make me warn you again.
As much as I as a white male
know about these different experiences
theoretically, I have never had to put my hands
above my head. I have never
had to get on my knees. That's just never an experience
I've had. And when I said that to you,
when you said, I don't know. A black friend who's not had that
experience.
So Tristan and I are going to take a brief interlude,
and then we'll get back to our conversation with Courtney and Jeremy.
First, we haven't really defined our terms yet.
So what is the metaverse and how is it different from virtual reality?
Virtual reality is a place that you go.
You put on a headset, you put your hands in some gloves,
and you're interacting in a virtual world, you have a virtual experience, and then you leave.
The Metaverse, it's sort of the same thing, except you are now living some part of your life inside of it.
So you're buying things, you're keeping your identity there, you have more invested.
It's a place you return to for social interaction.
Yeah, and it's more like an iteration of the Internet.
So in the same way the Internet, this connected set of websites I can go to, different worlds.
I can go to the Reddit world and go to the Facebook world.
The Metaverse would be this connected set of virtual worlds.
And I'll be able to keep my identity there, pay for things there, and it becomes kind of a new,
immersive internet. Right. The Metaverse is a digital habitat that we collectively live within.
With that established, let's talk about the transformative potential of the Metaverse relative to
other technologies, as well as some of the risks that Tristan and I want to identify that we haven't
really heard talked about as much. So the reason why an interview like this on virtual reality is
so important is we're talking not just about these light persuasive nudges to what we do with our
fingers and our behavior and notifications, we're actually talking about transformative technologies.
The metaverse and virtual reality will be mass transforming people as they spend time in it
because it transforms so many different more dimensions of the human experience than a flat screen
desk. So something I want to highlight here and something I did not understand earlier in my
career is every time you invent a technology, you also invent a responsibility because it's a new
power and that new power is going to do something. And the more persuasive your technology,
the greater responsibility is to understand how it's going to be used and what perverse
incentives are going to guide its impacts on society.
So you invented Infinite Scroll, which is kind of a new power.
How did that one turn out?
Well, not super well, right?
It's a crazy thing to realize that this thing that I invented in my early 20s is in the pockets of 4.6 billion people.
I didn't realize when I invented it, that actually it was going to become a pawn in a larger game.
of extracting engagement and time from people.
So if I could roll back the clock,
what I would say is when I invented Infinite Scroll,
I had a responsibility to invent regulation, philosophy,
to bind the way that Infinite Scroll would be used.
Now, that's a hard thing,
but note if we don't do this
as the power of technology continues to go vertical,
the externalities, the bad things that happen,
are just going to break us or break our societies, break our relationships.
You know, what you're referencing reminds me of a quote by Langdon Winner,
which is that the biggest window to change a technology is in the emergence of that new technology.
And the quote is, by far the greatest latitude of choice exists,
the very first time a particular instrument, system, or technique is introduced.
And just to tell a little bit of a story, when I was at Google,
working on the issues of social media, the attention economy, the raised to the bottom of the brain stem,
You know, I was saying, okay, well, the reason I should stay at Google and not go somewhere else is that Google's sitting on top of Android, you know, a mobile phone platform used by half the world, billions of phones and devices.
And if we can change how notifications work or how the Android Play Store works and the kind of way that apps are paid, we can change the dynamics of the attention economy.
But the problem that I ran into is at that point, this is probably around 2012, 13, 14.
Android was probably close to it, you know, almost a decade old by that time.
So one of the hardest things is after an operating system is 10 years old, it's very hard
to change the core assumptions, the core design, the core ways in which that would work.
And I really want listeners to look into the magic eight ball of how VR is going to go.
We know that all the social media platforms are competing for their content to go viral fastest.
And so we should expect that virtual reality and the metaverse is going to go the same way.
And then there's one other point I think we should cover in this.
this interlude, and that is empathy and the weaponization of empathy that we think it's
this. Empathy is always a good thing, isn't it, Hesa? Well, empathy is both a beautiful thing
and also the biggest backdoor into the human mind. Like, if I can convince you to empathize
with me, then I can get you to sort of come over to my side. So that can, of course, be used for
connection, but it can also be used for manipulation. And virtual reality is going to make it
easier than ever to have people empathize with people.
I forgot the paper Aza, we looked at it a long time ago,
about AIs that can generate trustworthy faces.
We can actually now generate an artificial face,
a face of a person that never existed,
and the AI can predict how trustworthy a person will find that face.
And then we can do that personalized increasingly for each person.
So trust and empathy,
as we can increasingly hack these human vulnerabilities,
what Yuval Harare calls the hackable human,
hacking human beings,
as the human becomes increasingly,
increasingly hackable, we have to start protecting more things. And this is just something that we always deal with.
The law moves slower than tech. We don't need a right to be forgotten until tech remembers everything.
But when tech remembers everything, we start to need a new right to be forgotten. And the problem is, as tech is continuing to develop more and more capacities, we're going to need more and more rights, more and more protections, more and more guardrails.
And that's really, in summary, kind of a meta description of what's gone wrong with social media up until this point, and what's likely to go wrong with virtual reality and with crypto and a bunch of other things.
if we don't radically move our attention
to what new forms of responsibilities
go with these new technologies.
So we want to ask the question,
why should any of this matter
to a regular listener out there who's saying,
this all sounds like something that's in the future?
Am I really going to put a TV on my face
and wear it for hours a day?
Can you both just sort of maybe leave us with,
why should our listeners care about this
and how is this going to relate to their daily lives?
Last year, in 2021,
we don't have an exact number of headsets sold.
Our best guess is there was between 15 and 20 million headsets for VR sold in the United States.
So it's not at the trajectory like a hockey stick that you see with social media where we go from zero to hundreds of millions.
However, we're starting to creep up at numbers where you probably have a family member or someone you know that has a VR headset.
It could be the Quest 2. It could be the PlayStation VR.
But these headsets are starting to leave the labs and enter our living rooms.
Yeah, I'd like Jeremy's position on this, the importance of understanding VR and the
metaverse before it's in every household, right? What is the effect on kids? How long should we
be in VR? What is the effect on intention span and belief systems and identity? So I think we need
as many people engaged and thinking about this and aware that this is coming and really understand
the scope of investment that's being made into this space. I think we have to anticipate that
this will be a part of our lives and not wait until it is a part of our lives and then try to
backtrack. Yeah, just to layer onto that, I talk to many parents who feel the pressure, you know,
come Christmas, what's the $200 gift? I'm going to give my kids. And a lot of them are giving
their kids VR. And they're not saying, is this really what's good? It's just kind of the next thing
to buy. And then once it becomes the thing that all the kids get, then it's like, I'm going to be
socially excluded if I'm not on it. And that's what happened with social media, right? It's
not about an individual who's addicted to social media, it's that it becomes the basis for social
participation. It's where social participation happens. And so in this arms race, this is the
key moment where we get to actually decide, how do we want this to work in our lives? What is the
ethical and humane version of this symbiotically riding on top of the back of society, and
meaning VR plus society equals a stronger society in the real world versus VR plus society
equals a virtualized society that has less contact with the real world in the first place?
We talk about in our work on social media, persuasive technology, and people are familiar
with the idea that technology can be persuasive. And oftentimes, I want people to think about a
spectrum of how persuasive a new medium might be. You know, writing and reading text and a printed
book is persuasive. I can have rhetoric that causes you to feel and think things. It's the degree
to which I'm able to write those words down and elicit the reactions from person, from someone.
That's a persuasive medium.
When I can get access to the visual part of your visual cortex with television or with imagery,
I can persuade you more.
But in terms of the spectrum of persuasive capacities, with social media and interactivity,
you know, when you're Facebook, you're designing social awareness cues, you know,
is the other person online right now?
Do they have a green dot next to their name?
Did they read the message?
Checkbox, they saw this message.
Does that create social pressure of, oh, they read this message?
Now I feel like that person's going to feel like they have to get back to me
because I know that they know that they've read the message,
so now they feel pressure to kind of respond to me.
These are the dimensions of a persuasive axis.
And virtual reality gives new dimensions of persuasiveness
up for grabs in the design experience.
And I thought it would be important just to distinguish
the different levels of persuasiveness in VR from other experiences,
including what the impacts of that are.
So you can have more lasting impacts on memory.
It's more likely to change future behavior.
I'd love just to walk through with both of you
some of those differences in what you can get as persuasiveness of VR compared to say the persuasiveness
of social media. I think when we're thinking about these broader sort of social impact, identity
formation, uses of virtual reality, we don't only have to think about what happens in the headset.
We can also think about what we do with people outside of the headset, to reflect on what they
just did, to journal and write about it, to read a body of literature related to the thing that
they've gone through, that starts to shift what we're thinking about in terms of influencing
and shaping people. So yes, do we understand that if you go through this experience, can you
plant a false memory? What does it mean to go through an experience like in the ways in which we're
trying to, you know, connect it to a course here at Columbia where you sit and reflect carefully
and read and dialogue with other people who have also gone through the experience? The effect on
you potentially changes and become something really, really different when you use it in that way.
So, you know, validating, we should absolutely understand these effects.
And then we're also not restricted to what happens in VR in order to think about the effects that we can have on people.
So supporting independent critical thinking based on existing knowledge is also a piece that might complement VR that could also be really powerful.
Yeah, you're making me think about the difference between a hit and run experience where I just quickly put on the VR helmet, have it, and then I'm done and I move on in the next thing in my life versus, you know,
you just think about the power of large group awareness trainings like landmark or in psychological counseling,
just getting to hear what 10 other people are experiencing totally deepens and changes the way that I reflect on my own experience.
It has incredible power. It can shift identities. It can plant false memories. It can do a lot of different things.
It can change your embodiment. But what is the right relationship between how and where VR fits into a society that is actually integrated and healthy and humane?
And what I'm hearing from you now is you wouldn't just have people pop in a VR experience, do it.
a thing and then kind of go back to their lives, there would be a way in which we integrate
VR into the learning that is the telos of the experience. So what would it mean to integrate
this with literature, reading about institutional racism and examples of other people and other
videos and material and other conversations with folks? So as we're thinking about, you know,
we have many policymakers who listen to this podcast, we have many technologists who are working
at the major companies who listen to this podcast, if we were to imagine and envision a right-sized
a relationship where VR is healthily integrated into society, what are some of those
additional structures that we would want to have it be the right thing we want?
From my perspective, there's no one answer to that, right?
It depends on what it is you're creating and what it is that you're trying to do.
I think some VR gets created for maximum emotional impact.
I want you to cry.
I want you to be scared, right?
And I don't care what happens to you after that.
That's the effect I want while you're in this headset, and I'm going to design accordingly.
When we designed a 1,000 cut journey, we weren't thinking of it as a standalone experience necessarily.
The way I've described this in early thinking and talks about this is I wanted white people in particular to come out of the headset and say, I thought I understood this, but I don't.
Designing with that intention is very different than I want you to cry and feel really bad about racism, right?
And so Jeremy and I are continuing to try and wrap our heads around.
What do we do with people when they come out of this?
this headset, or what can we do with them inside of the headset to help them reflect and make
meaning out of what they've done? Those are design decisions, and it depends on what it is
that you're trying to do. But that intention and designing accordingly, I think, is really important.
Yeah, I'm thinking about cars without seatbelts versus cars with seatbelts, where you're changing
the shape of the artifact, the shape of the technology, to layer in the extra values, which is
safety. Safety isn't just, hey, I'm putting my hand in the steering wheel, and it's up to
to the developmental level and consciousness of the driver,
we should also change the shape of the artifact.
And even when you don't put the seatbelt and do a little five beeps on the sound,
which is a little persuasive technology there,
to kind of remind you a little nudge to put that seatbelt in.
And I'm thinking, what would an integration belt for VR look like
where we don't make the mistakes of smartphones turning into social media phones?
And then just to complicate the equation,
one thing that we think about in VR is the perceptual arc of an experience.
So you've got a narrative arc, and Courtney's talking about
Maybe you want to inspire someone to have an emotional reaction of a certain time.
But there's also, VR is pretty intense on the perceptual system.
And so in my lab, we've got a 30-minute rule, and people are always surprised to hear that,
which is, after 30 minutes, I want you to take the headset off.
I want you to go get a glass of water.
I want you to go touch a wall, talk to a human being.
But, you know, that, as you said, that's a really blunt tool.
And, you know, people are often surprised who are VR enthusiasts because, oh, I use it for hours at a time.
And so one thing we're working on is, you know,
how can you design a VR experience
to minimize what's called simulator sickness,
that kind of wonkiness you get
after wearing a headset for half an hour?
And when an experience, should there be any camera movement?
And if the camera's moving,
it should be translating, not rotating it at a certain speed,
and we're building models now of how to extend that out a little bit.
So I don't want people in headsets for hours a day.
I really don't.
It's not how we've designed the purpose of our lab.
It's really about experiences that are designed
to help you rethink a certain situation, to give you a new skill.
But I like to think of it for a 30-minute chunk here and there.
You know, one of the places I want to bring this is back to a conversation about the medium versus the specific messages.
Because what I heard you say, Courtney, was how the medium is used sort of depends on what message, what experience you're trying to craft,
and then you have to carefully craft that.
What structurally happens to us, or does VR do to us as a society as we start to adopt?
it. Well, look, I love the question and I love the forward thinking. And what I'll say, what we know is that VR lacks large scale empirical study. So what I'm so impressed about Courtney's study at Columbia is that it's a fairly large sample that is being, you know, studied over time. And what the field of VR lacks in general is studies that have thousands and thousands of people. And so, you know, your questions are exactly the right ones. And
should be thinking about, well, how do we start to think about that?
Unfortunately, the field of immersive VR is riddled with work like my own.
I'm criticizing myself, which is looking at, you know, a few dozen people,
a single time, maybe two times, and what we need is to start looking at this over time.
I think one of the ways you can start to answer your question is just a poke-around steam,
which is a place where you download VR experiences,
and seeing what's popular, and just to give a sense of how wrong we are all the time
at predicting what's going to be good in VR, one of the most popular,
things for people to do is to stand in VR, to have a lightsaber in each hand,
have techno music blasting, and smash boxes that come at them while they're doing it.
And it's just really hard to predict what's going to be sticky in this medium.
The things that will win, in my opinion, are things that are based on body movements,
going back to that concept of any screen is fine to give you a nice image.
VR allows you to move your body and have the scene respond accordingly.
So as you project out to the things you're worried about, the things you're excited about,
think less about what it looks like and think more about, well, how do I use my body to interact with this content?
I think there's a piece of your question as well that's existential, right?
So how do we want to exist and be?
And I'll go back to the point that it depends on how we design it, right?
It depends on how we set it up.
Do we set it up in a way that restricts freedom and control?
And we tell people how they're supposed to represent themselves and how they're supposed to interact.
and we create a bunch of rules and regulations and this space is for you and this space is for
them and we control the space or do we create a space that's more open and we see what people
make and create and people might build their own societies and decide their own rules
and ways of interacting and their own notions of democracy represented in a digital
space. And I think depending on how we approach this, we'll end up with different shifts
and how this changes, both how we interact in the digital space, but also in the physical world.
And I think it's part of what's so intriguing to me about the metaverse, at least in principle, in the abstract, is that to some degree it is a blank slate.
We could decide that we want to experiment with different ways of structuring society.
We want to see what people come up with and how they would organize an election or how they would organize a democracy.
And you have this version and that version and we're trying all of them out.
for instance, as opposed to the path that we're currently on, which is a replication of who we are
right now. We're going to create the same kinds of systems, the same ways of representing
ourselves, the same ways of having an account and a profile. And, you know, those are all design
choices. So I think it really depends. A few things come to mind when you're talking about both
of you. One is that we often come back to this quote from Daniel Schmachtenberger, which is that
you can't have the power of gods without the love, prudence, and wisdom of gods. And we are in the
era of new godlike technologies that have more impact than we will be aware of at the pond
deploying it in society. So we're affecting the Amazon rainforest faster than we're understanding
how we're affecting the Amazon with all the changes we're making to deforestation, climate change,
pollution, dead zones in the ocean, et cetera. We're affecting social relationship in children's
development faster than we're understanding, right? So we'll ship Instagram to a billion people
and hundreds of millions of kids faster than we'll understand what it's doing to those kids.
will shape attention spans faster than we'll understand
how we're shaping attention spans.
With VR, I mean, Jeremy, I remember from the talk
that you gave back at the Children and Screen's Conference
ages ago, and this is an example I know well
from the persuasive technology literature,
if you get someone to imagine their future self,
it actually changes some of their decision-making.
And specifically, Bank of America, I think,
did this thing where they say,
take a photo of your face, then we'll age it.
So now you get to see a photo of your face when you're 80.
And then that by itself gets people to think more
about their retirement, their savings,
etc. You can imagine a world where baked into the way VR works is my long-term self. Everyone has
an 80-year-old avatar. As you've written about Jeremy, in the military, we use VR for
simulations and for combat training and for flight simulators because it works. And when we're
thinking about what would the conscious integration of VR look like into 21st century democratic
societies? If empathy is key to creating tolerant, pluralist democratic societies, I can easily
see a world where the kinds of things that you created with Thousand Cut Journey are integrated.
I could see a world where, you know, the U.S. Congress has a VR room. So you can't actually
vote on a decision that's going to affect all these different stakeholders without every
member of Congress going into a VR room and experiencing what it would be like from that
stakeholder's perspective in a first-person experience. That's an inspiring vision to me of technology
plus democracy equals stronger democracy. Now, not to be technotopian, by the way. Of all people,
we're very critical of how technology can go wrong in the externalities. But I both wanted to kind of
make space for what are some of the unacknowledged risks that we need to be thinking about when I
mentioned that example of like the VR room in Congress. What would that look like? What do you
imagine the kind of right sizing of how technology can actually affect policy in an effective
way so that we're not just using it for better empathy for personal experiences, but we're
systematically making those changes that we want to see? I think the biggest risk is putting the
design of this into the hands of a narrow group of people. If white straight men are the only
group of people deciding how we're going to exist and be in the metaverse, we're going to
recreate the same stuff we have right now. We will never explore the boundaries of what we could
make and how we could be. And so diversity is not just in vogue or niche or a cute thing.
It's critical. It's critical that the ability to design is put in the hands.
as many people as possible and see what they come up with.
That's when we'll exceed the range of what's possible.
But if there's a small group of people who have the power to determine what my avatar looks like,
then that's what we're going to get, their vision of the world.
Ruhab Benjamin at Princeton likes to say,
what is it like to live in someone else's imagination?
That's exactly what we'll be doing.
We're living in the imagination of a small group of people
and we'll never understand the range of how we could actually be
and how we could actually live.
And the Metaverse is potentially, VR is potentially an opportunity to reimagine and redo and transform.
How different would TikTok be if the leadership of TikTok were all mothers of teenage daughters?
We know teenage girls have a problem with Instagram.
If the entire design team was made up of mothers, of teenage daughters, how quickly would that problem get addressed?
And that's skin-in-the-game decision-making.
The people who are in the design position have direct skin-in-the-game, and so they're likely to change it.
One of the interesting things about VR is that, you know, for a person to have skin in the game,
usually they have to have a personal stake in that problem.
VR can actually extend skin in the game.
So now I have virtual skin in the game because I can experience it.
If I'm saying don't vote for a war that my own children won't fight in, that's skin in the game decision making for Congress.
If I say don't vote on refugee camps until you've actually visited the refugee camps, well, now you can do that virtually.
So we can have a world where people are voting on refugee camps, climate change, coral reefs, institutional racism,
and actually have direct experiences if you had that kind of VR room.
Jeremy, you're going to jump in?
Yeah, so I want to give two examples of your congressional lawmaker VR experience
that we've actually done, one in the U.S. Senate and then one in Palau.
And I'm going to start in Palau because it's actually a bit of a success story.
We went to Palau, which is an island nation in Micronesia.
They have about 20,000 citizens, but they get about 100,000 visits from tourists each year.
And their economy is largely based on those tourists going to see their inclusive.
incredible coral reefs and to go scuba diving, snorkeling, other type of nature.
Now, for Palau's economy to succeed, they need these reefs to stay healthy because that's why the tourists come.
So we went to Palau for two weeks, a couple of students at Stanford and myself, a climate change scientist named Rob, and we spent two weeks diving, snorkeling, and using underwater 360 footage to capture parts of the scene.
One of the places we went to was soft coral arch, and in soft coral arch, they have this amazing flowery coral that,
only grows in the shade. And because it's an arch, it's a destination from people all around the
world. So we got there at about 7 in the morning, and we filmed it underwater pristine, meaning
there was no one there, and it's just awe-inspiring. The first tourist boat at 7.30 gets out,
and there's 40 people that get out in the water. They're snorkelers. A lot of them can't even
swim. They're on these rafts, and you can see their fins just kicking and dislodging and killing
the coral. We got it all on tape, and that wasn't a rare event. That occurs from 7.30 a.m. until about 6 p.m.
every day. The fascinating part of the story is that we had the opportunity to prevent to
about 90% of the lawmakers of Palau three or four days later. So Elise Ogle and Tobin Asher,
they pulled an all-nighter and they built a VR scene that showed the before and after, that
showed the pristine spot and then the tourists that were destroying the coral. The first thing to
note is that most of our lawmakers that experienced the VR, they had actually never just
experienced the awe of being in this amazing spot. They knew of soft coral arts, of course,
because it's an economic tourist destination,
but they'd never been there.
So just hearing the utter connection they were making
with this spot in their country was pretty neat to see.
And then when they saw the tourists come in and destroying that,
I mean, the spontaneous utterances and ideas about how they can put this in schools
as education, make this be in dive shots for people to see.
And what occurred a few weeks later was a change in regulation
on how they informed tourists about diving
and a couple different changes that were, you know,
We can't claim it's 100% because of the VR,
but we know that VR was part of the experience
and showing that this does lead to a policy change later on.
The fail that we had was going to the U.S. Congress,
and we were lucky enough to work with the Ocean Conservancy,
and we set up a VR demo at the Senate.
And we had a bunch of senators come through.
More important, we had a bunch of their staffers come through,
as you know, they're the ones that are doing all the work,
and we had a lot of folks come through and experience the VR.
Unfortunately, we had a few climate change deniers who were scheduled to come on tour, in particular Senator Colburn,
who had attacked me publicly previously about my work in VR in climate change.
Most of them canceled and didn't come.
So it was an instance where we did bring this experience to the Senate floor
and gave our lawmakers an opportunity to engage with this material and just didn't really get the engagement that we wanted.
You still have known about this election effects that there's the senators and Congress members that are willing to go
the VR experience and then those who don't want to go there. I mean, I'd love for you to share a little
bit more about, yeah, examples of where can VR make the difference that makes the difference?
And you actually have a framework also for VR, the Dice framework. I'd love for you to walk the listeners
through that. Yeah, so Dice is the acronym we use for when should you use VR? And it shouldn't be
used for everything. I don't want you reading your email in VR. I don't want you going in VR to do
a video conference that works pretty well right now, a one-on-one video conference. You don't need to be in
VR to do that. So when do you use VR? We like to save VR for experiences that if you did them in the
real world would be dangerous, impossible, counterproductive, or expensive. And I can break down
examples of each of those dangerous. What's one of the best examples of VR training firefighters,
training people who are doing jobs that you get hurt during training. For example, the flight
simulator. The way you learn is by doing things, making mistakes, getting feedback, and repeating.
And so VR is great for things that would be dangerous in the real world. Impossible.
and cut journey is an amazing example of this.
It's impossible for me to become a seven-year-old black male,
so it allows you to do things that are just simply not possible in the real world.
Counterproductive, one example of this is just what I talked about with Palau.
So we've got this beautiful spot, soft coral arch,
and it's showing this damage to it from Taurus is transformational
in your understanding of how climate change and environment
and human interaction with the environment works.
If we were all to fly there to experience this,
think about the fossil fuel that we'd use.
And then expensive is kind of a broad category,
but these are things, if I'm going to, you know,
have you been a room of 50 people to practice your public speaking
because you've got social anxiety,
it's pretty expensive to hire 50 actors,
and so it's a nice use case.
So we like to think of the DICE framework
as, you know, when you start thinking about
when should I use VR and not just a 2D movie, an example there.
But Tristan, to go back to part of your question,
which was in this specific example of Palau, Y, VR,
the real reason is because of this sense of awe, A-W-E,
which is one of the main emotion VR can induce,
which is when you're in this gorgeous reef and you're looking around
and you just feel this incredible sense of awe.
And that's really what drives the second part,
which is then you see these people come in and they're destroying it.
It's the kind of awe that you just can't get from a 2D screen.
And to close this, I get asked a ridiculous question often.
Jeremy does VR cause empathy?
And I say, you would never ask that question about a podcast or about a video.
It's a medium.
It depends what you do with it.
And VR is the same way.
And it's such an obvious, quote unquote, revelation.
But so many of us are still just treating the medium as independent of a message that it's really about what you do in there.
I just find it fascinating that there is a measure of how effective a message will be in VR.
like how engaged, how much you're looking around.
That's, of course, really exciting.
And also when I hear that terrifying,
because it immediately makes my mind think about,
well, what is the Cambridge Analytica of VR?
If I can measure and understand how engaged you are
with a message in real time,
that's the kind of data that can be used
to learn what messages most affect you.
And then, of course, you'd expect that to be used
by whatever political powers there are to activate us
sort of like emotional tuning forks and find our resonance.
And where that leads me is the question,
as this technology is both incredibly persuasive
and you can get incredibly granular data
on that which affects you most
and will affect your future behavior,
what, if anything, do you think should be illegal
or should be out of bounds for this medium.
So let's talk about body movement data.
If I'm posting to social media,
I have some control over what I say and which image I choose.
When you're talking about nonverbal tracking data,
so current commercial systems track your body movements at 90 frames a second,
typically about 18 separate movements.
So the tracking data are pretty insightful in terms of what it can tell about you.
So we've run studies that predict.
if I'm giving you a VR lecture
and I'm tracking your body movements
I can predict with machine learning fairly accurately
how well you're going to do on a test
I can predict aspects of your identity
so Mark Miller as a brilliant PhD student
he's recently published a paper
that looks at how likely
you can identify someone
after you've taken their name off the data file
simply from how they're moving their body in VR
and he shows it takes about 20 seconds
of tracking data to identify you
with about 90% accuracy when chance was far less than 1%.
In other words, you can take someone's name off a VR data file,
but people can still figure out who you are.
We call this trend the end of the poker face,
that as technology accelerates,
it will be able to use fewer and fewer signals
with increasing AI capacities to predict who you are
or things about you, like Gloria Mark,
we've had on this podcast before,
did even a study predicting with reasonable accuracy,
I think closer to 70%,
some of your big five personality traits
just by looking at the way that you
move around in a computer screen
like the timing of how you use the back button in a browser
and going back and forth and switching between Windows
you can predict someone's personality traits
of China, they have this gate detection interface
where just by looking at your body movements
and walking with 94% accuracy
can predict exactly your identity
just on how you move.
And obviously we're going to require
instead of 20 seconds of body movement today,
that number will go down to fewer and fewer seconds
you know, tomorrow. I actually remember being at Google
and Sergei Bryn said it. So it turns out there's no
law in California that says that driverless
cars are illegal because no one ever
thought there would ever be such thing as a driverless
car. And every new technology
opens up a new can of worms that has
to then get dealt with. The can of worms
are the externalities. When I think about
VR, I think about an entire universe
of externalities. That the
kind of assumptions that we make about
what is illegal or not illegal, instead
of having a few of those that you have to deal with,
which is what we have to do with social media, addiction,
mental health, kids, polarization, democracy.
With VR, I mean, it's like we just open up the entire dimensionality of what a reality is.
And now anything is possible.
So when I think about institutionally, maybe the state has incentives to say anybody
who can actually name risk areas or externalities of a new medium or technology
actually can earn bounties for actually discovering what those harms might be.
How do we actually incentivize the closing of the universe of worms so we can identify,
up front because one of the problems is how do you move at the pace of irreversible
externalities? And I'm curious any of your reactions to that.
I mean, I think we, I agree it, but I also think that we have to consider the alternative
that we won't be able to regulate it, period, right? So we also have to consider the question,
what if we can't regulate it? What do we do then? Right? What if we can't keep pace? What do
then? Maybe we need to even reimagine, you know, legal structures and how they function, right? The whole
notion that a body can regulate something that's happening over here. That works when you're
dealing with a state or a country. But if you're dealing with the metaverse, there's no way
that that model in my mind can function. So what if we start with the question that we can't?
And if we can't, then what? And that might open up other possibilities for what we do.
To me, it's such a significant potential shift in how we function as a society that this is not
one that you get to opt out of and just ignore. You can make decisions about whether you're going
to participate or not, but to just disregard it as a significant discourse in potential transformative
shift in technology in our lives. I think everyone has to be thinking about this.
Jeremy and Courtney, thank you so much for being on your undivided attention. I think it's so
critical. We had this conversation and get people thinking both inside technology companies and
those who are making policy, how do we get this right? And you've just given us a real gift today.
Thank you so much for joining us.
Thank you.
Thanks for having us.
This has been a great discussion.
Dr. Courtney Cogburn is an associate professor at the Columbia School of Social Work,
where she directs a research group that investigates the effects of racism on mental and physical health.
Her research also explores how emerging technologies like virtual reality can lead to changes in attitudes and social
perception. Courtney is the lead creator of a thousand-cut journey, an immersive virtual reality
racism experience, which premiered at the Tribeca Film Festival in 2018. Your undivided attention
is produced by the Center for Humane Technology, a non-profit organization working to catalyze
a humane future. Our executive producer is Stephanie Lepp. Our senior producer is Julia Scott,
mixing on this episode by Jeff Sudakin. Original music and sound design by Ryan and Hayes Holiday
and a special thanks to the whole Center for Humane Technology team
for making this podcast possible.
You can find show notes, transcripts,
and much more at HumaneTech.com.
A very special thanks to our generous lead supporters,
including the Omidio Network,
Craig Newmark Philanthropies,
and the Evolve Foundation, among many others.
And if you made it all the way here,
let me just give one more thank you to you
for giving us your undivided attention.