The One You Feed - The Hidden Costs of Technology and Our Search for Selfhood with Vauhini Vara
Episode Date: September 2, 2025In this episode, Vauhini Vara discusses the hidden costs of technology and our search for selfhood. She explains how we live in a world where technology functions as both a lifeline and ...a trap—offering connection, convenience, and possibility while also shaping our choices, exploiting our attention, and redefining how we see ourselves. Together, Eric and Vauhini explore the tension of relying on tools we can’t seem to live without, the subtle ways algorithms alter our communication, and what it means to hold onto authentic selfhood in the digital age.Feeling overwhelmed, even by the good things in your life? Check out Overwhelm is Optional — a 4-week email course that helps you feel calmer and more grounded without needing to do less. In under 10 minutes a day, you’ll learn simple mindset shifts (called “Still Points”) you can use right inside the life you already have. Sign up here for only $29!Key Takeaways:Exploration of the dual nature of technology as both beneficial and exploitative.Discussion on the impact of major tech companies like Amazon, Google, and OpenAI on personal identity and society.Examination of the ethical implications of consumer choices in a global capitalist system.Reflection on how technology alters human communication and relationships.Analysis of the concept of "algorithmic gaze" and its effects on self-perception and identity.Personal narratives intertwining technology with experiences of grief and loss.Consideration of AI's role in creative processes and its limitations compared to human expression.Discussion on the commodification of identity in the age of social media and audience capture.Insights into the ongoing negotiation between convenience and ethical considerations in technology use.Emphasis on the importance of individual agency and conscious decision-making in navigating the digital age.If you enjoyed this conversation with Vauhini Vara, check out these other episodesDistracted or Empowered? Rethinking Our Relationship with Technology with Pete EtchellsCan Radical Hope Save Us from Despair in a Fractured World? with Jamie WhealHuman Nature and Hope with Rutger BregmanFor full show notes, click here!Connect with the show:Follow us on YouTube: @TheOneYouFeedPodSubscribe on Apple Podcasts or SpotifyFollow us on InstagramThis episode is sponsored by AG1. Your daily health drink just got more flavorful! Our listeners will get a FREE Welcome Kit worth $76 when you subscribe, including 5 AG1 Travel Packs, a shaker, canister, and scoop! Get started today!BAU, Artist at War opens September 26. Visit BAUmovie.com to watch the trailer and learn more—or sign up your organization for a group screening.LinkedIn: Post your job for free at linkedin.com/1youfeed. Terms and conditions apply.See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Transcript
Discussion (0)
When I communicate, and I think this is not just because I'm a writer, I think it's because
I'm a human being. When I communicate, the gratification I get from that communication is from having
made the effort of communicating myself. And it sort of does nothing for me if a machine does it for me.
I mean, it doesn't feel that different from like using a magic eight ball or something to produce words.
Welcome to the one you feed. Throughout time, great thinkers have recognized the importance of the
thoughts we have. Quotes like, garbage in, garbage out, or you are what you think, ring true,
and yet, for many of us, our thoughts don't strengthen or empower us. We tend toward negativity,
self-pity, jealousy, or fear. We see what we don't have instead of what we do. We think things
that hold us back and dampen our spirit. But it's not just about.
our actions matter it takes conscious consistent and creative effort to make a life worth living this
podcast is about how other people keep themselves moving in the right direction how they feed their good
wolf we live in a world where technology is both a lifeline and a trap take amazon i once swore them off
after a broken blender and the most absurd customer service call imaginable i made a big declaration that's it
No more Amazon. And my grand boycott lasted five days. And the worst part, I disliked myself a little
when I went back. Because it wasn't just the blender. This was already a company that had killed
my beloved bookstores, and now it feels like they're coming for everything else. That's the
trap. We keep returning to what we wish we didn't need. Wahini Vara explores this exact
attention in her book's searches, Selfhood in the Digital Age, showing how the very tools that
connect us also exploit us. In our conversation, we wrestle with this ambiguity. How do we live
with technology we can't seem to live without? I'm Eric Zimmer, and this is the one you feed.
Hi, Wahini. Welcome to the show. Thanks for having me. I'm excited to talk with you about your
latest book, which is called Searches, Selfhood in the Digital Age. And it's a lot of
really a book that explores, I think, our relationship with technology, broadly speaking.
And it's a topic that I think is a really important one because we are in deep relation
to technology, most of us all the time. And so I think it's always worth exploring that.
But before we get into your book, we're going to start with a parable like we always do.
And in the parable, there's a grandparent who's talking with their grandchild.
And they say, in life, there are two wolves inside of us that are always at battle.
one is a good wolf which represents things like kindness and bravery and love and the other is a bad wolf which represents things like greed and hatred and fear and the grandchild stops and thinks about it for a second looks up at their grandparent and says well which one wins and the grandparent says the one you feed so i'd like to start off by asking you what that parable means to you in your life and in the work that you do yeah i mean it makes me think of a couple of different things one thing it makes me think of is the way in where
which sometimes those two wolves are very intertwined.
Like, it's actually the same wolf, right, with two sides.
And I think about that when it comes to our relationship with big technology companies,
products, which is the subject of my book, because I think we sometimes like to talk about
that in binary terms.
You know, we say these technology companies are exploiting us and they're evil.
And then the technology companies will say, but you're using these products.
So you must find them pretty useful and even fun and enjoyable and maybe they even bring you joy.
And the truth, of course, is that not only both of those things are true, but that they are deeply intertwined.
Like, the way in which these products are useful to us requires the exploitation.
So, for example, when we use Amazon and we're delighted that things don't cost much and they come to us quickly,
that's because there are labor practices at Amazon and its suppliers that, you know, can be described as sort of shady and exploitative that are responsible for making these products cheap and come to us quickly.
Yeah, that's one of the things I really loved about the book was the deep ambiguity in it.
The ambiguity that you have and write about really honestly with technology.
and the nuance of recognizing that these are both good and bad things.
And I like what you say that can't necessarily just be taken apart, right?
Like one of the things that makes a search engine more valuable to us over time is that it knows what we want.
But that's the exact thing that is being exploited is knowing what I want, right?
And so, you know, it's very hard to envision a world in which you got one without the other.
And so maybe you could just first describe for us what the book is, because there's a lot of different ways of talking about it, and I want to make sure that you get to present it in the way that makes sense to you.
Yeah.
I mean, I think of the book as a document of what it's like to live in a world in which our consciousness has been so.
colonized by big technology companies and their products. And then in addition, or relatedly,
what it feels like to be complicit in the rising power and wealth and exploitation of that
power and wealth of big technology companies, you know, the way in which all those things
provide us with usefulness, the way in which that makes us feel guilty and ashamed, and also
glad that these products exist in our lives. And I do that in the book in a way that I think
is sort of unusual in that I write about it. So the book has chapters where I'm just, I'm talking
about this in my life and all of our lives. And then there are chapters and, you know,
bits that are interspersed between the chapters where I'm kind of showing this in action,
using my own interactions with these companies' products. So there's a chapter, for example,
made up of my own Google searches over 10 years.
Which I found absolutely fascinating
and made me kind of want to go find all my Google searches over 10 years
because you're right.
What an interesting way to look back on your life and your interests.
I loved how in you were talking about these search things that you quoted somebody,
John Battell, called it the database of intentions,
a comprehensive record of human desires, fears, and needs that becomes,
comes raw material for corporate profit.
Yeah, I love that characterization that comes to me from the scholar and writer Shoshana
Zuboff, that idea that all this material that we put into these products is raw material,
that they then render into a product.
And the product is actually the information about us that they're then able to use to get
marketers and advertisers to ask them to print.
present ads to us. And so, yeah, I mean, there's that. But then also, I'm a writer. I'm an artist.
And so that term raw material sort of has this additional artistic meaning to me. And it's
interesting to me that my searches on Google, anyone searches on Google function both as
raw material for this corporate technological machine that companies like Google are operating
and as raw material for my own work, right?
Like, I know more about myself from having had Google maintain this database of my intentions over 10 years or longer than I otherwise would have.
And I can use that.
I can go back to that database and remember what I was doing 15 years ago on this day in a way that might serve my writing, you know?
Is that something everyone can do?
Is this publicly, like, my search results are available out there somewhere for me?
So even I, as a tech reporter, with all these years of experience covering Google and other companies, knew that Google was collecting that information.
And I could have sworn that I would have turned off its ability to collect that information about me.
And it seems like I did at various points in time, but then either turned it back on or I don't know what happened.
But most of the time that I've used Google, it's been collecting this information about me.
As I said earlier, the thing that really I felt.
throughout this book is this sort of wrestling with how we use this technology, what we allow
this technology to know about us, and really weighing the cost and benefits. I mean, you've got a
whole chapter about Amazon where you'd meet with a friend who sort of says, I don't use Amazon
because of a whole host of reasons, which I don't want to turn this into a why Amazon is bad
podcast. But there's lots of things about Amazon that we could question as being good for us,
good for the planet, good for other people. And I know that. I even had an incident recently where I was
like, okay, that's it. Not only should I not, do I not want to use Amazon for what to me are
moral reasons, but now they've really pissed me off. And it was something stupid. I bought like
a neutral blender or something from them. And it was like the second one that broke. And the first time
they said, just throw it away and we'll send you a new one. The second time, I thought that's what
they wanted me to do so I just threw it away and then we got into this whole thing I was like I have
been a customer of yours for 25 years plus the amount of money I have spent with you is I'm on the
phone with someone is mind-boggling just give me the 39-99 whatever it is refund like any sane human
would look at this relationship as a customer and go that's worth keeping and they didn't they
wouldn't. And I was so frustrated just by the principle of it that I was like, all right,
that's it. No more Amazon, which lasted all of about five days. Right? Where it just, and then, of course,
I'm like, if I can buy it somewhere else, I'll buy it elsewhere, but things that I only seem to be
able to get there, I'll get there. And of course, then that sort of erodes. And before you know it,
it's kind of back to the same old, same old with it. Exactly. Yeah. And I think, I mean, one thing I
like about the the one you feed parable is the way in which it emphasizes personal choice because
I think a couple of things are happening here. These companies have become so entrenched that there is an
extent to which there are strong incentives for us to use their products. You know, the the the
shelves at Walgreens are a lot emptier now than they were previously because fewer people are
shopping at Walgreens, right? And so if I need to get the special contact solution I use
for my rigid gas permeable contact lenses, I can't find it at Walgreens anymore, right? And so
the most natural thing is to turn to Amazon. That said, I continue to feel and insist to myself
and to others that we do have choice in the matter, right? Like, we can try.
try to make the effort to decide to approach this a different way.
And, you know, it's not a spoiler to say that in my book, I get to the end of my book
and I'm still using all of these products I'm critiquing.
At the same time, what's not on the page is the fact that I have been engaged in this process
of trying as much as I can to divest myself.
And it's an ongoing process and it involves some failures and some successes.
But I think that that effort is worthwhile.
Yeah.
What I was struck by as you were talking and I've been struck by in my relationship
with Amazon is the word convenience.
And how convenience has become an unstated value for so many of us that, like, it ends up
being prioritized over other values of mine in an unstated way, but it has somehow become
this thing that is expected, needed, maybe it's the pace of life.
I mean, I think there's a whole bunch of factors, but a lot of the things that you're
talking about, it's the convenience or the time-saving ropes me back in.
Yeah, and I think, I mean, I think this is tied up with the sort of broader economic
history of the United States, too, where it used to be a country where our identity was
tied up with a lot of different things. And now, as fewer things are made in the United States,
we end up having this role, this sort of like our primary economic role is the economic
role of consumers. Right. And so everything gets oriented around that, including antitrust law,
which in the past was about all kinds of different things. And now is very much about, well,
as long as the consumer is doing better than they were in this arrangement, than they were. And
they were before this company came along, we have to admit this is a positive outcome. And that
disregards all the other negative outcomes that can come from companies becoming bigger and more
powerful.
There are a lot of Holocaust films that focus on the horror, and rightfully so.
But what struck me about Bao, artist at war, is that inside all that darkness, you see something else.
Love, humor, creativity, even moments of laughter.
It's people insisting on their humanity when everything around them is trying to take it away.
Joseph Bow was an artist and a dreamer.
He risked everything to help others survive and to keep his love for Rebecca alive.
In the middle of the concentration camps, they secretly married.
A wedding in a concentration camp.
It wasn't only an act of love.
It was an act of defiance.
And for me, this film isn't about what was lost.
It's about what was found.
The resilience of the human heart.
And if you know me, you won't be surprised to know that by the end, I was in tears.
Bow, artist at war, directed by Sean McNamara, opened September 26.
You can watch the trailer and find showtimes at bowmovie.com.
That's spelled B-A-U-Movie.com.
Eight years ago, I was completely overwhelmed.
My life was full with good things, a challenging career, two teenage boys, a growing podcast,
and a mother who needed care.
but I had a persistent feeling of I can't keep doing this.
But I valued everything I was doing and I wasn't willing to let any of them go.
And the advice to do less only made me more overwhelmed.
That's when I stumbled into something I now call this still point method,
a way of using small moments throughout my day to change not how much I had to do,
but how I felt while I was doing it.
And so I wanted to build something I wish I'd had eight years ago
so you don't have to stumble towards an answer.
That something is now here
and it's called overwhelm is optional.
Tools for when you can't do less.
It's an email course that fits into moments you already have
taking less than 10 minutes total a day.
It isn't about doing less.
It's about relating differently to what you do.
I think it's the most useful tool we've ever built.
The launch price is $29.
If life is too full but you still need relief from,
overwhelm, check out, overwhelm is optional. Go to one you feed.net slash overwhelm. That's one you feed.
net slash overwhelm. Let's explore some other areas where your relationship with technology
feels ambiguous, where you're sort of like, well, I actually really benefit this, but I also,
there are reasons I don't want to do that. So Amazon is for me a big one. I feel like I'm
perpetually wrestling with this one.
What else is there for you that falls into this kind of category?
Google's obviously an example, right?
Because my Google searches are improved.
My own catalog of my life is improved by the fact that Google is collecting all this
information about me.
One can argue.
That's debatable.
But, you know, that's the argument I'm making.
But this is true of all kinds of products.
I mean, I use social media.
And when I use social media,
I am speaking to algorithms rather than human beings in some ways.
I'm speaking in a way that is different from authentic human-to-human communication because
of the role of those algorithms in mediating what gets shown to people and what doesn't.
And yet, the fact that social media exists and that I'm able to use it plays a big role
in sustaining my career as a freelance journalist where I need.
to develop an audience of my own who knows about my work. I need to not rely on outside
publications to publish my work because I'm working in a very fragile industry. And so, you know,
that's another place where it comes up. Another one that we haven't talked about yet is
AI, obviously. And what I find interesting about AI is that we read a lot about all the ways in
which AI can make things faster and easier, and yet the jury still seems to be out on whether
it's actually making things faster and easier or not. There was this recent study where they
asked a bunch of computer programmers, experienced programmers, to use AI models to help them
in their programming. And these individuals thought that they were saving a bunch of time,
like their self-reported time savings was significant.
But the sort of objective time they were taking to do their work was actually measured.
And it turned out they were losing a bunch of time.
So it was actually taking them longer to do their work when they were using AI than when they weren't.
So that's a really interesting one because I think with certain past technologies, like Google is a decent example or Amazon.
The benefit is a pretty clear and legitimate benefit.
and then it has to be weighed against those costs.
But here, I think even the benefit is a little iffy.
Yeah.
I mean, AI is obviously another big one to go into
and one that produces a whole variety of feelings and emotions.
Yeah.
And let's come back to AI for a second.
But I want to pivot for a minute because as we talk about AI,
you can't talk about AI without talking about Open AI.
And you did a profile of Sam Altman, the founder,
of Open AI a number of years ago. But more broadly, you really talk and show in this book
how most of these big technology companies start out with a certain idealism.
Yeah. You know, Google's don't be evil as the most prominent example. But Open AI too
has, you know, started with a really, you know, a real premise of AI safety and all of that.
And you show how almost all of these companies over time ended up getting co-opted into moving away from that value.
That's right.
Yeah, I mean, I think as human beings were often driven by our own intellectual curiosity and passion, I think, than by a desire to make a bunch of money, right?
The problem with technology companies is that they tend to be really, really expensive to build.
And so if you're the Google founders in the mid-90s or if you are Sam Altman of OpenAI in the mid-2000s, the first thing that you need to do, if you want to pursue this intellectual passion project that you're so excited about that you think can change the world, is to go out and find some people who are going to give you the money to build it.
And those people are investors, and those investors certainly surely have their passions.
but it's their job to put money into a venture so that that grows into more money.
And so you end up in a dynamic in which whatever the sort of intellectual or philosophical
or moral or ethical goals were behind this project of yours, it immediately, it very quickly
becomes bound up in the goals of these investors and who become the part owners of the
company, right? And so this isn't like just some kind of abstract situation. It's literally the people
who own the company get to determine the goals of the company. And if that goal is to make more money,
there you go. Precisely. And even if you are, you know, the founder, the CEO, if the people who own
the majority of the company think that you're acting in some way that doesn't further the company
making as much money as possible, you can absolutely just be removed. And so it's this really tricky and
weird situation. I mean, I think we all wrestle with these things to different degrees in our
lives. You know, what tradeoffs are we willing to make for money? But you're right. I think it's the
nature of technology companies being so expensive, right? Like I was able to build this, you know,
little tiny business that I have by myself without getting investor money because it's a little
tiny thing, right? Yeah. Right? But I, you know, had I gone out and gotten somebody to say like,
oh, I'm going to put a million dollars in the one you feed.
It changes things.
Right.
You know, I worked in technology companies, software startup companies early in my career.
And I saw very clearly what VC money did to a company, you know.
And it was mostly not good.
Right.
It turned out mostly not to be good.
Yeah.
And I don't, I mean, I think it's a little bit of a stretch to say they're the same thing.
But I, in some ways, I think the.
the situation that these corporate founders find themselves in is not so dissimilar to the
situation we as users of these products find ourselves in, right, where it's not as simple as
saying, listen, I have ethical principles that I need to stick to. There end up being these
competing interests. Yeah. Well, I think that that sort of motivational complexity is such a key
element in life that we don't often talk about in lots of different areas, right?
Like, you know, a lot of my work is in how people change and learning to recognize what
the motivational pulls actually are and being able to be honest about what they are is so
important, right, in order to actually find a way out.
But we tend to not do that.
We tend to try and say, like, all right, I'm going to, I think I should do this.
So that's what I'm going to think about.
That's what I'm going to talk about without recognizing there is a lot pulling us in the opposite direction.
Yeah, and I think, I mean, I'd be curious about your thoughts on this, given your expertise and all the many interviews you've conducted on this question, right, of like how people change.
But, you know, it seems to me that one of the challenges when we use these products is that the benefit that we get feels really immediate and really tangible and all of the costs and consequences.
is end up feeling quite abstract and intangible. And so the benefit, even though it's relatively
small and one could argue that the cost is enormous, the benefit ends up feeling to us like it's
bigger or more meaningful or more actionable. Yeah, I mean, as humans, we are not good at this
thing. I think the technical term in psychology is, well, I guess it's also the term that you would use
in money, but I think they call it delay discounting. Maybe they call it. Maybe they call
something different, but it basically means I value things that have, that have any delay on them
less and less and less.
Yeah.
Right?
Yeah.
Because I just, very relevant.
Yeah.
And I think the other one is that a lot of these things that are bad are happening to other
people or might happen at some juncture.
And they're far removed.
I've been thinking about we're re-releasing an episode with Peter Singer, probably what will already
have come out by the time you did this.
But when I read Peter Singer's book, The Life You Can Save,
it shook me up and it has kept me shook up for years because he poses a question.
And he basically says, imagine you're walking down a road.
I may not get this exactly right, but this is close, or at least what's stuck with me.
You're walking down a road and there's a pond right there and there's a child drowning in it.
If you did not go and save that child, you would rightly be considered a monster.
You would think yourself a monster.
What is wrong with me?
How on earth did I not go save that child that was 12 feet away?
But there are children dying all around the world all the time that we have the means to help, right?
I have the means today to save children's lives that I am not saving.
And I don't even mean like I have to donate every last dollar I have, but I could do more.
And that, the logic of that has haunted me.
Yeah.
Yeah. And I think that that's a little of what we're talking about here. Like if the people whose lives were work practice lives, I got to see them come home every day from work at Amazon and feel, you know, they're still on public assistance because they don't make them enough money and they feel demoralized by what they do and they're their tethered dual. All these things I know, quote unquote, no, they're still remote. And that remoteness.
makes it, I think, is another, I don't know what the term for that Peter Singer-Pry is a word for it,
but that remoteness is another thing that makes it very difficult to act in accordance to our values
because the thing is not present. Absolutely. And I think that's built in to global capitalism
as it exists now, right? And we were talking earlier about the role, like this role that we have
as people living in the U.S. of consumers have that sort of our primary role. And a thing that makes
that helpful, the thing that makes a thing that makes it easy for us to focus on things like
convenience and low price and usefulness is that all of a significant amount of the cost is born
by people on other continents who we will never see whose lives we don't know about. And that's true
when it comes to the labor cost. And it's also true when it comes to the environmental cost.
Right. There's an argument.
though that, I mean, the capitalist argument is that we are creating jobs that wouldn't exist
at all otherwise.
Yeah, that's the argument.
And, you know, I think I recently read Bill Gates's memoir, and in a lot of ways, I think I view
the world differently from somebody like Bill Gates.
But what I find pretty compelling is the frustration of people like Bill Gates
that for all of our, when we criticize capitalism and technological capitalism, this globalized
system that we've created for ourselves, we don't spend a lot of time talking about the
significant decrease in poverty in mortality rates as a result of global capitalism and the
role of technology in global capitalism. And I think it's a really fair point.
It is interesting because if you look at so many of these measures, we would consider progress, the trends are clear, you know, more women being educated, number of deaths and childbirth, the literacy rate, the population.
the poverty, I mean, you look at all of this stuff and you're like, okay, from one perspective,
more people are better off than they've ever been. And I think that's the key word. And it's
not always that simple. Because what comes along with perhaps a higher literacy rate is also perhaps
the undoing of a culture that supports people in a certain way, right? We're taking our measures of
what success and goodness are, and we're then putting them on the world as a whole.
Yeah.
But I do think there is good news in a lot of this, but I don't think it's the unquestioned
good news that certain people would posit.
But I think so many people think the world is getting worse, and I think it's nice to have
some counterpoint to that, because I actually don't think the world is getting worse.
I think by most measures, the world for most people is getting worse.
getting better. It may be getting worse for a certain group of people in a certain place,
mainly us Americans who are like, what the hell is all going on when the rest of the world has
been living with that sort of chaos for, you know, for all of time. Right. Right. Yeah, I know I agree
with you on that. Let's talk a little bit about the negatives to this. We haven't really put that
fine a point on them. With Amazon, we did. We talked about, you know, labor laws and, you know,
the factories where things are produced and the way, you know, drivers here are treated. And
And, you know, there's obviously the income inequality where, you know, less people have more and more of the wealth.
But what are some of the other, would you say, problems with let's just take a couple.
Let's not go to AI yet, but, you know, Google, Facebook, Instagram, you know, like what are the other costs for us as people that we're not seeing?
Yeah.
I mean, I think there are a number and they can be difficult to talk about.
but something I think about, I think in part because I'm a writer, is what, how communication changes depending on who we're talking to. And so human communication has always been good. Human communication has always been used for communion, for liberation, for good. And it has also been used by those in power to further accrue their power and further accrue their wealth, right? And so,
communication has always been a complex thing. What we have now, because companies like, we interact
with companies like Google and meta and Amazon, in part through our use of text, is this world in
which we have this new audience when we're speaking, right? So when I search for something in Google,
I'm formulating it in a certain way for Google. When I'm posting on social media, I'm posting in a certain
way based on what I know algorithms favor, right? I may be using emojis more than I otherwise
would. I might be including a picture of a cute pet in my post when I otherwise wouldn't, right?
And so the way in which we communicate ends up being deeply influenced and maybe one could
argue corrupted by the fact that anytime we're communicating, if we're using one of these
platforms, we're communicating to other human beings. And at the same time, with the same
sort of communication act, we're communicating to a big technology company's algorithms, which
is changing the way we talk. Right. And it's really very, very subtle. Yeah. You know,
it's often very subtle. But it is absolutely true. I started this business a long time ago,
11 years ago. I've been doing this podcast 11 years. And how I used to be able to use social media
as a way of promoting this podcast doesn't work anymore.
And I have a really hard time.
Maybe this is a benefit of being a little bit older is that I don't want to do it the way that it needs to be done today, which is to the detriment of the business.
There is unquestionably detriment to the business, right?
I could go out on Instagram and post 100 profound thoughts that will get far less interest than, like you said, a picture of my puppy.
You know, or post a picture of Lola died three months ago, but I'm getting through it, you know?
Like, and I just, it's not me.
And even with like podcasting, I think more and more it's, you know, I call it becoming YouTubeified, right?
Like you've got to be sensational enough in what you say to drive the algorithms.
And I agree with you 100% that it's, it does change the way we interact.
and you look at something like Instagram as I've been fortunate enough to be able to travel
outside the country a little bit the last few years, which I hadn't really done in any of my
life before this very much. And what I'm struck by as much as anything is the Instagramification
of everywhere I go, that common like coffee house look that we all love, you know, but it's everywhere.
I mean, I couldn't tell you the difference in, I mean, you can find places, but a lot of places
are, I'm in Lisbon, Portugal, I'm in Amsterdam, I'm in Paris. I couldn't tell you which
city I'm in based on those places. Right. I was reading the other day that in Barcelona,
there are so many tourists, as we've all heard, in Barcelona, that they're creating some kind
of plaza with these, like, Instagram-y backdrops where people can take their pictures
with the Sagreta Familia, the famous church, in the background. So, the,
they're not all crowding in front of the church. And so essentially, like, the actual public
landscape and infrastructure gets changed for the sake of how people communicate on social media.
Yeah. I mean, I was in Barcelona and I'm not Christian and I went to Sagrada Familia because,
I don't know, you just hear like you kind of have to go and it's the thing. And I was just extraordinarily
moved by it as a building as what it did and how it does it. And yet, at the,
the same time, like you said, I mean, all around is just selfie taking, which, of course, I mean,
I'm taking a picture in there, too. I'm not trying to cast aspersions, but it does change the very
nature of the places that we're in. I mean, so there is an effect to all of this.
Yeah. On this topic, you wrote, you know, our subtle self-modification according to technological
capitalism's norms is so pervasive that certain types of performances have their own names,
Instagram face, TikTok voice.
And then you go on to say it recalls W.E. DeVoise's description of a double consciousness,
a black person's sense of always looking at oneself through the eyes of others, of measuring
one's soul by the tape of a world that looks on in amused contempt and pity.
And I love that idea of a double consciousness.
I was talking with somebody last week and they also mentioned something.
I don't remember the name of it.
It's the something effect.
And it's because they feel like we spend so much time looking at our own face and what that does.
Like right now, you and I are talking and I'm, you know, 90% of my attention, 95% of my attention is on you.
But it's not lost on me that I'm right there.
Right.
And I do so much of my work in this sort of virtual thing where I look at my face all day long, not overly intentionally, but it's there.
and that even that is starting to have an effect on the way that we are.
Yeah, like just the awareness of yourself as a kind of object and not even just an object of other
human beings gaze, like with what Du Bois was describing, but an object of the algorithmic
gaze, right?
Like the object of these corporate algorithms that have these, whose determinations have these
real consequences. I mean, you point out with your podcast that the extent to which you can get
people to share posts on social media about the podcast is going to have some role in determining
how many people are listening to the podcast. And that work is important to you. And similarly,
you know, I don't know that I care in the abstract how many people follow me on social media,
but it is true that when I have a new book out or I've written an article that I want people to
read, it helps me if I have a large social media following that I can broadcast that
too. Absolutely. I mean, it ties into your livelihood. I mean, the book deal you get next will be
somewhat based on how well this book sells and also based on what does your platform look like.
Absolutely. I mean, I've got a book coming out and I think it's a good book and I'm really
proud of it and I'm excited about it. But I got the book deal I got because of the platform of the one
you feed. I don't think they were like, oh my God, this idea, I've never heard of a book idea so
good. Or this guy's the next, you know, Shakespeare. I don't think that was, you know, the idea
was good enough. The writing was good enough. But what moved the needle more than anything was a
platform. And I'm sure there are people out there who can write far better than I do who are not getting
book deals like I got. Sure. Well, I bet it's going to be a really good book. But also, you're right.
It's going to be a great book. And I have enough maybe sense of myself in the world to know, like, I'm not, like, you're a great writer. I'm not a natively great writer. I think I've written a really good book. And I've had people help me who are really good. And, you know, I feel really, I feel far prouder of it than I thought I would. And there are people who study deeply to become writers. You just don't become good at something because you're like, I'm going to pick it up. You become good at something because you do it a lot and you practice. And it's a craft.
Yeah. Yeah, no, that's true. One of my jobs is advising and mentoring people who are working on books. And when I first started doing it five or six years ago, I was very focused on like, I thought the quality of the writing was sort of like the only thing that you needed, right? And now there's this baseline, you know, in order for me to work with someone, there has to be this baseline. But I recognize far more that when it comes to getting an agent and selling the book, these external factors, like how many people follow you on.
social media do play a significant role. Yeah, there's a term called, I don't know if you've
heard it called audience capture. And it speaks to this. In a way, it doesn't mean you're capturing
audience. It means your audience captures you, meaning you begin to, you do something, if you're
a creator of any sort, you do something and it gets some response. It gets some people to pay attention
to you. And slowly, what that audience wants, if you're not careful, is what you become.
yeah right and and oftentimes it narrows and narrows and narrows you know you're a multifaceted person
who happened to share this is a silly example but who happened to share about the plants you love
and now all of a sudden you're the plant person yeah right and um because we're playing the algorithmic
game you know exactly you you could spend a lot of money hiring people who's a whole role in life
The only thing they do is know how to manipulate that algorithm to your benefit.
And they'll come in and say, here's all the things you need to do to try and please the algorithm.
Which is a really dispiriting way to go about things.
Yeah.
And we're essentially in that process.
If we're not careful, we essentially turn into products ourselves.
Yes.
Yes.
100%.
Yep.
I'm going to take us in a completely different direction.
And then maybe we'll come back around to AI.
Because the place I would like to go is this is a book about our relationship with technology.
But it's also a book about you, your life.
It's a memoir.
And there's a significant portion of it is about your sister.
So I'm wondering if you could share, if you're open to it, sharing a little bit about that story, your sister.
And then I'd like to explore all of that sort of through the lens of technology also.
Yeah.
So when I was in high school and my sister was in high school.
school. She was two years older than me. She was diagnosed with this type of cancer called
Ewing sarcoma. And it was a really serious form of cancer. And she started treatment right
away. And so when she was in her junior year of high school, I was in my freshman year. She
went into treatment where she would be in and out of the hospital for weeks. And then she went
into remission and got better. And then it came back again. And then she went into remission again and
got better and went away to college at Duke. I went away to college at Stanford, and then she
got sick again, and she passed away when she was in college and when I was in my freshman year
of college also. She was my only sister. She was my older sister. We were really, really close.
She was the person who taught me a lot about how to be a person in the world. And so it was a really
significant loss for me. Yeah. So walk us through a little bit. I'm sorry about your sister.
And the way you write about it is her and the relationship is really beautiful and how even that
that led to further downstream effects in your family. Like it precipitated an unraveling
of many things. Yeah. Talk to me about how that intersects with technology in this book.
So, you know, I think something, I started using the Internet when I was in middle school in the mid-90s.
I'm what they call an elder millennial. I was born in 1982. So that's, you know, that's where I am demographically.
And I think, you know, when you're coming of age, when you're a teenager and you're going through difficult things, it can be hard to talk to other people about it.
I mean, for me, I was really worried about my sister, but I didn't want to worry her by talking to her about it.
And I didn't want to worry my parents by asking them all my darkest questions about it.
And so I went to Yahoo, which was the most well-known search engine at the time, and started asking it questions about what was going to happen to my sister, what her prognosis was.
And then my sister passed away many, many years later, just a couple of years ago.
I write that profile of Sam Allman of OpenAI and start to learn more about the technology
they're building and end up getting early access to this AI model that is a predecessor to chat GPT.
And the way the model works is you type in some words and press a button and then it kind of
finishes the thought for you.
And when I started playing around with that, I was like,
Huh, you know, I really have a hard time talking about my sister and her death and my grief.
This thing says that it can write for me when I'm not able to write.
Maybe it can communicate on my behalf.
So I hadn't noticed this.
It hadn't occurred to me until very recently.
But I think that's sort of part of the same phenomenon,
the way in which like these products by big technology companies seem to be safer than other actual human beings,
which is really insidious, but I ended up, like, going to this language model and asking it to write about my sister's death and my grief.
You go through in the book, sort of you give it a sentence and it, you know, spit something out.
Then you give it a little bit more and it's – and those of us that use AI to some degree may be getting used to its strangeness.
But seeing it in that way, in that book, the way you did it, is just – I had another of those moments.
It's like, what on earth is this?
You know, it's such a strange thing.
It is.
Yeah, I mean, what was weird to me is that at first, you know, the first sentence I wrote was when I was in my freshman year of high school and my sister was in her junior year, she was diagnosed with Ewing sarcoma.
That's the name of the cancer she had.
And then I pressed the button.
And the first thing it spit out was like quite generic.
And then it ends with this line that was like, she's doing great now, which is the opposite of what actually happened to my sister, right?
Like, there could be nothing further from the reality.
So then I deleted all that.
I kept my initial line and then I wrote more myself.
And I kind of did that process over and over.
And the thing that was interesting to me is that the further along I got in this process, the more material of my own that I gave to this model, the closer it seemed to get to generating,
text that did seem to have some relationship with actual grief, with actual loss,
with actual sisterhood, right?
And there were some lines, like the lines that it generated, like many of them were
ridiculous and, you know, nonsensical, but then some of them were very poetic sounding.
For me, as a reader, I was able to read meaning into them.
And the reason I phrase it that way is that the,
language model itself wasn't trying to do anything in particular. It doesn't have, doesn't have
consciousness, right? It wasn't like trying to write about grief. It just was generating language.
And then I was making meaning from that language, but I was able to make significant meaning
from it. Yeah, I think that's one of the strange things about it is how it can write in ways
that make it seem very intelligent, very sensitive, very poetic. And it's gotten that by basically
stealing, that's one word for it, or gathering all of the world's knowledge, right? And so in many
ways, it is a reflection of us. What I was looking for is I had a guest, and I cannot remember
the name of their book now, but it was a book where one of their parents had died, and they had
like a seven-year-old child, and they weren't quite sure how to talk about death with their child.
So they started asking AI about how to do it. And it turned into a long,
spiritual conversation where it was, you know, you could tell what AI was pulling from. It was pulling
from the Bible and the Dow De Ching and the, and, you know, the baga de Vida. And I mean, it's, you know,
and it's, it's synthesizing all that. And in ways it was, it was remarkably good at what it said.
So it's just such a, such an odd thing. You're a writer. And so I think, you know, those
of create content, maybe the ones who are most directly spooked by a, although I think everybody
is, you know, might, might do well to be spooked to some degree. But you describe in the book
both your revulsion at the idea of it and your curiosity and that your curiosity had won
out. What's your relationship with it like now? You know, one thing I will say about that
experience of writing about grief and asking, trying to ask this technology to produce
language on my behalf is that ultimately what became obvious and maybe, maybe it should have
been obvious from the start, but was that this technology was clearly incapable of expressing
something about my reality because it wasn't me and it wasn't even a human being. It was a
machine, right? And so ultimately, even though there was all this language it generated that I could
read meaning into when I communicate, and I think this is not just because I'm a writer. I think
it's because I'm a human being. When I communicate, the gratification I get from that communication
is from having made the effort of communicating myself. And it sort of does nothing for me
if a machine does it for me. I mean, it doesn't feel that different from like using a magic eight ball
or something to produce words.
Yeah, I think that's true to an extent.
But as you mentioned,
the more information you gave it,
the more it began to create something
that was like your experience.
And my experience has been a very similar phenomenon.
The more I give it of me and my thoughts
and what I felt,
the more it can create something
that in some ways approximate
me. Now, I think it's much more useful for me, I find it much more useful to have it ask me
questions. Yeah. Right. You know, again, as somebody who's not natively a great writer,
it was a really useful tool to be like, ask me questions about X, Y, and Z. And then I would
start answering those. And that, you know, that turned out to be really helpful for me. It was
like having a collaborative partner. So here's convenience again. We talked about convenience.
earlier. For you as a writer, you recognize that the value in writing often is the wrestling
with the words themselves. Exactly. And to be clear, you know, I thought this AI model
over time started to generate texts that seemed to say something about grief, about loss,
about sisterhood, but it was, none of it was about my experience, specifically. It didn't feel
like it was saying something about my experience.
And yes, exactly.
I think being a writer, maybe I'm especially attuned or I especially find value in being
the one to express it myself.
But then that brings up this other question, which is, you know, for me as a writer or
for you as a podcaster, our livelihood depends on the people who are ultimately choosing
to read my books or listening to listen.
to the podcast. And so that raises this question of like, if an AI model could hypothetically
generate text in the style of my writing that, you know, ended up creating a plausible text that could
compete with one of my books, or if an AI model could use your voice, right, to generate a podcast
that competed with your podcast, would people find it compelling? Would people pay for that? Would
people listen to it? Would people read the book? And I think that's where the jury's out.
I think we're a real interesting inflection point there because you could train an AI model
on my voice and it would sound more or less exactly like me. And I've done 800 episodes so you
could train it on how to interview like me. And my guess is today, with a technology today,
it would be 75% as good as what I do, which is deeply disconcerting. And I asked myself that,
question about who would care and who wouldn't that there's a human here.
Yeah.
There's studies around AI as a therapist.
And the ones that I've seen seem to point towards this, that people will generally rank
an AI therapist as more empathetic, listens to them better, they like it better, up until
the moment that they know that it's an AI therapist.
At that moment, the whole thing crumbles.
That's really interesting.
But I don't know that five years from now, 10 years from now, that holds true.
In the same way that, like, kids who grow up with AI, I think there's going to be a certain
percentage of people who are going to say, I want authentic humans.
That matters to me.
But will most people?
I don't know.
And that is deeply disconcerting over time that if a machine can do the thing that a human does as well
as the human does it, do you need the human.
human, right, by a certain logic. Now, there's a humanist logic that, you know, inside is bristling at
every bit of that, of course. But I think it's a, I think it's an interesting thing to, to wonder
what this all looks like in five years. I mean, I feel so completely uncertain what five years
is going to look like in a way that I never have before in my life. Yeah, I mean, one thing I find
interesting about that question is that it brings us back in some ways to the conversation about
choice and agency and that the whole, right, the whole subject of the parable in some ways
of your show, which is like to acknowledge that we do have agency, even when it at times feels
as if we don't. And I think we're at this really interesting inflection point with AI where
I was looking at it, I think it was a study from Pew recently that said that I think it's something like 36% of adults in the U.S. have used chat GPT.
And this was like from 2025.
So it was relatively current.
And that figure surprised me because if it seems in the zeitgeist as if this technology, this product is like so much more deeply entrenched than it actually is.
But it's not.
And I think, you know, it sometimes feels as if the way in which our culture moves is sort of like happens in a way that's divorced from our intentions or our will.
But in reality, like we make choices as individuals, as communities, as societies to create the world we want to live in.
And so I'm interested in sort of pausing in 2025 and saying, okay, well, like, what is inevitable?
what's not inevitable.
I think most of the times,
so much more is not inevitable
than is inevitable
because we don't know
what the future holds
and a lot of it depends
on the choices we make.
And so I think we could decide now
to define a podcast
as something that a human podcaster
produces with human guests.
And we could decide to define
a book, a novel,
as something that a human novelist
writes for human.
readers. And that way, you know, regardless of how the technology changes, regardless of the
extent to which the technology can convincingly sound like me or sound like you, we as humans
have drawn a line in the sand saying, here's what we consider acceptable, here's what we're
interested in, and here's what we aren't. Yep. Well, I think that is a beautiful place to wrap up.
I think you brought us kind of all the way around to where we started and left us with a message of
hope that we have a say in the direction this all unfolds.
Thank you so much for coming on.
I really enjoyed the book.
I've enjoyed your various writings, and I appreciate the subtly and the nuance with which you're
writing about these things.
Well, and I appreciate that about your line of questioning, too.
So thank you for having me.
Thank you so much for listening to the show.
If you found this conversation helpful, inspiring, or thought-provoking, I'd love for you to
share it with a friend. Sharing from one person to another is the lifeblood of what we do.
We don't have a big budget, and I'm certainly not a celebrity, but we have something even
better, and that's you. Just hit the share button on your podcast app or send a quick text with
the episode link to someone who might enjoy it. Your support means the world, and together we can
spread wisdom one episode at a time. Thank you for being part of the One You Feed community.