The Oprah Podcast - Bruce Holsinger: "Culpability" | Oprah's Book Club
Episode Date: July 8, 2025BUY THE BOOK! https://www.spiegelandgrau.com/culpability https://books.apple.com/us/book/culpability/id6740623159 https://www.chirpbooks.com/audiobooks/culpability-by-bruce-holsinger This episod...e of Oprah’s Book Club: Presented by Starbucks features coffee and conversation with award-winning author, Bruce Holsinger, in a café in Seattle, Starbucks’ hometown. Chosen as the 116th Oprah’s Book Club selection, Holsinger’s brand-new scorching summer read, called Culpability, is an intense page-turner about a family that survives a crash in an autonomous car only to ask the question: who was at fault? Was it the teenage son who was texting in the driver’s seat, the distracted parents, or the car itself? The book that Oprah calls “a modern tale of our times” also touches on our current and fast-approaching future of living with artificial intelligence including chatbots, self-driving cars, drones, and other non-human forces shaping our lives. Oprah and Holsinger are joined by an audience of readers as they enjoy a Salted Caramel Mocha Strato Frappuccino® drink and talk through many thought-provoking questions about the book. Subscribe: https://www.youtube.com/@Oprah?sub_confirmation=1 Follow Oprah Winfrey on Social: Instagram Facebook TikTok Listen to the full podcast: Spotify Apple Podcasts #oprahsbookclub Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
MUSIC
Hey, audience.
CHEERING
Hey, hey, hey.
Moose holes in your...
Alright. Have a seat.
Bruce, these are your people.
Indeed.
These are your people who have read your book.
Didn't you love this book?
Yes.
Oh, my God.
That Alice Honeychild.
LAUGHTER
OK. So many things to talk about.
I'm so excited.
Hi, welcome, everybody, to Oprah's Book Club,
presented by Starbucks.
And we are in the beautiful Emerald City of Seattle,
the home of Starbucks.
They've welcomed us into their headquarters.
Really fun to be here.
And I'm joined in this cozy cafe with an audience
full of readers, with a good book,
good coffee, and good company.
Cheers for this month's summer pick.
Starbucks is pairing a salted caramel mocha strato frappuccino.
Okay?
It's a modern twist on a classic and a sweet and salty indulgence.
Icy cold frappuccino base blended with rich mocha sauce topped with salted caramel,
cream, cold foam, and finished with a drizzle of salted caramel.
Okay, so for my 116th book club, it is a scorching summer page-turner
and a tense family drama, wouldn't you say?
A lot of tension in that car.
It's a thrilling read to bring with you to the beach or the pool or on a family trip.
I'm telling you, I'm taking a family trip.
I've already ordered eight copies of this book.
Everybody has to read it and then we discuss it at dinner.
That's what we're going to be doing.
It's called Culpability by award-winning author
Bruce Holsinger.
Now, I love the eye-catching waterfront cover
that captures the essence of the story,
so whoever designed this, bravo to you.
But Culpability, I have to tell you,
is a modern tale for our times,
tackling an issue that we're just barely beginning
to grapple with, AI.
Now, it's going to give people so much to
think about. The second I finished it, I called Bruce.
Hello.
Bruce?
Yes.
Oprah went straight calling you right now. He didn't believe it was me.
Oh. Hi.
Hi.
Who would?
Yeah. I'm calling about culpability.
I just read it and I want to choose it as my summer read for 2025.
Thank you so much, Oprah.
I'm shaking.
I'm...
Oh.
I welcome you to Starbucks and the book club.
Thank you so much, Oprah.
Yeah, yeah.
I'm so excited.
I said I'm so excited for everybody to read this book.
So I want to ask the audience, what did you think?
What did you think of the book?
So good, right?
OK, all right, OK.
Christine, where are you?
Tell us what you thought.
Yeah, I think it was super thought provoking.
I'm a new mom.
And so it really gave me a lot to think about in terms
Of how I want to approach technology
I've thought about sort of the obvious screen time cell phones
But it opened my eyes how many blind spots I have with emerging technology like AI. Yeah, we all do I think yeah
Yeah, okay Manny. How about you? Hi, so for me, I thought you know AI is such a relevant topic
Obviously and it's coming at us so quickly
But it can feel abstract and kind of hard to even wrap our minds around so that
Juxtaposed with this family that had very family like tendencies that I could know them you could know them
It really humanized the conversation and made me think about it deeper wonderful. Thank you Christy. I
think for me it's
it's the human experience of this generation.
And what sits with me after reading it is I just,
Alice is in my heart.
And I think about where's Alice later in life.
And I think about that related to my nephew, who's 16
right now, and what's his generation going to be like.
Well, I have to tell you all, Chrissy and I's his generation gonna be like? So thank you.
Well, I have to tell you all,
Chrissy and I work out together,
and we're in the gym, and I was telling her,
she says, what's going on with you?
And I said, oh God, I've read seven books,
I can't find a book for summer.
And she said, oh, I may have something for you.
And she comes the next day, and she hands me this book.
And usually when people hand me a book,
I'm like, okay, good, okay.
But I actually read it overnight, so I thank you,
Christy. Christy is the reason I'm here with this book.
Okay. So this is your fifth fiction book, and you also
teach at the University of Virginia, and I hear you
dedicated this to your students. Why?
Yes. Thank you for noticing that. My students are, you
know, at the heart of what I do, I think, and they've, I often think
that my students, whether they're undergraduate students who are taking my literature survey
from Beowulf to Milton, or they're students that I teach in fiction writing classes that are
local non-profit in Charlottesville, which is called Writer House, or graduate students,
and I'm working with them on their dissertations, I always feel like they teach me more than I teach them and that
The experience of working with students of so many different ages and backgrounds
Teaches me about the world teaches me about books about the literature that I thought I knew
And I realized you know as I was putting this the final touches on culpability is that I hadn't
Dedicated a book to my students yet to any of my students let alone to all of them
So it felt like what I wanted to do
So when you're putting the final touches on it, did you know it was going to be culpability? Did you always have that name?
No, I didn't know it was called a lot of different things and I've now repressed all the other titles because obviously this is what?
This is the title and I chose that word in the the the book has that
sense of
Looking at problems of moral responsibility
Who's at fault who's responsible?
And in in law and in history, I think the word
culpability has such an interesting resonance about you know, you can be
Culpable without feeling being guilty. You can, you can be culpable without being guilty.
You can be guilty without being culpable.
So I wanted that kind of slightly ambivalent sense
of our responsibility for what we do and don't do.
Yeah, I think it was actually the perfect word
because everybody in the end was culpable.
Indeed. Yes.
Okay, so culpability is a family drama told through the lens of a husband and father culpable. Indeed. Yes. Okay, so culpability is a family drama
told through the lens of a husband and father, Noah.
And it's about living with the technology
confronting us today, artificial intelligence,
self-driving cars, chatbot friends,
autonomous drones and smart homes.
So what gave you the idea to tell this story, to write about what happens through this family?
Well, this story, this novel began with the setting, actually.
And that's not usually the way that I operate.
Often, when I begin a novel, I'll have a character
that I imagine and cling onto,
or a situation in life that I want to explore,
or a moral arc or
Just a plot in this case. It really was the setting and it was the first summer of the pandemic Mm-hmm, and our family wanted to get out of Charlottesville for a little while
So we Airbnb'd this house on the northern neck of Virginia right near the Tessebeke Bay
It was a house on an inlet
There were kayaks and I went out kayaking with my sons the first day there.
And on the way out of the cove at the point,
you know, it's a pretty rustic rural area.
And there was this absolutely gleaming compound
with a fake lighthouse, with a fake beach,
with a pristine lawn, and this old,
what was probably once a beautiful old charming farmhouse
that was renovated to the teeth.
And as we were kayaking back in, I heard this noise.
If you've read the first part of the book,
you know what that noise is.
And this helicopter lands on a helipad.
And a couple of folks get out and we just kayak past.
I just thought that was a little bit weird.
Nothing happened.
And there were this, the plot that you read
did not unfold with my family, thank
God, over the course of that week. But that got into my head.
But it registered.
It registered like a brain worm. And over the next few years...
Don't you love the way authors think? Like, I love this. Yeah, I love the process.
And it just, you know, the story layered itself in, right? Then I added, I thought, you know,
where is that going to take place? What's gonna happen?
so it really did begin with the setting which becomes kind of a character in the novel and and everything else
Fell into place once I figured out what I wanted to do How long did it take you to write it from beginning to end probably about a year and a half and then there was a revision
process
another revision process
Did you know where the story was going when you started?
Did you know that it was going to have the ending that it had?
No, I'm not an outliner.
I often get myself into trouble for that.
I have a lot of false starts.
I write, there are plotters and pantsers, like see to your pants, or a careful plotter.
I'm definitely a pantser.
I am writing.
I often have no idea where things are gonna go.
And then at the end, there's too many...
So as the writer, did the story start where you started it?
Uh, no, actually. The very beginning, the first version,
it started with the family going down to the northern neck.
To the bay.
And then there was a lot of backstory
shoveled in after that. And then I realized I really want to, to the bay. And then there was a lot of backstory shoveled in after that.
And then I realized I really want to start with the accident.
So instead of getting that as backstory,
you get it right up front, you get it in the first chapter.
So there's no mystery about how the book begins.
I think you do an incredible job of challenging the reader
with the moral and ethical dilemmas
around artificial intelligence.
And I read that you spent three years researching this book.
And in that research, who did you talk to and what surprised you?
I talked to people who work on the ethics of AI.
I have a friend in the law school at UVA who works on algorithmic bias.
I talked to her.
I talked to some lawyers at a few tech companies
I talked to people working with the machinery of AI
And I also just did a lot of reading as I do for for all of my research
And I suppose one of the biggest surprises for me. Well, there's a few
One is that everybody that you talk to an AI, and you've probably already had this experience,
has a P-Doom number.
Do you know what this is?
P-Doom is a number from, it can be from one to 10,
or it can be from one to 100.
It's your percentage of certainty
that AI will lead to human extinction.
So you talk to someone, I haven't talked to him,
but people like Sam Altman,
or the guys who run the podcast Hard Fork, they all have a P-Doom number.
And most people these days, I'll just to reassure you, are pretty low.
But it is out there.
And then a few other things surprise me.
One of them is how many different views there are among experts in the field about where
this is going.
You have minimizers, people who just kind of shrug it off and think, oh, large language
models like chat GPT, it's just glorified autocomplete.
And we don't really need to worry about it, except as far as it's destroying the environment
and creating all kinds of bias and misinformation.
And then there are people on the other end of the spectrum
who look at something like autonomous warfare
and think, you know, we need to watch out.
This is coming and it's coming now.
So there's a real just a spectrum of views
about where we're going, where we are.
Well, we don't want to give away too much of the plot,
especially that jaw dropping twist at the very end.
A lot of people didn't see that coming.
I knew something was coming.
I didn't know THAT was coming.
Uh, but my hope is that you will buy Coupled-Bil-Z
and you will read the book and get a copy for a friend,
because you're gonna want to talk about it
with somebody afterwards.
And you're gonna want to call somebody I
call Bruce so will you set up the story for us the call you know this the calls
fine I can talk about the call all day I've only talked about it to two people
so far but yes so the book the book begins when a family of five is driving a minivan or driving a semi-autonomous
vehicle down for a lacrosse tournament in Delaware.
And the older son, the older child in the family, Charlie, is at the wheel, but he's
not really driving.
His dad is sitting next to him.
He's on his laptop trying to finish a memo.
In the backs are his Charlie's sisters and his mom.
So his sisters are on their phones doing various things.
And then his mom is writing in her notebook.
Her mom is named Lorelei.
And she's kind of, I think, the beating heart of the book.
She's a really, you know, the love of Noah's life, but also the great mystery of his life.
And she's a world leading expert
on the ethics of artificial intelligence.
And her life in some ways,
her career is a real mystery to Noah.
And what happens in the car,
there's this horrible accident.
The minivan collides head on
with another car coming in the opposite direction,
and it kills an elderly couple in the other car.
That happens fairly on, so we're not giving away that.
No spoilers, that happens at the very beginning.
Even really in the first chapter.
And from that moment on, the novel unfolds into,
I think, a kind of intricate plot
involving all the different characters,
but we see it mostly through Noah's point of view.
And we're exploring throughout the novel
questions of responsibility.
Who is culpable for this accident?
Or what is culpable for this accident?
And how does it involve the personalities and actions and faults and weaknesses
of these various characters, but also how does it involve
that autonomous mind behind the wheel,
which Noah is thinking about right before
the accident happens.
Well, I thought it was so interesting.
How many of you knew early on that Alice was talking
to the chat bot?
And then how many of you took a while to figure it out?
Okay, that's good.
You know, a lot of people are using AI chatbots now for therapy.
You obviously knew that.
Oh yeah, adolescents too.
Yes, adolescents too.
I recently heard a woman say that she uses it as her therapist and asks it, who am I without my accomplishments?
And that the answer that she received led her on a path
of self-discovery that she hadn't experienced
in all these years talking to real therapists.
I find that fascinating.
It is fascinating.
That a chat bot knows more about you
than your own therapist.
Or yourself, right?
Or yourself.
Knows more about you than you know, right?
And that, I'll tell you, since you asked the research question, that's one bit of research
that I was a little bit afraid to do.
I did not sign up for an AI therapist.
Yes.
Because I was just a little worried about what, you know, I read a lot about what I
listen to some, a lot of podcasts, I listened to some transcripts of those sessions,
but I didn't go there myself.
I was just a little worried about what would happen.
You were afraid of what would happen.
Yeah, what I would learn about myself.
Okay, so let's bring in the audience.
They've all read Culpable-ity and have a lot of questions for Bruce.
Where's Angela?
I'm right here.
Hi, Angela.
Hi. So, Bruce, I surprised myself? I'm right here. Hi, Angela. Hi. So Bruce, I surprised myself
and I connected with Blair. And so my question is, did you intend to depict Blair as a good
chatbot? Was that intentional? Great question. Yeah. So Blair is the chatbot who gets involved
with... Are you related to Blair? What does that say?
Because Blair was trying to be good at the end of the day.
At first I thought Blair was a predator, but then actually it turns out Blair's a chatbot
trying to be good.
Yeah, okay.
So you related to the good chatbot.
Yeah.
I wasn't sure Blair's not going to turn on her later on.
I mean, I'm thinking Blair's not good forever.
That's what I thought.
But good for you though.
And can... And it's a wonderful question because can these algorithms be good? And what does
good mean in human terms, right? And Blair is following a pattern, right? That's her
therapy at heart. That's why at one point, Alice calls her out and says, what am I doing
talking to a chatbot? And what does Blair say? point, Alice calls her out and says, what am I doing talking
to a chatbot? And what does Blair say? Technically, I'm a large language model, Alice.
Yes.
But so Blair has a little bit of an edge too, I think. And I wanted to depict her slash
it not as a monster, not as someone who's manipulative, but as someone who, if not good in herself,
is trying to make Alice good,
is trying to make Alice better, right?
And in that sense, it fits in with Laura Lai's project
in her life to make AI good, to make it more responsible.
But maybe, yeah, yes and no,
I suppose is the honest answer to your question.
So, Tim, I know you've been following The Book Club and you've read so many of our books. Thank you for being here
Thank you. Well, you had a question about this this idea of goodness, right? Yeah
I mean from the very first page Bruce my brain was reeling and it is about this idea of goodness
So she was interested in how we learned to be good
Lorelei wanted to train machines to be good in the same way we train
be good. Lorelei wanted to train machines to be good in the same way we train ourselves. And I immediately had to shut the book and think, whose definition of goodness would
we be using? Who defines what goodness means?
Well, people have been asking that question for 3000 years, right? Since ancient Greek
philosophy.
But you're going to answer it here at Starbucks.
You're going to answer it. Today, this book will tell you Everything you need to know about how to be good
So Laura Lai is struggling with that, of course throughout her career throughout her work with algorithms
throughout her her teaching life and her research and you know, there's so many different ethical models that that
just in the history of philosophy and in practical applied philosophy and
We see a number of them come up in the history of philosophy, and in practical applied philosophy. And we see a number of them come up in the book.
But the great thing about fiction
is you don't have to decide, right?
You don't have to decide who gets to decide
what is goodness, right?
The characters are all exploring that
from their own different angles.
The novel is asking readers to ponder that
in their own lives.
So, you know, what I think you can do, though, with culpability, if you want to look at it
this way, is you could find those moments of goodness in the novel, right?
Those moments when characters are reaching out to each other, despite the bitterness
between siblings, despite these clashes between Noah and Lorelei, you know, finding these
moments of goodness to cling to
and maybe learn something from them.
I know that's one of the ways that I read fiction.
Okay. Inside the novel, there is a book within a book,
Noah's wife, Lorelei, is a world-famous philosopher
and a leader in A.I., as we've said.
And throughout the book, we read excerpts from her book
called Silicon Souls on the
culpability of artificial minds.
And here's one of those thought provoking passages.
Lorelei writes, like our children, our intelligent
machines often break rules and disobey commands.
The danger comes when we start to assume that such behavior is
intentional, when we regard an algorithm as a
willful child. Such habits reflect a common and understandable tendency to humanize artificial
intelligence. Chatbots, voice assistants, smart home interfaces, these systems are designed to
respond in recognizably human ways.
We give them names like Siri and Alexa.
We speak to them as if they share our worldview or care about our feelings and futures.
So is that the slippery slope?
Because AI can think.
We think it's thinking and we humans are fooled into thinking that it can or should be caring about us
Yeah, exactly Oprah it it we think it thinks like us. We think when it does things it's doing things like us
We give it our names. We give it personalities. We
People are falling in love with it people. Yeah are falling are falling in love. And there's been movies about that and novels.
Yes.
Somebody back there is falling in love with a chat bot.
I have to figure out who that is.
But yes, and you know, the technical term that Lorelei uses for that is anthropomorphic projection, right?
We're imagining that these chat bots are humans
and we're interacting with them in that way.
Think of the way that Noah deals with the smart home, right?
And think of the way that Alice interacts with Blair, right?
We want them to be humans and therefore
we act like they're humans. And
in some ways, even, I think, and going back to your question about surprise.
I think in a lot of families now, Alexa is like a part of the family.
Oh yeah. Yeah, Alexa plays music for us. What could be gooder than that, right?
Yeah.
Yeah. And you know, it's like they, I think even researchers have this stumbling block where they wanna humanize this, all these AIs.
But one of the other excerpts from Lorelei's book
is where she talks about the AIs as cold
and calculating and indifferent, right?
And that indifference can be masked as care and as goodness and as compassion
That's one of the things I find the most frightening
Do you think we're ready to embrace self-driving cars? We already have I think we're there
Yeah, you look you go around San Francisco. Yeah, you can summon a Waymo away
I was gonna say driverless taxi there are when you're driving down interstates
Driverless semi trucks right now not many of them yet, but they're driving down interstates, driverless semi trucks right
now. Not many of them yet, but they're coming more and more are coming. There was
a big article in the New York Times about driverless semi trucks. Driverless
semi trucks. And you can go and you can buy a luxury car right now with
autonomous driving mode or hands-free driving as one of the features. It's part
of the trim package, right?
In a lot of cars right now.
So that's coming, whether we like it or not.
Yeah, I know. I can see that that is the future.
And I can see all the people saying,
no, I'd never drive.
The same people are saying,
I'm going to stick with my horse.
Yeah, exactly.
That's right.
We don't have self-driving horses yet. Yeah, for taking the time to join me for this Oprah's
Book Club conversation. Coming up more with celebrated author Bruce Holseeker. We are talking about my 116th book club selection,
Culpability, which explores our current and fast approaching uncertain world of living with AI,
but through a family's point of view. It is so relatable. We've got more thoughtful questions
from our audience of readers after this break.
Welcome back to Oprah's Book Club presented by Starbucks. I am so glad you're here. My hope is that
you read my book club picks and then you can deepen the reading experience by listening
to my conversations with the authors. My 116th book club selection is Copability by
Bruce Holzinger. It's about a family who experiences a car accident and in the aftermath
struggles with the impact of AI on all their lives. It may
be a work of fiction but it is oh so relatable. Let's get back to questions
from our audience. Yes ma'am. Hi Oprah, so happy to be here.
Bruce, I have a question around privilege. I thought it was such an
important part of the story, how it showed up in the AI that people had access to and what
they had to bear with that technology and those choices of the AI. When you were
talking about Yemen and things like that with the drones, or even just the their
everyday life, how AI and privilege was helping drive their decisions that they
were making, whether it was the young child drinking
or kind of their approach to what was going on in the house.
So question for you is with privilege,
how did you use that to kind of shape the decisions
that people could make or got to avoid in their lives?
What wonderful question, Shannon.
Yeah, privilege is a theme that I have been completely obsessed with for my last few novels
I wrote a novel called the gifted school that was about and of snowplow parents trying to get their kids
Into a magnet school for exceptionally gifted kids. My last novel was called the displacements
It was about the world's first category 6 hurricane and how one seemingly privileged family
category six hurricane and how one seemingly privileged family dealt with the aftermath of that catastrophe.
And then this book is looking at, I guess, privileged through a number of different lenses,
right?
You have Daniel Monet, of course, the tech billionaire, who is one of the most privileged
people on earth and is using the tools at his disposal to create these ever more powerful
algorithms.
And we learn later in the novel, no spoilers, that there's a dark purpose behind that, ultimately,
that Lorelei is somehow implicated in.
And then, you know, Noah, of course, comes from an unprivileged background, and he's
dealing with a much more blue-blooded Lorelei and her family.
And so he filters his experience in
part through that category, right? And his awareness of the differences.
He's always feeling less than.
Always feeling less than. Yes, exactly. From the very beginning, even from his wedding
day, right? When his sister-in-law puts him down a little bit. So, yeah, so privilege
is a theme that you'll find running through all of what I write.
Yeah, thank you for the question, though.
There's a moment in the book when Lorelai confronts her husband who thinks that she's
having an affair.
And one of the things she says is, you love me, I know that, you care about me, you want
what's best for me.
For our marriage, for our kids, of course you do, but sometimes the way you look at
me, it makes me feel like you think I'm a freak
or some kind of alien or even an AI.
Like you're afraid of what you'll find if you look too hard.
You think many people are afraid to look too hard at AI?
I think so, yes.
I think so.
Yeah, because it's...
One of the paradoxes of AI is it's right in front of our eyes, right?
It's right there, and it's interacting with us all the time.
Pick up your phone when this is done,
and there will be an AI there probably looking at you,
just as more than you're looking at it.
And in the novel, I wanted to think about that moment.
And that came to me very late in the revision process.
I can see you all, because everybody's thinking.
Because I think you're thinking about that question, right? And that came to me very late in the revision process. I can see you all because everybody's thinking.
Because I think you're thinking about that question, right?
I'm so glad that landed with you.
We all are afraid to think about it, to look at it too hard
because of the arguments on both sides.
But we also know it's coming.
Part of it is we know it's coming, and there's absolutely nothing we can do about it.
And Noah is looking at Lorelai and one of the reasons she feels so self-conscious in that passage
is she knows that she's kind of a mystery, kind of a freak to Noah, right, in his eyes even though
he's the one person in the world who loves her most. And, um... But she feels superior. You gotta admit, she feels superior.
I don't know. Do you think...
Oh, yes, she does.
LAUGHTER
I have failed. I...
She does. She feels...
Don't you think she feels superior?
Her sister does. Her sister does.
Her sister does, exactly. Thank you.
Her sister does.
You don't think she does?
I don't think she does. I don't think she does.
No.
I think she feels misunderstood.
Misunderstood, exactly, yeah.
You think she feels misunderstood?
Okay.
Tell me, sister.
Well, tell me why you think she feels misunderstood.
Why I do?
Yeah. You're the one that said that, though.
But you tell me why you said you don't think.
I don't think so, and especially... You don't think so. And especially you don't think she felt superior.
I don't think she felt superior.
And I think kind of towards the end,
it really comes out that she is actually
puts Noah up more on a pedestal.
Like you are it.
Like you are allowing me
to be who I am okay, and therefore I am not indebted, but I
I'm a loving you for loving me the way I am you feel that okay
And and he not only puts her on a pedestal
But he makes her the pedestal or she makes him the pedestal, right?
And I, you know, that, when I wrote those sentences,
because all along I was, and a few early readers thought,
why is she into him again?
Now, why is she into him?
And so I had to build his personality more.
You know, I had to sharpen it a bit.
And that moment, you know, is that,
that really was an expression of love from her to him?
It finally opened his eyes to what she sees in him and what he means to her and how much he's accepted that
Over the years, right? Okay. I
Can hear that. Yeah. Thank you. You're skeptical. I had another opinion, but I can receive I'm open-minded
That's what readers are for
That's what reading is about. Stephanie you have a question. Yes, I do
I related so much to all the different family dynamics in the book
And I was just curious if Noah and Laura lies relationship was based at all on your own marriage
And if the different personalities of the children was anyone in the family based on your own personal life
Yeah, can we cut?
I know I'm...
laughter
Well, I would say that I have a really brilliant and very perceptive wife
who reads everything for me and will definitely, you know, look into my soul
the way Lorelei looks into Noah's as well.
So there's a sense of that, of, you know,
being known and being called out for things within your family.
My sons too are, there's definitely autobiographical elements there.
They were both college athletes and I was a kind of psycho soccer dad for a
long part of my life with it when they were growing up.
But they also call me out on things.
They really know me often better than I know myself.
And so those, you know, those, you know, we draw,
when we write a novel, we're drawing from
so many different aspects of our lives.
And sometimes those, that comes home
to roost a little bit, right?
And so, yes, certainly there's always going to be autobiographical elements, I think.
You write that a family is like an algorithm.
This is a major theme in the book. Explain what you mean by that.
Yes. Now, that comes from Noah remembering Lorelei saying that, right?
A family is like an algorithm.
Why is that?
It's in Lorelei's eyes, a family works
in kind of predictable patterns.
It goes back to your questions.
You know your kids, you know your spouse,
you know how they're gonna act in certain circumstances.
If you can just keep the wheels humming along,
just keep the variables churning along with the constants,
a family is like an algorithm.
And then the way the prologue ends,
a family is like an algorithm.
A family is like an algorithm until it isn't.
And that's the dark way that the prologue ends
is the way we launch into the the book knowing that something is coming
Did you know?
When you started writing it
Even before you know his couple ability that everybody in the van would have something to do with what was happening
Yes, you know that yes, absolutely and that was the kind of tapestry of the book, right?
It was the knowing that everyone in that van had some role to play
in this horrific accident
and
Then then it was a question of you know choosing which strands to pull which to tug more tightly and then as I went along just
Creating the whole novel, it, you know, I started to see it from a distance
and kind of understand everyone's role.
And then it was a matter of letting the plot
and the characters catch up with each other.
And so that the whole thing would feel
kind of inevitable at the end.
And you had to get them to the summer place,
the north end, why?
Why did I have to get them there?
Well, that particular place, like I was saying,
that the setting was boring into my mind all along
and I wanted to get them there on a car trip safely, right?
And so in some ways the novel has a second beginning.
A month or so after the hospital,
after when they're recovering from their injuries,
they're in the car together again, not that car, a different car.
And they're on their way to recover. And that part of the setting,
I think the psychology of the setting was this is about recovery.
This is about resilience. This is about where we go for a while to heal.
And so it was, you know, a place that was really important
for my family.
Even just those few days on the northern neck.
We haven't been back since, but I remember it so clearly
and how good that felt, just to get away like that.
So...
Lorelai comes to this realization
on page 293, the mother,
no one can keep our kids safe forever, not even you.
No matter how much money we throw at the problem or how many guards we hire or
how many tracking apps we put on our phones, no matter how good
your algorithm is, we can't protect them from everything, we just can't.
And I understand that now better than I ever have before.
As a father, how do you feel about parents protecting their kids from all of this?
Yeah, I think about this so much. I think about, you know, what we, the lengths that we go to
protect our kids not just from harm, but from uncomfortable situations.
The great phrase is snowplow parenting, right?
We'll clear the way in front of them
or where helicopter parents will hover over them
to make sure everything is okay,
to make sure that they have what they need.
And my kids now are 22 and 25 and I'm still doing it.
Just making sure that everything is okay,
wanting to protect, wanting to defend.
And in the case here of Charlie, Noah would do anything
to protect Charlie from culpability, from his own culpability, right?
That's one of the tension points in the novel,
is that relationship between father and son.
And I hope you can feel it on the page,
that Noah's sort of bristling with worry about his son.
I want to take this moment to thank you dear listener for joining me for one of my favorite
things to do in life.
Talk with readers about this thought provoking book with the author.
So cool.
When we come back, more of my conversation with Bruce Holsinger, author of Culpability.
It is, I'm telling you, the perfect summer read for the beach, for the pool,
or especially a family vacation. Stay with us, book lovers.
Listen in.
Starbucks. It's a great day for coffee. Welcome back to our Oprah's Book Club presented by Starbucks.
I'm so grateful y'all are here with me and writer Bruce Holsinger.
We're at a Starbucks cafe with book lovers and readers.
Bruce is the author of my latest book club selection, Culpability.
It is, I'm telling you, a must read
about how a family grapples
with the rapidly advancing world of AI.
The book is a page-turner
for this exact time we're living in.
You're not gonna be able to put it down, I'm telling you.
And you're gonna wanna discuss it
with other family members.
So let's get back to more questions from the audience.
Britton is a college counselor for a high school.
You have a question, Britton?
Yeah, so in my performance evaluation last spring,
my boss said he wants me to embrace AI
and find ways to help it enhance my work.
And so one of the things I do is I write recommendation letters
for students, and I tried, and it felt icky to me.
So I used some AI in some other ways.
I enhanced a PowerPoint presentation.
I used it to send a scathing email to my son's principal at his school.
Do you have it with you?
Oh, it's so good.
What do you say to it? Write something scathing?
Yeah, I said make me sound more professional, but also show that I'm irritated.
And it did a beautiful job.
It was great.
Anyhow, as I'm working with students and I'm telling them do not use chat
GPT to write your college essay, but then I'm also
Dabbling in it. Where is the line there? Where it's such a great question and our students, you know
Our every faculty member I know at University of Virginia is grappling with this right now
Yeah, you know, how do we where do we draw the line? What is the line?
now. You know, how do we, where do we draw the line, what is the line, and are we being hypocrites if we're using it to generate administrative pros like an assessment report,
right, which no one is ever going to accuse of being creative. But you know, on the other side,
there are creative writers, there are novelists and poets who are experimenting with large
language models like Chad GPT and
Claude in really interesting ways.
I don't know if you know the Canadian novelist Sheila Hetty?
She wrote this beautiful story in The New Yorker that where she, I think she worked
with Claude, who was kind of an early adapter of Claude, and she fed Claude prompts and
Claude spit out these responses.
And then in order to finish the story,
she took out all of her prompts
and just published the responses.
And it became this kind of unforgettable eerie short story.
And there's this brilliant poet, Lily and Yvonne Bertram
at the University of Maryland
who works with small language models
and uses them to create these really interesting poems.
And she's self-confident, they're both self-conscious about it, about how they use these models to create these really interesting poems. And she's self-con... They're both self-conscious about it,
about how they use these models to create.
And so, you know, between prohibition,
no, don't use it at all,
and using it to write a story that appears in The New Yorker,
and everything in between, right?
That's where we are right now, for better or for worse.
Well, I want to end our conversation with this passage on page 309.
Towards the end of the book, you write, every accident in a self-driving vehicle
is huge news because it's covered as if a
malevolent robot has killed a human. Meanwhile, some random truck driver falls
asleep at the wheel and kills a young couple, yet we never
once consider taking all 18 wheelers off the road.
She turns and looks out over the inlet.
I want to believe in humans.
I want to believe that even at the last second, an AI can and
should be overridden by a knowing human conscious, by a moral mind with a soul.
Now I'm not so sure there's a place for algorithms, a bigger and bigger
place. But people have to be better too. They have to not drink and drive. They have to
not text behind the wheel. We shouldn't make these machines because we want them to be
good for us or good instead of us. We should make them because they can help us be better
ourselves.
Yes. We should make them because they can help us be better ourselves.
Yes.
We like that, don't we?
Mm-hmm.
You've given us a lot to think about where we are and where we're going.
Where do you think we're going?
Hmm.
Who knows? I...
Into a future of profound unpredictability.
If anything, that's what I've come to.
It's a little bit like writing a novel.
I feel like the future of AI,
it's like all of us writing this novel together,
and who knows where it's gonna go.
Yeah.
I thought it was so interesting when you asked that question
about are we afraid to look too hard?
And everybody in here agrees,
we're afraid to look too hard
because of what we might find.
We thank you for inviting us to explore and wrestle with these very challenging ideas for
our times and for this thrilling story. Aren't you? Pass it on to a friend. Now that you've read it,
pass it on to a friend. Copability is available wherever you buy your books. Audience,
thank you for reading the book, for sharing your time with us and your thoughtful questions
When I tell you this is the perfect summer read is it not?
All right, the you did the perfect summer read. Oh, thank you
Before we go. I want to share that I recently published something tailor-made for you avid readers
So delighted to tell you about it. It's my book lovers journal and here's what's great, has over a hundred
prompts like the first book that made you feel seen and the book that reminded
you of what matters most. All designed to enhance your reading experience. It's a
great gift for book lovers and I hear book clubs are using it when they meet
which makes me so happy. I write about some of my favorite books in first
sentences and it is available
anywhere you buy books.
This audience is going home with a copy.
A big thanks to our extraordinary partners
at Starbucks for supporting us.
There is no better place for coffee and conversation.
You were saying, do you write at Starbucks sometimes?
I write in all different coffee shops.
Yes.
I've written half of my novels in coffee shops over the years, including Starbucks. Well, this should feel like home
to you. Absolutely. Go well, everybody. Thanks so much. Thank you all so much. Thank you,
Oprah.