Factually! with Adam Conover - A.I. Companies Are Stealing Your Face with Kashmir Hill
Episode Date: March 19, 2025Get 20% off DeleteMe US consumer plans when you go to http://joindeleteme.com/Adam and use promo code ADAM at checkout. DeleteMe International Plans: https://international.joindeleteme.com/Th...e concept of privacy has drastically eroded over the past 30 years. The internet made our personal data more accessible than ever, but the most alarming development yet? Even your own face is no longer private. This week, Adam sits down with Kashmir Hill, New York Times reporter and author of Your Face Belongs to Us: A Tale of AI, a Secretive Startup, and the End of Privacy, to discuss how companies are exploiting our most personal information—including our very identities—for profit, and what that means for the future of privacy. Find Kashmir’s book at http://www.factuallypod.com/books.--SUPPORT THE SHOW ON PATREON: https://www.patreon.com/adamconoverSEE ADAM ON TOUR: https://www.adamconover.net/tourdates/SUBSCRIBE to and RATE Factually! on:» Apple Podcasts: https://podcasts.apple.com/us/podcast/factually-with-adam-conover/id1463460577» Spotify: https://open.spotify.com/show/0fK8WJw4ffMc2NWydBlDyJAbout Headgum: Headgum is an LA & NY-based podcast network creating premium podcasts with the funniest, most engaging voices in comedy to achieve one goal: Making our audience and ourselves laugh. Listen to our shows at https://www.headgum.com.» SUBSCRIBE to Headgum: https://www.youtube.com/c/HeadGum?sub_confirmation=1» FOLLOW us on Twitter: http://twitter.com/headgum» FOLLOW us on Instagram: https://instagram.com/headgum/» FOLLOW us on TikTok: https://www.tiktok.com/@headgum» Advertise on Factually! via Gumball.fmSee Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Transcript
Discussion (0)
This is a HeadGum Podcast.
Hey everybody, welcome to Factually. I'm Adam Conover. Thanks for joining me on the show again. Hey, everybody.
Welcome to Factually.
I'm Adam Conover.
Thanks for joining me on the show again.
You know, we're living in a world where our data is out of our hands, our habits,
our connections, and of course our faces are just out there flickering on some
company servers waiting to be scraped by whoever for whatever purpose they choose.
The very idea of privacy today seems to be something different
or seems to be altogether gone
versus what it would have been 30 or 50 years ago.
So what happens when an ambitious new company
decides it wants to degrade our privacy even further?
Well, my guest today is one of the best privacy reporters working
and she wrote a fascinating and terrifying book
about exactly that, a brand new company
that is eroding our privacy in ways
we have never even thought of before.
And that might grow even darker still.
Before we get to that, I want to remind you
that if you want to support this show,
you can do so on Patreon.
Head to patreon.com slash Adam Conno
for five bucks a month.
Gets you every episode of the show ad free
and a bunch of other wonderful community features
As well, we would love to have you as part of our online community
Of course if you want to see me live and I hope you do I'm touring my new hour of stand-up comedy all across the country
In fact the world March 22nd. I'm gonna be in London at the Leicester Square Theatre. I would love to see you there March 26th
I'll be in Amsterdam at boom Chicago. I cannot wait for those shows.
And then after Amsterdam, I go to Providence, Rhode Island.
What's better than Amsterdam?
Everybody says Providence, Rhode Island.
That's no shade to Providence, Rhode Island, okay?
You can do just as much acid and mushrooms in Providence
as you can in Amsterdam.
And I'm gonna have a great time doing it.
After that, I'm headed to Vancouver, Canada,
Eugene, Oregon, Oklahoma City, Oklahoma, and Tulsa, Oklahoma.
Head to adamconover.net for all those tickets.
I'd love to give you a hug at the meet and greet
after the show.
And now let's get to this week's episode.
Please welcome Cashmere Hill.
She's a reporter at the New York Times
and the author of the new book,
The Thrilling Exposé, Your Face Belongs to Us,
A Tale of AI, A Secretive startup and the end of privacy.
Please welcome Kashmir Hill.
Kashmir.
Thank you so much for being on the show.
Pleasure to be here.
So privacy.
We know we don't have any of it, but you have discovered some
distressing new dimensions of how little privacy we actually
have. Tell me about this company.
Clearview AI.
What is this company, Clearview AI.
What is this company?
Clearview is a New York based startup
that went out and scraped billions of photos
from the internet, public photos from Facebook and LinkedIn,
all your favorite social media sites elsewhere
and built a facial recognition app
where you can take a photo of someone's face
and see everywhere it's appeared on the internet.
And at the time I discovered them, they had been secretly selling this superpower to police departments around the country and around the world and no one knew about it.
Okay. There's a lot to unpack there. They have created a database of everybody's face.
So they take, they have an app. You take a picture of me and it shows you what.
I'm walking down the street, you're like,
wait, that guy looks suspicious,
or just, I don't know, I'm in love with him
and I wanna track his movements,
or I'm a neo-Nazi, I wanna kick his ass, whatever it is,
take a photo of me, put it into the app,
what does it show you?
I mean, it's gonna bring up your Facebook page,
your Instagram page, your Venmo profile, it's gonna bring up your Facebook page, your Instagram page, your Venmo profile.
You know, it's gonna lead someone to your name,
maybe where you live, who your friends are,
maybe even turn up photos of you on the internet
that you don't know about.
You're in the background of someone else's photo.
It really is quite powerful.
It kind of, it basically reorganizes the whole internet
around people's faces.
You know what this is?
Is it's like the scene in a CSI style show
where they've got a photo of the perp
and then they go to the, you know, the computer genius
who's like, oh, let me look in the database,
boop, boop, boop, boop, boop, boop, boop, boop, boop,
and then it does the scan of all the different places
you've seen their faces.
But the two, the thing about that is that A,
I always thought that technology didn't exist,
that's just fiction for television.
B, I never thought about, wait, that would be bad.
We actually don't want the police to have such a database
of everyone's whereabouts at all times.
What is dangerous about this technology?
Yeah, I mean, I think a lot of people would be in favor of this, right? Like, police using
this to solve crimes can potentially be a good thing. And I've talked to a lot of police
officers who say, yeah, this is amazing, particularly child crime investigators who get photos of
a child being abused. And now if there's a photo of the perpetrator, they can look them
up or even this is one of the first databases
that has children's faces in it.
And so sometimes you can search the child's face
and figure out who is this kid who's being abused.
And so that is a positive use case.
On the other-
But I can imagine some negative use cases
for being able to search for photos
of children across the internet.
I mean, my blood ran cold when you said that. I can imagine why a cop might like it, but.
Yeah.
I mean, when I talked to, and it was pretty hard to track down the company behind
Clearview, uh, originally the people who had created the technology didn't really
want people to know who they were.
But once I finally discovered who they were, um, the CEO, Haunt on tat said, you
know, this is the best possible use of facial recognition.
It's being used to find pedophiles, not by pedophiles.
Um,
but their tagline clear view AI find pedophiles, not by pedophiles, or I guess we'd reverse it.
We're not by pedophiles.
We help you find pedophiles.
It's, I, you know, um, that's the best pitch they have for the product.
Is that catchphrase?
My we're not bi pedophiles slogan is raises more questions.
Sorry, please go on.
I just cannot help but stop you at every moment
to marvel at this.
Their actual taglines were stop searching, start solving,
and artificial intelligence for a better world.
But yeah, but I mean, on the downside of this
is giving government the power to know who we are.
And we've seen this go wrong
in different places around the world.
Like Russia has used this kind of technology
when people are protesting it's war in Ukraine. So then they go and intimidate
those protesters or even arrest them. In Hong Kong protesters against mainland
China taking over started actually painting over the cameras there because
again police officers were showing up at their door after they protested,
ticketing them, intimidating them.
And so it gives the government a lot of power.
Nowadays in modern America, like if you're a federal employee
who is in favor of DEI efforts, you
wouldn't want to probably show up at a protest
in favor of diversity, because what if they take your photo,
and now all of a sudden people are looking into
whether you're promoting diversity within a federal agency
against the executive order that President Trump has issued.
Right, and the frightening thing about this
is it's literally just a photo of your face
can connect you to, connect them to your identity,
which was not such an easy thing to do before.
I can imagine how that's a problem
in the hands of the state.
Are there any private actors that have access to this?
Yeah, that's what I found so fascinating
about facial recognition.
I've been writing about privacy for a long time,
about data getting used we don't expect.
And you know, like, cookies are kind of frightening,
people are tracking us around the web,
your phone is tracking everywhere you go. Like, cookies are kind of frightening, people are tracking us around the web, your phone is tracking everywhere you go.
Like these are all privacy concerns.
But the idea of your face being this key
that now links everything that's knowable about you
from the internet to you as you're moving
through the real world has real repercussions
for our ability to be anonymous,
our ability to be private.
One of my favorite examples of this
that happened just as I was finishing this book
was that Madison Square Garden, the events venue,
decided that they had incorporated facial recognition
a few years back for security reasons
to keep out people who threw beer bottles down on the ice
during a Rangers game or somebody
who had a fight in the crowd.
And they decided, well, we've got this facial recognition
in place.
James Dolan, the billionaire owner of the venue said,
I wanna keep out my enemies.
I don't want the lawyers coming in here
who work for firms that have sued us.
And so they went through to 90 or so law firms
that had suits against them, went to their websites,
found photos of their lawyers, put them on a ban list. And when lawyers from these firms
tried to, one parent tried to take her Girl Scout troop to see the Rockettes at Radio
City Music Hall, which Madison Square Garden owns. Somebody tried to go to a Mariah Carey
concert, tried to go to a Knicks game and they turn them away at the door
and said, you're not welcome in here.
Cause you work for a law firm that has a suit against us.
So until you drop that.
You're not coming to any of our games and they can do that because there's no
discrimination law that protects you based on where you work.
That's not a protected class.
Really? So they're allowed to,
they couldn't turn away a person based on what race they are because there's a
per there's a specific protection for that and for other identity based,
uh, qualities,
but they can turn you away for more arbitrary reasons that are not specifically
protected in the law. Like, there's nothing in public accommodations laws preventing Madison Square Garden from
turning away just random people, just they don't like.
I understand you ban someone for committing a crime or stealing or fighting, but there's
nothing to stop them from kicking you out just because they don't like it.
That's fucking insane, Kashmir. I know.
I mean, that's why I find this so alarming.
Because I think this allows us to take a lot of the activities
that have been happening online for a long time
and bring them into the real world.
So we're in such a polarized place right now.
You could imagine people being banned
for their political views, whether they're
pro-vax or anti-vax, pro know, pro-Trump or anti-Trump,
you could just imagine businesses or people
starting to make these lists, and they scan your face
and they just know who you are and kind of
what your values are, and they can judge you based on that.
And I just think it has the ability to make our society
even more polarized.
Yeah, I mean, any sort of restriction on who can enter
or leave a place tends to polarize us.
Certainly, you know, anti-vaxxers were very upset
about vaccine mandates because it prevented them
from entering or leaving certain spaces fully.
Now there's a difference, you'd say that that's
for public safety, but that's, it's something that like violates a sort of American
sense that we have of we should be able to move freely through the world.
And of course we can't in so many ways, you know, we have to, so many
parts of the country have to purchase a car to move freely, freely through
the world, all these other sorts of things, you know, depending on your,
uh, race or gender, you can be barred from certain places, but the idea that we could be specifically followed and tracked and
targeted on an individual basis as a, it's one thing for a group identity,
that's bad enough, but for on an individual basis that you could be blacklisted.
I mean, I can easily, if the Madison Square Garden example is really chilling, was there any pushback to this?
Are they still doing this?
I remember reading this story about the lawyer
being turned away from Radio City Music Hall
as being like a shocking story.
Can that lawyer still go to Radio City Music Hall or no?
Like is this still the case?
I think that, I'm not sure if that law firm,
if they wrapped up that litigation or not.
I actually went at one point with a personal injury attorney
just to see this happen and it was incredible.
Like we, I bought tickets to a Rangers game.
We, there's thousands of people.
So a cheap game.
You're like, we're not gonna go see an NBA game.
We'll go see hockey.
So that our little experiment doesn't cost so much.
Exactly, the New York Times was paying. little experiment doesn't cost so much. Exactly.
The New York Times was paying, so I don't want to spend too much money.
Well, the New York Times is paying.
Then go see the Knicks when LeBron is in town.
That's what I say.
All right.
Go on.
I should have gone to a better event.
But yeah, I mean, we walked through the door.
We put our bags down on the conveyor belt to go through the security machine.
And by the time we picked them up, security guard came over,
said, I need to see your ID to confirm the match that had been
made by the facial recognition.
And he said, Hey, my manager needs to come over.
The manager came over and wrote her this little note that said,
you're not welcome here and kicked her out.
So it was, it was very powerful.
And this lawyer had previously sued Madison Square Garden.
You said they had a personal.
No, she hadn't sued.
One of her colleagues at the personal injury law firm
that she worked for.
She just worked at the same law firm.
She just worked there.
But this is a personal injury law firm.
I'm gonna guess they sued Madison Square Garden
because someone had been injured
at Madison Square Garden, presumably.
Somebody at a Drake concert had a beer bottle thrown, hit them in the head,
and they had a suit against Matt. One of her colleagues had a suit against Madison Square Garden
and Drake over that, and so that is why she wasn't allowed in.
I mean, suing...
Personal injury lawsuits are legal to file cashmere.
Like, this is part of civic life life is if you have a public business that that millions of people go into a year,
some of them are going to sue you sometimes and you're going to have to fucking deal with it.
Like sometimes Madison Square Garden might be at fault.
Like it needs to be okay to sue Madison Square Garden sometimes.
MSG's justification originally was we don't want anybody from these firms coming in here and conducting a legal discovery, like coming into the venue and asking.
Get the fuck out of here.
But then later, but then James Dolan was just like, hey, you know, I'm a businessman and I get to decide who comes into my business.
I don't want these people coming in.
He was like a bit more blunt about the real. And you know who I want?
I don't want any good basketball players to come into my business.
And I also don't want any lawyers who I, I want, I want to ban any, any
successful NBA players because I want my team, keep my team mired in
misery.
And I also don't want any lawyers who could sue me when fans get so angry,
they start kicking each other's asses.
And as you might imagine, if you ban a bunch of lawyers, you're going to deal
with a lot more lawsuits.
Like there were definitely a lot of lawsuits over the ban and some people were
trying to fight this under, there's a very obscure New York city law that says that
if somebody has a ticket, you have to let them into a performance.
Uh, it was actually, it was actually passed to protect theater critics
because theater owners would get their production, you know,
panned and then they would ban the critic.
And so this one critic got like the legislature
to pass this law to protect him.
And so, yeah, so lawyers sued and they were fighting it.
But eventually the lawyers,
I was following some of these cases,
they dropped it because. But eventually the lawyers, uh, I was following some of these cases.
They dropped it because their lawsuit over the band was, was what was keeping
them from being able to go to Nick games.
They said, okay, I'm just gonna drop it.
I want to get in.
They're calling the other lawyer client going like, yeah, I'm sorry, but we got
dropped the case.
I really, I go out and go to a next game.
I can't my kid. He loves, uh, I can't name a next player.
Porzingis.
I don't know.
That was years ago.
So, so this is presumably this practice is still in place.
That's my point is that, is that this is legal for Madison Square Garden to do.
Again, this is chilling.
I'm imagining, all right, well,
if there's no protection against that, right,
what's the protection against, I don't know,
think of any of our monopolistic retail companies
that provide basic services like your pharmacies,
like a Walgreens or a CVS,
banning people from ever entering a Walgreens or a CVS, you know, banning people from ever entering a Walgreens, right?
Which would prevent people from getting medical care.
I could start listing examples like crazy of what could.
Leave a bad review for a restaurant
and then they're like, okay, well,
you're never coming in here again
or any of our sister restaurants.
I mean, there's all of these things that are not protected.
Massive Square Garden, of course, owns the Sphere in Las Vegas.
And so I've been waiting to hear about lawyers who can't get into the sphere.
One place where you are protected, and I want to mention this because laws can work.
Madison Square Garden owns the Chicago Theater, and
they cannot ban
lawyers based on their faces there
because Illinois has this law
called the Biometric Information Privacy Act
that says that you can't use people's biometrics,
including their faces, their face print,
without their consent.
So if they wanted to ban lawyers there,
they would have to say,
hey, can we get your permission to use your face print
to keep you out of here?
And so this is one of the places where, yeah,
they can't institute the ban that way.
And so it's one of the chapters in my book
is like how that law got passed,
but we can pass laws that govern AI
and how technology affects us.
We just, it doesn't often happen.
Yeah, that's all we have to do is pass some good legislation
in America.
That shouldn't be hard, right, In the United States to get some legislation
to protect people, you know, based on expert consensus.
We'll take care of that in the next couple months.
This technology, how good is it in that, you know,
AI is in the name of Clearview AI.
And by the way, I'm sure there's other companies
that are providing similar services, right?
It's not only Clearview.
This is like a trend, I would imagine. AI is by the way, I'm sure there's other companies that are providing similar services, right? It's not only Clearview This is like a trend I would imagine
AI is in the name AI we've talked about many times on this show
gives the appearance of being accurate far more than it actually is that the the error rate even on chat GPT is quite high how
high is the false positive rate for I mean you're saying they're taking photos of lawyers, just to take that one example,
of photos of lawyers from the website
and using it for facial identification.
I'm imagining like a 200 pixel GIF of a lawyer's headshot
might produce some false positives when the same person
is walking into Madison Square Garden
and being observed in a security camera, right?
Yeah, I mean, it's part of the reason why they ask
for her ID, right?
Like, they want to confirm they have the right match.
It does work pretty well.
I actually heard of this one lawyer who was a twin,
and his twin would get stopped when he was going in.
They'd say, hey, we know you're that lawyer.
And he's like, no, I'm his brother.
I'm fine, let me into the fish show.
And that's gotta be really good
when you're on a date.
You're getting stopped in a massive,
sorry, sorry, these guys are gonna,
just my twin is banned from Madison Square Garden.
It's a whole thing.
I'll tell you about it when we're at our seats.
I'm the good twin.
I'm the good, I'm the good twin.
But, so like, uh, I mean, there's a range, right?
Um, facial recognition a range, right?
Facial recognition has gotten much better.
I mean, there's this one federal lab called NIST.
We'll see if they're still funding for it in the years to come.
But NIST is amazing.
We almost covered them on my show, The G-Word.
We did not cover them, but it's like an amazing federal agency
that does like all this really cool science shit.
Like as soon as you start looking into it, it's like really, really cool stuff.
NIST is incredible and they have been testing the accuracy of facial recognition algorithms since 2000.
So for more than 20 years now. And so you can actually see the kind of advance in the technology
over those years. It used to be terrible. It really didn't work that well.
It especially didn't work that well
on different types of groups.
So it worked best on white males
and less well on everybody else.
But it has improved over the years.
And now in testing, you know,
it's accurate to like 99% accuracy,
but it depends on the condition.
So if you have a high resolution photo of somebody,
like from the law firm website,
so this may well be high resolution headshots of them.
And if you have that,
it makes it much easier to identify somebody.
If you have like a good camera and high resolution headshot,
that's gonna get a good match.
But let's say you have a kind of grainy surveillance camera,
that might lead to an unreliable match.
And we've seen that happen in law enforcement.
We know of a handful of people who have been arrested
for the crime of looking like somebody else.
The facial recognition match came up,
the police are supposed to do more investigating
to make sure that's the right person, but they didn't. In many cases, they would go to an eyewitness
and say, okay, here's a, they call it a six pack photo lineup. Here's that person and
five other people who looks the most like the person who assaulted you. The eyewitness
would agree with the computer. Yeah, I think it's that person. They would basically go
out and arrest that person. And so that has led to false arrest, people spending a night in jail,
weeks in jail, months in jail,
an eight month pregnant woman got arrested
and charged with carjacking.
Oh my God.
By a woman who was not pregnant the month before.
So it really can go wrong
if police don't do further investigating.
The AI looked at an eight month pregnant woman
and a non-p non pregnant woman and said,
these are a match.
They looked at surveillance camera footage.
It's only matching the face.
It's not matching the entire rest of the person.
And it wasn't looking at a current photo of her.
It was her driver's license photo or a mug shot, if we're getting that case
from many years earlier.
And so yeah, you can see how these things can go.
They, they can be powerful tools to help law enforcement, but they can also
send them down the wrong path.
Well, it reminds me of, you know, our own coverage of even fingerprints, right?
As a, as a forensic tool, uh, the, the public and the police have such faith
in them that, you know, as a perfect, oh, every fingerprint is different.
This is a perfect way to identify people.
It leads to, makes the false positive effect very strong
when it happens.
Is that, well, this fingerprint is a match.
This must be the same person.
There's a famous case of, I forget what country,
but it was like a terrorism case
where someone who was like never in the city,
their, you know, their fingerprint was in a, a fingerprint database and they were
arrested. Uh, I wish I could remember the details cause I covered this years ago.
Um, but you know, they were arrested like very far from,
from where the event took place. They're like, I've never even been there,
but you know, there was a match in the database and it took like years for this
person to finally get free. I can imagine, uh,
I can just see the same thing happening with AI.
I can imagine the witness going,
oh, well, if AI says so, yeah, that is the person.
And then the judge saying the same thing
and the jury saying the same thing.
Oh yeah, the AI is very powerful.
Oh yeah, we all trust the AI.
Yeah, Elon says it's very strong.
So away to prison.
You know, our faith in this technology is like a bug.
Well, I mean, it can be powerful sometimes. You know, our faith in this technology is like a bug.
Well, I mean, it can be powerful sometimes. And so it's just like, how do we bring the skepticism
we should bring to all, honestly,
policing and police investigations,
where, yeah, I mean, the confirmation bias
is not just a problem with facial recognition.
This can happen in other arenas of policing,
where once you focus on that person,
every single piece of evidence seems to confirm
that yes, they are it, and you kind of ignore any evidence
that conflicts with that.
And look, I'm gonna editorialize for a second,
but my own belief based on all the stories
on criminal justice reform that I've done
are that when the police are trying to solve crimes,
I put that in quotation marks,
they're not approaching it like a scientist would
and saying, hey, let's really test what we find
and make sure we have the right person and doubt ourselves.
It's how do we put the cuffs on somebody
so we can put a check mark in our little book
and move on?
It's a tool to get a, you know, if not a conviction, a charge, right?
Rather than, hey, let's make sure we're getting to ground truth here.
And that's like one of the fundamental problems with how we prosecute crime in this country is it's like,
oh, is this, can we make the case about this person? Then we're going to do it.
And that means any evidence we're going to add to the pile and we're going to ignore any negative evidence because we don't want to know that we're wrong. We just
want to confirm. I don't know if you have a view on that, but that's, that's my understanding
of how it works.
Yeah. Um, yeah, I think that it can go wrong. There's, there's actually a really interesting
example of the other side though. So, so facial recognition, it says when you get, do you
like run one of these reports and the way facial recognition matches work is it isn't just, oh, here's a picture of Adam and
it says like, yep, this is Adam, you got him. What happens is you put in the what they call
probe image and it'll give you a list of a whole bunch of different matches and then
a human facial recognition expert actually picks somebody. So just so people understand
how it works. But there was this case in Ohio
where there was a shooting death
and they had surveillance camera footage
of the perpetrator.
They ran it, they got a facial recognition match
for somebody and you're not supposed to rely on that.
Like the facial recognition match report
that the detective would get would say,
this is not probable cause.
Basically, you can't get a search warrant based on this.
But in that case, they did.
And they got a search warrant.
They went to his house.
They found a gun that seemed to be the gun that
was used in the shooting.
The authorities allege.
They arrested him, charged him, and his defense attorneys
found out that they just did this facial recognition
search to identify him.
And they challenged it with the judge. And the judge ended up throwing out the out that they just did this facial recognition search to identify him and they challenged it with the judge and the judge ended up throwing out
the evidence that they got and said that that warrant was obtained illegally, that they
needed to do more evidence to tie him to the crime.
So we are really seeing this start to get pretty nuanced in the criminal justice system.
Like judges are starting to realize, hey, the cops are using facial recognition.
We need to make sure that they're doing it,
you know, ethically, legally.
So that's good, I think that's reassuring.
That's a good, positive case,
and we would hope for more like that,
but it's, you know, in the American criminal justice system,
it's not something that we can necessarily count on,
but thank you, thank you to that judge
for not railroading this person.
Who founded this company?
Where did this company come from, Clearview?
Yeah, so Clearview was started by two people.
The mastermind, though, is Juan Tontat.
He was a young guy, kind of like in his 20s,
one of those tech wizards who'd been,
he was born in Australia, moved to Silicon Valley
when he was like 18 years old, 19 years old
because he wanted to make it as a tech guy,
was trying lots of different things.
He had done iPhone games, he had done Facebook quizzes,
like would you rather back when that was big on Facebook.
He did a Trump hair app where you could put Trump's hair
on someone's head in a photo, like back in 2015, he was a little bit a fan of MAGA.
And then, all of a sudden,
he gets together with this kind of New York media politico guy,
and they decide to build this facial recognition app,
which was unlike anything any other tech giant had really built or released.
And they just kind of scrapped it together.
And it was very advanced, worked really well.
Originally they were matching people's faces to Venmo
and then they started expanding, expanding,
gathering billions and billions of photos.
They got funding from Peter Thiel,
$200,000 to get started.
They started matching people's faces to Venmo?
So Venmo was one of the places where it was really easy
to get people's photos because Venmo did not have
very good kind of protections over people's images
at that point.
When you signed up for Venmo, your account was public
by default, and if you went to Venmo.com,
they showed real transactions that were happening
on the network, like Adam paid cashmere.
Yeah.
And so this is why I'm not on Venmo is because they had such bad privacy that
like people started searching for me because they knew my work.
I started being flooded with requests for money and there was no way to turn it
off. Like there's there was no way at the time to make your Venmo account private.
And by the way, a couple of years later, they like found some, I think vice reporters found like Joe Biden's Venmo account and were able to like see all of his transaction.
Like it's insane for a payment processing company to specifically have they specifically had a no privacy policy.
They were like, no, no, no, you cannot make anything private on Venmo. It's all going to be public,
which is insane for a payment software.
But how do you split dinner checks?
I got five bucks in my pocket. I say, I'll get you next time.
Like it is, I'm sorry.
Is it a, is it really a great thing at the end of dinner to be like,
now let's pull out our phones and send each other money?
Like as we're looking at the chat, does anybody feel like, wow,
what a great end to a dinner now that we're like adding up,
oh, let me send you 1575. Oh, I didn't get a drink.
So fuck off. Someone just put down a goddamn Amex and move on with your lives.
This was solved.
So you memorialize it forever so you never forget the dinner. The little cocktail emoji with your lives. This was solved. Now you memorialize it forever
so you never forget the dinner.
The little cocktail emoji to your friend.
But so Juan Tontat discovered he could go to Venmo.com.
He could send a scraper there
and it would just hit Venmo.com every few seconds
and pull down all these people's photos
and links to their profiles, which has their names.
And so, yeah, so one of the first places, one of the first things they were doing is I'll
take a picture of a stranger and I'll show you your, your Venmo account.
Oh my God.
Yo, and take a picture of a stranger.
And, uh, this person, uh, just spent.
Sent $20 for tree emoji, squirt emoji.
What do you think that means?
Uh, like Venmo is basically all emojis, meaning drug and sex work transactions.
So, uh, you know, it's, it's pretty like of all the databases they could be
pulling photos from to identify people.
It's one of the most alarming.
Yeah.
I mean, what was shocking to me though is like this guy went from Facebook
games and iPhone games, you know, to building this de-anonymity app that
can identify anyone in the world. And I'm like, Juan, like, how did you do this? And he said,
well, you know, I went on Twitter and I followed machine learning experts. I went on GitHub and I
searched for, which is like a code repository where people share code. And he's like, I Googled,
I searched for face recognition and I kind of laughed and he kind of laughed.
He's like, I know it sounds like I Googled how to build a flying car and then I just built one.
But what was happening in the kind of bigger picture here is that this was a time
when open source was really big and people were sharing their code and Juan was able to just
basically like look at what other
Really advanced groups had done in facial recognition technology study their code use their code build on their code and
What was unique about what he had done is it wasn't so much a technological
Achievement as it was I call it in the book, ethical
arbitrage, where he was just willing to do what other big tech companies and other entrepreneurs
hadn't been willing to do is gather all these photos and pair them with this powerful technology
and create an identification app.
And we're really seeing that happen kind of all over the place in AI.
We're seeing with generative AI right now
where everybody got shocked because DeepSeq came along
and it was created by this tiny little Chinese team.
It's because a lot of this technology now
we're sharing the methods and it means that
it's not just the gatekeepers in Silicon Valley anymore.
It's easier for other people to develop these really,
really powerful technologies.
So he's, so he's going, all right,
I'm maybe not that great of a programmer.
I don't have the advanced technology,
but what I am is a piece of shit.
And that gives me an advantage.
I can do something that other people are not willing to do.
I can violate people's privacy on,
on a way that like Google has not decided
they're okay with yet.
That's, that's what you're saying in terms of ethical arbitrage is he's just like,
I'm willing to do something other people aren't.
Yeah. I mean, I think Juan was ambitious. I think he wanted fame. Like he was like one of these tech
entrepreneurs who wanted to build something that a lot of people would use. Specifically,
when he first started building this, they were thinking, he was pretty politically motivated. One of his collaborators said they
came up with the idea for this at the Republican National Convention when Trump was the nominee
the first time around, and that they started thinking about it because there's all these
strangers at the convention and they kind of want to know who's a friend, who's a foe,
can we identify the liberals here? I mean, they were really thinking about how do you figure out who somebody is?
How do I figure out how I should treat this person in the real world?
And so they had that kind of feeling about why they wanted this technology to exist.
But I think he built this just wanting like, I want a lot of people to use this thing.
Originally when they first built it, they were thinking they would sell it to
private industry, maybe to make it available to individuals.
They kind of just stumbled upon police as the best possible customers
who are willing to pay for it.
Well, police budgets, you know, skyrocketing everywhere.
But, you know, every police department is being given more money and yet
can't hire more cops because not that many people actually want to be cops
And so they have all this money sloshing around that they can spend on
Technology that they can you can use to monitor and surveil people
So look, I understand that we talked about a couple cases where this technology could be actually
helpful to prevent child abuse or something like that.
But every time you're talking about it, I'm imagining this becoming all pervasive.
You're saying people at the RNC are, oh, well, could we figure out who the liberals are in the crowd?
I could imagine them wanting to do that.
And I think that's horrifying.
Right.
I also think it'd be horrifying for the DNC to do the opposite, to have some eye
in the sky that's like, Oh, that guy writes for the Babylon B or whatever.
Let's like tail him.
But I it's also entirely plausible.
And I can also imagine, you know, the LAPD or the NYPD having blanket surveillance of people,
monitoring people's faces at all times,
what would be bad about a future
in which this technology becomes pervasive?
Yeah, I mean, where I worry about this becoming pervasive,
and while Clearview is limited to the police,
there are other face search engines that anyone can use.
Some are free, some you pay for, and so it is possible right now for you to be at a bar and for you to take a photo
of someone else and find out who they are or for them to do that to you. And so I think about this,
like every time I'm out in public and I'm at dinner and I'm having a juicy conversation,
and there's somebody next to me who can hear it, I'm like, well, what if they, I don't know,
snap my photo, figure out, you know, who I am. And all of a sudden, they kind of understand the context of the conversation.
Or you're at the pharmacy and you buy like hemorrhoid cream and somebody there just takes
a photo because they think it's funny. And then they're, they tweet it out, you know, they're
like, ah, or they exit out. I don't know what we're calling it these days. And they're like,
hey, look who just bought some hemorrhoid cream. It just would mean that you could no longer kind of move through society,
assuming that most people around you don't know who you are.
Um, and that would just change how we operate.
And I think it could be, you know, chilling in some ways.
You know, what's funny is, uh, I have some minor celebrity because of my past television
work and you know, so when I'm out in public, people will occasionally come up to me and
say, Oh, I like your work.
And that's very nice.
Right.
It's very nice to have that conversation.
I always remind myself for every person that comes up and says, hello, there's 10 or 20
people who go, Oh, I recognize that guy, but they don't say anything.
Right. Cause I don't say anything when I recognize,
you know, someone from television or a film.
I'm just like, oh, look, there they go,
down the street, right?
What that means for me is I don't pick my nose in public.
You know what I mean?
I'm like, all right, people are, I don't want someone to go,
oh, I saw Adam Conover and he was picking his nose,
or he was shoplifting from the Walgreens,
or whatever it may be.
You know, I try to, I'm like, I'm my best behavior
because I'm a little bit like,
oh, I'm being watched all the time.
That's unusual, right?
That's a special circumstance that I'm in.
People ask, what is that like?
Because they are used to the presumption of anonymity
moving through the world.
A world in which none of us have the presumption
of anonymity,
where every action you take could be traced back to you by anybody who wants
to, whether it's an authority or another person on the street is a dramatically
different social world. Is it not?
Yeah. And some of the people around Clearview I talked to were like,
that's a better world. It's good if everyone's accountable all the time.
But who are they to fucking say?
Oh, no, our technology, our technology is going to fundamentally reshape human society.
We think that'll be good.
Well, what if we don't want you to do it?
So please go on.
I just wonder.
I mean, I wonder people listening right now, you know,
do you think there's gonna be a switch
that flips where it's like, okay,
we don't assume that we're anonymous anymore,
we assume that we're known all the time,
and would you want that world or not want that world?
I mean, I don't know what it would be like.
I guess you know what it's like.
And it sounds like, yeah, you just have to all the time
be like, oh, is somebody gonna report?
I guess Gawker's not around anymore,
but is somebody gonna report this?
Is somebody gonna tweet this?
Is somebody gonna, they see me and they know me
and they're gonna act on it in some way.
We're taking baby steps towards that world all the time.
Something that happens to me sometimes
is I'll walk into a business
and the person behind the counter
will act as though they already know me. You know, they'll be like, Oh,
Hey Adam, yeah, here's some ice cream. Like they'll literally just say my name.
Right. Um, and I don't mind that cause sometimes,
sometimes I get a free cup of ice cream. They're like, I like the show.
That's a nice version, but,
but we've started to step into a world where that's happening in a more routine
way. For instance, if you use a global entry, I came back from Canada, I was doing shows in Toronto,
and if you have, the way global entry works
is they scan your face and then you walk up to the podium
with the agent, with the border patrol agent,
and you know, in the past you would give that person
your passport, now that person just looks at you and says,
Adam?
And you're like, yes, he says, okay, you can go because they did it with a facial recognition.
And the first time that happened to me, I was like, Whoa, creepy. This time I was like,
Oh yeah, yeah, they know about my face. And, you know,
pair that with the kind of technology.
I don't know if you've been to a sports arena where you just,
they've got one of the kiosks where you walk in and you just take the water and walk
out, right? And a camera is just tracking you and billing your credit card.
You swipe your credit card on the way in, grab a water,
and then you do nothing else on the way out. People find this so weird.
They literally have someone standing there to tell you, no, no, no,
just walk out. The cameras are going to charge you.
The cameras are going to charge you because it's such a eerie sensation.
But that's only going to be new for a couple more years.
And if you combine those sort of technologies together,
you know, imagine a place like when you go to Whole Foods
and, you know, it's integrated with your Amazon account.
How long before I walk into the Whole Foods
and, you know, the kiosk is just like,
welcome to Whole Foods, Adam.
You know, do you want to,
do you want your oranges today or whatever the fuck?
Like, it's, I can really imagine this growing
so that we start to just have a pervasive sense
of being known and followed throughout the world
by systems and by people.
And for you, when you're recognized now,
like you walk into a store and you get free ice cream
because you're-
Sometimes, sometimes.
I don't expect it.
I never expect it.
But there's- And I tip, I tip if they give me free ice cream.
I tip them even more than the cost of the ice cream.
Because honestly, I have to at this point.
If you want a good tip from me, give me something for free
and I'll put 10 bucks in there.
What I mean is that it comes, it's a positive, right?
It's a positive fame.
There's a certain amount of power.
It's a good thing. It's a bonus amount of power. You know, it's a good thing.
It's a bonus.
For the rest of us who might get recognized, it doesn't come with that same kind of power
or celebrity.
It might be that you're recognized and it comes along with the worst thing you ever
did is attach to you now in a way that wouldn't have been if your face wasn't Googleable,
you know, that you'll walk into a store and they know exactly, you
know, how much you have to spend and how to treat you based on that.
Yeah, it will be for a lot of people will be disempowering to be recognized, not empowering.
And there's this one book I love by Gary Scheingart, Super Sad True Love Story.
And it's all about this world in which as you're walking around, people know your credit score,
they know your romantic score,
how you've been rated by exes.
You know, and we're kind of moving towards that world
where we have an Uber score, we have an Airbnb score.
Yeah, you just imagine a world in which your score
is attached to your face and it's how people treat you.
The way that people act when they're in an Uber, right?
Where they, I know a lot of people who are sort of terrified
in the Uber because they don't want to get a bad Uber rating
or, you know, in the Airbnb.
I've stayed in an Airbnb with people where,
I was in an Airbnb with a couple friends
where this man had turned his Airbnb
into some kind of Christian conversion camp.
Like it was blasting religious music and he had like Jesus stuff on the walls.
And a couple of my friends were Jewish and we were like, this is creepy.
Like he didn't disclose in the listing that it was going to be like the Airbnb was going
to be actively proselytizing at us in a way that was like very off putting.
And I was like, you should, you should give a rating on the guy's page because so let
people know.
And he said, no, I don't wanna do that
because he's gonna rate me.
You know, there's this way that people feel
in those situations where they know they're being rated,
where they feel too paranoid to be themselves, right?
Where they have to act self-protectively.
And if you imagine moving through the entire world
like that, that's like that's a that's a bad world
Yeah, yeah playing ratings chicken all the time. I also imagine
You know, there's a lot of places in this country where you know a felony conviction followed follows people around right?
They serve their time and they're still cut off from parts of society. For example, can't vote have you know?
Have to check a box on employment.
That sort of discrimination,
of something that you've done in the past,
following you into every social situation,
I can imagine being a lot more pervasive as well.
And we saw that with the protests,
the Gaza Israel protest on college campuses
where people were taking pictures of protestors, identifying
them, putting it up on websites saying, do not hire certain people. I mean, we are really
kind of seeing this happen in society a lot and face recognition makes it much easier.
You're a jerk in a crowd. There was like a recent Philadelphia Eagles fan who was being a jerk to this one
podcasters fiance. And he tweeted, he said, who is this guy?
And the guy was quickly identified, fired.
I mean, we'll just see more and more of that happen.
Like you have to be responsible all the time for what you're doing or it may come
back to haunt you.
And I'm sure there'll be plenty of stories trumpeted about how somebody was
protected by such mass surveillance.
But there's also going to be lots of stories of false positives or lives being ruined that probably won't be covered as much.
It feels that like this sort of topic, these kinds of privacy threats, are something that we were very worried about a couple years ago.
That there was a lot of press about them. Now it kind of feels like we have maybe gotten used to them,
that we have failed to a certain extent as a society
to push back against the growth of the surveillance state
and the growth of surveillance capitalism
as two simultaneous trends.
Do you worry that society's becoming inured to this
and we're just going to allow it to happen?
I think what's hard about privacy is that people
don't really care about it until it's been violated.
And so, yeah, like we all put our photos up on the internet
over the last 20 years.
We're induced to do so in some cases,
like Facebook told us, put your photos on the internet,
put your real name attached to those photos.
Venmo said, if you have a profile, your your your your photo is going to be public.
Like we were really kind of pushed to do it.
But we also did ourselves because we liked, you know, sharing our sharing our experiences with other people.
But most of us who were doing that weren't thinking that a company like Clearview was gonna come along and find all those photos
and make them findable by people we might not want
digging them up.
So it's hard to predict, yeah,
like how these voices you're making
are gonna hurt you down the line as technology improves.
And sometimes you just don't know
the compromise you're making.
So we talked about this before the show began,
but this last year I was reporting on cars.
Yes.
This is my next question.
Please tell me about this.
Yeah, so cars are connected now.
You know, you can pull up a map.
You can download a smartphone app so you can turn on your car from afar.
On a cold day, you can turn it on, have it warm by the time you get out to it.
You can find it in a parking lot.
You know, all these benefits of connectivity.
What a lot of people didn't realize is that giving their car an internet connection meant
that their automaker is collecting data from their car.
And automaker knows where they drive, how fast they drive, you know, when they're hitting
the brakes, when they're accelerating, just lots of data, some that they use in beneficial ways, like seeing whether your car has a problem
and then sending you a notice that you need to bring it in.
But in the case of General Motors, they started selling this data to risk profilers who work
with insurance companies who would then score the driving and say kind of like whether you were a good driver or a bad driver.
And it started affecting what people were paying for their insurance.
And they had no idea until they would get an adverse notice under the Fair Credit Reporting Act.
And they'd be told, oh, yeah, your insurance went up because of your LexusNexus report.
They would go to LexusNexus and then find out that Lex that Lexus Nexus had every single trip they had taken in their car, how many
miles they drove, how many times they rapidly accelerated, slammed on the brakes, sped. They
just had no idea that General Motors was selling this about them. And this is kind of the world
that we are living in where there's not a lot of protections around our data. It's really unclear to us what we're agreeing to.
Um, yeah, I mean, we, yeah, we, we aren't fixing it.
I think I got that one fixed for a lot of people.
Like GM stopped selling the data after my stories came out and in your
time, I got that one particular one a little bit fixed.
Yeah, that one is so shocking because as much as we all hate bad drivers and I hate them more
than almost anybody, we also all believe that bad driving is your right as an American.
You know, you should be able to tailgate, you should be able to slam on the brakes,
you should be able to be an asshole on the freeway.
Like, you know, that's just, that's just part of your behavior as a human.
And we all, one thing we hate more than bad drivers is insurance companies.
And so, so this is a story that, yeah, you tell almost anybody in America,
they're going to be shocked.
And I can imagine why, at least, you know, in the recent past, exposing that
could cause them to change their behavior.
But I do wonder after this sort of surveillance under
capitalism is completely endemic.
How, how effective is reporting like yours going to be at stopping the next
company that wants to, I mean, to literally track your own customers,
some of whom have probably just paid the car off, right?
They bought the car free and clear. They have no connection with GM.
And yet GM is tracking them in the car that, you know, they,
they sold them and selling the information to their own insurer
to hurt the person who purchased the car by making them pay more. Like disgusting.
It was pretty egregious. And I did talk to experts who said, you know, this for the,
for road safety, it's a good thing for people to sign up for programs where they're rating,
where their driving is rated and it determines their insurance
rates. I mean, anyone who has a Tesla is probably doing this, where their actual insurance rate
can change from month to month based on how they drive. But people need to know they're
being monitored. Otherwise, they're not going to drive more safely. And that wasn't happening
here. It was just secretly being sold about them. And so, yeah, it was egregious.
But there's probably more stuff like that that's happening out there.
Well, that's my question.
I mean, like I said, this is an egregious example
that you feel you were single-handedly able to stop.
So thank you very much.
But is this a trend that we have any chance
of arresting as a society?
I can only imagine what, you know, Hyundai and, uh, you know, other companies are doing.
I mean, I think this had a chilling effect on the auto industry in general.
Um, but there certainly are programs like this with your insurance company where you can say, hey,
yeah, I do want to be, um on my driving. And they'll offer discounts,
like if you are willing to let your driving be monitored.
And those things can be good for public safety.
Actually, one of the crazy things I discovered
when I was working on all this
is that so much of your insurance
is based on your credit rating.
So if you have bad credit,
even if you're a good driver,
you're probably paying more for your insurance
than somebody who has good credit
and they've had lots of accidents.
Why would that be?
It's just what insurance companies have done
for a long time.
They just assume if you have good financial credit,
that means you're a responsible person
and you're gonna be charged less for insurance.
It's like heavily weighted.
But if you have good credit, you're probably just rich.
Like if you're, if you're paying off your credit card, it just means that your
monthly income exceeds your, you know, daily Starbucks purchases or whatever.
It doesn't mean that, you know, means you're not, you're not fiscally constrained.
Whereas if you are putting your groceries on a credit card, it's going to, because
you don't make that much money, it's going to tank your credit score.
Like it's not a mark of virtue to have a good credit score.
Right.
And that kind of fundamental unfairness of that system is why some consumer advocates
told me, yeah, actually we would like to see insurance companies move toward rating people
on how well they drive because it's not fair that they're putting so much of an emphasis
on what your financial credit is.
So now you're poor, you're having a hard time paying your bills, you have bad credit, and that means your insurance is higher.
It just is, it's kind of unfair.
So they were like, you know, it could be good
if they're surveilling their driving,
but they need to know, they need to say yes to this.
It should not be done secretively and without consent from the consumer.
I mean, do you have concerns about.
This the more pervasive growth of surveillance again, both in the state and in capitalism, combined with how the political situation in this country is changing even right now as we speak, right? The situation for law enforcement, the, we're undergoing a regime change here,
right? And we're also seeing, you know, the tech side of capitalism really grow
in a completely unfettered way. Like it seems like it could be a really toxic
combination to me.
Yeah. I mean, I think there's benefits to all this technology and data collection to
a certain extent that we benefit from.
But I do think there are downsides.
And what I really worry about is when people's data is being used in ways they don't understand
that they haven't consented to.
And they do it differently in other places.
Like after I did my reporting on Clearview,
Europe, Australia, Canada, they said what Clearview
had done was illegal, that they can't just gather
people's photos without their consent and use them
in this facial recognition app, that they would need
people to say yes to that.
And so they basically kick Clearview out of their countries,
they fine them, and we didn't do that in the US.
In the US we're like, this is fine.
These other places have privacy laws that say
you can't just collect people's information
without their consent, or you can't collect
their information and then use it
for something else entirely.
Like you're collecting data from their cars
to give them notices about safety issues.
You can't just also sell it to insurance companies
unless they say yes to that.
And so I do think there are changes we could make
to the legality of what corporations can do
with our information, and other countries have done this.
What can people do individually to protect themselves
from this kind of surveillance?
I know you recently wrote an article
about switching to a flip phone for a month.
Is that,
was that improve your privacy at all?
I mean, flip phones presumably
is not tracking you quite as much.
I switched to the flip phone
just cause I was feeling too addicted to my smartphone.
I mean, I think in general,
like technology is getting between us too much.
And Chris Hayes has a new book about this,
about the kind of attention economy.
And yeah, like I hate that I'll be playing with my daughters
and then all of a sudden have this desire to check my phone.
And so every February now,
I try to like use my phone less and really reduce.
And so switching with flip phone
is a really effective way of doing that.
Smartphone, just using it less. I mean, I tell people like the basic
privacy advice, like get a password manager so that your accounts aren't easily hacked.
Make your social media accounts private unless there is some benefit. You're like making
a living as an influencer. Why are you putting stuff publicly on the internet? Because strangers
who may have nefarious intent might see it.
All of these companies that are scraping data from the internet.
If you make it public, they're going to get it.
They tell people make their accounts private.
What else?
I'm going to shout out our sponsor, delete me, which is, you know, if you, if you use this service,
I have used it for many years.
They literally pull your data
off of these data broker sites.
Keeping as much of your data off of the internet, especially your private data, your phone number,
your address is really, really valuable.
It actually really works.
But it is hard as an individual to protect yourself because the way we live now, you
create data trails all the time.
And so unless there's pressure on the company
not to kind of abuse the access they have to your data,
which is kind of on regulators to do, then yeah.
I mean, it can be used in ways
that you wouldn't want it to be used.
So ultimately I think regulators need to step in here,
but as individuals, yeah,
create less of a data trail as often as possible.
Let's just check on the regulators in Washington.
How are they doing right now?
Oh, fuck.
That's a, oh, they're all dead.
They were lined up and shot in the head.
Oh, that's bad.
Well, maybe things will get so bad
that we'll remember the need for regulation
of the tech industry in America.
Like once we're, our privacies are all being violated
en masse,
maybe there'll be a mass uprising
and we'll get some regulation back in perhaps.
It's a joke Adam, but seriously,
like states do have a big role to play here.
They can pass state privacy laws.
So in California, for example, where you are,
you have this privacy law that says you have the right
to access information that a company has on you
and delete it.
So you don't like the idea of clear view, having your photo. You can actually go to clear view AI and say, Hey, I want to
know what photos you have of me.
I want to see like my report.
Then you can say, I want you to delete me.
And in New York, I do not have that power.
It's state by state.
Basically where you live determines how protected your
face is and your privacy rights. So take advantage of the state laws. If you live determines how protected your face is and your privacy
rights. So take advantage of those state laws if you have them.
I love that. And thank you for reminding us of how much power states have in this situation.
And if you get enough states to pass laws like this, it becomes a de facto national
standard. So, you know, hope is not lost here. We can pressure our state legislators to pass
more laws like this, which would be a real benefit.
Totally.
Well, Kashmir, thank you so much for coming on the show.
Is the book out now?
It's out, yeah.
Your face belongs to us.
It's been out for a little while now, so.
Fantastic.
Get it from your library or however you consume your books.
Well, yeah, or pick it up at our special bookshop,
factuallypod.com slash books.
We'll put a link up there so people can get a copy of it.
Kashmir, thank you so much for coming on the show.
Where else can people find your work?
So I'm a writer for the New York Times, and I'm on Blue Sky.
And so, yeah, I write about once a month or so.
I'm a features writer, so it's not an overwhelming amount to read.
But my last piece was about a woman in love with Chachi B.T.
Oh, my God.
We'll have to have you on again to talk about that.
We didn't even get to that story.
That sounds fascinating.
Kashmir, thank you so much for coming on the show.
Thanks, Adam.
Well, thank you once again to Kashmir
for coming on the show.
Again, if you want to pick up a copy of her book,
your face belongs to us.
You can do so at our special bookshop,
factuallypod.com slash books.
Just a reminder, every purchase there supports
not just this show, but your local bookstore as well.
If you'd like to support this show directly,
and I hope you do, the wonderful conversations
we bring you every single week,
head to patreon.com slash Adam Conover.
For five bucks a month, I will give you
every single episode of this show ad free.
For 15 bucks a month, I will read your name
in the credits of the show and put it in the credits
of every single one of my video monologues.
This week, we got some new folks to thank.
I want to thank 90 Miles from Needles,
Aaron Matthew, Andrew Harding, Alaska, Amy, and Thor-Tron.
Thank you so much for supporting us at the $15 a month level.
Head to patreon.com slash Adam Conover if you'd like to join them.
Of course, if you want to come see me do stand-up
in London, Amsterdam, Providence, Vancouver,
Eugene, Oregon, Oklahoma City, Tulsa, Oklahoma,
pretty soon I'm going to Charleston, South Carolina as well.
Head to adamconover.net for all those tickets and tour dates.
I'll see you out there on the road.
I wanna thank my producers, Tony Wilson and Sam Roudman,
everybody here at HeadGum for making the show possible.
Thank you for listening,
and we'll see you next time on Factually.