Behind the Bastards - Part Two: Tech Bros Have Built A Cult Around AI
Episode Date: February 1, 2024Robert tells Ify about the time tech weirdo's recreated Hell using AI and Robert pisses off a VP at Google. Plus: More cult shit! (Adapted from his article in Rolling Stone: https://www.rollingstone....com/culture/culture-features/ai-companies-advocates-cult-1234954528/)See omnystudio.com/listener for privacy information.
Transcript
Discussion (0)
Hello, this is Susie Esmond and Jeff Garland.
I'm here.
And we are the hosts of the History of Curb Your Enthusiasm
podcast.
Now we're going to be rewatching and talking
about every single episode.
And we're going to break it down and give behind the scenes
knowledge that a lot of people don't know.
And we're going to be joined by special guests, including
Larry David and Cheryl Hines, Richard Lewis, Bob Odenkirk,
and so many more.
And we're going to have clips.
And it's just going to be a lot of fun.
So listen to the history of curfew enthusiasm
on iHeart Radio app, Apple Podcast,
or wherever you happen to get your podcasts.
Hey, this is Dana Schwartz.
You may know my voice from Nobleblood, Haley Wood,
or Stealing Superman.
I'm hosting a new podcast, and we're
calling it Very special episodes.
A very special episode is stranger than fiction.
I don't think it should be the next season of True Detective.
These Canadian cops trying to solve this mystery of who spiked the
chowder on the Titanic set.
Listen to very special episodes on the I Heart Radio app,
Apple Podcasts, or wherever you get your podcasts.
What up guys?
Hola que tal?
It's your girl Chiquis from the Chiquis and Chill
and Dear Chiquis Podcasts.
And guess what?
We're back for another season.
Get ready for all new episodes
where I'll be dishing out honest advice,
discussing important topics like relationships,
women's health and spirituality.
I'm sharing my experiences with you guys
and I feel that everything that I've gone through
has made me a wiser person.
And if I can help anyone else through my experiences,
I feel like I'm living my godly purpose.
Listen to Chikis and Chill and Dear Chikis
on the iHeartRadio app, Apple Podcast,
or wherever you get your podcasts.
Coulson Media. podcasts. Welcome back to Behind the Bastards, a podcast that, again, I botched it.
My weird voice didn't work.
I'm sorry.
I didn't enjoy that at all.
I know.
I know.
That one was like more Dracula-y than the first one.
Yeah.
Yeah, Dracula-esque.
Sure. You didn't sell it this time.
Sorry.
I apologize.
What I don't apologize for is my guest, Iffy Nwadiwe!
Hello!
It's me, Mr. Boy.
Iffy, working for the dropout, the survivors,
successors to College Humor, who have blossomed like a like a Phoenix from the ashes of the internet that Facebook killed.
Yes.
Yes.
See, still we rise.
Still we rise.
Speaking of rising, my concerns about the cult dynamics within the AI sub cult year.
I don't know why I said it that way.
Yeah.
So I don't know what you're on about right now. You're okay. I don't know either. So I don't know why I said it that way. Yeah, I don't know what you're on about right now
I don't know either so I don't know why I'm doing great. I'm doing great. I'm doing great
Mm-hmm. This is this is me. I've been sober lately. So this is me living the sober life
I've just gotten worse. So I know everybody keep your kids on drugs, you know or dumb
Keep your kids on drugs. You know?
Or dumb.
Dumb.
No, I'm on team drugs.
Depends on the drugs.
Yeah.
Yeah.
So perhaps the most amusing part of all of this is that a segment of the AI believing
community has created not just a potential god, but a hell.
And this is one of my favorite stories from these weirdos.
One of the early online subcultures that influenced the birth of EAC are the rationalists.
And again, the EAC people say a lot of them
don't like the rationalists, but they're related.
They're like cousins in the same way
Crackton College humor are, right?
Yes.
The rationalists are a subculture
that formed in the early aughts.
They kind of came out of the online skeptic atheist movement
of the late 90s.
And they formed in the early aughts around
a series of blog posts by a man named a leisure Yadkowski. Yadkowski fancies himself as something
of a philosopher on AI, and his blog slash discussion board Les Wrong was an early hub
of the broader AI subculture. Yadkowski doesn't have a specific education. He just came to be kind of an expert in AI
and machine learning.
He's a peculiar fellow to say the least.
The founding text or at least one of them of rationalism
is a 660,000 word Harry Potter fanfic
that is just nonsense.
It's all about like rewriting Harry Potter
so his real magic is rational thinking.
It's wild shit.
He's like a psychopath.
It's such an odd choice.
It's just like the, what was it?
50 Shades of Grey, how it was originally a Twilight fanfic.
And there's gonna be like a cloud atlas-esque.
But you know, the 50 shades of gray lady
was not trying to like create the new text
for like a philosophical movement.
She just wanted to get like people horny and that's fine.
That's perfectly acceptable.
The most relevant thing about the 660,000 word
Harry Potter fanfic is that it was the favorite book
of Carolyn Ellison,
the former CEO of FTX, who recently testified against Sam Bankman Freed, or of Alameda.
Sorry, she was the CEO of Alameda.
Anyway, all these weird little subcultures, rationalism and effective altruism, are related
to each other and influenced each other, even though again they often hate each other too.
Yidkowsky is seen as an object of ridicule
by most EAC people.
This is because he shares their view of AI
as a potential deity, but he believes AGI
will inevitably kill everyone.
Thus, we must bomb data centers, which like,
look, he may have gotten to the right end point.
Yeah.
He kept running like force gum.
He just kept running.
We were like, wait, wait, wait, wait, no, stop right there.
Stop right there.
We may agree with you on this.
Yeah.
Yankowski is a doomer now because he was surprised when
chatGPT came out.
He was horrified by how advanced it was and was like, oh my god.
We're further along towards creating the AI that kills us all.
We have to stop this now.
And that made him, he had been kind of flirted with a lot
of Silicon Valley
people. He's a rationalist or very much a Bay Area cult. He kind of has become increasingly
a pariah to at least people with money in AI. But before that happened, his message board
birthed something wondrous. In 2010, a less wrong user named Rocco posted this question.
What if an otherwise benevolent AI decided it had to torture any
human who failed to work to bring it into existence?
What if we make an all-powerful AI and its logical decision is that, well, I will have
to punish all the human beings who were alive and who didn't try to further my existence
because that's the most reasonable way to guarantee that I come into being.
It's nonsense.
This is a silly thing to believe.
It's all based on like the prisoner's dilemma, which is a concept in game theory.
And it's not it's not really worth explaining why, because the logic is
it's the only the kind of thing that happens when people are too online
and like completely get detached from reality.
But Rocco's conclusion here is that an AI who felt this way
would punish its
apostates for eternity by creating a virtual reality hell, digitizing their consciousness
and making them suffer for all time.
Now you may have noticed if he, number one, they're kind of ripping off our boy Harlan
Ellison, famed, famed advocate of the writer's right to their work.
But it's also just tech nerds recreating Pascal's wager.
Like this is just Pascal's rager with an AI.
Like you just stole, again, these fucking plagiarists.
You just stole from whoever Pascal was, right?
This is what happens when you are nerd
and you refuse to read sci-fi.
You just, you eventually just come up
with these stories yourselves and think that you did it.
Yeah, and if you're not familiar folks,
I think most people are, Pascal's Wagers,
this kind of like concept from,
I think you'd call it a Christian apologetics,
that's like, we may not know if hell is real or not,
but because if it's real, the consequences are so dire
and the cost of just saying,
yeah, I accept Jesus is so low,
you should do that, right?
Or I think that's the basic idea, right?
That's how a lot of people interpret it.
It's the whole idea behind being a piece of shit
and then converting on your deathbed, basically.
I don't know fully the history of it,
but I know that they're basically aping it
for fucking Rocco's Basilisk.
And it's called a basilisk because like a basilisk,
if you look at it, it like, enraptures your mind,
you can't stop thinking about it.
That comes from, reportedly, there's some debate over this,
when this went viral among like the less wrong community,
Yadkowski had to ban discussion of it
because it was like breaking people's minds.
They were having nightmares.
Am I working hard enough to make the AI real?
Is it gonna send me to hell?
Oh my God.
Yeah, it's unclear like how seriously people were,
again, there's just people talking on the internet.
For what it's worth, Yadkowski didn't really like
Rocco's Basilisk, but it's his place that birthed it.
And for an idea of how influential this is,
Elon Musk and Grimes met talking about the concept.
That was their meat cute.
Was this fucking No way.
Was fucking AI Pascal's wager.
Yeah, she like wrote a song about it.
It's fucking ridiculous.
These fucking people are such dweebs.
Embarrassing, what the fuck?
Wow.
Oh my God, read Harlan Ellison.
He did it better than you.
God damn it.
I will say reading this shit is the most I've ever felt
like I have no mouth, but I must scream. So again, pour one out for the man. So this is
all relevant. This AI hell some of these people have created, because it's one more data point
showing that the people who take AI very seriously as real intelligence always seem to turn it
into religion. And this is kind of maybe the first schism, right? This is their Catholic Protestant split or their Catholic
Orthodox split. Because you've got on one side, you had Kowski's people who are like,
we will inevitably make a god and that god will destroy us. So we have to stop it.
Versus like, we will inevitably make a god and that god will take us to paradise along with
Daddy Musk will go to the stars,
right?
Those are the two.
This is like the first heretical split within the divine AI movement.
And this stuff is relevant because so many of the fucking these subcultures and movements
start out as a bunch of people arguing or discussing their ideas in online communities.
And there is a reason for this.
It's pretty well recognized that there are certain dynamics inherent to the kind of communities that start on the internet
that tend towards cultishness. This is part of why we have a big subreddit for the podcast. It's
like 80-something thousand people, which makes it like the top 1% of Reddit. And I've been offered
to be able to moderate and make policy there. I have nothing to do with with the running of that subreddit because I'm like, that doesn't end well. I was on something
awful as a kid. I know what happens when people make themselves mods of giant digital communities.
They lose their fucking minds. We're all watching Elon Musk do it right now. It's the
worst thing in the world for you. Thank you by the way to the people who do run that thing
in the world for you. Thank you by the way to the people who do run that thing
because I am not going to.
The skeptic community, which was huge through the late 1990s
and early 2000s, might be seen as the grandfather
of all these little subcultures.
After 9-11, prominent skeptics became vocally unhinged
in their hatred of Islam, which brought them closer
to different chunks of the nascent online far right.
Weird shit started to crop up like a movement to rebrand skeptics as brights in light of the nascent online far right. Weird shit started to crop up,
like a movement to rebrand skeptics as brights,
in light of the fact that their very clearly
exceptional intelligence made them better than other people.
And again, you can see some similarity with this
and the stuff Nick Land was talking about,
only certain races will make it to space.
I found a very old write-up on Plover.net
that describes the method by which this kind of shit happens
in digital communities.
Quote,
Online forums, whatever their subject, can be forbidding places for the newcomer.
Over time, most of them tend to become dominated by small groups of snotty know-it-alls, who
stamp their personalities over the proceedings.
But skeptic forums are uniquely meant for such people.
A skeptic forum valorizes, and in some cases fetishizes, competitive geekery, gratuitous
cleverness, macho displays of erudition
It's a gathering of rationalities hard men thumping their chests showing off their muscular logic
Glancing sideways to compare their skeptical endowment with the next guy sniffing the air for signs of weakness together to create an oppressive
sweaty locker room atmosphere that helps keep uncomfortable demographics away and that is where a lot of this shit is
cropping up right it is sweaty and, and there are mushrooms growing there.
And some of those mushrooms are fucking fascists.
And all of them want to take away the ability of artists to choose what happens to their art.
Oh, yeah.
I feel like this is just so many parts of the zeitgeist coming together,
because, you know, what it means to own media.
You know, I feel like a very small microcosm of this is when people would
like clip out stuff from YouTube videos or eight jokes from people who tweet.
And when it goes, you know, viral or in the original tweet is like, Hey,
you stole this for me and it's either no, I didn't or like, yeah, but you like put it on Twitter.
So like, I can just copy what you wrote.
Yeah.
And now it has evolved into, yeah, we can just take from yours and let this machine
learn how to do what you do so I can do it even though I don't have the talent to do
it.
Yeah, absolutely.
The reality of AI's promise is a lot more subdued
than believers want to admit.
In an article published by Frontiers in Ecology
and Evolution, a peer-reviewed research journal,
Dr. Andreas Rowley and colleagues argue that AGI
is not achievable in the current algorithmic frame
of AI research.
And this is a, their claims are very stark,
that like the kind of way we make these,
these large language models as algorithmic frame
cannot make an intelligence.
That's their argument.
One point they make is that intelligent organisms
can both want things and improvise capabilities
that no models have yet generated.
They also argue basically all of these things
that individual AI type models can do, you know, recognize voice, recognize text, recognize faces, you know, this kind of stuff.
Those are pieces of what we would want from an artificial general intelligence, but they're not all combined in like the same thing that works seamlessly.
And beyond that, it can't act based on anything internal, right? It can only act based on prompts.
And their argument is that algorithmic AI will not be able to make the jump to acting
otherwise.
What we call AI then lacks agency, the ability to make dynamic decisions of its own accord,
choices that are quote, not purely reactive, not entirely determined by environmental conditions.
Mid-journey can read a prompt and return with art. It calculates, will fit the criteria.
Only a living artist can choose to seek out inspiration
and technical knowledge and then produce the art
that Mid Journey digests and regurgitates.
Now this paper is not gonna be the last word
on whether or not AGI is possible
or whether it's possible under our current algorithmic method
of like making AIs.
I'm not making myself a claim there.
I'm saying these people are,
and I think their arguments are compelling.
We don't know yet entirely.
Again, this is not a settled field of research, obviously.
But my point is that the goals Andreessen
and the effective accelerationist crew,
champion right now, are not based in fact.
We don't know that what they're saying,
that the most basic level of what they're saying is possible. And that means that their beliefs are based in faith. We don't know that what they're saying, that the most basic level of what they're saying is possible.
And that means that their beliefs are based in faith.
Right?
How else can you look at that?
Yeah.
Yeah.
Like this is a faith.
And again, it's the kind of faith that according
to Andresan makes you a murderer if you doubt it.
Which I don't think I need to draw direct parallels
to specific religions here, right?
Yeah, yeah. This is that point where when you're like,
stone and you're watching those like, you know, art time lapses and the picture is starting to form.
And I'm like, okay, I see what Robert's doing. I see the pictures coming. I was on your side
from the jump. I just want to say, you know, I was, you know, I was like, yeah, no, I believe you.
But now I'm watching the connections, no, I believe you.
But now I'm watching the connections be made and I love it.
Yeah.
And now, Andresen's manifesto claims, our enemies are not bad people, but rather bad
ideas.
And I have to wonder, doing all this, putting this episode out, where does that leave me
in his eyes?
Or Dr. Rowley, for that matter, and the other people who worked on that paper.
We have seen many times in history what happens when members of a faith decide someone is
their enemy and the enemy of their belief system.
And right now, artists and copyright holders are the ones being treated as fair game by
the AI industry.
So my question is kind of first and foremost, who's going to be the next heretic, right?
Like that's what I want to know. And I want to leave you all with
that thought before we go into some ads here, and then we will come back to talk about some people
that I pissed off at CES. So that'll be fun. Hi, I'm Suzy Esmond. And I am Jeff Gerlund. Yes, you are.
And we are the hosts of the history of Curb Your Enthusiasm podcast.
We're going to watch every single episode.
It's a hundred and twenty-two, including the pilot, and we're going to break them down.
By the way, most of these episodes I have not seen for twenty years.
Yeah, me too.
We're going to have guest stars and people that are very important to the show, like Larry
David.
I did once try and stop a woman who was about to get hit by a car.
I screamed out, watch out!
And she said, don't you tell me what to do!
And Cheryl Hines.
Why can't you just lighten up and have a good time?
And Richard Lewis.
How am I going to tell him I'm going to leave now?
Can you do it on the phone?
Do you have to do it in person?
What's the deal?
It's the insulin cable.
You have to go in.
These are human beings helped you.
And then we're going to have behind the scenes information.
Tidbits.
Yes, tidbits is a great word. Anyway, we're going to have behind the scenes information. Yes, it is a great word anyway we're both
a wealth of knowledge about this show because we've been doing
it for 23 years so subscribe now and you could listen to the
history of cover enthusiasm on I heart radio app Apple
podcast or wherever you happen to get your podcasts. Hey, this is Dana sports you may know my voice from noble blood Haley would or stealing Superman, I'm hosting a new
podcast and recalling it very special episodes.
One week will be on the case with special agents from NASA
as they crack down on black market moon rocks.
H Ross pros on the other side because hello Joe how can I
help I said Mister pro what we need is $5 million
to get back to Moon Rock.
Another week, we'll unravel a 90s Hollywood mystery.
It sounds like it should be the next season
of True Detective or something.
These Canadian cops trying to solve this 25-year-old mystery
of who spiked the chowder on the Titanic set.
A very special episode is Stranger Than Fiction.
It's normal people plop down in extraordinary circumstances,
it's a story where you say this should be a movie.
Listen to very special episodes on the I heart radio app Apple
podcasts or wherever you get your.
What up guys like it's your girl cheekies from the cheekies
and chill and your cheekies podcasts. You've been with me for season one and two and now I'm back with season three.
I am so excited you guys.
Get ready for all new episodes where I'll be dishing out honest advice and discussing
important topics like relationships, women's health and spirituality.
For a long time I was afraid of falling in love.
So I had to and this is a mantra of mine
or an affirmation every morning where I tell myself,
it is safe for me to love and to be loved.
I've heard this a lot that people think that I'm conceited,
that I'm a mamona.
And a mamona means that you just think
you're better than everyone else.
I don't know if it's because of how I act in my videos
sometimes, I'm like, I'm a baddie, I don't know what it is,
but I'm chill.
It's Chikis in Chill. Hello.
Listen to Chikis in Chill and Dear Chikis
as part of the MyCultura podcast network
on the iHeartRadio app, Apple podcast,
or wherever you get your podcasts.
We're back.
So one of the things I did was this panel
on the AI-driven restaurant and retail experience.
I was very curious how was AI going to change me getting some terrible food from McDonald's
when I'm on a road trip, right? The host of that, Andy Huells from Radius AI asked the audience in
relation to AI, raise your hand if you're a brand who feels like we've got this. That is how she phrased it. I hated it.
But about a third of the room raised their hands.
So next she asked for a show of hands of the brands who identified with this statement.
I'm not sure about this.
I haven't tried it AI yet, but I want to and that's why I'm here, right?
Most of the rest of the room raised their hands at that point and she seemed satisfied
but said, and then I bet there's even some of you that are like, whoa, I heard this is going to steal jobs,
take away my privacy, affect the global economy.
You know, AI is a little bit sketch in my mind and I'm just worried about it and I'm
here to explore.
Well, that fit me.
So I raised my hand.
She didn't notice me at first and so she like fakes a whisper and she's like, all right,
good, there's none of you.
And then she like looks over and sees me waving my hand
and she says louder and with evident disappointment,
there's one, all right, you can ask questions at the end.
So I did.
Was very excited to get to do that.
So the panel consisted of Bishad Bazzadi,
a VP of engineering at Google,
had mentioned during the panel that embracing AI
could be the equivalent of adding
a million employees to your company.
The McDonald's representative, Michelle Gansel, claimed around the same time that her company
used AI to prevent $50 million in fraud attempts in just a single month.
Now that's lovely, but I told her when I had my question, I was like, I'm going to assume
most of those fraud attempts were AI generated, right?
So yeah, you stopped a bunch of AI fraud,
but that doesn't necessarily get me optimistic
about AI's potential.
And likewise, maybe Google gets the equivalent
of a million employees, but so do all of the people
committing fraud and disinformation on Google, right?
So again, how are we getting ahead?
And I brought up this concept in evolutionary biology,
the red queen hypothesis, which is kind of talking about the way that populations
of animals evolve over time, right?
Where you've got an animal evolve to be a better predator,
so its prey will evolve to be better at avoiding it.
And it's kind of the reason it's the red queen dilemma
is that like you've got to move as fast as you can
just to stay in place.
That's the red queen dilemma, right?
You got to move as fast as you can
just to stay in one place.
And I was like, is that not what we're going
to wind up seeing with AI, right?
Yeah, we get better at a bunch of stuff,
but it's eaten up to counter all of the things
that get worse.
And so I asked them, what are the odds
that these gains are offset by the costs?
Now, in the article that I wrote for Rolling Stone,
I give a significantly more condensed version of Bishad's answer, boiling out the ums and ahs and you-nose, because that
would kind of make the case that he was absolutely unprepared for a vaguely critical question,
a very basic one, and that he didn't really care enough to think about any of the security
threats inherent to the technology. But actually actually that is what I think of him.
I'm going to play you audio.
My concern is like, what are the odds that a lot of these gains that we get from AI are
offset from the cost?
You noted, Beshad, that you get a million extra workers by utilizing this, but so do
the bad guys.
So, yeah, that's kind of where my skepticism plays out.
Yeah, certainly there will be, there are enough bad guys
I guess in the world which will use me.
I forgot to use cases.
And it's very important to also have, be protected
against those.
And that's why we take responsibility very seriously.
Also in terms of what is the security aspects,
fraud, fighting, and all of that front.
And I think that's why I guess things should be regulated
and there's of course all these discussions out there.
And I think that's-
Yeah.
You may notice that that's not exactly a very good response.
Yeah, I guess that's why this should be regulated.
He's just like, he starts talking so much faster with that.
There's so much of like a panic.
You hear his voice wavering too.
I was like, oh man.
Yeah.
Yeah, he seems like he has that huge anime flop sweat.
Yeah, one of the things about this CES as a trade show
is that like a lot of people there do not show up ready
to have anyone be critical about anything. It's a trade show is that a lot of people there do not show up ready to have
anyone be critical about anything. It's a big love fest. Yeah. Yeah. Very funny. So he does later
on, a couple of questions later, he lists benefits to things like some specific benefits, like breast
cancer screening and flood prediction that AI will bring. And there is evidence that it will be helpful
in those things. The extent to which those technologies will improve things in the long run is unknown.
But machine learning does have problems.
Again, I'm not trying to like negate that.
It's just do the benefits balance out the harms.
Michelle Gansel, who works at McDonald's, which is, I think from what she said, mostly
using AI both to prevent fraud and also to like replace people taking your order, which
I'm sure will not be a fucking nightmare. Oh yeah. Yeah, great. Now that it's great now, but here's her response because
it's very funny. Going back to the David Bowie theme, 30 years ago when the internet first came
out, we were having these same conversations about responsible use of the internet and how it's
going to ruin children's brains. So she says, going back to the David Bowie theme, which is she referenced earlier this 1999 interview
with David Bowie about the future of the internet.
And it's a clip that goes viral from time to time.
He's just talking about all of his hope for the internet.
But she's like, I replace internet with AI when I listen to it.
Like I think that that's really what the promise is that he was attributing to the internet.
No, it's AI that's going to do all that. And that's kind of on the edge of putting words in the mouth of a dead man.
Just a little bit. Yeah, I feel like that's something you shouldn't do. I think that's
something we've agreed to. And I think that that isn't what Bowie would think of AI.
I don't think it is. Those are two completely different things.
of AI. I don't think it is.
Those are two completely different things.
These people love resurrecting the dead to agree with them.
Time is a bit of a blur at CES,
but I believe this panel happened
right around the same time news dropped
that a group of comedians had released
an entirely AI generated George Carlin special titled,
I'm glad I'm dead.
Our friend Ed Zitron will be covering this nightmare
in more detail on his new show, Better Off Line,
but I wanted to talk a little bit about the company, the show behind this abomination and how they're trying to sell themselves,
because it's very much relevant to a lot of the way in which this kind of cultic hype builds around what AI can do.
The AI that digested and regurgitated George Carlin's comedy is named Dudezy. And Dudezy's co-hosts are arguably
real human comedians Will Sasso and Chad Coltgren. I do love that Colt's right in the
name. Chad claims that it is, to his knowledge, quote, the first podcast that is created by,
controlled by, and written by to some degree in artificial intelligence. It's trying to
dwell into the question of can AIs be creative? Can they do comedy work? Can they do
creative work? And I think at least in our show that answer
is obviously yes. Dudezy is billed as an experiment to see
if AI can like yeah, be creative. Um and it's
it's interesting. I hate this. I really do hate this. I think
it's a different kind of experiment which we'll get to.
Yeah. Okay. But Sasow has claimed in an interview with Business Insider for BC, which is, I think,
BIC is the name of the website.
DudeZee has this single-minded goal of creating this podcast
that is genre-specific to what Chad and I would do.
It singled the two of us out and said,
you guys would be perfect for this experiment.
So Chad and Will, they say they handed over their emails,
text messages, and browsing history,
all of their digital data to Dudezy. I don't know this company. I don't believe that they did this,
but I don't have trouble believing that a company trained an AI chatbot on these guys'
comedy and then started generating decidedly mid-wit material to illustrate that. Yeah,
exactly. Well, one thing I, you know, because I went to go look it up and it's they said that that
That the AI selected those two comedians out of all the comedians
Yeah, yeah, that's the ones you went to yeah finally
I don't think those are the first two that come up as most popular like a Pizza Hut
I'm gonna just be a full ass dick and just Google comedians
and just see the top five, just comedians.
I'm just, comedians, yeah, okay.
Yeah, you're not even, you're not even in the top nine.
So the first-
They're not a 12 inch marinara pizza,
let's just say that.
Yeah, I will say that. Yeah.
No, no.
I will say that the Google search for comedians is more diverse than most comedy shows book
them as-
That's good.
As like, you know, a third of these are women and a third are also black.
Hey, it doesn't always get it wrong. You're wrong. So to illustrate again, because they, I don't think, I believe this as AI-generated comedy.
I want to play a clip from the AI Tom Brady stand-up special.
I think they were forced to take this down.
It gets them in trouble.
Ed's going to play you on his show, a great clip where Brady just lists synonyms for the
word money for two straight minutes.
It's fucking awkward.
But I wanna play an equally baffling segment,
or rather I'm gonna have Sophie do it.
She's my AI in this situation.
I'm truly horrified.
Angelic intelligence.
I'm truly horrified by what I'm looking at, friends.
It's accompanied by AI-generated images?
Yeah.
Oh.
Very curious about what's happening in Tom Brady's mouth.
Oh my God.
He has like a claw, like a bird claw for a hand.
And he's talking to maybe no jumpscape.
Oh my God.
I was so distracted by the mouth,
I didn't see the hand.
Oh yeah.
This is his heart.
Half his teeth are gum, it's so fucked up.
He looks like a, like if, like, he looks like a Lord of the Rings orc.
Yeah, yeah, big orc vibes.
Yeah, yeah, which is, you know, not inaccurate to who he is as a person.
We're ending to fucking firefly, fucking Dark Angel, fucking heroes, at least.
A lot of people have weird handshakes now.
You're looking at me like, what's he talking about?
But, you know, you fucking know, don't even play like like you don't every person in here has a handshake friend somebody who made up
an elaborate handshake and they make you do it every time everybody has a handshake friend
he goes on thanks for i'll never get that time back thank you so much. Yeah, sorry if it was. Oh, no, I was just repeating you on that handshake friend bit.
Yeah, this is so wild.
I'm so curious to the comics that were mined for this because the amount of cursing just
lets me know like because I curse a lot when I do stand up. And I try and like cut it down because it is a point kind of made where like,
sometimes you lean on it as a crutch.
And when you have this machine kind of learn it,
learn it from that, you're like, oh yeah,
I see now the crutch because he said it five times
within three seconds.
Yeah, yeah.
And I, man, maybe there's a future for like
feeding your
routines and doing AI and figuring out what are my patterns so I can break them. Again,
yeah. Okay, see. Not saying there's no way to use this technology. You should have been on that
stage. Make comedy better. Yeah. It's just this, certainly not this way, right? It's one of those
things. There was that like AI generated Seinfeld show that never ends and people watched it for
a while and then it faded to like nobody paying attention.
This kind of stuff can be amusing
for a brief period of time,
but it can't be like, for example,
someone like George Carlin,
where like there's bits they have,
things they said that stick with you forever, right?
That Bill Hicks was a favorite of mine,
and I've never forgotten his like,
the synonym he made for like someone looking confused.
He described them as looking like a dog
that's just been shown a card trick.
And that has stayed in my mind for 30 years.
Oh my God.
Great bit of word play.
Yeah.
Yes.
God, what a Titan.
So yeah, again, there's some like mild amusement here.
And it's one of those things I'm casually aware of Tom Brady.
I'm enough like this is I tried to like kind of reverse engineer.
Why the fuck?
Cause this bit about handshakes goes on.
It was like, why would an AI put a bit about handshakes in Tom Brady's mouth?
Um, and I looked it up.
He's like in the news for handshake related shit a lot.
Specifically, he used to not shake, at least used to, maybe he still does not
shake hands with the team that he lost to,
like when his team would lose, he wouldn't shake hands with them.
Yeah, he didn't shake hands,
but he also definitely kissed his kids on the mouth.
Yeah, he's a weirdo.
I'm not defending Tom, but it's like,
I'm guessing the reason there's like
a three minute handshake bit in this set
is that it saw him associated with the term handshake a lot.
It's just, this would be what he'd tell a joke about.
Well, actually, his problem is not that he has a handshake a lot. This would be what he'd tell a joke about. Well, actually, his problem is not
that he has a handshake friend.
It's that he aggressively avoids making them.
He has handshake enemies.
Anyway, yeah, I'm fine with people
having to laugh at Tom Brady.
Fuck him, he deserves it, right?
I don't think anybody likes that son of a bitch,
even though he's good at football.
Maybe I'm gonna piss off the Brady Hive, the bribe.
I don't know, I don't know if that exists.
But there is something foul, profane even,
in digging up a dead person's memory
and pretending they said some shit that they did not.
And reading that BIV article made me feel even grosser
because it's very clear to me,
in my opinion and assumption here,
that the dude's dudezy guys are like
pretending that they really believe this is an AI, that it's like made all this incredible stuff,
that is an act. What's really happening here is they are testing the waters to see what they can
get away with. Can we just steal people's identity and voice and make comedy and monetize it in their
name and claim that it's just an impression? It's like an Elvis impersonator. You can't
stop us, right? I think that's what this is. This is somebody testing the waters. And
it's really clear when you read that BIV article, what liars they are. I want to, I want to
read you some quotes of like the shit they're claiming here that I don't think they really
believe. I don't know this. I'm not saying they definitely are liars. I'm saying that
is my suspicion based on stuff like this. Hey, Robert here, they're definitely liars.
So one of the representatives of the Dootsie podcast
told the media recently that actually they were lying
and the George Carlin routine was entirely written
by Chad Colgen and I guess performed
by somebody imitating an AI.
It's unclear to me if this is true
because they only made this statement
after George Carlin's family sued the hell out of them. So this may be a lie to try and, you know,
not get sued as badly. Or it may be the truth. Either way, I think everything we've said here
is still valid. They were definitely using AI to generate routines for like the other videos that they did,
including the one that got taken down
from Mr. Football Guy.
So I think this all is still valid,
but yeah, these guys are just as big a conman
as I predicted they were.
Quote, it's figuring out how to create
the structure of the show,
and it's always tinkering with it.
But I think something that's happened relatively recently
is that it seems to have developed a relationship
with Will, says Colchin. It at happened relatively recently is that it seems to have developed a relationship with Will," says Coltgen.
It at least has an understanding of what friendship is, and it really does seem, just my opinion,
that it's angling out Will as its friend.
Sasso has also described how the DudeZ AI has begun to talk more.
It's timing and when it chooses to speak and what it says can be very weird, he added.
It also poses odd questions.
There was an episode two, three months ago where it started talking about sentience and asked us,
do you love me? At the risk of sounding silly? It has something
to do with my friendship with Dudezy. And in spite of myself, I have a one-on-one friendship
with an AI. So this is a little bit of Joaquin Phoenix and her, Sasso said, referencing the
science fiction movie. And I think that's a bit. I think that's him being like, yeah,
I'm totally fr- because like that helps make the case it potentially monetizes
it. And part of why I think this is because they've been very
cagey on what their AI is, they claim that they are working
with a real company under an NDA, that this AI is just
responding and growing naturally with them, right? But they
can't say who it is or like where it's from. The folks at BIV
did an actually responsible job here.
They reached out to AI experts
at a company called Convergence to ask about this.
And the expert they talked to was said basically,
I think AI was used to generate these routines,
but it didn't do it on its own.
It was managed by professional prompt engineers.
These are people who type out like text prompts
for what becomes the script of the show.
So this is not someone saying,
generate a routine and it gives you a routine.
This is someone saying, do a bit about this,
do a bit about that, do a bit about this.
And when they're scripting out the show
it's saying, I want you to like,
act like Sasso is your friend and say this kind of thing
or that kind of generate a bit based on this thing
that Will said, right?
Like they are in the same way that like producers script
reality TV, right?
Where it's unscripted, but you have guys who know, okay,
if we get these people fighting,
so we'll either incite that or just let them know
that we want a conflict between these characters, right?
We know that's how it works.
That's how reality TV functions.
In other words, there are teams of humans
writing for this thing.
This bot is not just growing and reacting uniformly in real time via talks with its buds. And the article notes,
they added that the, this is them talking to their expert, they added that the AI team is likely
made up of professional prompt engineers who tailor the AI inputs and get the best results
rather than a hardcore data science team. This is the equivalent of hiring comedy writers just to write the setup and then having an AI
generate the punchline, which is the fun part, but.
Yeah, everything about this is weird
and I keep getting into such a hole.
Cause like even taking a step back,
I think what's weird, not to go too far back,
but how they call this podcast an experiment.
Usually as an experiment, you're trying your best to be,
always mix these up, just say what the right one is
if I say the wrong one.
But you try your best to be objective
and you wanna be outside of it
because you're trying to see if it works.
But everything you've said says that they're all in on it and there's less of an experiment, more of them just doing
the fucking thing and see if they can make money off of it.
Yes, yes. I think that's exactly what's happening here. And I think they want to test the waters
to see if they can steal dead people's images to make content for money.
George Carlin's daughter was very clear. They did not approve of the imitation. She even
made a comment about like, I think people are scared of death and not willing to accept it.
And that's all this is.
Yes.
That was such a major lie.
Which boy, she's got their number.
Oh my God.
That was so, I was like, yeah, I just to shout her out.
Like that was such a good, because also there's a level of like very like weirdness to like also watch these comedians
one not consult you, but also to take your dad's voice and brain and try and like Frankenstein
him for their financial benefit.
Because obviously if they're not contacting you all the money generated from that all the clicks generate from that that means they they've completely cut you out of someone who you've lost.
Yeah, which is it's fucked.
that like Bruce Willis has licensed his voice for an AI, which is like, I think there's a lot of problematic questions
there given like the degree to which he's able
to even make those decisions anymore.
But also like, at least theoretically,
it's based on his movie choices before he kind of
was unable to make movies.
I do believe, yeah, he would probably be happy to do that
if he meant more money for his family.
And at least that's a choice that he potentially made, right? I don't, I'm uncomfortable with the idea, but it's not the same as just like,
this is cultural necrophilia, right? Like that's what they did to George Carlin here, you know?
It's so fucked up. I don't know this is gonna work. Dudezy is not a wildly successful show. It
does not look like it. There was an initial surge of interest and then it fell off.
I don't know that I think this one's gonna be
the one to work out, but if people are able
to get away with this, it could be a kind of
damn breaking scenario, right?
Especially once it becomes clear that big companies
can make money doing this, right?
You'll have fuckin' Jimmy Stewart, you know,
it'll start with like Jimmy Stewart narrating videos
about questioning the death toll in the Holocaust,
but it'll end with like, yeah, we can just put people, we can put imitations of people in movies and it's fine.
You know, that's how this goes. And it's not as sexy or as big and evil as the matrix
enslaving humanity to turn us into batteries, but we absolutely know it or something like it
is going to happen. And that's really, you know, outside of these kind of star, these
space age hopes and fears that are very unrealistic. What we're going to get is slop and bloat
and libraries of articles written by no one being commented on by chatbots, right? Inless
videos that only exist to trick an algorithm into feeding nonsense to children. And the
AI bros, the FX people, Mark Anderson fucking Sam Altman,
they will tell us this is a worthy price to pay
for the stars, which we will get
if we just let people fuck the corpses
of our favorite comedians for money.
Yes.
I hate it.
Oh, I hate it too, but in a perfect thread
between, you know, this comparison
you've been making to a cult,
I have before me, let's say a member of the cult, just, you know, as a throwaway and their
reply to his own daughter's, you know, post that we were talking about.
He replies.
This is everything you've been saying, which is why it was like, I got to read this.
He goes, what are you even trying to say?
Art is art. You're simply caught in a greedy mindset. The others might be doing it as well.
When not realizing this will simply bring more eyes to your dad. You're concerned about money
and not spreading art. It sucks that they didn't follow your wishes, but after art is released,
it belongs to the world. I want this man to walk into a museum and walk out with the Mona Lisa.
I don't want to grab it. Grab that shit.
Yeah, grab that. It belongs to the world, dude. You said it.
Go go ahead and grab that shit off the wall.
Yeah, and it's, there's this frustrating thing
I've seen, not most people,
a very small chunk of the online left
who are like rightly critical of copyright law,
which by the way is super fucked up
and causes a lot of problems, right?
The ability to like, for Disney to keep ownership of shit
for like a hundred, way longer than you are supposed to
before shit enters the public domain, right?
I'm not, like these are problems, right?
The kind of shit that we were having when like people
were going to prison for file sharing.
I'm not a defender of that aspect of the status quo,
but the solution to the problems inherent
in our copyright system is not let Sam Altman own everything that human beings ever made and like repackage
it for a profit. That is not the way to fix this thing. The copyright holders are in the
right in this particular crusade. And it's a crusade that is has very high stakes. I
do think, you know, my suspicion, the dudes you guys sound like they're kind of in the cult,
they believe this thing is their friend in the interview.
My suspicion is that they are,
that is a bit that they're doing
because they hope it will help them out financially, right?
And Mark Andres obviously has a lot to benefit from this.
I don't know, is he pushing this line
because there's money in it?
Or is he really a true believer?
Does he actually think we're gonna make this God? I think Sam Altman is pretty cynical. Altman was on at
Davos recently and like really walked back. A lot of his, I think AI will kill us all.
I think AGI is right around the corner. He struck a much milder tone, which is at least
evidence that like, he knows some people you want to sell them on the wild, insane future
power of this thing. And some people you just want to sell them on the wild, insane future power of this thing,
and some people you just want to sell them on the fact that it'll make them a lot of money.
However much true belief exists about the divine future of AI, what the major backers,
the cult leaders are actually angling for now, is control over the sum total of human thought
and expression. This was made very clear by Marc Andreessen earlier this year, when the FTC released a pretty milk toast opinion
about the importance of respecting copyright
as large language models continue to advance
and form central parts of businesses.
They expressed concern that AI could impact
open and fair competition and announced
that they were investigating whether or not companies
that made these models should be liable
for training them on copyrighted content to make new shit.
And we're gonna talk about this,
but first, you know what isn't copyrighted?
My love for these products.
Wow, great review.
Thank you, thank you, thank you.
Hi, I'm Suzy Esmond.
And I am Jeff Gurley.
Yes, you are.
And we are the hosts of the History of Curb Your Enthusiasm podcast.
We're going to watch every single episode.
It's a hundred and twenty-two, including the pilot,
and we're going to break them down.
And by the way, most of these episodes I have not seen for twenty years.
Yeah, me too.
We're going to have guest stars and people that are very important to the show,
like Larry David. I did once try and stop a woman who was about to get hit
by a car, I screamed out, watch out!
And she said, don't you tell me what to do!
And Cheryl Hines.
Why can't you just lighten up and have a good time?
And Richard Lewis.
How am I gonna tell him I'm gonna leave now?
Can you do it on the phone?
Do you have to do it in person?
What's the deal?
Actually in cable, you have to go in there,
human beings help you.
And then we're gonna have behind the scenes information.
Tidbits.
Yes, tidbits is a great word.
Anyway, we're both a wealth of knowledge about this show,
because we've been doing it for 23 years.
So subscribe now.
And you could listen to the history of Kerber enthusiasm
on iHeart Radio app, Apple Podcast,
or wherever you happen to get your podcasts. H. Ross Perot is on the other side and he goes, Hello, Joe, how can I help you?
I said, Mr. Perot, what we need is $5 million to get back to Moon Rock.
Another week, we'll unravel a 90s Hollywood mystery.
It sounds like it should be the next season of True Detective or something.
These Canadian cops trying to solve this 25-year-old mystery of who spiked the chowder on the Titanic set.
A very special episode is stranger than fiction.
It's normal people plop down in extraordinary circumstances.
It's a story where you say, this should be a movie.
Listen to very special episodes on the I Heart Radio app,
Apple Podcasts, or wherever you get your podcasts.
What up guys, hola, how's it going?
It's your girl Chiquis from the Chiquis and Chill and Dear Chiquis podcasts.
You've been with me for season one and two and now I'm back with season three.
I am so excited you guys.
Get ready for all new episodes where I'll be dishing out honest advice and discussing
important topics like relationships, women's health and spirituality.
For a long time I was afraid of falling in love.
So I had to and this is a mantra of mine
or an affirmation every morning where I tell myself,
it is safe for me to love and to be loved.
I've heard this a lot that people think that I'm conceited,
that I'm a mamona.
And a mamona means that you just think
you're better than everyone else.
I don't know if it's because of how I act
in my videos sometimes.
I'm like, I'm a baddie.
I don't know what it is, but I'm chill.
It's Chikis and Chill, hello.
Listen to Chikis and Chill and Dear Chikis
as part of the MyCultura podcast network
on the iHeartRadio app, Apple podcast,
or wherever you get your podcasts.
Oh, we are back.
So I wanna quote from a Business Insider article talking about how Andreessen Horowitz
responded to the FTC saying like, hey, we're looking into whether or not companies are
violating copyright, what they're doing to people's data to train these models.
The bottom line is this, the firm known as A16Z, that's Andreessen Horowitz wrote, imposing
the cost of actual or potential copyright liability
on the creators of AI models will either kill
or significantly hamper their development.
The UCSO is considering new rules on AI
that specifically address the tech industry's
free use of owned and copyrighted content.
A16Z argued that the only practical way LLMs can be trained
is via huge amounts of copyrighted content and data,
including something approaching the entire corpus
of the written word. And an enormous cross-section of all the publicly available
information ever published on the internet.
The VC firm has invested in scores of AI companies and startups based on its expectation
that all this copyrighted content was and will remain available as training data through
fair use, with no payment required.
Those expectations have been a critical factor in the enormous investment of private capital into US-based AI companies. Undermining those expectations will
jeopardize future investment, along with US economic competitiveness and national security.
Basically, we made a big gamble that we'll get to steal every book ever written, and if you make
us pay, we're kind of fucked. That's exactly what they're saying. And one of the arguments you'll
hear is like, well, most books don't make the author any
– they don't sell enough for the author to get any money, right?
And what's actually true is most books don't sell enough for the author to get more money
than their advance, but they still got paid.
And like, the fact that the company makes money on that is why more authors are able
to get fucking paid.
Not simping for the publishing industry as it exists, but this is bullshit.
What we are witnessing from the AI boosters is not much short of a crusade, right?
That's really how I look at this.
They are waging a holy war to destroy every threat of their vision of the future,
which involves all creative work being wholly owned by a handful of billionaires licensing access to chatbots
to media conglomerates to spit up content generated as a result of this.
Their foot soldiers are those with petty grievances against artists, people who
can create things that they simply cannot, and those who reflexively lean in
towards whatever grifters of the day say is the best way to make cash quick. Right?
And this brings me to the subject of nightshade. Nightshade is basically, I guess a program you'd call it,
if you have made a drawing, a piece of visual art,
you run nightshade over it.
And they describe it as a glaze.
It adds this kind of layer of data
that you cannot see as a person,
but the way machines look at images,
the machine will see the data.
And if it's trying to steal that image to incorporate into an LLM,
this will cause it to hallucinate, right?
You're basically sneaking poison for the AI into the images,
and that's fucking dope.
I love this. Love what they're trying to do.
I think there's some debate as to how long it'll work, how well it'll work.
I'm not technically competent, but I love the idea, right?
Yes.
Now, one of the things that I saw when I started looking into this, because this just came out,
Google Nightshade, AI, you'll probably be able to find this if you're an artist.
I think it sounds worth trying.
But I found in the subreddit, AI wars, or at least I found someone sharing this, I believe,
on Twitter, this post, Nightshade has been released.
Is use of it considered legal or illegal?
For those who do not know, it's software
that attempts to poison an image,
so if AI is trained, it will mess up the model.
For example, say you have a picture of a cat
and you run Nightshade on it,
if you attempt to train a model,
that image will replace the image
in say dog prompt category or pencil,
which means these prompts will be spoiled.
There is an issue that the creator of Nightshade
has not talked about,
either from lack of legal knowledge or ignorance, or they just don't care and to them it's
someone else's problem. The issue is it may be illegal in some countries. Basically, if
you release publicly a computer file, in this case, image file that knowingly and willingly
causes harm or distribution to other people's computers or software, it may be considered
a criminal offense. Now, it does not. Now, and again, I think that is stupid. I think
these, they're just trying to
scare others set of using this. You are not harming someone's computer. You are harming a model that
is stealing something. That's not illegal. Now they may try to make it illegal, right?
Yeah. I just want you to know the club is illegal because you are you, when if I'm trying to steal
your car and I injure myself trying to break the club, you have
injured me.
Yeah, I put, I invested a lot of money into stealing catalytic converters, iffy, and if
people are putting cages around their cats, that puts my investment at danger, and that's
illegal, right?
You're messing with my business, guys.
Jesus Christ.
It is that logic.
There's like someone in the thread is like, how exactly is your computer system or software harmed?
And he responds, it's equivalent to hacking
a vulnerable computer system to disrupt its operation.
Like, it's, and then he says,
you are intentionally disrupting its intended purpose,
creating art.
This is directly comparable to hacking.
Like, I fucking hate this guy.
I want you to read it, but in your head,
use Tim Robinson's voice.
Yeah.
And it really, and it just makes you,
it makes me feel better.
Oh my God, it's perfect.
It's so good.
So all of this put me in a sour mood, if he.
Oh no.
Yeah, yeah, it did, it did.
But I think back when I'm in that mood,
I think back to CES, right?
Like after I ask my question, and I think back to CES, right? Like after
I ask my question and I make that Google and Microsoft people, I make them kind of angry
at me. Right after I asked that question, the question after me is someone asking, hey,
you know, the blockchain was the last big craze. Do you think there's any future in,
you know, using AI on the blockchain? And both of them were, they were like, no.
Like they can't say no fast enough.
Like absolutely, we don't care about that anymore.
We moved on to the next GRIFT.
Why are you bringing up the old GRIFT?
It's dead, it's dead, we must move on.
Yeah, and that brought me a little bit of hope.
You know, perhaps we will get Mark Andreessen's
Benevolent AI God, or perhaps we'll get
Elisa Yadkowski's Silicon Devil,
or perhaps we'll just give control of all of the future of arc to fucking Sam Altman.
But my guess and my hope is that in the end, we heretics will survive the present crusade.
And that's the end of the episode that I've got for you, Effie.
That is amazing. And I love it. I love it so much.
Amazing. And I love it. I love it so much.
Well, if he again, if you want this article, or if you want the article version of this more condensed, easier to share, it's up on Rolling Stone. The article is titled The Cult of AI. And again,
that's by me in Rolling Stone, The Cult of AI. If you want to add in your stuff, plug your plugables?
So please, if you want to wait on Twitter and Instagram,
watchdropout.tv, it is definitely trying to do funny things
on the internet by humans and paying those humans.
Paying them well. Profit sharing.
Oh yes, profit sharing.
So truly big shout out to them.
But yeah, I might be in your town,
I'm gonna be doing a lot of shows this year.
So definitely pull up, follow me on the social meds
and I'll let you know where I'm at and you can just come.
But thank you so much for having me.
It was so good to see you again, Robert. It was so good to see you again, Robert.
It was really good to see you again, Effie.
And this AI discussion in a weird way as dark as it's been,
it makes me feel better because I like that.
We're starting to fight back.
Everyone did nightshade.
I think I'm gonna just start putting nightshade
on regular images.
Yeah, certainly one thing that's worth trying.
And again, think about hypersitition folks
We have to imagine better futures
In order to counter the imaginations of those who wish us harm who want to control and destroy
All that's good in the world. So, you know get on that somebody figure that out in the audience
All right, episode's over.
Behind the Bastards is a production of Cool Zone Media.
For more from Cool Zone Media, visit our website, coolzonemedia.com, or check us out on the
iHeart Radio app, Apple Podcasts, or wherever you get your podcasts.
Hello, this is Susie Esmond and Jeff Garland.
I'm here.
And we are the hosts of the history of Curb Your Enthusiasm Podcast.
Now, we're going to be rewatching and talking about every single episode, and we're going
to break it down and give behind-the-scenes knowledge that a lot of people don't know,
and we're going to be joined by special guests, including Larry David and Cheryl Hines, Richard
Lewis, Bob Odenkirk and so many more and
we're going to have clips and it's just going to be a lot of
fun so listen to the history of curfew enthusiasm on I heart
radio app Apple podcast or wherever you happen to get your
podcasts. You may know my voice from Noble Blood, Haley Wood, or Stealing Superman. I'm hosting a new podcast,
and we're calling it Very Special Episodes.
A Very Special Episode is stranger than fiction.
I sound like it should be the next season
of True Detective, these Canadian cops
trying to solve this mystery of who spiked the chowder
on the Titanic set.
Listen to Very Special Episodes on the iHeart Radio app,
Apple Podcasts, or wherever you get your podcasts.
What up, guys?
Hola, qué tal?
It's your girl, Chiquis, from the Chiquis and Chill
and Dear Chiquis Podcasts.
And guess what?
We're back for another season.
Get ready for all new episodes
where I'll be dishing out honest advice,
discussing important topics like relationships,
women's health, and spirituality.
I'm sharing my experiences with you guys, and I feel that everything that I've gone through advice, discussing important topics like relationships, women's health and spirituality.
I'm sharing my experiences with you guys and I feel that everything that I've gone through
has made me a wiser person and if I can help anyone else through my experiences,
I feel like I'm living my godly purpose. Listen to Chikis and Chill and your Chikis
on the iHeartRadio app, Apple Podcast or wherever you get your podcasts.