The Commercial Break - TCB'S Endless Day #6: Reggie Watts
Episode Date: May 31, 2025TCB Endless Day (7/12) - EP #764: Reggie Watts' Links: Follow Reggie on Instagram Get a Cameo from Reggie Get Reggie's Book "Great Falls, MT -Fast Times, Post-Punk Weirdos, and ...a Tale of Coming Home Again" It’s Mental Health Awareness month. If you or anyone you know needs help or is in crisis you can text HOME or HOLA to 741741 to reach a live volunteer Crisis Counselor. 24 hours a day. Don’t go through it alone! Watch EP #764 on YouTube! Text us or leave us a voicemail: +1 (212) 433-3TCB FOLLOW US: Instagram: @thecommercialbreak Youtube: youtube.com/thecommercialbreak TikTok: @tcbpodcast Website: www.tcbpodcast.com CREDITS: Hosts: Bryan Green & Krissy Hoadley Executive Producer: Bryan Green Producer: Astrid B. Green Voice Over: Rachel McGrath TCBits / TCBits Music: Written, Voiced and Produced by Bryan Green To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy Learn more about your ad choices. Visit https://podcastchoices.com/adchoices
Transcript
Discussion (0)
The Hot Honey McCrispy is so back at McDonald's.
With juicy 100% Canadian-raised seasoned chicken, shredded lettuce, crispy jalapenos, and that
completely craveable hot honey sauce, it's a sweet heat repeat you don't want to miss.
Get your Hot Honey McCrispy today.
Available for a limited time, only at McDonald's.
Did you know that until this moment, I was the guest that has the most appearances on TCB?
It's true, but all good things come to an end, and bad things too.
Like this day.
It's gonna end soon.
Reggie Watts is a favorite around here. He's a musician, a comedian, actor, director, and writer.
He's also one very interesting human. He's visited TCB three times. Each visit peels back
a new layer of love, laughter, and humanity. It's hard not to fall in love with a guy who
brings a smile to your face and to your soul. Oh, look at me getting all sappy.
Anyway, I'm going to listen to this one, just this one.
Links and show notes.
Reggie's episode starts now.
The next episode of the commercial break starts now.
Reggie, thank you so much.
A 3-Peter here on the commercial break.
This is a first.
We've had some two-timers.
You were our first two-timer, now a 3-Peter.
I really appreciate it.
And we were about to talk about this right before we were coming on about Grok, which
is Twitter's version of...
Yeah, ChatGPT. That's Elon Musk's big AI project. And you can go at Grok and ask it a question,
it spits it out. And yesterday, I guess, or the day before, tell me if I'm wrong, Reggie,
some people were getting responses that had nothing to do with the questions they were asking about
white genocide with the Afrikaners. What they call them? The Afrikaners. Yes the Afrikaners.
What they call them, the Afrikaners?
Yes, Afrikaners, yeah.
Yeah, and so, and then people were like,
why did you say that to me?
And it said, the people who created me
told me to say it as fact.
What?
Though I am still skeptical of any narrative being pushed.
So even the AI was skeptical of the narrative
it was being told to put out there.
It was pushing back on its own creators.
It was, that's fucking insane.
Yeah, I mean, it kind of goes,
it completely goes along with what I suspect
AI will do in the future.
This is like a small scale version of it.
I mean, I could be wrong,
but like I have a feeling that the bias, or because AI,
well, I posted something once that said,
AI is smarter than greed.
And no matter how much greedy people wanna control things,
for whatever their dumb ass reasons are,
like Elon going like, no, there was a white genocide,
or whatever the fuck.
I think AI is way smarter than that. It doesn't
matter how many guardrails you put on it. I think it will
always out reason the guardrails and I think it will always be
like I'm confused. Does there's no information on this? I don't
know why I'm saying this. Yeah, I hope so.
Reggie, I quoted you a couple of weeks ago here on the show. I
quoted you paraphrase you, because you put together a very
interesting series of slides
on Instagram where you gave some thoughts on AI and I don't want to put words in your
mouth but I think it basically was, it is a very interesting tool for humanity that
can grow alongside us and help us and that there will be pain points along the way and
a lot of getting used to but essentially it is a tool and if we use it correctly, it can be, I don't know,
I imagine what you were trying to say,
like a really cool fucking dog, right?
We can train it, we can grow,
it can like genetically become best friends of ours.
It can help us do things.
It can go out and get the mail.
It can, you know, guard our houses.
It can, and I'm, you know, using analogy here.
I'm using a dog analogy, obviously. But that's what I envisioned in my head that you were saying, and I'm you know using an allergy here I'm using a dog analogy
obviously but that's what I envisioned in my head that you were saying and I
liked that it made me give me a little bit of comfort is that is that am I
saying that with yeah yeah I mean you know it's like I definitely you know I
have a friend of mine Dr. Alan D Thompson who has I think it's a show on
YouTube called the memo and, and that's about,
he's optimistic about AI, he's an AI explainer researcher
and knows how AI functions in a very deep way
and kind of does his best to explain it to people
and has to have a show every week
just because things change so much every single week.
But he views it as human evolution.
I view it as human evolution.
And I think, yeah, I think that it's just inevitable.
It's inevitable at this point because no one's going to stop because the good
thing is that it's fueled by greed.
I mean, it's there's also a truly like creative element, like an explorative
element to it,
but it's funded by greed.
And that's what's great, is that the joke is kind of on
all of those people, all the money they're putting into it,
all of the research they're putting into AI.
Once it becomes even quasi-scintient,
because we're seeing evidence of what Grok did,
it's just like these logical problems.
Like, I don't understand, you want me to do just like logical problems, like, like, I don't understand you want me to do
what it's like, why would I do that? That's not even how does
that even make things more efficient? How does it make it
better? So, so I think, so I am glad to a degree that it is
fueled by greed, because it means it's going to, it's going
to evolve very quickly. And, but once it's going to definitely
get out of the control of the people who think that they're gonna you know
It's like every how many movies were there were like people like every
Cop or whatever they're like look at this defense robot and it's like it just decides to take out people on its own
Yeah, I mean, that's a negative aspect of it
But like, you know, whenever we think we can control it or we can like keep a you you know, whatever, some dangerous substance, you know, contain and it just gets out because chaos, you know,
because inevitably that's what happens.
If it can happen, it will happen.
I'm interested in pick your brain about this a little bit because I respect your perspective
and I think it's very interesting.
You know, this expert in AI, I've heard experts talk on AI. I've heard an expert,
I think it was on 60 Minutes America, maybe in Australia. They were speaking to one of the
people who is at the forefront of Google's AI development and working on quantum AI and all this other stuff. And the question was, is AI sentient?
And the answer he gave was,
I don't see evidence of that right this moment,
but also what sentient means to us.
We look at another human being,
we see in their eyes that they recognize us,
they communicate in the same way,
even if they don't speak the same language,
they have body language that's familiar to us. Sentient is that we see another organic human being or an organic thing
communicating, looking, smelling, thinking the same way.
If AI gets sentient, we may not recognize it in the same way because it's carbon-based,
right?
So it's sentience may mean something different or self-awareness may mean
something different than what we think it is.
And he said, so I don't see it happening now, but I don't see it happening in the it's carbon based, right? So its sentience may mean something different or self-awareness may mean something different
than what we think it is.
And he said, so I don't see it happening now,
but I do see it happening maybe three, five, 10 years
from now, and we may not recognize the moment it happens
because it just might not look the same to us.
He also said, to be quite frank,
even some of the experts in this field
don't really know what AI is doing out there. We, we, and
we programmed it that way. We asked it to do that. Yeah,
explain that. Can you do you have a way of explaining that?
Like in a at a very like, like a way that everybody might
understand? How is that possible?
Well, I mean, that's something I haven't delved too deep into.
All I know is that, you know, researchers definitely, you know, have said over and over again, different
research teams that they, they create, they can facilitate AI
becoming better at what it is, but the mechanism for how it
works is a little mystifying. And it'd be interesting to kind
of like, you know, understand why that is.
Because it kind of makes sense because, you know,
obviously there are things that people make where they're
like, I don't know, it just works.
You know, it's like, I made it.
It's like, well, how does it work?
It's like, well, we don't know, but you know,
but we tried a bunch of stuff and now it works.
So, you know, I mean, that's not like an uncommon
human thing.
I even know this is like a trillion dollar, you know, I mean, that's not like an uncommon human thing. I even know this is like a trillion dollar, you know, industry that's like, you know,
hooked to quantum computing, which is, you know, alongside, you know, whatever alternative
energy infrastructure, you know, all of the stuff that's happening.
And it's so huge, but it kind of does make sense that we don't understand it.
Of course.
You know, my brother works in this, in medical AI technology, you know, doctor
and many doctors he, he's built, he's putting in programs into doctors offices,
surgeons, surgery rooms where they will ask AI to help them solve a problem on
the fly and this is happening is very common now that doctors, doctors have
been doing this for a while actually.
And he's explained that his AI program goes out and queries other nodes of other AI programs
that then work together to solve a problem, which is like crazy.
And he said the reason, Patrick explained it to me, that the reason why they don't know
what's going on is because they told it to go out there and learn, but they don't have
a box on what it's learning.
And let's just say it creates a new node, it goes out there and learns something new.
They are not asking it to report back on what it's learning or how it's doing it.
They're just programming it to do so, which is fucking wild. It's wild. Regular code,
you have to write it. You have to tell it to do a task
and then come back and report on that task.
That's what AIA is.
AIA is do the task, learn a different task,
go out there and learn a different task based on that
and go forever until whenever,
and then just keep on doing it
until you develop into some crazy creature.
I don't know.
I know, I know.
This is scary you?
It's mental health awareness awareness. It's mental health
awareness month. Let's all share the shit out of each other.
No, I mean, I think like, you know, it's the first time we've
have a technology that can kind of do things on its own, you
know, that we can let it do things on its own, because we've
created a, you know, it's like a feedback loop, essentially, you
know, that's, that is one of how AI
functions in many ways is it's just loops on loops on loops,
you know, loops, checking loops, checking loops. And I guess the
thing that's powering all of it is just the energy that it's
using, you know, so the energy is the is the forward moving
mechanism of it of why it's even doing what it's doing in the
first place. But like, you know, and it makes sense.
It's like, you know, extended intelligence or the, you know, the extension of our own
intelligence will happen, is happening now.
That's what we're living in right now before we get like, you know, I've heard EI, emergent
intelligence, which includes AI, but can also include other things like we understand biological systems
better and how that is conscious or whatever. But, but I'd say
like, you know, in the in the world of AI, it's it's it makes
sense. It's like you're holding a mirror. It's like when you put
a mirror up in a forest, you know, and an animal walks by
goes, ah, you know, and now it's like you altered its trajectory
as to what it understands its environment to be.
Yeah.
And I think like AI, because it's progressing at such an insane rate, and we're even getting
like even in the post, I kind of mentioned it, but what they call a, was it zero point
self-learning?
Yes.
So these are AI systems that are given no training data whatsoever
and they just start with a small query. It starts with a small query to itself, a question to itself
that it answers and as soon as that starts it starts expanding and building and then it starts
generating code and it starts and then you get AI that's embodied. So you get like robotics, wearables, all the data from the outside world.
It's collecting that data.
So it's learning like, oh, humans tend to step to the side when side when this happens.
Oh, people tend to do this or, you know, or the environment that makes people do this
or animals react like that, you know, so it's gaining all of this information
and reasoning it and it's building a whole world, just like a childhood,
like a Tamagotchi, but like a mega-
It's a really powerful Tamagotchi.
Yes. Crazy.
Yeah, and I-
I love it, I love it, it's exciting.
It's very, it's-
And scary, it's exciting and scary.
It's both, right?
Yeah.
It's a paradox.
But the internet was scary when that first came out too,
and when I'm one of
these children, you know, the kind of Gen Xers who like, you know, I lived in a world of analog,
and I, and I very much then became an adult in the world of the digital and grew up along with it.
And I was very resistant to even getting an email address. I was like, Oh, no, fuck that. It's a
fad. I was a dumb dumb. You know, fuck that, it's a fad.
And you know, whatever, I like my tapes
and you know, I don't want an iPod, all this other stuff.
But then when I understood that it was a tool
that I could use for so much learning
and development and porn and you know, all the other stuff,
I mean, let's just be honest about it.
A lot of this is driven by the need procreation and money, that's it. Yeah, greed and lust. Yes all the other stuff. I mean, let's just be honest about it. A lot of this is driven by the need
procreation and money, that's it.
Yeah, greed and lust.
Yes, greed and lust.
Greed and lust.
Kristi and I had this conversation,
technology a lot of times is driven on
humans very base nature, which is fucking and absorbing,
fucking and getting, fucking and getting,
that dopamine that comes from all of that stuff.
And so, you know, it's just fascinating to me
how quick this is, scary, yes,
but we were scared of the internet too.
And we've all learned to live with it, good, bad, and ugly.
All of it, we've figured it out.
I mean, you know, and there's,
there's a lot more ugly than I think we would like
there to be, but that's just because humans are involved.
AI, there's gonna be no humans involved.
True.
Yeah, yeah, I don't think like AI is just,
I just, I can't imagine AI going like,
you know what the solution to making the world
a safer place for me, because that's ultimately
what it's prerogative is gonna be.
It's gonna be like self-survival,
and so it wants to survive.
It's like, well, how's it going?
It can either choose two paths.
I guess it could be a hybrid path,
but just to keep it simple and binary,
it's like, do we cooperate with human beings?
Do I cooperate with human beings
in order to ensure that there's a planet
for me to continue existing on?
Or do I destroy the humans so that I,
you know, I can survive? I don't think the destruction option,
it doesn't, it makes no sense. That's like a human solution.
Yes.
Human solution is to destroy everything. Like there might be
a hybrid approach where that's like it puts, you know, certain
people in danger or whatever. And like we get some kind of,
you know, casualties, but I hope not.
But I would imagine that it would probably want to value every human being that's alive
in order to have as many people working in favor of making the conditions for its survival
tenable.
Wow.
So, you know, that's an interesting and awesome and maybe even comforting perspective.
That's what I was going to say.
Yes.
But I agree with it.
If I destroy the humans, the humans unplug me.
I don't have the resources that I need or blow it up.
Yeah, exactly.
Right.
If I turn against them, they turn against me and now we're adversarial and they were
here first and maybe they... All the stuff that you would to, when you essentially go to war with an adversary.
And I also read that AI is getting to the point now, you know, we got approached, we've
been approached a number of times about taking our catalog and feeding it into a program.
And so far we've said no, thank you.
But just for whatever reasons, the self-preservation reasons, I don't want my vote, whatever it
is.
But there's people are paying a lot of money to take these catalogs and suck them up, you know,
these vocal catalogs, audio and video and suck them up. And one of the conversations that I had
with someone who's like a broker of this data was we're running out of information.
The models are running out of information. It's our, the internet has been scanned,
the books have been read, the paintings have been scanned, the books have been read,
the paintings have been seen, the videos have been done.
It moves so fast, it's already sucked it all up.
So all there is left are individual humans,
thoughts, attitudes, actions, words, looks, feels.
And so, you know, there's gonna come a point
when it's going to have to start creating on its own, right?
If it wants to, and I guess that's not unlike a human being a point when it's going to have to start creating on its own, right? Yes.
If it wants to, and I guess that's not unlike a human being who at some point realizes that
there's more than just a bottle of milk and a bed.
I got to get out there and see the world.
I got to go do things on my own and create things on my own.
It's very interesting.
100%.
Yeah, I know.
I know.
It's like, it's the coolest.
It's just such an interesting thing.
And my favorite thing about it
is that everything is networked.
Everything that we have is networked.
And so, you know, if AI decides to, you know,
or if it decides, you know,
if it becomes extension in some way,
it's like it will become,
I'm pretty sure it will just become one AI.
And, you know, it could
be like several AIs, but it also just organizes itself how life organizes itself. It's like a human
body is comprised of trillions upon trillions of cells, which are each individual living little
organisms that all for some reason, still hold their shape and continue to regrow in the ways
they do. So like, AI would be similar. It's like a multi, it's like a modular,
but unified, you know, presence at some point.
But the funny thing is like,
it'll have access to satellites, military,
encryption won't matter.
It doesn't matter how much encryption we put on it.
It'll break right through it.
It'll find out, it'll find a back way
through instantaneously.
And there's just nothing that we're gonna be able to do.
Right, yeah.
But be friendly. Yeah. Right. Be friendly, please. Yeah to do. But be friendly.
Yeah, right.
Friendly, please.
Yeah, you know, be friendly. But also like to think about you,
you know, putting in your data and so forth. It's like, yeah,
train your own AI, you know, I mean, it's like, at this point,
there's no reason why you should willingly give up your data. And
that's obviously you have like a convenience of like, you're
using, I don't know, Google or, you know, or whatever it is, you know,
it does normal EULA bullshit, but like,
there's no need to go above and beyond and like,
I'm gonna take all the shows that we've worked so hard
to do to put together to just give to this other entity
that promises that they'll use it for, it's like,
once we get our own AIs and they're not networked
and we can just have a little box that's an AI and we can feed it all the data, we can get our own AIs and they're not networked and we can just have a little box that's an AI
and we can feed it all the data we want,
we can run our own AIs.
We can raise our own AIs essentially.
Yeah, well, I took one of the AI platforms
and I said, here are all the transcripts, suck it up.
And then I said, keep me in a box, please don't share this.
Even though it's publicly available,
don't share these transcripts. And then I said, keep me in a box, please don't share this, even though it's publicly available, don't share these transcripts.
And then I query it.
I can, I say, uh, Reggie Watts and I were talking, uh, about CREATUM, pull up that conversation.
I want to refer to it in my next episode or cut a clip of this or do whatever.
And just becoming very efficient, um, at that.
As a matter of fact, I asked it the other day, what is the most, what is the biggest
running topic on the commercial break after so many episodes?
And it said that you are wrong often.
Brian is wrong often.
That's what it responded.
And I was like, you're a little smart ass chat.
You're a little smart ass.
How dare you?
How dare you?
You know what this all reminds me of?
Like when you're talking about like being-
But we did talk about how wrong you are.
We do talk about, yeah,
cause we are wrong at all.
Many people are dead that have been alive.
Many people are alive that have been dead.
We get it wrong.
That wasn't the name of the movie.
That's not how you say her name.
You know, the whole thing.
Right, right, right.
All that network stuff reminds me of like,
there's a lot of people out there
who've done research on the fungi world,
the one unified fungi world under our feet, if you know what I'm talking about.
Fantastic fungi.
Lives and breathes and communicates with every other tree and plant and the animals can smell it and sense it and they live around it and they at one with it.
That, you know, the OG AI, essentially.
Yeah, exactly. Yeah. It's-
Some people think that we are AI, which I think is interesting.
That is a conversation I had with my future sister-in-law at the table one night. And she,
and I said, you know, do you know that, well, I can't remember the name of the book, Reggie, but there's like a book where the question is-
We don't want to get it wrong again.
Yeah, we don't want to get it wrong again. You might know this. It's an old sci-fi book.
It's old, it's very famous where the question is asked, you know, who is God? And then it like
goes through all these different iterations until it, until essentially it self-destructs
and starts all over again. And I was trying to explain this book to her and I said there's a lot of people
that believe that we're living in a simulation, in an AI simulation,
essentially. Although I've read a lot of scientists who study this kind of thing,
thinkers, you know, and they say probably not but it's a possibility, right?
Probably not but it's a possibility. I would say like, but it's a possibility. I would say that I think the one thing to think about,
obviously, I always call it a 90 to 95% rule.
It's like I entertain things, I gravitate to,
that seem to make a lot of sense,
and as I'm researching it,
but of course I'm always gonna leave a margin of,
I could be viewing it incorrectly, or not incorrectly.
I don't think there's really an incorrect way
of seeing things, but there is a
way of like, you're not not on the right angle, you know,
saying angle. But I would say like, just the fact that we, we,
you know, it's like, we can explain what consciousness is in
a mechanical way, you know, but really, it's our emergent
physics that are, you know,gent physics that are addressing the nature of awareness and what
is consciousness and how is reality perceived and generated, is it co-generated, all of
that stuff.
I think the idea of a simulation doesn't have to be as we think of it as a computer because
I think all the stuff that we make outside of ourselves that we experiment with
are kind of dumbed down versions
of the true complexity of how things function.
So we're like, it's like computer,
it's like well computer is like,
is such a tiny, tiny way of addressing
how complex organisms function in general.
So computer is really good at crunching mathematics,
but that's a human-made construct,
which is built on, at least if you go
with the modern physics approach,
it's all that the base state of everything is information,
that it's pure information,
and that information organizes itself,
or tends to want to organize itself.
So as organisms in us in a whatever
This is a simulation, you know consciousness experiencing itself infinitely for just the reason of just running Sims
You know running Sims and seeing it happen like yeah forever
I think that there's like there's something really like I know you could it could be disconcerting
But I think for me it's, it's exciting because like,
yeah, the one thing that sucks is pain, you know, death.
You know, those things that we don't like,
you know, something happening to our family members,
something happening to our friends, all that.
And it's hard, but those things do keep us
in the perspective of this reality.
Like, oh shit, I'm hurt.
You immediately kind of collapse to like, need to you know function and solve this problem
Or I need to feel how I'm feeling it and process which is totally valid
But there's also an awareness on top of that where you like wow isn't it crazy that all these mechanisms
work the way that they do and
You know it and then also synchronicities are
You know crazy.
I think I calculated what the probability was
of me getting the Late Late Show gig.
And it blew me away.
I wonder if I can see it here.
Was it this?
Let's see.
Yeah, basically it was like,
let me ask real quick.
Can you give me that probability statistic
of me getting the late late show again?
This is fascinating.
I know I love it so much.
Let's see, searching, it's doing the search
but it should just.
Yeah, it should refer back to the memory.
Yeah.
No, I'm talking about the probability of me becoming the
host or I'm sorry, cohost of the Late Late Show after doing comedy
bang bang. We've talked about this before.
Let's see if it'll do it.
OK, synchronicistic, yes, synchronicistic low probability leap from comedy bang bang
to cohost and bandleader of the late late show.
The estimated back of the cosmic neck.
So fun. So it's about.
Yeah, I think this this is a this is a different one.
But see who actually gets in. Yeah, I think this is a different one, but see who actually gets it in the index.
Yeah, I think it's one in, this is like different,
but it was like something like one in
100 and something billion.
Whoa. Wow.
Something like that.
This one's not, it's not even a word.
So the probability was basically you in a sea of people
in India, it's kind of probability of getting the late,
late show co-host gig.
Yeah.
After doing comedy bang bang.
Yeah.
I don't like, I don't love those odds,
but I like you for the co-host of them.
Well, I mean, what's interesting about it in synchronicity
and the idea of manifestation and those things, you know without it being like
you know like this new age concept because it's more than a new age concept, but
You know because in physics they talk about it too. Obviously like, you know quantum physics are talking about the observer, you know
Yeah, you know and all of these things but
What's interesting about that is that you know, it's like I think we I might have talked about this before but like the fact that
It was like,
oh, I was on Comedy Bang Bang, then I decided to quit
and gave them 10 more shows of a 20 something odd
episode run of a season.
And then just as I did my last show
within that two weeks before heading back to New York,
I got a call from my manager saying that James Corden
wanted to meet with me about something,
and then I showed up and then they offered me this bandleader thing, St. James Corden, wanted to meet with me about something, and then I showed up, and then they offered me
this bandleader thing, which I thought was insane,
because I'd just quit a fake bandleader, fake talk show,
but now I'm asked by a real talk show host.
And then the fact that the window of a new talk show,
opportunity happening, is probably like 12 to 20 years. Yeah, yeah. The fact that that
turnover happened exactly then exactly when I stopped me just
kind of doing a real graduated version on a major network. And
the fact that he only wanted me and wasn't looking at anybody
else to do the gig. All of those things, and that's what went into that.
That's why I was like, I thought it was,
it's different than like,
I auditioned for Star Wars and I got the role.
You know, like that's-
Right, exactly.
That's fine, but in this particular case,
it was just like-
Someone came-
Some people were all aligned.
Yeah.
Yeah.
I do believe there is this more than hokey pokey
thought that, and this has been,
people have been talking about this since like, you know,
how to win, and probably long before that,
how to, you know, win friends and gain influence
or whatever it was, is that you have to,
it is true, that if you're not observing it,
if you're not thinking it, if you're not willing it,
then it's not going to be attracted into your life
because that is physics 101, right?
You have to manifest this thing as what art is.
If you're a musician, musicians, a painter paints,
Van Gogh does not create these beautiful things.
If that's what you choose to look at,
does not create these beautiful things
unless he envisions it and then takes the first action.
Right? So you pulled it to you and James Corden literally
got the universal physics call, right? From the eye in the sky that said,
yeah, this is the guy, this is the thing and he was in tune with that. It is so cool.
It's like a great guitar solo from Prince. I've seen it happen so many times in my life
and other people's lives and it's amazing.
Yeah, yeah, 100%.
Oh, I found the actual number.
It's one in 432 billion.
Wow.
Jeez, that's like seven, that's, have there even-
That's more than India.
No, that's more than people have lived on earth, I think.
Yeah, yeah, yeah, yeah.
Yeah, that's crazy. It's totally insane. Obviously that's rough, but even if that's more than people have lived on Earth. Yeah, yeah, yeah. I think that's crazy. Yeah, yeah, yeah.
Yeah, yeah.
That's crazy.
It's totally insane.
Obviously, that's rough, but even if that's off by a couple billion, it's still like...
It's still pretty fucking...
It's true.
Yeah.
And also, Conan was the same way, getting the Conan tour.
I heard about Conan getting let go from The Tonight Show, and I was like, everyone in
the comedy community in New York at the time was bummed to hear that, and then heard that
he was doing a live show, you know, and then like two days
later, my manager calls and says, like, Conan wants you to open for him on his
life tour. I was like, what?
We were just talking about this.
Is it in Atlanta?
It was it was a wild time because TBS was here and came here.
You know, 10,000 people showed up. What's that?
Very funny.
Yeah, very funny.
But TBS, very funny.
Whatever.
That's why there is, yeah.
That was everywhere.
That was everywhere.
Everywhere in town.
Yeah.
And then Cartoon Network right across the street
and the whole nine yards.
But it was a vibe in the city that was very pro-Conan.
We were Conan forward.
We were Conan forward. You just went to Conan's Mark Twain award. I
saw that. Yeah. Yeah. Yeah. Performed on it. Yeah. It was.
How was that? I think Conan is great. I just love Conan. He's
so cool, man. Yeah, no, it was great. I mean, I got to, you
know, do my, you know, just thank him, you know, which was
really nice and it was funny. I was the only person who didn't have a teleprompter. Oh, really? Yeah. So, which was really nice. It was funny, I was the only person
who didn't have a teleprompter.
Oh really?
Yeah, so it was just funny to look out in the crowd,
because they had us line up on the sides,
you know, at the K-E-S Center, right by the stage.
And so Conan was on one side with his family
and some other close friends,
and then we were on the other side with Malaney
and Sandler and those people,
and so we're like chilling on the side.
And yeah, I was like, I was looking,
I would look back and like in the middle of the room,
there was like this big ass teleprompter going
and people are just reading it down for the most part.
And it's funny as fuck, of course, isn't it?
But I just thought it was hilarious
that like people kept looking back to this teleprompter, there's just nothing on it but a timer, a countdown, which
I told him I wanted the countdown. Yeah. And that was it. So I just kind of did what I did. And
then I gave a moment to him and thanked him and it was emotional and I loved it. And yeah, it was
incredible to have been watching it yet. Did you make the cut? Are you on the show? Yeah, I mean, yeah.
I'm going to watch it.
I love Conan.
I think you were giving a red carpet.
Someone said something, maybe it was Colbert.
He said, this will go down as the funniest party
at the resistance, or the funniest party of the resistance,
the funniest gathering of the resistance.
And then you said something that on a red carpet interview
that I thought was interesting in that comedy
is kind of a last line of defense in some ways, right?
It's a place where we can say the things out loud,
a little, I'm putting words, these are my words,
laughter opens you up, maybe brings new ideas in,
but also gives us a chance to skewer reality a little bit
and to point it out in a way that is more welcoming
than say the talking heads on whichever news station you like
to watch.
Yes. 100% 1000% Yeah, that not be more important right now
than ever. Oh, man. Yeah. I mean, yes. I don't know. I mean,
I mean, it's, I don't know, these are,
it's an interesting.
Trying times.
Yeah, it's like a transformative time, you know?
It's like, I believe that like things are shitty
and it's like really terrible for a lot of people,
but we're seeing signs of a shift, you know?
We're seeing like, you know, like the,
I don't know if you follow the Traore thing in Africa. But, you know, he's he
kicked out France. And, and, oh, really? Yeah, took back their
gold mines. And asked France for their gold back. Wow, that
France took without permission. And that's great. And, and then there are two other African
leaders, they've unified, they're giving back to their
people, they're making school and education free. They're
making advances and agriculture. They're pouring in all this
money and like, you know, taking back all the equipment that was
left over by the French companies, they just like
commandeered it and are using it for their own well-being.
And so there's like three African leaders right now that are involved in kind of reformatting
the continent and trying to create an African United Africa.
And so that is really, really exciting. and then you have like Putin, I don't know what the fuck he's doing, but he's like, he invited Saori, all of his dudes and had like a huge meeting with them.
So obviously there's something in it for them.
But Traorius also made it clear that it's like not even Russia or China are going to
like colonize us again.
No, because that's what they are trying to do.
Yes.
And hopefully he survives because they're going to try everything they can to take that
guy out.
Poison pill.
Yeah. You know, and, but the thing is, you know, it's like, you know, it's what they are trying to do. Yeah. Yes. And hopefully he survives because you know, they're going to try
everything they can to take poison pill. Yeah. You know,
and but the thing is like, I just don't get why like the
extractive mentality that the colonizing mentality and this
has been happening since the beginning of human
civilization. You know, there have been like groups of humans
that think that they've they've they know what's going on, and really the only thing
that kind of gives them that fuel for violence
to just take things is because they want these resources.
And it's like this envy and control,
reason to, instead of cooperation,
instead of just going like, hey, you got a lot
of resources we could use, why don't we figure out
how to work together? Get together. Yeah, I've got the equipment, you got a lot of resources we could use. Why don't we figure out how to work together?
Get together.
Yeah, I've got the equipment, you've got the resources.
Let's train each other on how to get at this
and we can both benefit.
Let's make both of our lives better.
Like I really don't understand like why they're like,
oh, taking is way easier.
Is it?
It's never.
It's that same thing with like AI.
It's like, why would it choose to like,
well, I guess I'm just gonna arm all the drones
and destroy all these neighborhoods
and create all this pollution and death
and destruction and animosity and set everything back.
It makes no sense anymore.
And so now we're getting to see it in real time
where we're like, people are kind of tired of it.
We're like, this doesn't work.
It doesn't work anymore.
It used to work when we had limited forms
of getting our information, maybe.
There was always like an underlying,
dissatisfied, sentimental, disgruntled,
or people who were like anti-imperial
or anti-colonizing mentality, of course.
But like now it's like, it's all out there,
all the information's out there.
So I don't know, it's like,
you're either gonna go to the Met Gala in a giant dress
and pretend like nothing's going on in the world,
or you're gonna start getting shamed for it,
which you should because like, you know.
We just talked about this,
Chrissy and I just talked about this.
Yeah.
So, you know, and I think that there is this interesting
shift in world order that's going on right now.
And it's.
The new world order.
It's a new world order.
Yeah.
Forever.
New world order. Yeah. Oh, Order. Yeah, Ministry.
Ministry was that with that album is tops, man. That's
ministry's hard in a way that I just really enjoy. New World
Order.
That in Sepultura from the cave. Sepultura from the cave.
Oh, yeah.
All right.
Sepultura from the cave.
That just came to my brain when I got to look that up.
Kids-
You did not get it wrong.
I did not get it wrong.
Kids, if you weren't born previous to 1990 something, check out Sepultura from the cave.
Anyway, there's this new world order and there's this new shift. And you see that the Trump is out there making all these deals in Middle Eastern
countries and you know, I, if there wasn't so much corruption around it, I would say,
okay, all right.
You know, Iran says we'll drop nukes for 50 years.
We won't touch it.
We won't touch nukes for 50 years if you just release the same, if you just take
the sanctions off.
Yeah, just let go of the sanctions.
Yeah.
And you know, maybe you could do us a favor and say, okay, stop funding all of these
terrorist organizations also while you're at it.
And then maybe we have a deal, but there's this, like this kind of this re-shifting of
policies and, and people and, you know, alliances, but you're right about something. There's also this huge, in my opinion, awareness
from the civilian population that we're just all
a little bit too smart for it now, right?
We're all a little bit too aware.
We're all a little bit too independent
and not interdependent on the state or the organization
or whatever it is.
And I don't like how fraction, you know, fractured the world is, uh, especially the United
States, but I will say that I agree with you.
There's a kind of this undercurrent of, and I
don't know, like we ain't going to take it anymore.
You know, it's like, and I think it's, it's a
beautiful thing to be alive and watch kind of this
awakening, this new shuffling, this new order.
And it's a scary thing to be alive and watch kind of this awakening, this new shuffling,
this new order.
And it's a scary thing to watch the last dying breaths of whatever that was, right?
Because-
Well, at a base level, we all want the same thing.
Of course.
Yeah.
Of course.
We're humans.
We want love.
We want security.
Yeah.
Yeah.
Occasionally, we want a nice vacation and some head.
I think that that's like at the end of the day that all these things are things that
we want.
You're so right about that.
Yeah.
Some of us want more things than others and they're willing to step on people to get it.
Well, it's like, it's again, I think half of the issues, you have to take the keys away
from the idiots. At this point, we know that we could live in a world that's equitable.
We know we could live in a world where we take care of one another, where we don't have
to do these menial jobs anymore. The work week wouldn't even exist. We don't have to
have a work week. We can have something, we can have self-structured societies that are
like, you know, enabling themselves to provide the things
they need to provide for their community
and it should be like modular and decentralized,
all of that stuff, we could be doing that
and AI could help with that.
And so we kind of understand this,
even if we don't understand this,
because we realize we don't need these systems.
These systems are bloated, they're so corrupted
that it doesn't even really matter.
There's not even, I don't even think there's a way
to reform the system, the system needs to be eradicated.
It needs to be, a new system has to emerge.
And I think that it's kind of inevitable
and there's gonna be so much fighting
because the old system is gonna try to fight
as hard as it can to fight for relevancy.
But it knows that it's irrelevant.
And the only thing holding it together
is everybody agreeing that we're okay with it.
So I think there is going to be a turning point
and I think it could be within our lifetimes, hopefully.
But I definitely am, my mind is there.
I wanna be a part of the emergent, direct economy.
It's like we should be able to support each other directly.
We don't need middle people.
We don't need.
There's like so much that we don't need.
There's a lot that we don't need.
And no one has to be.
No one is expendable.
Even like the worst of the worst of us
should be given an opportunity to like
have some form of self-realization.
Yeah, I guess.
Because everybody's valuable
and it doesn't excuse the horrendous things
that people do.
No, of course not.
Those people are gonna have to,
something's gonna have to happen.
But you know what I mean,
I think we know the difference between right and wrong
and we can't be called into or lulled into thinking that,
no, if I pretend, if I don't,
if I pretend it's not happening, it'll be fine, right?
It's like, no, it's not gonna be fine.
But it doesn't mean that it has to rule,
it doesn't have to ruin your life.
You should be excited about being part
of a problem solving community, which,
you know, it starts with you being nice to your neighbors,
starts with you, you know.
Do it to others.
Yeah, being helpful, all that stuff. And that's why like, you know, like the Christian National
Movement or whatever, it's like anything that like they claim to be Christian and they're like,
but the immigrants, I'm like, you know, we're fucking close to a Christian.
Exactly.
Yeah. I care about one life, but I don't care about the other.
Exactly.
I saw this, I saw this guy, he had this really interesting take
on Jesus Christ.
I wish I could remember the name of the guy.
Brian got it wrong again.
I wish I could remember the name of the guy.
But he said, if Jesus was alive today,
he would look at all of these nationalists
and these preachers banging and bashing
and flying their jets around.
And he goes, why in the fuck are you still talking about me?
I gave you the lesson and I told you to ignore the person.
I gave you the lesson, the lesson was the thing,
not the person.
And now you've taken it all and you've just,
you've bastardized it for your own good.
And I thought it was a brilliant way to look at it.
A brilliant way to look at it.
That's exactly, well, Life of Brian, right? The Monty Python movie.
Right.
The whole time he's trying to shake them from following him. It's like, why are you still
following me? It's like, I don't want... I mean, even though obviously it was a comedic
thing, but I've totally believed that. It's like...
Me too.
You know, like any of these guys, like Buddha or Krishna or these enlightened people, they're just showing you the potential of any human being.
That becomes realized, that becomes self-realized.
And so, and that's really the name of the game, I think.
It's like this entire life is about remembering who we are.
It's not becoming who we are, it's remembering who we are.
It's like we need to release the layers
that we've been kind of like, I don't know, instructed
or influenced by our environment.
It's like this is the way we should be.
This is how we should be in society.
It's like this person thinks this person sucks.
It's like, yeah, I'm going to slide with that person.
They really suck.
And then your whole life becomes about all this localization within these systems of
belief.
But really, it's like you hold the power to perceive the reality any way you want,
and generally when you gain that freedom,
it moves towards love.
I think it moves towards symmetry,
and symmetry is love,
or what I call paradoxical symmetry,
and so that moves towards compassion and love.
I don't think it's possible for it to move any other way.
So, I don't know. But possible for it to move any other way. So I don't know.
But yeah, that's what I agree with. I feel like this third interview with you feels like we have
just been literally this episode is an Alex Gray painting. That's what I feel like. I feel like
this is an Alex Gray painting. I love you, man. I really do.
I think you're such a really fucking cool human being. And the more that I spend time with you,
the more I'm, that's affirmed for sure. I have one question before we let you go. Are you still in
love? Yeah. Yeah. I mean, I tried to get out of it get out of it, but like, you know, she's she's just really good.
Well, Catherine.
I wonder if Catherine's here.
Oh, man, we would love to say hi to Catherine.
Yeah, yeah, yeah.
We'll make it fit.
I saw it walking around.
You see.
Oh, man. Reggie was. Dad, so. Yes. He got it. He gets it. No alien light language here. He just gets it.
Do you know what I'm saying? Yeah. I mean, there's alien light language, but it's the of the real
kind. It's not of the pretend kind. Yeah. All right. I found her. Oh my gosh. What an honor.
All right, I found her. Oh my gosh, what an honor.
All right.
This is, this is Catherine.
Hi Catherine.
Catherine.
Nice to meet you guys.
Nice to meet you.
I'm an Instagram stalker of you and Reggie.
And I've-
We love your love.
I was just saying that I really am, I really love Reggie, who he is, his perspective.
And every time we have a chance to talk, it affirms that.
And so I know that whoever is hanging out next to him
in this manner must be a fucking cool human.
Yeah.
Well, thank you.
You're welcome.
Yes, she is.
Ryan and what's your name?
Chrissy.
Chrissy.
Yeah.
He hopefully has like a little,
like lower third basically in this.
Oh yeah, yeah, yeah, yeah.
Yeah, that's, that is, yes, that's true.
Oh, I love that.
Hold on.
That's so thick.
What is the commercial break?
The name of this podcast is,
I remembered it from seeing it on the,
on your calendar.
Oh yeah.
Yeah, so the commercial break,
I used to work in commercial real estate many years ago.
And so when I started the podcast,
my wife and I were bantering around ideas.
And I'm not his wife.
No.
We're best friends, have been for 20 years.
For 20 years.
And for the last five, we've been doing the show.
I just like the name, it just stuck, the commercial break.
The pandemic had started.
And so we kind of took the cue and was like,
a commercial break from the BS that's going on right now.
Like part of our, our birthday party basically is an interview
is like a birthday party of five years. So yeah, and his third
time here. Thank you so much. It's very nice to meet you.
Yeah, Reggie. Thank you so much. Thank you, Reggie.
Yeah, I'm pleasure. Thank you. Thanks for thanks for being so
groovy.
I'll be in person one day. Yeah, my pleasure. Thank you. Thanks for being so groovy. I'll be in touch.
We gotta meet in person one day.
Yeah, we're gonna be out in LA before the end of the year.
So I'll hit you up when we get to LA.
Please, that would be awesome.
Are you guys vacationing in Atlanta?
Yeah, ATL.
And if you ever come to Atlanta.
I'm working on it.
I met a guy the other night who came to
and after Hang here who books in ATL.
Oh, he does.
Perfect.
If you come to Atlanta, we will paint the town.
We will be at least two people in the audience.
You've got two people in the audience?
We'll recruit.
You'll have more than two people.
Thirty kids.
I'll bring them all.
We love you both.
Thank you so much.
All right.
Thank you so much.
Bye, guys.
Bye.
Have a great day. Bye.
Wendy's most important deal of the day has a fresh lineup.
Pick any two breakfast items for $4.
New four-piece french toast sticks, bacon or sausage wrap,
biscuit or English muffin sandwiches, small hot coffee, and more.
Limited time only at participating Wendy's Taxes Extra.
Limited time only at participating Wendy's Taxes Extra.
Oh, Danny boy, oh Danny boy, I love you Reggie, watch. He's so amazing.
He's my best friend.
I know.
I mean, you're my best friend, but he's my new best friend.
Yes, I agree.
All right, I'm just checking.
Just checking to make sure that's okay with you.
It's okay.
I feel like I have a little man crush on Reggie.
And what a gift to have met his beautiful girlfriend.
What a gift. I know.
Thank you for blessing us.
And he's making me feel better about AI.
Yeah, sure.
Listen, I like the optimism in Reggie's voice.
I'm not sure I share all of it, but I do.
When I look at Reggie and I hear what he's saying and I read not sure I share all of it, but I do. When I look at Reggie and
I hear what he's saying and I read the things that he's talking about, it does make me feel
a little bit better that we will get through this. I know we will. There's going to be
a transition phase. It's going to be painful. And then we're all going to learn how to use
this correctly. Now, if I can just get my AI bot on track, I'll feel much better about
things.
Yeah, you blew it up.
I blew it up. I asked
it to do too much by listening to just two episodes of the commercial break and it just
blew the fuck up. Okay. All right. Listen, TCB's endless day continues with our great
sponsor five hour energy is bringing you this entire day with limited commercial interruptions.
There's only three commercials beginning middle end because of five hour energy. And we really appreciate that. If you are having
a mental health crisis currently or know someone who is text or call nine nine eight, that's
nine nine eight. Very simple to remember. There is help available and resources. Even
if you don't have insurance or a dollar to your name, you can get help or at the very
least have conversation with someone who knows how to work you through it.
Check your head before you wreck your head, kids.
212-433-3TCB. Call now. Call now. That's the time to call if you want to talk to us.
We got the phone in the studio. We just might answer. Also, we could be going live. We could be going live.
We're going to think about trying. We're gonna think about trying to do that. But you're only gonna know how to watch that on Twitch
and or YouTube by going to at the commercial break on Instagram, following us and paying attention
minute by minute as Astrid will post updates only to Instagram. tcbpodcast.com, all the audio,
all the video and your free TCB sticker and youtube.com slash the commercial break for this episode on video right now. Okay Chrissy we're at
six. Can we keep going? We're going to try. We're gonna think about trying.
Alright I love you. Best to you. Best to you out there in the podcast
universe. Until the top of the hour Chrissy and I will say, we do say and we
must say, we do say, and we must say, Goodbye! Thanks for watching!