Breaking Points with Krystal and Saagar - 7/22/23: Marc Andreessen On Why AI Will Save The World, Scott Galloway Predicts Trump Won't Run, Anxiety Driving Political Identity
Episode Date: July 22, 2023This week we bring you a discussion with tech entrepreneur Marc Andreessen to talk about his recent essay "Why AI Will Save The World", we also have Krystal and Saagar looking at a recent podcast wher...e Scott Galloway predicts Trump won't run in 2024 due to threat of prison, and studies that show political identities are potentially correlated with anxiety.To become a Breaking Points Premium Member and watch/listen to the show uncut and 1 hour early visit: https://breakingpoints.supercast.com/Merch Store: https://shop.breakingpoints.com/ Learn more about your ad choices. Visit megaphone.fm/adchoicesSee omnystudio.com/listener for privacy information.
Transcript
Discussion (0)
This is an iHeart Podcast. Taser Incorporated. I get right back there and it's bad.
Listen to Absolute Season 1, Taser Incorporated,
on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
I'm Michael Kassin, founder and CEO of 3C Ventures,
and your guide on good company. The podcast where I sit down with the boldest innovators, shaping what's next.
In this episode, I'm joined by
Anjali Sood, CEO of Tubi. We dive into the competitive world of streaming.
What others dismiss as niche, we embrace as core. There are so many stories out there,
and if you can find a way to curate and help the right person discover the right content,
the term that we always hear from our audience is that they feel seen.
Listen to Good Company on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Sometimes as dads, I think we're too hard on ourselves. We get down on ourselves on not being able to, you know, we're the providers, but we also have to learn to take
care of ourselves. A wrap-away, you got to pray for yourself as well as for everybody else, but
never forget yourself. Self-love made me a better dad because I realized my worth.
Never stop being a dad. That's dedication. Find out more at fatherhood.gov. Brought to you by the
U.S. Department of Health and Human Services and the Ad Council.
Hey, guys.
Ready or not, 2024 is here, and we here at Breaking Points are already thinking of ways we can up our game for this critical election.
We rely on our premium subs to expand coverage, upgrade the studio, add staff, give you guys the best independent coverage that is possible.
If you like what we're all about, it just means the absolute world to have your support. But enough with that. Let's get to the show.
Joining us now is venture capitalist Mark Andreessen.
He's out with a stunning new essay,
Why AI Will Save the World, Counterintuitive or Is It?
So Mark, it's great to see you.
Good morning.
Mark, you go through a couple of the myths,
and I actually just thought as a fun interview,
we have a general audience.
Some of these people may not be all familiar with your work.
You were just on the Joe Rogan Experience,
but the most common ones, I thought you do a fantastic job of breaking these
down. The first and foremost is, will AI kill us all? Why won't it? AI will not, in fact, kill us
all. It's a very exciting new technology. It's a technology. We build it. We control it. You know,
it runs on computers that we own. We don't like We control it. It runs on computers that we own.
We don't like what it does.
We turn those computers off.
That part is just, I think, straightforward engineering.
One of the reasons why I think it's important for people to understand what you're talking about,
about the level of control, is that there's been a lot of, I don't want to say fear-mongering,
but maybe there is, about singularity, about the ability to pass the Turing test and all this.
As someone who was a technologist there from the very beginning, from so much of the internet
and all that, why do you think that that is something that we shouldn't be concerned about?
Yeah, so you may know we Californians are famous for our cults.
We love our cults.
We have produced many thousands of them over the decades and continue to do so.
And so there's this concept in religious studies of a millenarian cult, sort of an end-of-the-world cult.
And the way that these start is with sort of the idea of creating heaven on earth, which is actually a term you may be familiar with.
It's an old Bill Buckley term called emanatizing the eschaton, which is sort of bring about heaven on earth.
This is the idea of the singularity, right?
So techies who get into this stuff
kind of come up with this idea of heaven on earth
through the application of technology.
But of course, the flip side of heaven on earth
is creating hell on earth, right?
And so the sort of utopian cult
tend to transform over time
into being these sort of apocalyptic cults.
And they sort of hear this term,
kind of summoning the demon.
And then things go horribly wrong.
So these are just like ideas
that are kind of very deeply rooted
in kind of Western theology, Western philosophy.
And whatever new is happening in the world,
we tend to kind of build up these cults.
Maybe I'm, I don't know, I'm a simpler guy.
I'm an engineer.
You know, I got where I am by writing software.
Writing software is hard.
You know, lots of, like, practical aspects involved
in getting any of this stuff to work.
I mean, the funniest idea right now is you can't even,
today you can't even buy chips to do AI.
Like, there's this massive, like, global chip shortage for AI chips.
And so I have these kind of fantasies that there's a baby, you know,
malevolent artificial intelligence running in a lab somewhere
that has, like, a huge order out to NVIDIA. And it's just like incredibly
frustrated that it can't take over the world because it can't get the chips. That's well
said. Yeah. It turns out that the limiting principle here is actual factories, not in terms
of the idea, which is kind of amazing. Let's also talk about AI ruining our society. As you said,
hate speech, misinformation. This is why we need
regulation. We need it now. We need to shut down. The government's got to get involved.
What do you make of that argument there? Yeah. So after we get through the apocalyptic cult,
then we get to sort of the mode basically, you know, of any sort of significant new technology,
basically where it threatens existing power structures, right? So it sort of threatens
the hierarchy of who's in charge and who gets to decide, you know, kind of what people
think. And of course, you know, the internet is, you know, you guys are an example of this,
like the internet has had a shattering effect on the traditional, you know, authority of people
who used to be in a position to be able to gatekeep access to information and used to be able to,
you know, determine what people say, you know, there's once upon a time, if it was set on the
major three TV networks, that's what everybody believed. And then, you know, determine what people say, you know, there's once upon a time, if it was set on the major three TV networks, that's what everybody believed.
And then, you know, guys like you come along and kind of, you know, blow that up.
And so, you know, look, we're on that we're on the tail end now of, you know, 15 years of like,
very dramatic change. But by the way, on both sides of the political aisle,
you know, in terms of like, who gets to basically decide what truth is. And of course,
the people who used to decide what truth is are extremely upset about that. And of course,
a lot of like ordinary people who now have a chance to learn alternate points of view and
get information they would not have had otherwise are obviously very excited about that. I think
this concern is like another turn on that, right? Because, you know, people are going in the future,
people are going to be talking to AI, learning from AI, you know, getting information from AI, you know, at least as much as they do
that with other people. And so what the AI says is going to have a big impact on society. I think
that's going to be an overwhelmingly positive impact, because I think people are going to have
access to a lot more information. It's going to be like the internet has steroids. But if you're a
traditional gatekeeper, like you don't like this at all. And so that's when you bring out the,
you know, the sort of scare factors, like, you don't like this at all. And so that's when you bring out the sort of scare factors like hate speech and misinformation. Yeah, smart. And one of the questions, though,
here is about development. And this is one of the only ones of which I'm still,
I have some questions around, is given that we have the existing players, Google, Facebook,
the bigger corporations involved, should we worry then about monopolies being the most
transformative in this space and carrying over
existing policy, or will the technology itself be able to trump those concerns?
Yeah, so that's a very open, live question right now. And I can kind of argue all sides of that
question. I don't know the answer. You know, this is a very unusual situation. You know,
usually new technologies get a running, you know, kind of start of a decade or so before they become sort of super political and people start to lobby for regulatory capture.
This technology is unusual in that the major players are already in Washington and they're already lobbying aggressively for regulatory capture.
And, you know, I said in the piece and I've said other places like the big companies here are trying to form a cartel.
You know, they're trying to basically get the status that the big banks have in financial services.
Right. They're trying to basically get protected, you know, as institutions that are too
big to fail. And they're trying to basically get a regulatory wall established so that startups
can't compete with them. And look, they have an early head start on that. They've made excellent
progress. They've used these scare messages like end of the world and hate speech and all the rest,
you know, kind of as levers to try to get this kind of regulatory capture. You know, I and others
have been pointing out that this is like get this kind of regulatory capture. You know, I and others have been pointing out
that this is like a transparent grab for regulatory capture.
You know, it's ironic that I feel like I have to encourage Washington
to like not like and trust big tech.
But in fact, they should not.
In fact, you know, in this case, trust big tech
and they should let the technology run.
Why should they let the technology run competition?
You know, basically the big companies should be subject to competition from startups and from open source. Startups and open
source should be subject to competition for big companies. And of course, that is the time-honored
way to discipline companies, is to make sure that they're subject to actual competition.
Yes, I think that's an important point. You touched on AI taking all of our jobs in the
essay. There's been some talk here about,
well, it turns out that blue-white-collar jobs are a little bit more at risk, even after decades of
self-driving car discourse. But even that, Mark, you kind of try to make the case here that taking
all of our jobs and all that is the doom that people have been crying about with new technology
for a long time. You've all known Harari.
Many others have been like, no, no, no, no.
This time is different.
So why is it not different?
Yeah, so if you look historically, basically this is the cry.
This is sort of the paranoid cry on the economic front
that's going to come up basically every new wave of technology for 300 years.
And the cry is always the same cry.
It's based on this fallacy in economics called the lump of labor fallacy,
which is, by the way, is the heart of Marxism. One of the reasons why Marxism doesn't
work is because like this fallacy is false. And the fallacy basically is that there's a fixed
amount of labor and either machines or either humans are going to do the labor and have jobs
or machines are going to do the labor and then humans won't have jobs. And so that, you know,
the proposal then always is that new technology will lead to a miseration of human beings. Of
course, what happens in practice is the opposite of that.
New technology drives economic growth.
New technology drives job creation.
New technology drives wage growth.
The sectors of the economy with higher levels of technological development show higher job
growth and higher wage growth.
And the reason for that is very simple, which is technology is a lever on human effort.
If you take a human being and you add technology to what they do, they become more productive, right? They generate more output with the same
amount of human effort that leads to higher wages, that leads to, you know, basically growth in the
economy. If AI, in fact, is allowed to run through the economy, I have no doubt that the result is
going to be massive economic growth, massive job growth, and massive wage growth. I think actually
the bigger threat is AI won't be allowed to run. It'll be outlawed in some form. And I'll just give you a couple examples of that, which is,
you know, will the medical boards allow there to be AI doctors? You know, will the bar associations
allow there to be AI lawyers? You know, I think it's actually equally likely we'll see the opposite
problem, which is AI will not be allowed to actually run through the economy. And as a
consequence, it will actually have less economic impact even than I would hope. I hope that's not the case, but I do worry about that.
Right. Well, I mean, you got classic job preservation there going on, which actually,
though, does lead to the next kind of myth that you talk about here about AI leading to crippling
inequality. So the obvious answer is, look, you're right, Mark, it'll unleash all this economic
growth, but all of the gains will aggregate to the top. You don't need that many employees actually involved in these companies, and they will
capture the trillions of dollars in value and a new underclass will form. That's why we need
UBI. So see how quickly I got there? In terms of this one, what's your view?
Yeah. So this is sort of, again, this is sort of, right, this is a fallacy right at the heart
of Marxism, right? And the way I always kind of teach my friends on this is it's like if the answer is UBI,
then the question was communism.
It's amazing how quickly sometimes people get to like,
oh, we just need to give everybody money.
It's like, okay, that sounds familiar.
So the reason that doesn't happen, the reason it doesn't happen is because people miss.
And this is, again, this is a fallacy that's repeated over and over again.
Technology democratizes.
And with AI, it's actually striking that it's so obvious that that's already happening.
You and I and anybody else in the world today can get access to and use in our daily lives the very best available AI.
Like anybody watching this can use the same, you know, I cannot pay a million dollars and get a better AI than anybody watching this can.
And the way that you do that, if you want that, is, you know,
you can go on ChatGPT for $20 a month, you get the good stuff.
Or you can go to Microsoft Bing, which is a version of GPT-4,
or you can go to the Google Bard, which is also getting quite good.
Those are actually free, right?
So Microsoft and Google are already bringing this stuff to market for free.
Why are they doing that?
It's because they're trying to use it right to boost their existing search businesses, whatever. That's not your problem. The fact is they're giving it for free. Why are they doing that? It's because they're trying to use it right to boost their existing search businesses, whatever. That's not your problem. The fact is they're giving it for
free. And so the best AI is already available to anybody who wants it. Over 100 million people are
already using it in their daily lives. That number is going to ramp very fast from here.
And so you'll have it, I'll have it, everybody else will have it. Everybody will have it in
their lives. Everybody will have it in the workplace. Everybody will be able to build
businesses based on it. Everybody will be able to upgrade any aspect of how they operate
based on it. And so, and so democratize, and the reason it democratizes is actually, it's sort of
an Adam, you know, Adam Smith refutation of Marxism. The reason it democratizes is because
these companies, the companies like Microsoft and Google that, and OpenAI want to become big,
important companies, right? The way to be the biggest possible vendor of technology
is to provide that technology to the largest possible market,
and the largest possible market is everybody on the planet.
And so basic commercial self-interest actually leads to the opposite result,
which is everybody gets the technology.
That happened, obviously, with the smartphone.
It happened, obviously, with the internet.
It happened with the microchip.
It happened with the car.
It's happened with every other new form of technology,
and that's what's happening here. Mark, one of the points that you often make,
which I think is very important, is that right now, the industries that are actually driving
the most inequality are housing, education, and healthcare, both of which are the most resistant,
actually, to technology. As you alluded to earlier, lawyers being able to keep and effectively
emote around their fees, doctors
in the same way, even if technology exists, which could quite easily give the consumer
a much better product.
Could you touch on that, though, about how tech has actually not been allowed?
And not even just tech, but really like competition has not been allowed to penetrate the regulatory
moat around these three sectors, which are increasingly driving the vast majority of inequality in U.S. society? Yeah, so we live in a bifurcated economy by sector.
And the way I describe this is there's what I call fast sectors, which are the sectors that
are allowed to incorporate new technology. And these are sectors like consumer electronics,
television sets, right, media, software, right? Actually, by the way, food and drink, interestingly,
where you're allowed to have basically
very rapid technological evolution.
You're allowed to have free market capitalism.
You're allowed to have tons of competition,
tons of new entrants, tons of startups.
Then you've got what I call the slow sectors.
And the slow sectors are the sectors
where the government is very involved in them.
And the government establishes
all these regulatory barriers and constraints
on what companies can do.
And the result of that is rapidly escalating prices with very little technological change.
And you mentioned it, but the three big slow sectors are housing, education, and healthcare.
And by the way, you could put a couple other sectors in there. And you could say also,
by the way, government is in there generally. Government agencies are also very resistant to
new technology.
And then I would just say also like the entire field of law,
essentially like, so the entire field of like administration
of the economy, which of course has been, you know,
growing tremendously, also is very resistant
to new technology.
And so you have this like bizarre situation.
If you're trying to like raise a family in the US right now,
you have this bizarre situation, which is you can buy,
you know, a television set that covers your entire wall, right, in spectacular, you have this bizarre situation, which is you can buy a television set
that covers your entire wall, right? And spectacular, you know, 4K, 8K visual splendor
to watch Netflix. You can buy that for like 300 bucks. But if you want to send your kid to college
for four years, it's $300,000, right? And those prices bifurcate further every single day,
right? And actually, my analysis is this is why it feels like our
politics have gone so wild on both the left and the right. I think this is the underlying reason
why there's so much populism on both sides of the aisle, is because the experience of just the
typical American voter citizen is that the American dream as defined by the very basics of
buying and living in a house, you know, having, sending your kids to college and having excellent
healthcare, the prices of those things are running completely out of control. Yes. This is 100% a
function of government involvement. The government in all three of those sectors subsidizes demand
while restricting supply. And so those are sectors that have a cartel-like structure,
right? It's virtually impossible to start a new university. It's virtually impossible to build
houses lots of places. It's virtually impossible to start a new hospital or get a new doctor
license. And so the government basically has a set of policies in place that restrict adoption
technology and drive prices in those sectors. I think they should stop doing that. You know,
one of the things we're trying to do in our business is inject more technology into those
sectors, but it is very hard because the barriers there to new technology are very high.
Yes, they're outrageous. And finally,
actually, one point that you touch on is about national competition vis-a-vis China. There's been some arguments here about the Pentagon and whether we should have completely state-subsidized
AI in the way that the Chinese companies do it. You've argued here in the essay that a robust
private sector is actually enough in order to, quote unquote, beat them. Given, though, that there are massive investments inside of China on quantum
computing and more, why is it still enough for our private sector to be able to come out on top
eventually? Yeah, so first of all, this is sort of an incongruity in how Washington's dealing with
AI. And this is something I experience a lot, which is I talk to people in Washington on Tuesday,
right, and they're, you know, very worried about all the arguments that we've already discussed, you know, sort of internal to the U.S. I talked to people in Washington on Tuesday, right? And they're, you know, very worried about all the arguments
that we've already discussed, you know, sort of internal to the U.S.
I talked to the same people on Thursday,
and they're very worried about China.
And all of a sudden, they're like, wow, we in the private sector
need to like team up and advance AI as fast as we can
because we have to compete with China.
And we might be in a future war with China.
And that war might be won or lost on the basis of the application of AI.
And so there is this weird thing, of course,
where, you know, societies kind of turn on themselves when there's no external threat. You so there is this weird thing, of course, where societies kind of turn on themselves
when there's no external threat.
The Chinese external threat,
maybe it's showing up at a good time,
kind of sort of weirdly,
not in a way that we would prefer,
but maybe it's going to get us
a little bit more focused on things that matter.
Look, so both the Chinese state
and the American state have decided
that AI and autonomy are the future of warfare.
China has been very public on this.
The great thing about China is they just publish their plans.
They just tell you what they're doing.
It's very straightforward how they communicate.
And then, look, in the U.S., the Pentagon, several years ago now,
basically announced, decided that what they called the third offset,
which is sort of the third big national strategy for technology advance and military affairs
was going to be AI and autonomy.
And so they've actually already been
kind of marching down this path.
And so it is pretty clear
that both the major powers in the world
kind of believe the future of warfare
is going to be based around AI.
In China, they have a system
and their system has their public sector
and their private sector
completely intertwined. The private sector in China is owned by the public sector. It's owned
by the Communist Party. And so they have the advantage that they can directly control the
entire productive capability of their system against their military needs, which is what
they're doing. We don't have that. We have, at least in theory, a free market system.
The advantage of the free market system historically
is a much faster rate of innovation.
If you get all these companies actually competing with each other,
you get all these new entrepreneurs that we back competing with each other,
competing with the big companies,
you get a much faster rate of innovation.
Historically, the U.S. system has led to a much faster rate of innovation
than the centralized Soviet system did or the centralized Chinese system has.
So my argument is just very straightforward,
which is we should unleash the animal spirits of our private sector and our defense sector in the same way we did during the Cold War. We should have an absolute determination
to just completely swamp the Chinese centralized system in terms of the quality of our AI. And then
in any future military conflict, we should be set up to just beat them decisively right out of the
gate with superior technology. Yes. And then my last question, Mark,
you're one of the best read people I know. What are some great books that people can read
to prepare for this transition on a social level, a technological level and a political level?
Yeah. So, I mean, there's tons. I mean, my number one recommendation on the shifts in technology,
there's a great short book written by an MIT professor 50 years ago called men,
machines,
and modern times.
And it goes through this sort of cycle of adoption. Like what,
what happens when a new technology arrives that sort of threatens to,
you know,
kind of upend the established order.
The book is interesting actually,
because it has a specific focus in the book on,
on,
on historical precedents and military technology.
And,
and so it,
it has this incredible story of the first Naval gun that was able to
automatically compensate for the role of ships.
Yes.
You know, it was like 100 years ago that changed naval warfare forever.
And it was the level of institutional resistance from both the U.S. and the U.K. navies to not adopt this technology was like, you know, completely wild.
And so anyway, it's a very good book on societal impact on new technologies.
I also can't help but recommend,
on the fiction side,
I'm reading with my kid right now,
Hitchhiker's Guide to the Galaxy.
And I bring that up because Douglas Adams,
the author of that,
had maybe the sort of funniest explanation
of how new technologies are received by society.
He said, if you're below the age of 15
and new technology is just obviously
the way the world works, why wouldn't it be? Which, by the way, is how my eight-year-old kind of views AI. He's like below the age of 15, and new technology is just obviously the way the world works,
why wouldn't it be?
Which, by the way, is how my eight-year-old kind of views AI.
He's like, yeah, of course, the computer answers questions.
Like, why wouldn't the computer answer questions?
And then he says, Adam says,
if you're between the ages of 15 and 35,
the new technology is extremely exciting,
and you might be able to make a career in it.
And if you're above the age of 35,
the new technology is unholy and scary,
and it's going to destroy society.
Right? And so I think that, I think his framework actually also is exactly what's
applying to AI right now. All right. Well, we'll have links down those in the description,
Mark. We really appreciate your time. I think the audience will get a lot out of this.
Awesome. Thank you, man. Thanks, man.
We're always looking for a good take out there. This one is genuinely stunning. Scott Galloway of Pivot Fame, a co-host of the podcast with Kara Swisher, usually confined to the tech world and to, like, cringe liberal politics, has decided to wade into the Republican primary.
Here's his prediction.
I think President Trump is not going to run for president under the auspices of a plea deal.
Where is that coming from? No one says that stuff.
Actually, Governor Christie said it, or at least a component of it.
He's going to be on the stage if Trump shows up.
He qualified thanks to my donation to him.
I can't believe I gave him money.
I actually think Governor Christie is going to surpass DeSantis and be the number two but i don't think i don't think it'll matter i think
trump i i don't understand and can't empathize with president trump but i know how old rich men
think he has a very nice life and his life can be going back to golf and sycophants and having sex
with porn stars which i think is a good thing.
I'm not being cynical.
I would like to do more of that at some point in my life.
Okay.
And, and, or.
I'm going to hit you when this is over, just so you know.
Go ahead.
Or, or he can live under the threat of prison.
Okay, that's certainly a take.
Here's why I think Scott's wrong.
If that was true, then he would have not run again in 2024.
Everything that he said, I've heard that, by the way, from a lot of people who knew Trump.
They're like, Trump likes to golf.
He doesn't like to do work.
All that's true.
Guess what?
He could do all that as president and still get all of the attention and craving that he's always wanted.
So I think that the the threat of legal indictment, being president is the best possible scenario because it shields you legally from a lot of – it gives you legal immunity in many of these cases and gives you the best political case to wage in order to keep you out of a prison cell.
Okay.
So I agree with you, but I'm going to make the other case just to make it interesting. So if he didn't run for president, then he wouldn't,
like the crimes were already committed. This writing was on the wall that he was likely to
face indictment in a variety of cases with regard to January 6th and the documents and the thing in
New York as well. So since that writing was already on the wall, if he doesn't run for president,
then he doesn't have any leverage to try to secure the plea deal that Scott Galloway is floating here
and which others have floated as well, which is like, listen, you agree to not run for president,
you go back to your private life, and this all goes away. So it would make sense in that context
of like, I got to hold the threat of me being president again over their head so I can secure
some sort of a deal to make sure that my ass does not end up in prison. And then the fallback plan is to actually win the presidency again, and then he
can make his own problems go away. The reason that I don't think that this is correct is,
if they strike this kind of a plea deal with Trump, it really validates the case that this
all was political to start with, that the only thing they cared about was keeping him out of
the White House to begin with. And the lawyers thing they cared about was keeping him out of the White House to begin with.
And the lawyers that I've talked to,
because some of the mechanics of this,
I don't have full insight into,
the lawyers I've talked to really think
that this is a very unlikely outcome
because you're offering a political solution
to what is ultimately a legal problem,
and those two things don't normally mix.
So that's why I, you know,
that's why I don't see it happening, frankly.
But hey, it's interesting.
Yeah, look.
What do you think of the Chris Christie take?
Oh, totally wrong.
I mean, look, no offense, Scott,
but like this is just not your wheelhouse really at all.
I mean, we've covered, I think here's the problem for him.
Scott genuinely is like a resistance liberal
and he hates Trump
and he wants to see somebody like Christie ascend.
But you and I are familiar with how actual GOP primary voters think and act, and they love Trump.
And the thing that they hate the most about Chris Christie is that he's anti-Trump.
Chris Christie, if he was running on literally anything else, probably would be doing fine. If you are somebody who is genuinely anti-Trump, the best lesson you can
learn is to be like a Nancy Mace or a, what was the guy who got elected governor of Georgia?
Brian Kemp. Brian Kemp. If you want to be like Brian Kemp in those two instances, what do you
do? You run against the left and you just kind of hope that nobody really notices that you do have
differences with Trump. That's not the Christie tactic. There's no way you can be number two like
that. Not whenever. That's like running against not the Christie tactic. There's no way you can be number two like that.
Not whenever.
That's like running against Ronald Reagan in 1980.
It's just not going to happen.
I do think.
I agree.
He's not going to be the nominee.
I don't know if he could pull into second.
Although, actually, I did see, I mentioned this,
but there was a New Hampshire poll that has him actually tied with DeSantis
in New Hampshire for second place in New Hampshire.
I do think that there is something
just on a basic human level
that is compelling about the fact
that he feels like, just like Trump does,
he feels like he's saying the things
that he really thinks.
Whereas DeSantis, Tim Scott, Nikki Haley,
all this other cast of characters,
it's so naked that they're posturing,
they're trying to get the right answer,
they're using whatever their little consultants
and focus groups told them to say.
And like, obviously Chris Christie has his own advisors
and is doing his own version of that game,
but he has much more of that feel of like,
listen, I know this is uncomfortable and unpopular.
I know the audience that I'm speaking to.
I don't care, I'm just gonna say it anyway.
There is something compelling about that. Is it win the GOP I'm speaking to. I don't care. I'm just going to say it anyway. There is something compelling about that.
Is it win the GOP nomination compelling?
No.
Is it do a little bit better than I think people are giving him credit for?
Yeah, I could see that.
I could see that.
All right.
Well, maybe.
We'll find out.
Maybe I look like an idiot.
You can roll my clip and say I'm the actual cringe person who doesn't know what I'm talking about.
That's always fun.
So we'll see you guys later.
So some fascinating charts were posted to Twitter showing increasing political polarization around some key issues of just like basic self-esteem, basic self-worth. And these things
seem to be splitting, like there seems to be more of an ideological divide in these feelings over
time. Let's go and put this up on the screen so you can get a sense of what we're talking about here. So you've got some questions like,
on the whole, I'm satisfied with myself. So you have a spike in Democrats who disagree with that
sentiment over Republicans. How satisfied are you with your life as a whole these days? You,
again, have Democrats really dissatisfied with their lives
and an increasing divide there. Same thing with life often seems meaningless. I enjoy life as
much as anyone. This one has potentially the biggest spike here, which I actually thought
was kind of revealing. The future often seems hopeless. You've got nearly 40 percent of Democrats
who agree with that. Again, that has really spiked very recently in recent years, whereas the Republican number has shown a recent spike, but not nearly as much as the Democratic number.
And that's one of the biggest divides.
So kind of fascinating that there seems to be a real split in assessments of the future, in assessments in personal satisfaction, how I feel about myself, how I feel about my life, how I feel about the future, etc.
Between liberals and conservatives.
I do wonder how much of this is generational, though.
Just because so many young people are liberal, they are going to self-select for more doomerism.
And, I mean, you can't blame them on one hand. So,
but I do think there is something to this. You can see it in the attitudes around COVID. You can see
it around safetyism. You can see it around optimism, around the feeling of the country.
In some cases, you know, I'm on various different sides of those. So I do tend to understand, but I think the one that is very sad
is life often feels meaningless and a huge spike in the agreed number for self-identified liberals.
And I enjoy life as much as anyone with the disagree also spiking as well. So in general,
I think this is probably skewed most by generational gaps and by wealth. But there is definitely something like,
I'm not gonna say spiritual, I don't know, what is it?
I don't know what the word is, epistemological
to something about what's going on here.
Yeah, I think the generational point is well taken.
I do think it's worth noting on almost all of these metrics, even though the
liberal numbers go up at a faster clip, the conservative numbers are rising as well. So you
do have increasing levels of like personal dissatisfaction and pessimism rising among the
population. And in some ways, I think that's justified. In a lot of ways, I think that's
justified. You know, the one about the future often seems hopeless. That one really struck me because, listen, if you, you know, if
you're a young person that's looking at, you know, the fact that the air is like unbreathable from
wildfires and there's increasing, you know, extreme temperatures and climate catastrophes,
and it just seems like there's no end in sight, and this is going to continue to spiral out of control. Yeah, it's pretty justified to have some negative feelings
around that. And then in terms of personal material conditions, if you are a person who
benefited from growing up and starting your career at a certain age in America, where now you've got
a home, and you feel pretty solid, and you feel pretty stable, and you've got a a nice little nest egg and you're of that generation, yeah, you're probably going to feel
pretty good about things, at least better about things than young people who are like, I'm never
going to be able to buy a house. I'm never going to be able to have a family. I'm never going to
be able to have kids. I'm never going to be able to have some basic measure of stability. And rather
than feeling like, yeah, but things will get better over time. It just seems like we're facing down a landscape of decline. You know, to bring it back to politics, because that's what we do here.
Just the specter of we're really going to have Donald Trump and Joe Biden as like the best we
can do in politics. I just think that's so symbolic of this sense of national decline,
which pervades people's overall feelings of optimism and also, you know, can can
bleed into their own personal feelings of how their life is likely to go over the coming years.
Yeah, I think that's true. I think a lot of it focuses on future conditions. And that's where
I would, you know, the most immediate ones that you can try is cost of living and, you know,
not even educational attainment, but attainment towards the or at least the feeling that you can
ascend to a higher social class if you work hard enough. That's really, you know, or at least the feeling that you can ascend to a higher social class
if you work hard enough.
That's really, you know, that's all the basic
American contract that was there.
It's exactly why also that those numbers
were much more equalized in the past.
So I do think this is a very generational chart.
And I think that's actually even more unfortunate
than whenever you divide it bipartisan.
Although I do think there is something,
even amongst people who are of the same age,
in terms of some sorting on these issues. Which is not good, by the way, for people to have
totally different worldviews. I think there's always been a divide between liberals and
conservatives. Conservatives tend to be like rank higher on levels of happiness. And so there's
always been a bit of a split here. But the fact that you have these generational divides coming
to the fore, I think is very noteworthy.
Agreed. All right. We'll see you guys later.
I know a lot of cops. They get asked all the time.
Have you ever had to shoot your gun?
Sometimes the answer is yes.
But there's a company dedicated to a future where the answer will always be no.
This is Absolute Season 1.
Taser Incorporated. I get right back there and it's bad.
Listen to Absolute Season 1, Taser Incorporated on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.
I'm Michael Kassin, founder and CEO of 3C Ventures and your guide on good company,
the podcast where I sit down with the boldest innovators shaping what's next.
In this episode, I'm joined by Anjali Sood, CEO of Tubi.
We dive into the competitive world of streaming.
What others dismiss as niche, we embrace as core.
There are so many stories out there.
And if you can find a way to curate and help the right person discover the right content. The term that we always hear
from our audience is that they feel seen. Listen to Good Company on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.
I always had to be so good no one could ignore me. Carve my path with data and drive.
But some people only see who I am on paper.
The paper ceiling.
The limitations from degree screens to stereotypes that are holding back over 70 million stars.
Workers skilled through alternative routes rather than a bachelor's degree.
It's time for skills to speak for themselves.
Find resources for breaking through
barriers at taylorpapersceiling.org. Brought to you by Opportunity at Work and the Ad Council.
This is an iHeart Podcast.