Theories of Everything with Curt Jaimungal - What's Wrong With (Fundamental) Physics? | Sabine Hossenfelder
Episode Date: December 7, 2024Head over to https://www.masterclass.com/theories for the current offer. MasterClass always has great offers during the holidays, sometimes up to as much as 50% off. In today's episode of Theories of... Everything, Curt Jaimungal speaks with physicist Sabine Hossenfelder to cover what's truly wrong with fundamental physics. Together, they uncover why long-standing problems linger, why essential data remain elusive, and how systemic pressures are stifling meaningful breakthroughs. Links Mentions: - Existential Physics: A Scientist's Guide to Life's Biggest Questions: https://amzn.to/3BeOyML - Lost in Math: How Beauty Leads Physics Astray: https://amzn.to/3OL4GbV - Sean Carroll's TOE Episode: https://youtu.be/9AoRxtYZrZo - Peter Woit's TOE 1st Episode: https://www.youtube.com/watch?v=9z3JYb_g2Qs - Peter Woit's TOE 2nd Episode: https://www.youtube.com/watch?v=TTSeqsCgxj8 - Sabine's Crisis in Science Series: https://www.youtube.com/watch?v=KW4yBSV4U38 https://www.youtube.com/watch?v=gMOjD_Lt8qY https://www.youtube.com/watch?v=HQVF0Yu7X24 https://www.youtube.com/watch?v=cBIvSGLkwJY https://www.youtube.com/watch?v=KBT9vFrV6yQ https://www.youtube.com/watch?v=LKiBlGDfRU8 Timestamps: 00:00 - Introduction to the Physics Crisis 03:29 - The Role of Experiment in Physics 06:21 - Internal Contradictions in Quantum Gravity 08:21 - Progress in Theoretical Physics 11:01 - Serendipity and Discovery in Research 12:09 - The Role of Funding in Physics 15:33 - Overproduction of Models in Academia 18:16 - Focus on Solving Inconsistencies 19:51 - The Crisis in Science 32:32 - Overhyping Research Possibilities 37:27 - Mistrust in Science and Academia 42:08 - Humor in Science Communication 57:46 - Addressing Problems in Academia 58:56 - The Scientific Underground and Job Market 1:02:29 - Academic Exodus 1:05:42 - Critique and Counterpoints 1:07:28 - The Irony of Theory Development 1:12:46 - The Scientific Underground 1:15:58 - The Crisis of Scientific Progress 1:28:31 - Challenges of Quantum Gravity 1:31:59 - The Special Issues Dilemma 1:46:49 - The Future of Scientific Discovery 1:54:22 - Envisioning a New Scientific Ecosystem 1:57:30 - A Call for Collaboration As a listener of TOE you can get a special 20% off discount to The Economist and all it has to offer! Visit https://www.economist.com/toe New Substack! Follow my personal writings and EARLY ACCESS episodes here: https://curtjaimungal.substack.com TOE'S TOP LINKS: - Enjoy TOE on Spotify! https://tinyurl.com/SpotifyTOE - Become a YouTube Member Here: https://www.youtube.com/channel/UCdWIQh9DGG6uhJk8eyIFl1w/join - Support TOE on Patreon: https://patreon.com/curtjaimungal (early access to ad-free audio episodes!) - Twitter: https://twitter.com/TOEwithCurt - Discord Invite: https://discord.com/invite/kBcnfNVwqs - Subreddit r/TheoriesOfEverything: https://reddit.com/r/theoriesofeverything #science #podcast #physics #theoreticalphysics Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
As a creator, I understand the importance of having the right tools to support your business growth.
Prior to using Shopify, it was far more complicated and convoluted.
There were different platforms, different systems, none of them meshed well together.
However, once we made that switch to Shopify, everything changed.
What I like best about Shopify is how seamless the entire process is from managing products
to tracking sales.
It's so much easier now and it's streamlined our operations considerably.
If you're serious about upgrading your business, get the same checkout we use with Shopify.
Sign up for your $1 per month trial period at Shopify.com slash theories, all lowercase.
Go to Shopify.com slash theories to upgrade your selling today.
That's Shopify.com slash theories.
Recently, there's been a huge hubbub and I'm sure you're aware regarding the purported
crisis in physics.
I had an interview with Sean Carroll and he defended academia.
His defense though wasn't just Carroll and he defended academia. His defense though
wasn't just out of allegiance to academia. To me it seemed a counterpoise to the growing societal
mistrust in science and he sees that as a detrimental trend. Today I would like us to focus on
not only the stagnation in fundamental physics, which is a specific claim, but perhaps the crisis
in physics in
general, maybe even science as a whole.
And I'd like to hear what you have to say.
I'll bring up objections as they occur to me, perhaps other people's objections as
well.
And I want to hear your objections to those objections.
And we can even get into the issues of causes and solutions.
Okay, well, why don't you distinguish the crisis in fundamental physics versus the crisis
in physics, and then we can get to science afterward.
And also, what is meant by crisis?
Because I made a claim of stagnation to Sean Carroll, which is different than crises.
Right.
So it's actually when I give talks about this, this is always the first thing that I say,
like, I don't want, I don't like talking about crisis.
I talk about stagnation because it's clearer what it means.
A lot of people actually think if we had a real crisis, that would be a good thing because
a crisis is an opportunity for a breakthrough or something so we would know what to hit
on, but that's not what it is. So to me, I think calling it a stagnation is much more accurate because that's what
I see.
We're just not going anywhere.
We're just like on a treadmill, pretending we're running but not making progress. So I actually can't remember I've ever talked about,
you know, the crisis in physics overall,
because I don't really know a lot about exactly
what's going on in physics overall.
It's just such a huge field with so many sub-disciplines.
So I tend to focus on what I call the foundations of physics, where we
think about the most fundamental questions, because that's what I know about. And so the
issue there is that we haven't really made any progress on answering the big open questions ever since they occurred, like a century ago.
And then we made progress until the Standard Mall was completed sometime in the 1970s,
depending on how you count. And then you could say for the next five, 10, 15 years or something,
people were just trying out different things and they expected it to go
somewhere. And at some point, you know, sometime in the mid 1980s, maybe 1990s, it just went wrong
because what happened was that they started doing the same thing over and over again.
And it just didn't work and they never revised those methods. So we still don't know what dark matter is made of, if it's made of anything.
We still haven't figured out how to quantize gravity.
We still don't really understand how quantum mechanics works.
So we haven't really found either a better theory or some phenomenon that we could actually
hit on.
How are people supposed to resolve some of these foundational problems such as quantum or some phenomenon that we could actually hit on.
How are people supposed to resolve some of these foundational problems, such as quantum gravity, as you mentioned, or dark matter,
without experiment to guide or to test between different theories?
Because there are different proposed solutions to what dark matter is,
as well as how to quantize gravity.
It's just that they lie outside the experimental range.
So would you say that's the theor that they lie outside the experimental range.
So would you say that's the theorist's fault, the experimentalist's fault, no one's fault?
Yeah, that's a very good question. So actually, strictly speaking, one shouldn't lump all these problems together
because each problem is its own problem. And dark matter is quite different from the other problems that I mentioned because in this case we actually
do have some experimental data. So we know something isn't
adding up, quite literally actually. If we just take the
matter that we know of that we have in the standard model and
we put it out there, it just doesn't properly work. And so
introducing some sort of dark stuff is a fix that you can do.
But for one thing, it doesn't always work the way that it should be, so it brings up
some other problems.
And also it opens the question like, what is it made of?
Is it made of anything?
And then there's this competing theory, modified Newtonian dynamics or modified gravity in
more general, and we
still haven't really figured out how to rule one out or confirm the other.
And then these other questions which we have like quantum gravity is often portrayed as
a problem of inconsistency, and we can talk about what exactly this means.
Quantum mechanics, I've personally argued it's also a problem of inconsistency, but
I think if you ask other people in the field, they would object that this is the case.
And then there are other problems that people like to talk about, which I actually think
are not problems, which are some misgivings about the stand-up model for example, like the strong CP problem, or it's supposedly unnatural, like why is
the Higgs mass so small, the hierarchy problem as it's sometimes called. And there are a
list of other issues that people have like the baryon asymmetry and stuff like that,
where I would say these are not good problems because it's not clear that they actually require a solution.
So you can say, I don't like that the standard model just has these three generations and we
have no deeper explanation for it, but maybe that's just how nature is. Maybe nature just
has three generations and there's no deeper explanation.
So I think it's not a good problem to work on because we have nothing really to start
with.
Like there's no problem that's actually in need of solving, if you see what I mean.
It's not the case with quantum gravity where you can say, okay, actually, if we just take
general relativity and we combine it with the quantum field theory, it doesn't work.
We don't know what to do.
So we, there's a real need to actually develop new mathematics.
And I think that's what makes this a promising avenue.
Just for people who are unaware of some of the problems or so-called problems, one is
about CP violation.
And then there's something called theta, which
looks to be zero or close to zero. And then physicists don't like small values because
they like to have some function that has a minimum of a small value so they can say,
okay, that's the reason for the small value. And you would say those are pseudo problems
because maybe the reason is just how nature is. So let's talk about actual problems which I
believe you put into two categories. One of inconsistency with data, so perhaps
the muon experiment would be that or the data from it. And then the internal
contradictions. So let's stick with internal contradictions. Quantum gravity
is one and the projection axiom of quantum mechanics
is another. Is there another or is it just those two for the internal contradictions?
Well, there are more technical issues related to quantum field theory, but it gets very
mouthy very quickly. So maybe let's leave those aside, but yeah, there are some other
mathematical mathematical issues that people have, like for example, I mean,
you can ask like, is quantum field theory actually a well-defined framework to
begin with?
Um, which is, is, is a very philosophical question to ask, you know, if it works,
do we have to worry whether it's well-defined?
Oh, interesting. That's super interesting. I didn't know that. I thought you're going
in the other direction of any progress towards well-defining QFT.
This is what I said. I don't want to talk about. So yeah, because I don't really know really,
this isn't something I have, like axiomatic quantum field theory is not something that I've my myself worked on. So I know it exists. I knew there are problems that people
work on. But I don't really know a lot about it. So it's, you know, it's not a good topic
to talk about because I just be talking random nonsense.
I see. I see. So something I was going to say was, okay, would you consider soft theorems and asymptotic symmetries as progress in physics or progress in double copy relations like gauge gravity dualities or gauge gravity relations?
Well, this is all very interesting in the sense that it helps us to better understand the theories that we already have, but they don't really help us.
You know, it's no, it's no new physics in the, in the sense that it's not a
fundamentally new phenomenon that has been predicted.
It doesn't really, at least I don't see how it helps us answering these
questions, like the big questions that we just talked about.
So, so of course, you know, people who work in the field, they see a lot of progress because
they write a lot of papers, they try very hard to understand the mathematics.
And I'm not saying that this is all useless, you know, especially I recently talked about
this in a video, the Ampli2-hedron stuff.
So that's a very interesting development.
And the entire ADS-CFT stuff is also pretty cool.
They're learning lots about the mathematics of gauge theories, how quantum field theory
actually works.
There are many more structures than we thought there are. But, um, you know, when it comes to those big fundamental questions that we, uh,
started out talking about, uh, at least right now, I don't really see what, what
we learned from it about this.
So there's a, there's a long story about Neemar, Karni, Hamed, and, you know, the
demise of, uh, space time, it's doomed, right?
This is like space time is doomed, like this is the catchphrase.
That might be like, so maybe this is actually the way to do it.
So maybe we have to describe quantum physics in terms of these amplitude hydra and that
will give us a clue for how to deal with quantum gravity.
But at least at the moment, I don't see it happening.
Yes.
And you also see that as an argument of serendipity, that hopefully if we explore this, we're going
to stumble upon a solution, but there are multiple different avenues we could have gone
down and if we constantly justify our explorations, we'll stumble upon something that'll be useful. So, yeah, well, so, so the issue is, uh, so, so serendipitous discoveries have
happened and they'll probably continue to happen.
Um, but the question is like, how likely is it?
So, uh, I don't have a big problem with people like, uh, actually I don't have
a problem with most people, period.
But, you know, research avenues like the aptitude or area safety and other very math heavy
stuff, might teach us something.
And yes, you know, something surprising might come out of it.
But I don't think it's, you know, just historically, it's not been a super productive
strategy and historically people have also had the advantage of having some data to go
by.
So you already raised this question, like, so this is like the big issue, like we don't
have any data to guide us to a theory of quantum gravity.
And so that just leaves a lot of space for playing around with the mathematics.
And you can, you can overdo this, you know, you, there's a lot of mathematics that you can guess, which is why we have all these different approaches to quantum gravity.
Right. And so everyone has their own idea about what's the right thing to do.
That's what this channel is about.
Yeah.
So, so what, what do we do, Kurt?
Right. So, so we, so do we just make videos about everything and then at the end of the day,
we have a vote about which is the nicest one.
I think that's not, that's not a good way to do it.
Well, for me, the amplitude hedron didn't impress me.
It was interesting, but it was specifically for a certain type of quantum field theory,
and equals for super Yang-Mills planar as well.
But then it's been generalized recently to shapeology, I believe.
I believe it's shapeology.
So now it can calculate without supersymmetry for phi 3, phi cubed, I believe.
But anyway, the point is that you would classify that as cool, but not progress.
Well, you know, in some sense it's progress, but it's not progress on the questions that we talked about.
Right? So at least I don't see how it's the case.
So we can have an endless discussion about what exactly do we mean by progress?
Right? Is it progress to write a paper? So we can have an endless discussion about what exactly do we mean by progress?
Is it progress to write a paper?
Yeah, well, so I think personally what I'm talking about is making progress on finding
some sort of new physics.
And this, to my mind, is better understanding the theories that we already have.
And yes, you can call this progress in some sense, but it's, you know, it's not personally
what I'm interested in.
And honestly, I think it's not what most people are interested in when they, when they talk
about the foundations of physics, you know, and then you come back and say, yeah, but
actually, you know, we have calculated some diagram to three more digits or something
that like, oh yeah, I'm cool.
Nice. So there's a great diagram in one of your videos and I'll overlay it on screen. diagram to three more digits or something like, oh yeah, I'm cool, nice.
So there's a great diagram in one of your videos and I'll overlay it on screen.
I'm going to make it simpler than it is where you show here's the standard model on the
x-axis.
Let's just imagine standard models predictions are a straight line instead of a curve and
then deviations from that are what people predict, but we've only tested up until from
zero up until here and so far the standard
model is correct and this predictions continue along the x-axis, but then people's theories
all diverge from there and then they say look I'm falsifiable because other energy range
you can either prove me to be correct or incorrect and then it's generally shown that the standard
model is correct. So we do have a variety of different theories that make predictions, but they also lie outside
the experimental range.
So outside of dark matter, even though we don't know if dark matter is a particle, like
no one's detected a dark matter particle.
So we just have the distribution of dark matter.
My issue is it's so difficult to have progress in physics without experiment
guiding us.
Then the other question is, okay, well look, if progress is difficult, why are we writing
all these papers?
And so I assume that that's part of your critique.
And I wonder if embedded in that critique is, and taxpayers are paying for this.
And then I wonder how much of your issue with the paper production, academia as a whole,
would go away if it wasn't taxpayer funded?
Well, we still wouldn't make progress.
So you know, just because you stop paying people doesn't help us understand how quantum
gravity works.
But yeah, so I think it's kind of a double combination.
You know, I'd say, well, you
know, if they're not actually making progress, then what are we paying them for?
So, so it's like a double insult.
It's not only you think they should actually make a little more effort, let me put it that
way.
But actually, we also have to pay for it.
So but yeah, so just, you know,
just stopping to pay them isn't going to help much.
At least I don't think so.
We have to think about some other way to do it.
So I really think it's a systemic problem.
And I also think that it's very field dependent
exactly how these problems manifest.
But at least in the foundations of physics, we're seeing this overproduction of predictive
models that are being ruled out over and over again.
What I've tried to convey is that this is a methodological problem that physicists even think that this should work.
And I really think it's some sort of misunderstanding of the philosophy of science.
They think that just because you can write it down in mathematics and it's falsifiable,
because in principle you can make a measurement and
rule it out.
It's good science.
And I've tried to convey to them, that's not right.
For it to actually be a scientifically useful hypothesis, it's got to solve some problem.
The only thing you do is that you add something on top of the stuff we already
have but we don't actually need this for anything. Just on a purely mathematical level, it's
fairly obvious why this doesn't work because there are infinitely many of these possible
predictions that you can make, so the probability for any one of them to be right is zero. So I'm not surprised that it doesn't work. And as you know, in my book,
I've argued that if you look at the history of physics, the theoretical breakthroughs
came from solving problems of inconsistency. And so this is just an observation,
which I believe is to be correct.
I don't actually know why this is the case,
but it seems to work.
And so this is why I've been saying
we should focus on solving inconsistencies,
like for example, in quantum gravity, quantum mechanics,
and forget about these pseudo problems
like the hierarchy problem, strong CP problem and so on.
Looking for the perfect holiday gift?
Masterclass has been my go-to recommendation for something truly meaningful.
Imagine giving your loved ones the opportunity to learn directly from world-renowned instructors.
With Masterclass, your friends and family can learn from the best to become their best.
It's not just another streaming platform, it's a treasure trove of inspiration and personal
growth.
Whether it's learning critical thinking from Noam Chomsky to gaining practical problem-solving
skills with Bill Nye, or exploring the richness of history with Doris Goodwin, or my personal
favorite which is learning mathematical tricks or techniques from Terry Tao.
There's something for everyone.
Another one that I enjoyed was Chris Voss,
a former FBI negotiator who teaches communication strategy.
It's been a game changer to me
ever since I read about Chris Voss from his book,
and it was surprising to see him here on MasterClass,
a pleasant surprise.
MasterClass makes learning convenient and accessible,
whether you're on your phone or
your laptop or TV or just listening in audio mode.
And the impact is real.
88% of members say Masterclass has made a positive difference in their life.
For me, it's an incredible way that I discover, sometimes even rediscover, learning.
Every class that I take helps me feel more confident, even inspired.
And I've received great feedback from friends that I've recommended it to.
There's no risk. It's backed by a 30-day money-back guarantee.
Right now, they're offering an extremely wonderful holiday deal,
with some memberships discounted by up to 50%.
Head to masterclass.com slash theories and give the gift of unlimited learning.
That's masterclass.com slash theories, your gateway to unforgettable learning experiences.
Strong CP problem and so on.
Well imagine your critique would also apply to the various theories of quantum gravity
that exist like loop or string because string people would say what we're doing is well
we've solved one of
the major difficulties in combining general relativity with QFT.
Namely you can't go all the way down to zero because you have a minimum
string length and you get gravitons popping out of the theory without
putting gravitons in the theory isn't that progress and I imagine you would
say either you would say maybe at first or you'd say no wholesale. So what do you say to that? What do you say
to the numerous approaches to quantum gravity?
Yeah, well, consistency is all well and fine, but it's not the entire story. It can't be
because there are always multiple ways to resolve mathematical inconsistencies. This
is why we have these different approaches to quantum gravity. String theory, and we can talk about which one, they all have problems,
let me put it this way, and there's a lot of argument about which one is better and
which problems are more severe. So we have string theory to do quantum gravity, asymptotically
safe gravity, and then there are some lesser known approaches like
causal dynamical triangulation. So honestly, I don't know what happened to this. I haven't heard
anything about this for some while. Maybe causal sets, which now seem to have somewhat of a
revival in this hypergraph stuff, which I find very interesting. But so in the end, you know, if
you have a theory, you still need to go and test it.
And I think the possibility, actually the need to test quantum gravity has been neglected for a long time
because physicists just thought, well, it won't ever be testable anyway,
which raises the interesting question where if that's what they thought,
why were they working on it to begin with because then it's not really science.
In any case, the interesting thing is that in the past 10 years, this discussion has
entirely shifted to, yeah, we might actually be able to test it.
I think the reason that this shift happened is that for a long time when people talked about testing
quantum gravity, they had in mind what's called the strong gravity range, where you would
actually see, is it strings or is it loops? So you'd be able to tell apart the details of whatever's going on at high energies. But actually, you should also see quantum
gravitational effects in the weak field limit, which we can
test in the laboratory in principle. We're not quite there
yet, but maybe we'll get there in a decade or two. And I can
talk more about this, but I think that's a very interesting shift because
people are now actually thinking about how to make predictions for experiments that might
actually happen at some point.
Yeah, I actually, I spoke to Yvette Fuentes about her proposal for how to test quantum
gravity in the lab and then
same with Chiara Marletto. There are different proposals that are being
articulated. Now you said solving internal inconsistencies is one thing
but again there are a variety of them so that seems to me it's not a sufficient
condition it's a necessary condition to solve internal inconsistencies. What else
would be a condition then?
If you were the grant body, Sabine's the grant body, and people are coming to you with proposals
for you to give some money to someone in the theoretical physics end, what are you looking
for?
You know, honestly, I would just tell them, don't ask me.
Like, I think it's not a good procedure.
We should go above this in some smart way.
I mean, so the most obvious thing that I always tell people is learn from your mistakes.
And this is exactly why I am so annoyed
about what's going on in the foundations of physics,
because that's not what they're doing.
Like, they're just making the same mistakes over and over again.
And so, and this is also the reason why we see so little novelty because they're just,
you know, continuing to repeat the same stuff in, you know, slightly different variations,
but there's no big change.
And then also this entire societal problem where people who try something
different have trouble getting funded.
So, yeah, so I guess this would be one of the major points.
I think there needs to be, if you want people to start something really new, they need more time
than you have on most of the current grants that you get, which are typically the two
or three years, which in practice, it means that you have to think about applying for
the next grant the moment you start working on the current one.
And it just makes people very, very risk averse because you know, you have this pressure in your back that you need to bring in money again soon. So you need to produce results like reliably
several papers a year or something like that. Otherwise it doesn't look good. So you can't just say, I'm going to think about this big question for the next five
years and see what comes out of it. It's not possible. And of course, you know, you don't
want to hire like 10,000 people who just sit around and think big thoughts. That doesn't
make sense either. But I think you need a few people at least being able to do this. And it looks to me like we just have too few people who can do it.
And it's becoming fewer and fewer instead of more and more.
And the other issue which I've talked about in my first book quite extensively is the issue of self-reinforcing bubbles that you get in research, of which
I think string theory is a very good example, string theory and also supersymmetry, you
know, all these beyond the standard model predictions that they made for the LHC which
didn't work out. Because what happens is that once you've made your PhD in a particular region,
you get a postdoc in it, you apply for grants on it,
at some point it becomes basically impossible to change topic
because no one will give you money, no one will hire you,
because it would take like a year or maybe two years to learn something new.
So I think a very simple thing we could do to prevent this self-reinforcement, because
people can't get out of the field.
So they're constantly forced to tell themselves that this is what they want to do.
And to attract more funding to the same thing is to just give people a chance to start something new
without pressure to having to produce papers immediately.
Like, so some kind of, you know, refocusing grant,
or maybe that's not a good word, you know,
redirection grant.
Yeah.
Interesting, okay.
So it sounds like two solutions is, okay.
Well, one problem that people think is that there's the reliance on grants, but
you're saying it's not just a reliance on grants, it's a reliance on these short-term
grants, short-term being two to five years.
Cause in order to make large progress, you need much longer timescales.
So an over-reliance on short-term grants and that the grants don't pay you to
switch topics and switching topics
may be necessary.
So we have sour grapes, which is that if you're trying to achieve something and then you fail,
you look at that and then you start to scorn that.
But we have the opposite, which is if you have something, you start to adore it.
In string theory, you start to adore it and start to think that it's what's required to
bring you forward or the field forward.
Correct my incorrectness here.
I'm pretty sure that's part, well, so I don't know, like, so I'm not, I haven't psychoanalyzed
the entire strength theorists community.
Like, so I mean, I've interacted with some of them and I think that most of them, they actually believe in what they work on.
So I think it's genuine and I think it's mostly because the people who are not actually convinced
that this is a good thing to do, they just leave because it's not that the working conditions are
so great. So you know, you need to be, you have to have enough motivation to take some pain,
basically, to stay in academia.
And so I think most of these people, they're generally motivated, but much of it comes
from the social reinforcement.
Like they live in a community where people constantly tell each other that this is the
right thing to do.
And especially in the string theory community, this was at least 20 years ago when I had
more to do with them, it was quite extreme.
Like they rarely ever talk to someone who was not a string theorist.
And so it's actually, again, this is just my impression, it's actually become much better
since there's more of the ADS-CFT stuff, because that ties into other research areas like condensed matter physics,
um, uh, heavier in physics, uh, and that sort of stuff.
So it's less isolated now.
Yeah, that was the point that I brought up to Sean Carroll.
Cause Sean was saying, well, I was suggesting to Sean that he was
misrepresenting the views that are saying that there's a crisis in physics.
Because the people who are saying that are
actually making an extremely specific claim about theoretical physics and
fundamental laws not just physics as a whole and he started spouting off
engineering feats like look at topological phases of matter or
condensed matter physics and so on and so on and I was saying okay well but
that's not what your opponents mean and then I was saying, okay, well, but that's not what your opponents mean. And then I was saying that if there is a crisis and not just a stagnation, it's a combination
of a stagnation and a silence with the silence being that in the string community, say, they
don't listen to theories that are outside of it.
In fact, they don't know and then they'll say that the only game in town.
Then when I speak to string theorists that I bring up alternative theories, they'll say, well, they couldn't recapitulate them in a manner that the theorist would agree
with.
So when I, like 20 years ago, when I actually worked on quantum gravity, mostly on the phenomenology
of quantum gravity, so I've always been interested in how can you test this stuff.
And you can try to squeeze some predictions out of string theory
phenomenology or you can take loop quantum gravity or you can take other approaches.
So this is why I've interacted with people who've worked on all kinds of different stuff. And it's
certainly true that usually they didn't know what the other people were even talking about,
like they had a lot of prejudices. And sometimes, you know, it was a little bit silly.
So I don't really know what's happened in the past 15 years
because I just stopped working on this.
But as I said, it's like my impression is just vaguely,
it's like the entire loop quantum gravity stuff has
totally spun off into its own direction. Like, so no one has any idea what these people are doing. They just, you
know, they just do their stuff. And then you have the string theorists, which, and then
the entire community split up into two camps, basically. So there's one, this is now the
much smaller camp, are those which think that string theory
is like a theory of everything, and it will give us the correct theory of quantum gravity
and all that kind of stuff.
And then you have those people who think of string theory as a useful tool to better understand
quantum field theory.
And this is where most of the ADS CFT stuff now lives. It's a tool to better understand strange metals.
This is a typical selling point that people like to raise.
And it's possible.
Generally I think that the possibilities have been hugely overhyped, but I would probably say this about pretty
much everything. So it's not specific to string theory. But yeah, I mean, so this is not specific,
I think, to this particular research direction. You see this in a lot of other areas too,
that people are kind of forced to over-specialize and they fall apart into different camps.
And they have to have some reason for why their stuff is better than that of the other
people.
And often they do this by just ignoring what the other people do.
You also see this in astrophysics in this debate between dark matter and modified gravity.
It's actually quite shocking how many of the people who are totally
opposed to modified gravity have no idea how it actually works. Like, so I've seen this
with my own eyes. They're just like, no, we know that dark matter is the thing. And then
they're point to something totally silly, like the, the, the bullet cluster, you know,
which, you know, this, the bullet cluster is like this one observation, which is neither
here nor there.
Like, so it's generally hard to explain with everything, seems to be a statistical outlier,
whatever.
But they have like this catchy image, you know, with the blue and red.
You should show this image while I fumble around here, so it doesn't make any sense.
And they say, this proves it.
If you look at the details, it's
just not true. And so I always find this a little bit distressing that actual astrophysicists
who should know this stuff, like who should actually look at the details would look at
this. Like, is it actually true? Does it actually rule out modified gravity? Because it doesn't. You could as easily spin a story
that says it rules out dark matter.
And there are other stories
like with the cosmic microwave background.
And this is all stuff that Stacey McCallock
has been going on forever.
How actually the peak of the CMB
actually ruled out dark matter.
If you look at the predictions, I looked at this at some point, it's actually true.
You know, they made a prediction which actually disagreed with what was observed, but they
did not conclude that dark matter has been ruled out.
Instead they, you know, they fumbled around with the theory.
And so, and the distressing thing about this is that if you talk to astrophysicists, they
will not know
this stuff.
So it's not in the code of their group, basically.
So it just becomes forgotten.
And this is what I find terribly distressing.
It's one thing to draw different conclusions from the data, that's fine.
But just not even know what's going on elsewhere.
That's not good.
That's a serious community problem.
So then do you see another grant being,
hey, know your competitors,
or do you think that's just,
that's part of your job as a physicist?
Yeah.
And so I think that physicists,
not just physicists, like this is an overall science thing,
they need to be more conscious about these social dynamics that force them into some corner where some things can just
become totally forgotten or they're so convinced that something is actually correct, they never
themselves go and check it. And so this is, I find this hugely worrisome. And this is also the reason,
by the way, why we had all these wrong predictions for what would happen at the LHC. You know,
all this stuff about supersymmetry and dark matter particles were supposed to show up in extra
dimensions and gravitons and black holes and whatnot. Well, they were all, you already alluded
to this with the theta parameter, they were all predictions based already alluded to this with the theta parameter.
They were all predictions based on this idea of naturalness, which is not a scientific
criterion.
It's ultimately, if you look at it, it's an argument from beauty.
And I've tried to talk to people in this community, like before the LHC even turned on.
So I was trying to figure out figure out why are you using this?
Why do you believe in this?
And they insisted it's a mathematical criterion.
And you can still see this like on the archive today.
People still use these arguments.
And I think this is just a fundamental misunderstanding of what's going on.
So they've grown up learning that this is how you construct
a theory, that this is a good criterion. And they never questioned this, where I would
say, well, you should, you know, you should not just trust these other people, what they
say.
And so, yes, so in the beginning, you asked this question about trust in science, and this is like one
of the key reasons why I have difficulties trusting scientists, because I've seen this
going wrong in my own community.
This is a great point to talk about the general mistrust in science, but also your issues
with science in general.
So let's get to the so-called crisis in science.
I believe, let me double check if you have a YouTube title.
Well, you could tell me if you have a YouTube video
called the crisis in science
and it wasn't referencing someone else,
like you're actually stating there's a crisis in science.
Yeah, yeah, I've certainly stated this,
but I can't remember what the title is.
Sorry, because I change the titles frequently, so
I have no idea what the current titles, you have to look it up.
Okay, so let's give some behind the scenes to people who aren't YouTubers and don't
know about testing thumbnails and testing titles, and then they just see something like
the crisis in physics, let's just imagine you came up with a thumbnail like that, or
your team did, and then they say, Sabine, what you're doing is click bait
and you're the one contributing to the distrust in science
with such inflammatory remarks.
So what do you say to that?
Give people some behind the scenes.
Look, people used to make fun of how bad my thumbnails are.
So if they now say your thumbnails are click bait,
I'm like, yeah, finally figured out how to do them.
But somewhat more seriously, I think most of the people who complain about titles and clickbait and, you know, YouTubers are just doing this for the clicks and whatever, they don't know the first
thing about how YouTube works. So if it was that easy to do a clickbait video and get rich with it, everyone would be doing
it.
So, yeah, I mean, as you already alluded to, so typically we write down a whole set of
titles, we produce a whole set of thumbnails, which luckily now you can automatically test
at least three of them.
If you have more, you have to iterate it a few times.
And the thumbnail and title you see at the end is the one that works best.
As a creator, I understand the importance of having the right tools to support your business
growth. Prior to using Shopify, it was far more complicated and convoluted.
There were different platforms, different systems. none of them meshed well together.
However, once we made that switch to Shopify, everything changed.
What I like best about Shopify is how seamless the entire process is from managing products
to tracking sales.
It's so much easier now and it's streamlined our operations considerably.
If you're serious about upgrading your business, get the same checkout we use with Shopify.
Sign up for your $1 per month trial period at Shopify.com slash theories, all lowercase.
Go to Shopify.com slash theories to upgrade your selling today.
That's Shopify.com slash theories. You know what's great about ambition? You can't see it.
Some things look ambitious, but looks can be deceiving. For example, a runner could be training
for a marathon or they could be late for the bus. You never know. Ambition is on the inside. So that thing you love, keep doing it. Drive your
ambition. Mitsubishi Motors. And so that's how it works. And yeah, I mean, so of course, everyone,
I think on YouTube tries to find a way to communicate the message of the video to their
audience in the best way.
And typically I just try to find something, you know, sometimes that's a really dumb way
to summarize what's in the video.
So in this video I explain XYZ because I found that the titles that really don't work
are those which misrepresent the content of the video.
Often that's not deliberate,
but I look at it later and I'm like,
I think people it's about something else
than what it actually is about. And it's surprisingly difficult. I mean, there's a reason why big newspapers have headline writers, because that's not simple. So, you know, I know this now sounds like I'm apologizing for something. I'm just trying to say, you know, it's, it's not as easy as people make it sound.
I usually just try to be really to the point.
Uh, and I guess that's kind of my trademark.
You know, I just say things as they are, and some people find it offensive.
Um, I suppose that's just how I am.
Have you always been that way?
Yeah, pretty much.
You know, a lot of people blame it on me being German.
Maybe that's part.
I think you blame it on yourself being German.
Yeah.
I mean, to some part, I mean, I've heard this, I've heard this a lot.
Like, this is a stereotype.
Like Germans are, this is what I've been told.
Okay.
So Germans are very direct and they often come across as impolite or unfriendly.
And if that's what people think, then I guess that's true.
I'm not deliberately trying to be impolite or unfriendly.
And I think most Germans wouldn't see me that way.
But yeah, I mean, so I guess I've always been more the kind of person who said,
I think that's bullshit. And then I just say that it's bullshit. Yeah. And I guess some people find
this appealing somehow. Yeah. I think that's an inveterate personality trait. I think that's trait
agreeableness in the big five model of personality, which has a heavy genetic component.
So maybe there is some truth about Germans though.
I need to see a population study.
Definitely got it from my mother.
Okay.
Now, what about humor?
Yeah, what about it?
Your videos have plenty of jokes.
Are those your writing?
Is it your team's writing? Is that something
that runs in the family? Is it something that comes naturally to you?
No, I actually do them myself. I've tried to outsource them to other people, but it
doesn't work very well. I tend to find other people jokes funny. There's also the issue
that I try really hard to make my jokes kind of intelligent, if that makes sense.
So they're actually often about the science.
Some of them are insider jokes that I think most people won't really understand.
So it's really hard to leave this to someone else.
So I've been doing it myself.
Some days I feel more funny than other days. You know, sometimes it's really difficult to come up with some joke, which is also one
of the reasons why I used to make these phone jokes, you know, that the phone would ring
and it would be the president and I would be like, oh yes, we will adjust the phone
structure constantly immediately or something.
And the thing is, like, it took me a really, really long time to come up with these jokes.
I now have much more respect for stand-up comedians and people who are professionally funny,
because it's just so difficult. At some point also you don't want to repeat yourself too much. So this is basically why I, you know, every once in a while I do it, but not
as regularly as I used to.
I used to do stand up when I was 18. And what you do, yeah, what you do is you get an open
mic, your five minutes in the open mic. And I told myself, every time I go to an open
mic, I'm going to come up with a new five minutes and it was so much work and I bombed so I did extremely well the first
few times and then I bombed on my fifth or my fourth or sixth time and then I remember
that crushed me because I did so well the first few times and I just had this pride
and this overconfidence and then I stopped doing it for a year afterward because I was
so traumatized from that. So what did you start doing it for a year afterward because I was so traumatized from that.
So what did you start doing it again?
Well afterward and then I continued doing it for a while afterward and then it petered
off because I became a filmmaker and I used my math skills for that and then became a
podcaster using the math and the physics for my undergrad.
Okay, that's pretty cool.
I would actually like, yeah, I'd be very interested in hearing some of this stuff. But yeah, so yeah, no, think about it. I'm serious.
It's embarrassing.
Well, it can't be more embarrassing than my stupid jokes. But yeah, so if you talk about
science, you know, if I do a video about, I don't know, the amputated or something,
there's this additional problem that people need to understand. It's a joke. Like, so
if you go to a comedy show, you know you're supposed to laugh.
Yes, yes.
So I kind of have to, I always have to make sure that people actually know it's a joke
and now you're supposed to laugh.
Well, part of that could also be the editing.
So when you're about to launch into a joke, the camera zooms in or your team cuts in.
We do this, right, right.
And you know, I have kind of this joke voice, you know, with kind of signals to people,
now that's a joke, you can laugh now.
But it doesn't always work.
You know, some jokes just, I think, just fall flat.
People don't really see what's funny about it or something.
So yeah, it's much more difficult than most people think, I guess.
Do your kids find you funny?
So, I don't know.
Do you have children?
No.
So the weird thing...
Yeah, well, that's a start.
That's supposedly the precursor.
The thing about children is that up to a certain age, they don't understand jokes.
And I found this to be, it's kind of bizarre because I constantly make little jokes, you
know, little ironic, you know, statements about, you know, something like, oh, really,
I would never have thought about this, you know, if I drop something, right?
And you have this five year old who says,
mommy, you just dropped an egg.
And I'm like, oh really?
You don't say, you know, this sort of thing.
Children don't understand that.
It's a joke.
And so they'll take it seriously.
And then they will point out, yes, there's an egg.
And you're like, yes, yes, yes.
And so at a certain point, they start understanding this like around the age of
nine or 10 or something. And mine are now in the age where they start making their own jokes.
So I find this very interesting. So it's like, you know, they're kind of trivial jokes at this
point. Sarcasm is particularly simple, I think.
I actually try to not use it all that much because it's kind of too simple.
And then, you know, people.
You mean to say in your videos.
Yeah, it comes across.
So I've actually, of course I've tried chat GPT to get to, to write jokes, but most of what chat GPT generates are sarcastic remarks.
And they always sound more or less the same.
So it becomes really boring.
Um, but yeah, I think I guess that's like the, the simplest thing to do.
And, you know, I've, I've like three, four books, uh, titled how to be funny or the
most common types of humor and stuff like that.
So that's hilarious. Oh, that's hilarious.
Yeah, that's hilarious, isn't it?
That's so funny. Yes, yes, you're studying it. It's like, it's bad.
Yeah, go on. I'm a scientist, right?
Yeah. This is Sabine. I'm sorry, Sabina. I should be saying Sabinuk, right?
Yeah, yeah. But don't worry about it.
All right. Sabina, one of the reasons I got into comedy, I never said a single funny thing until I
was 18.
And the reason was I loved Seinfeld, I loved watching, I loved any comedy, but I never
thought I could just produce comedy.
And then I thought, okay, it turns out Seinfeld himself, Jerry Seinfeld, didn't say anything
funny at least not to his family until he was 22.
He just decided to be a comedian once he graduated.
And I remember thinking, why can't I study how
comedy works in the same way that I study a math problem?
I was taking real analysis when I was 17 and I
remember some of those problems were extremely difficult.
I'm thinking, okay, why can't I just treat jokes like that?
I started writing out jokes and I started finding the formulas.
Like I mentioned, when I went on stage,
I did extremely well the first few times.
The guy thought that I've been doing it for months.
So this analytic quality of analyzing jokes does work.
Yeah.
Yeah, I've noticed the same.
I think it works up to a point because I look at the really good comedians.
It's a lot to do with the way of presenting things.
It's a lot about the acting, you know, they're just funny in themselves.
And I'm just, I'm a bad actor. Like I just can't do it. I think it's like, you know,
expressively, I'm not funny. So this is why the sort of jokes that I make, they're kind of not
funny, funny in a sense, you know, they're kind of not funny funny in a sense.
You know, they're funny because they're not funny.
Yeah.
Yeah, so I try to do what I can.
How do we end up talking about jokes?
You were what?
I was so rationalistic when I was 18 that I remember saying, if I'm going to come up
with a joke, it has to be such that a robot could read this joke in the same intonation and it would get a laugh, that it's not going
to get a laugh because I'm doing something physical with my body or I have the correct
confidence.
And I didn't like Kramer from Seinfeld because of that.
So I remember thinking that slapstick and I wanted to be like Seinfeld.
I looked up to Seinfeld and he said he analyzes jokes with a scalpel.
He picks them apart and down to the syllable.
And I like that.
I like playing with the words.
But anyhow, when you and I met, we at the Institute for Arts and Ideas, you talked about
how you tried to get people to write your scripts before you tried to outsource some
of it and with people who are researchers, not just script
writers.
But it didn't work.
Can you talk about that?
Yeah.
So it was not a 100% failure.
I've worked together with some very good people who saved me a lot of time by doing the research, you know, going through the literature, finding
out what are the key references, reading them, what are the key points in the references.
But I've found it very hard to find, basically impossible, to find someone who's both good
at doing the research and actually writing a script for YouTube.
Right. And so in the end, I always ended up doing most of myself. I also had a few script writers who
unfortunately brought in some mistakes that I only caught in many cases coincidentally,
because it happened to be something that I knew and then I became suspicious and I started looking up all this other stuff.
And so I've cycled through a lot of people very quickly.
So one thing, for example, which has driven me nuts is that I've always told people, like,
so I'm a little bit over organized, so if you start working with me, you get a long
sheet which says what you're supposed to do and not supposed to do because I don't like to repeat myself.
And as this like one of the key things is always don't trust second hand references,
like never ever. You know, a website that says, you know, there's been a report and the report
says XYZ, you have to look at the report and you wouldn't believe how often you look
at the original source and it actually didn't say this thing.
And I still, you know, some people, you know, it doesn't matter how often you tell them,
don't do this, they'll still do it.
And they think they'll get away with it.
And unfortunately, in the end, I'm the one who's responsible for it.
Like, so, and I always think that I should check everything, but of course I don't.
And I don't always notice, and this is why shit happens.
In the end, I'm the one who gets blamed, right?
So now I mostly, I pretty much do everything myself at the moment, which is also not a good arrangement.
This is why I mostly talk about stuff that I know about myself, at least a little bit.
So here's something else.
We tell our kids that we're not supposed to judge a book by its cover.
We even claim to be living by that philosophy ourselves.
You're just supposed to review the content itself, not the packaging of the content.
It's on the inside that counts.
However, people will at the same time claim that they dislike the YouTube titles.
So why?
If the content is there, then why do you care?
Is it that we're not supposed to judge a book by its cover, but please do judge it by its
thumbnail?
Or are we not supposed to withhold judgment until we understand the substance or content?
Like, which is it?
And another one that I see is you're supposed to judge people by their own merits.
But now it seems like people are judging others based on who does their audience comprise.
So in other words, it's like you're not supposed to judge a book by its cover,
but you can by its sales demographics.
Yeah.
Well, so first of all, I don't really see this a lot in my comments and also like generally,
like on YouTube at least, you know, the weird things going on on ex Twitter, as you've probably noticed, like, it's a very
strange place and it's becoming stranger every day.
But at least for what I see in my YouTube comments, to the extent that I can read them,
like I mean, I get thousands of comments a day, so I can't really read them all.
I try to read as many as I can.
They're mostly from normal people, I would say.
They're interested in physics.
The interested layperson is how I would describe most of the audience of my channel.
Many of them are also students of one field or another,
typically physics and associated disciplines.
And of course, many of them,
and I know that you get this too,
are independent researchers.
They work on their own stuff.
Right.
And they have their own ideas about everything.
And it's fine with me.
So I don't really see a big problem with this. And then you get what you call the science deniers who are looking for a reason to dismiss some scientific finding that they don't like very often.
That's climate science, because it doesn't fit with their political views or something
like that.
And they'll jump on everything which proves their point that you can't trust scientists.
The thing is that I understand where these people are coming from.
As I said, it's like, so what I've seen going on in my own field basically destroyed my
trust in scientists and I haven't gotten it back because I haven't done anything to fix
the problem.
I have no reason to think that the same problem does not also exist in other disciplines,
which is why I've looked very closely at what they do in
climate science. And I've actually been to some conferences and I've talked to climate scientists.
I've interacted with a bunch of them, which is why I can now confidently say, no, climate change is
not a hoax. It's a real thing. And Climate science, the community has its problems, but the problem
is not that they're making up climate change. My approach to this science denier problem
is to take them seriously because there was an origin for their mistrust and to try and
address this, like, you know, on a substantial basis to look at the science and say, no,
this is why the evidence is sound.
And so people still have this mistrust of the institution of science.
And I think the only way that we can address this problem is to actually make it better.
So in other words, you're counter to the people who are saying like, look Sabine with your
scintillating rhetoric, people love the word rhetoric, your scintillating thumbnails or
what have you, that you're contributing to the distrust or mistrust of science.
You're like no firstly there are some people who will always abstract away and
decontextualize something and use that as ammunition for whatever their cause
is so that will always be the case so firstly there's that but secondly you're
bringing up issues that aren't talked about and now scientists are talking
about them people in academia are talking about them. People in academia are talking about them. Many people in academia agree with you, by the way.
And so you're contributing to the public trust of science,
or at least you're trying to bring that trust back by strengthening science.
Is that a fair recap?
Yeah, I think it's a mistake to try and sweep these problems under the rug,
because at least I think they're totally obvious.
So here's an interesting thing that a lot of people in academia like to forget.
I think it's something like 90% of people who do a PhD or who do something in academia
leave.
So there are a lot of people out there who have first-hand experience with academia who no longer work in academia.
And those people know perfectly well what's going on.
And I actually know from the feedback that I get that a lot of the people who are very
concerned about what's going on in academia, they know what they're talking about.
They've seen it with their own eyes.
So it's not as easy as saying, they're all climate deniers, you know, they're talking about. They've seen it with their own eyes. So it's not as easy as saying that all climate deniers, you know, they're all
somewhat weird in the head or something.
So, uh, I think this is a serious problem.
We need to do something about it.
And the first thing you need to do is to acknowledge that you have a problem.
Basically help turn off hesitation, turn off doubt, turn off fears.
With your support, the YMCA of Greater Toronto
helps people turn off whatever's holding them back
so they can let their potential shine.
Help turn on confidence and connections and possibilities.
From youth shelters to job training,
mental health counseling and beyond,
the YMCA offers hundreds of programs
that empower people to shine their brightest.
See our charity's impact at ymcagta.org slash charity.
This podcast is brought to you by Humber River Health Foundation.
From the discovery of insulin in 1921 to the promise of universal health care in 1966,
Canadians have always made health care our mission.
Now we face our biggest challenge yet. A cure for healthcare.
Reduced wait times.
Safer patients.
Advancements in technology.
The end of hallway medicine.
We're finding it all here at Humber River Health.
Help us innovate to keep healthcare alive.
Donate at healthcarelives.ca.
So I was surprised when you said earlier
that there aren't enough people
that are thinking
about the foundational issues because to me there's an overproduction of these people.
And in my estimation from what I see from being on the inside, formally on the inside
of academia and also from speaking to people who are on the inside and just seeing this
whole trend, it seems like we're producing many, many physicists or people with physics
skill sets and people with physics skill sets and people with
mathematics skill sets and then they don't have a place to go. So I'll read a recent tweet. Someone
said, me after finishing my PhD, surely I'll be able to find some jobs in my field. No. Next. Okay,
how about getting access to the online journals so I can continue my research? No. Okay, at least I
can still upload my work onto free online pre-print repositories, right?
Also no.
And then you even commented this really shouldn't happen.
I feel we need to create a scientific underground.
So do you also see that there's an overproduction of scientists or you don't see that?
And I want to know more about this scientific underground as well.
So the overproduction of PhDs, like this is a longstanding problem that has been discussed
forward and backward.
And the, like, I think most people who I've talked to about this agree that this is driven
by the need for cheap researchers.
So a lot of this bringing in grants depends on how quickly you can churn out papers.
And the more students and the more young postdocs you have that you can put on cheap positions,
the better.
And so you see this in a lot of institutions that I've seen this with my own eyes.
Actually, I was one of them, is that you hire students, young post-docs, you put them on these super
cheap jobs, they produce their three papers, you kick them out.
But for most of them, there are no positions to land on.
There are just not sufficiently many jobs.
And the reason this doesn't change is that the people higher up in the hierarchy, like
the professors, especially the younger ones, they need these people to produce all these papers.
So, and I think this is why this isn't changing. I don't really know what to do
about it. I'm just telling you, I think that that's what's going on. I remember
this tweet and the reason I thought it was interesting is that this is a problem I've seen with a lot of people who I know who've
left academia, mostly voluntarily, and they went on to take some other job, something
that would feed the family more reliably, basically.
But they don't lose their interest from one day to the next.
In many cases, they have unfinished research research or maybe they have some other research
things that I want to work on.
And you know, they do it on the weekend, you know, out of passion.
But then they have the problem that since they're no longer affiliated with some institution,
some journals won't even accept their submissions, which
I think is just totally crazy.
Why do you need a university affiliation or something to submit a paper to a journal that's
supposed to be peer reviewed?
That doesn't make any sense.
You can also ask if they want to go to a conference, they can't apply for any funding or something
because they don't have this affiliation. And then there is also the community problem, right? You're
no longer really tied into the community and you know, you fall a little bit out of touch.
And as I said earlier, like just looking at the numbers, I don't have really, I have the exact
numbers on top of my head,
but I'm pretty sure like the majority of people who have something to do with academia at
some point leave eventually.
So this is why I have this tweet about the scientific underground, which is loosely speaking
how I've become to call it, you know, it's all those people who want
to do this research work, but who are not affiliated with a university or some other
research institute.
But so what do they do?
You know, how do they organize themselves?
Why can these people not apply for travel grants?
Why can they not submit papers?
That doesn't make any sense.
And so this seems to me, if I had more time,
maybe I would do something about it.
It can't be so difficult to fund some sort of community
where people can get together
and try to find a way to solve this problem.
But yeah, I think way to solve this problem.
But yeah, I think it's a growing problem.
I see more and more people in that position who are frustrated about it because they've
written a paper.
And let me be clear, I'm not talking about some people with their revolutionary new quantum
mechanics.
It's typically, it's like super technical stuff, some sort of data analysis
or, you know, something that came out of their PhD.
It's all print out from chat.
Yeah, yeah, right.
Or actually this tweet from the guy, he actually, what he wanted to do was he wanted to post
his PhD thesis on the archive.
And you know, he's already got his PhD, right?
So how bad can it be why can't
they just let him post his PhD thesis like this is like crazy and of course I mean I
understand like the archive is you know they have a lot of problems with whether they find
people to moderate the papers and so there are things going wrong there but I have even
less understanding for this when it comes to journalists who are actually making money with this kind of stuff.
Yeah, I'm extremely interested in independent scholarship.
So while the Theories of Everything audience has a large amount of academic researchers
who watch the show, I also, I want to contribute to academia in a way that's outside the academy.
And I see this as a large gap in a way that supplements the academy rather than opposes
it or overthrows it or what have you.
I see the universities as doing something necessary and I'd like to help that and see
where its problems are and just fill in the gaps.
But I'm not entirely sure how to do that.
So anytime Sabina, when you have ideas or anyone from the audience and just feel free
to contact me, I want to bring up something you said that, well, I want to bring up a
critique that I've heard thrown at you that I think is unfair.
And then I want to bring up something that I think you said that was unfair in one of
your videos.
Okay.
So a critique that is thrown at you, which is something you just said, which
is, look, you outlined a problem.
Then you said, I don't know how to solve it.
And some people are like, look, she's just bringing up problems.
What are the solutions?
I think that's quite unfair in the scientific method.
Your propositions just live and die on their own as to
whether they're true or false.
You can state a problem without knowing a solution.
So in the early 2000s, when Lee Smolin came out with his book against string theory,
the string theorists criticized him. They were saying, look,
Lee, you're just trying to promote your own theory, namely loop quantum gravity.
Peter White at the same time had his own criticisms of string theory,
and then the string theorists came to him and said, look, well,
you're just criticizing without putting up any alternative.
Put up or shut up is something that people say.
I don't think that's fair.
I think a criticism should stand on its own.
It's not whether there's a solution attached to it.
Then something that you said that I think was unfair was Peter White now does have his
own alternative.
And I remember you in a video and I don't recall the video, so I apologize, but
you were lamenting some science problem.
And then you were saying the only people who are talking about this are me, Peter
White and Eric, and those two have their own solutions that they're trying to
peddle. And I was like, okay, but that doesn't minimize what they're saying.
Now that was an offhanded remark in just a single video and you have thousands of videos
and so it's unfair of me to even bring this up because if you find, if you examine anyone
who has such a wealth of content, you'll find different sentences here and there to nitpick.
So I don't want this to be that I'm ill-natured or perverse in my carping.
But I'd just like you to expand on that, to tell me what your thoughts are on what I just said.
I can totally tell you what. So first of all, it was mostly meant as a joke. So I know both
Eric and Peter White. And what I had in mind when I said this was that we had a meeting and this is a really long story, which was
Eric Weinstein's idea.
And we all sat together with some other people, including Garrett Ducey, and it was about
theories of everything.
Like officially kind of the topic was my book, my book Lost in Mouth, in which I explained
how you should not go about theory development.
So my entire point is like you have to have a concrete problem to solve, like you have to try to find an inconsistency to resolve and then you have to try to find a way to experimentally test it.
So this is like my summary. And of course you can disagree with that and maybe have a better way to do it. But so the irony of the, of the whole thing was that I ended up with these
people who do exactly what I said we should not do, right?
Like so, so Garrett Leesey, Eric Weinstein, Peter White, this is exactly
the same thing that they also do.
And, and so I find this, you know, it has a certain irony to it.
You know, I don't, you know, I understand that.
When was this meeting?
Ah, Jesus, it must have been before Corona. Sorry, this is how the Germans call it. COVID.
I don't know why the Germans stuck with this, calling it Corona, but yeah, so I think it's
funny because, you know, I've tried to get people to, to understand my criticism and
even the ones who are sympathetic to it don't want to actually use it.
So I'm like, okay, all right, you know, whatever.
So what can I say?
About when you said that you have investigated climate change, and then
you've talked to climate scientists and you've seen how different it's called
gel man, amnesia, I believe, where, where you see actually, no, it's not
gel man amnesia in this case.
It's the opposite where you've, actually no, it's not gal-man amnesia in this case.
It's the opposite, where you've noticed how false scientific reporting can be in physics,
but then you've actually investigated it in other domains, and so now you're just not
a fan of not only science journalism, but maybe scientists.
What if someone's like, okay, that's similar to a claim of racism, where you've interacted with a few women, or a few minorities, or a few of whatever, or white people, and then you've had bad experiences, and so you generalize to all white people, or to all males, or what have you, rather than keeping the claims specific and not abstracting away?
So how do you disembroil between critiquing science as a whole or a scientist as a whole
versus just the specific claims that you've investigated?
I think we're conflating two different things now.
So the one is my trust issue where I say,
I don't trust scientists because of what I've said.
And the other is what's wrong with science overall.
So I think when it comes to my trust issue,
it's just a matter of, I want to see proof.
I'm not saying necessarily there is something wrong in these other disciplines,
but I want to know reason why I should trust those people.
So I think that the comparison is not quite adequate. It's actually I'm exactly trying to not fall the Gell-Mann amnesia issue in saying, okay, but it's only a problem, this particular discipline, because I think it's a systemic problem.
So I have no reason to assume that it doesn't also exist elsewhere.
So when it comes to the issue with science overall, I'm more relying on data, which I
talked about in a recent video.
So I've, for a long time, I think this is also some connections that go back to Eric
Weidstein, who I've known for a long time.
So it's a really long story.
In any case, so I've been very interested in the economic impact of technology.
And there's a long background story about how progress seems to be slowing down, which
is what Tyler Cowan, I think is his name. I think that's how it's pronounced.
Yes, right.
Yeah, I see.
The economist.
Yeah, yeah.
And other people have been working on for a long time,
you know, how do you actually quantify the impact
of technology on our society?
And at least my understanding of what they found
is that progress has been slowing down.
There are a couple of different ways to look at it.
So actually, Tyler himself seems to have recently said there seem to be some indications that
Trent has been reversing and that other people have disagreed with it.
So you know, it's, you know, this is, it's, it's its own research field.
I see.
And they have also, I, I've been very interested in, um, what's called scientific metrics or
bibliometric analysis of what can you squeeze out of the scientific literature.
And I actually had a research project on this.
So this is another long story that I want to get.
I don't want to get into.
Um, but so you don't want to get into, or you want to, no, no, I don't want to get into. But so you don't want to get into or you want to.
No, I don't want to get into it because then we'll still be talking at, you know, tomorrow
morning.
Okay.
But so I know a little bit about bibliometric analysis and I've worked on this myself. There have been numerous papers saying that there are indications that scientific
progress is actually slowing down. You can always ask exactly how did they measure scientific
progress? Everyone does it in their own way. This is an entire art discipline basically where they make up new measures.
And so the reason why I find it convincing is that it doesn't really matter exactly how they did it.
They always found more or less the same thing, which is that scientific progress has been slowing down since the 1960s, 1970s,
which is also consistent with this economic analysis.
And then there is like the more, the narrative side of this story, which is what John Horgan
wrote about in his book, The End of Science, which is exactly what we started from.
Like this impression, we're not really making progress on these big open questions. Like they've been open for a long time
and nothing seems to be happening.
So John Horgan thinks that we've just reached
the end of science, which is why his book is titled that way.
And I think that that's a little bit too extreme,
but I think his observation is basically correct.
And he's actually written a new preface for the book at the 20 year anniversary,
which was just a couple of years ago, in which he says,
well, you know, I've re-investigated the points that are made in the book,
and they basically turned out to be correct.
You know, we haven't figured out what consciousness is.
We still don't know what dark matter is. We still don't know what quantum gravity is.
We're still talking vague words about what complexity is and stuff like this.
And I think it's basically correct.
And so we cannot talk about the objections to this because they're always more or less
the same.
I find it a little bit tiring, you must get the same thing on
social media that there are always the same objections that come from people who have
never heard of the topic before and you just end up endlessly repeating the same things
over and over again. So in this case, like the typical objection is that as science gets
more mature, it becomes more difficult to make progress, the progress slows down.
And what you can hold against this, where on the other hand, the number of scientists
is exponentially increasing, which already makes us a little bit suspect.
And then the other issue is that different disciplines, you know, if you look at something
like medicine and physics, for example, like physics is much older than what people started.
I mean, you know, we can talk about exactly when did medicine actually start, but because
there was a lot of hocus pocus in the early days.
But you know, serious medicine is much younger a discipline than physics, which is basically thousands of years old, especially astronomy.
What sense does it make that progress in all of these disciplines would slow down at the
same time if it's just something to do with the nature of knowledge itself?
It's not what it looks like.
It looks like it's a systemic problem with the way that we organize research.
And I understand that this is not a watertight argument,
but I think it's quite plausible.
And at the very least, you'd think it's something to take seriously,
that we have to change something about the way that we organize science.
Hmm.
Okay.
So then it's not an argument that the foundations haven't changed.
It's more about questions have remained open and not answered in quite some time.
And it's been that way for a variety of fields around the same time.
Yeah.
So that's entirely right. So my biggest problem is not that these questions have remained open for such a long time, but
that we're not getting anywhere.
So we're just doing the same thing over again, and it seems to be a systemic problem in all
fields.
So it just feels to me like something isn't quite right.
Does it make sense?
Yeah, the only field I don't see this in, I don't see someone saying that there's a crisis in,
crisis in physics, crisis in ecology, meaning crisis, everyone's in a crisis,
except computer science.
Yeah, right.
And the foundations of computer science hasn't changed since the 40s,
since even before physics, since Turing and Church.
Some things require a lot of work to grow, like plants, hair, babies, or your savings.
But when you run a business, you already have enough on your plate.
Scotiabank's Right Size Savings for Business account can help you grow your savings with ease.
For a limited time, open a new account and earn up to 4.65% interest for the first six months. Before you know it, your savings will grow without you even noticing.
Ooh, which reminds me, I need a haircut. Conditions apply. Ends December 15th. Rate is annual,
calculated daily and will vary based on account balance. Visit ScotiaBank.com slash rightsize
savings for full details.
A courtside legend is born. The Raptor Chicken Nacho Poutine from McDonald's. Our world famous fries topped with seasoned chicken,
gravy, stringy cheese curds,
tortilla strips, and drizzled with nacho cheese sauce.
Get your claws on it.
For a limited time only, I
participate in McDonald's restaurants in Ontario.
Now, there are unanswered questions about
the limits of computability and complexity
issues, but those are like machine code questions that are close to the base, but they don't
question the base.
Yeah.
So, actually, I think one could make a fair point that pretty much all the progress that
we see in science overall has actually been driven by maybe not exactly computer science, but computer
science in the general sense extended to technology, technological applications, data analysis,
and so on and so forth.
I mean, certainly like in physics itself, I mean, I remember when I was a student, if you had an integral that you could not solve
that was not in a table, that was the end of your research.
You were like, okay, that's it.
Can't solve this integral.
Like, and this is just, this problem has just totally evaporated, right?
You're like, okay, we can't solve this integral.
We put it into computer problem solved. And so, and this is like such a, such a trivial thing almost, how computing power
has made such a big impact in physics.
And it's certainly also, also in other disciplines, like if you look at medicine,
just data analysis, if you look at MRI, FMRI, all this stuff, you know, it's,
you need to analyze this data, you need computers, right?
So, I think this is what it's driven by.
But then you can also ask, well, what made this progress in computer science possible?
And I would say, well, ultimately, it's all physics, right?
You need to know what are semiconductors, how do they work, what's a band gap, and all
this kind of stuff.
And you know, I know there are people who are arguing like, this is not like, historically
it's not how it happened.
It was more an accidental thing.
So people were tinkering around with stuff here and there.
It's not like they sat there and developed a theory for semiconductors and then they
build it.
This is not what happened.
But I still think like if you look at it from a constructive point of view in the sense
of how does science build up, it goes back to our understanding of physics, of materials,
of electricity and so on.
So this is where it comes from.
And so what we're doing right now is that we're trying to get out of the science that
we have already discovered what we possibly can.
So we're making computers smaller and smaller and there's a lot left to do.
Actually John Horgan makes it very clear in his book that this is not
what he's talking about.
So there's a lot of stuff that you can build up on the science that we know already, for
example, in computing and robotics and so on, which kind of seems to be the next big
thing that will come knocking at the door, seems to be a robot.
And so there's a lot of research left to do, but what he's concerned with and what I'm
also concerned with are these big new discoveries about the world.
And so, of course, I'm, as a physicist, I'm very opinionated about these things, but I think that physics and especially the foundations
of physics are a super important discipline because it's where we get the deepest insights
about nature.
So, if we don't make progress in the foundations of physics, then sooner or later, progress
in all other disciplines is also going to run dry.
And so, this sounds a little bit bombastic and there are a lot of people who would disagree
with this.
You probably know this entire argument about what do we actually mean by fundamental?
Is biology any less fundamental than physics?
I would say yes, but biologists might disagree. In any case, I think that if we really
want to make progress as a society as a whole,
we need to make progress at the foundations of physics
in the sense of actually discovering some new phenomenon.
Maybe understanding quantum mechanics.
Maybe it's quantum gravity, the way I personally
think this is somewhat unlikely.
But anything, something really.
Yes. So you don't see your work on super determinism as just contributing to the useless papers that are produced?
No, of course not. And even if that was the case, I'd never admit it. No, but look,
we already talked about this. Like I overthink everything. If I make a joke, I read like three books and I analyze joke structures, as you say,
and I think about the subtext and I try to find the most punchy phrase and which word
comes in which order, how do I start?
Oh, that's interesting.
Great.
Right.
And so this is also more or less how I've approached the question of what do I make
my research on. This what do I make my research
on.
This is why I wrote my first book.
It was trying to figure out what should I spend my time on because there's only so much
time in my life.
So what's the most promising thing to work on?
And I've tried to take my own advice very seriously, maybe not totally successful, but
at least I've tried.
And so I've tried to not work on the problems that I think are pseudo problems,
but I've tried to focus on the problems that I said are real problems where time is well spent.
Now the issue is that I couldn't get funding to work on those.
And this is why I'm now in the position that I am, where I'm funding my own research by making videos on YouTube.
And I rely a lot on people supporting me, like people on Patreon, people who join my channel,
who I'm super, super grateful for because otherwise it wouldn't be possible. And so the reason I ended up working on the super determinism stuff is that I arrived
at the conclusion that the best way to make progress is to figure out what happens in
a measurement and quantum mechanics and why this particular problem, because we know it's
something that is in the measure of a range.
Like it's literally a measurement process.
Like it happens in the laboratory all the time,
and it quite plausibly has a relevance for practical applications.
Exactly because it's in a parameter range that is easily experimentally accessible,
that we actually already access with devices that
we build.
Do you have any new ideas about quantum gravity?
About quantum gravity?
No, because I've been focusing on this measurement issue.
So I've eventually, well, kind of, yes, but so I know this now sounds very confusing.
So I've for some while suspected that maybe the two problems are actually related, partly
because as you know, Penrose has this idea that actually the collapse of the wave function
is somehow caused by gravity.
And I'm not super happy with this particular model of how it works.
And so I've tried to find a different way to combine these ideas.
And I've actually like in the past month, I've actually made some progress, which I
think is pretty exciting.
Great.
I don't know how much you want to hear about it because it will get to a little bit.
Yeah, I would love to hear about it, especially if you're in collaboration with Tim Palmer
or if it's different than this.
No, he doesn't like it.
So I don't know.
Well, so we talk about it and he gives me good feedback, but I think he doesn't really
like the idea.
Okay.
So it goes back to a paper which I wrote like 10 years ago or something, which is kind of
vaguely called a possibility to solve the problem with quantizing gravity or something.
What I was trying to point out in this paper is that just on purely logical reasons, there
is a possibility to solve the problems with quantizing gravity that no one
has previously talked about, which is that all these problems which we have, if we perturbatively
quantize gravity and then the theory produces all these infinities, it's not renormalizable
and all these problems, then we need strength theory or loop quantum gravity.
They appear at really high energies, which we haven't tested.
And so in particularly, we don't know that quantum physics itself actually works the
same in this energy range as it does at low energies.
So Issa, you know what we can do is that if we go to these high energies where
perturbatively quantized gravity becomes problematic, we just say quantum effects
go all the way. So we go back to classical physics. And so you see then
these two problems suddenly become the same problem. That's the reason we can't
quantize gravity
is that we don't understand
where quantum effects go in a measurement.
Yes.
And once you make this connection,
you have a method of quantifying
where the deviations from quantum mechanics should occur
because now you can estimate that it should
come with the size of the quantum gravitational effects, which don't happen, if that makes
sense.
So you say it has to happen in this range for those quantum problematic quantum gravitational
contributions to go away.
And so even if you don't exactly know how it works, like what's the
exact mechanism that makes it go away, gives you a way to estimate the range
in which it should happen.
And I've done this and now the depressing part of the story is that I
couldn't think of any experiment that would actually access this range.
So, which is unfortunate.
Right.
So you're part of the problem.
Well, well, you see what I could do now is I could come up with a reason for why this is accessible
with the next experiment that I'm going to build somewhere.
And that's exactly what I don't want to do.
And so now I have this problem.
Okay, so I finally have found a way to do it.
I found a way to make this estimate estimate.
Um, but, um, it's not particularly interesting because you can't test it.
So now what do I do?
So I've decided for myself, I'm going to write it up anyway, uh, and put it out
there because maybe someone else can think of a way to test it.
Um, but yes, so that's the current status.
So I'm confused about the proposal.
So you're saying that at high energies, quantum effects may diminish
because we have some problems with quantum gravity only at high energies.
Like there's low energy combinations of GR with QFT.
Okay.
And then you're making a connection between that and the measurement
problem, but the measurement problem occurs at low energies as well, no?
Yes, that's exactly the right question to ask because when I was talking about high energy,
that was a very vague way to explain the idea. So the question is exactly what is the quantity that you need to compute?
I see.
And energy, so first of all, normally if they talk about the high energy expansion, it's
not actually energy, but it's something like momentum transfer, so it has the scale of
an energy. But first of all, you want to use something that's actually Lorentz invariant.
So energy is already bad.
So then you can think about should we use math, but math doesn't really make sense
either.
And so I've eventually come up with a way to do it in the path integral because that fits
very well together with the paper that I brought several years ago.
And you can then estimate the contribution from this quantum mechanical term in the path
integral, which we know how it looks like.
So this is, again, you know, this is the nice thing about it.
We know how this contribution looks like to the normal term
without the quantum gravitational contribution.
You can ask, well, when does the crossover happen?
Like when does this term have approximately the same size
as this other term?
And so it's more complicated than energy.
It's something like a space-time integral over the stress-energy-momentum-tensor coupled
to the metric, if that makes sense.
Something like that.
I'd be remiss if I didn't mention the profusion of papers produced, which was covered by the
economist. mention the profusion of papers produced which was covered by The Economist. They have an
article explaining this issue simply and concisely called, Scientific Publishers are producing
more papers than ever. They confirm what you've been saying, at least from a financial incentive
standpoint. They're saying that the reasons are based on business models or subscription
fees. However, they brought up special issues as another reason.
And I haven't heard you mention that before.
Well, yeah, I mean, the special issue issue is also something that I've talked about.
So basically the issue is that publishers make money by selling stuff.
They sell subscriptions to journals or books or special issues.
But they don't want to spend the money to produce the content.
And so they've caught into live these special issues where they recruit what's called a
guest editor.
And the guest editor invites people to write contributions to the special issue on some topic.
And then they publish this special issue and they basically sell it to libraries.
And I think it's a, so first of all, I don't want to diss all special issues.
It started from a good idea, I think, to give some space for rapidly emerging fields where you don't
have a lot of people who know each other.
So you want to collect all these people in one issue so a new field has something to
start from.
It's the same idea as with the rapid response workshops
that they do in some places, if you've ever heard of this. There's a new issue that is
emerging in the scientific community. You want to give people some networking opportunities.
They need to get to know each other. They need to develop their new language. And so the special issues came out of the same idea.
But then what's happened is that some certain publishers have abused this to cheaply produce
content and then make money by selling it.
And this has certainly contributed to the number of publications rising.
I haven't read the article in The Economist, but I think that's correct.
I'm not actually sure that this is like the biggest issue.
So this has been going on for at least, I don't know, since before COVID.
They still don't get it right.
And so what I've seen in the past three years or something is that everyone knows like this
is basically, you know, it's not really sound stuff to do and more and more people just
don't want to have anything to do with it.
So I would expect that this is actually already declining at this point.
So my ending question is a meta question, given that you're much more skilled at this
than myself.
What should we title this video?
Oh dear.
Sabina Orsenfelder talks a lot of bullshit.
I'm pretty sure of this.
Sabina's drool.
Yeah, but I have to warn you, what's going to happen?
You're getting flagged by YouTube for mild profanity in the title.
Oh, interesting.
Okay.
Yeah, I don't know.
I guess pick the most interesting one.
I'm not sure if it's good to have my name in the title, if that is more attractive or
more repulsive.
But I guess there's no way around it, right?
I mean, otherwise, how are
people supposed to know what's in the video? So, okay, one that occurs to me is the stagnation
in physics rather than the crisis in physics. Now, if I wanted to be more galvanic, I could
put the crisis in science. But what would you say? I may just take what you're going to say here and
put it on and people will see. Oh dear, you're putting me on the spot.
Yeah, well, the issue, the potential issue with this title is that it must have appeared
like two dozen times already, you know, this idea with the crisis in size, the crisis in physics.
Maybe use something like, you know, related but not exactly the same in terms of words, like maybe what's
going wrong in physics.
Maybe I have already used this as a title.
So you better take this.
Yeah, I don't want to compete with the great Sabina.
Yeah, well, you get the idea.
Just don't use exactly the same words that other people have already
used because people might think, oh, well, it's just the same stuff all over again.
Why string theory is correct.
Yeah, but then people will feel misled, right?
No, no, no, of course.
Okay, it was a joke.
Okay, right.
Yeah, that's my def.
Okay, first of all, by. Okay. It was a joke. Okay.
All right.
Yeah.
That's my definition.
Okay.
First of all, by the way, that's a joke.
And then secondly, that's my definition of clickbait is when it not just something that
is peek into, but that it doesn't deliver.
Yeah.
Right.
Exactly.
Right.
Honestly, I think when people say that my videos are clickbait, what they're referring
to the stupid photos of me, the facial expression on the thumbnail, for which I have
a very simple explanation is that I don't know what else to put on the thumbnail.
You know, you must know this problem.
You talk about super abstract stuff.
What the hell do you put on the thumbnail other than yourself?
And this is why I have collected 100 photos of me making silly faces, just so there's something
to put there.
So the way that it works, just if you want some behind the scenes for our channel, the
way that it works is that the full podcast that are one hour to three hours long, the
full ones, they have my new detention on the title and the thumbnail titles.
There's two different types of titles for people who are unfamiliar.
There's the YouTube title and then there's what we call the thumb text.
So we play around with different combinations and we test some,
but it does have my personal touch.
But then we also take out clips from those because not everyone has
the stamina or want to watch a five-hour podcast.
So we take out 10 to 15 minute clips.
And that's a marketer. I have a marketing guy, a fantastic marketing guy take out 10 to 15 minute clips. And that's a marketer.
I have a marketing guy, a fantastic marketing guy who's in charge of daily clips.
And he does that.
And I tell him, just don't, as long as it's not, it doesn't say Penrose reveals his theory,
like the word reveal, it would get more clicks.
We've tested that, but we've said, okay, no, that's too much.
The hidden nature of quantum, hidden is another, that's too much. The hidden nature
of quantum, hidden is another one that we removed, even though hidden actually works.
We just tested a title called the hidden potential of the brain, and it works better than the
potential of the brain. But I and my, even the marketer himself didn't like that. And
so we just went back to the potential of the brain. But anyhow, he's in charge of the daily
clip titles and thumb texts. Whereas the full podcasts get much more of this, the fact that I'm even asking
you, getting your actual opinion, it has my fingers in it. Yeah, so you must also have tried this,
like there are a variety of AI plugs that you can use to generate titers and they're so hilariously bad. Like, hidden is one of those words and reveals and, you know, blasting the myth of the something
or other with lots of capitals.
You don't want to miss it.
Yeah, yeah, exactly.
Right.
Yeah, yeah.
Yeah.
It gets cringy to me.
Maybe they would work.
Yeah.
I can't even get myself like, so this is something I've discussed with my agency.
Like there are a lot of people who put capital words in the title and I've tried this a few
times, but every time I look at it, I'm like, I just can't do it.
And I remove it again.
Yeah.
Okay.
So we still don't know what you're going to call the video.
Yeah.
So I may say what's wrong with physics or something like that.
I think that what you said is correct.
And I'm going to blame you.
So if I get criticized for being clickbait, I'm going to respond to each comment.
Yeah.
Yeah.
Yeah.
You know, I find it interesting that the word clickbait, you know, it's, it's so overused
at this point is pretty much meaningless. So I remember a time when clickbait was, it was a title that you actually, you
had to click on it to find out what it even was about, right?
So is it like stereotypically this thing, uh, she woke up for her wedding day.
You won't believe what happened next.
Like, so this is like the stereotypical clickbait type.
This grandmother in Utah found that toothpaste does dot dot dot.
Exactly that sort of thing.
And so now we have like some sort of clickbaitish title that people don't even recognize as clickbait.
So like, for example, there was a video which kind of
titled the man who killed the most people in history. And you maybe you know
who's video this was. But you have to click on the stupid video to figure out
that it was about the guy who put Latin gasoline or something.
Oh, okay.
Right? So, and I would call this, this is like stereotypical clickbait because you have to click on it
to know what it even is about.
But I think people don't register this stuff as clickbait, which I find kind of interesting.
So it's just an observation that I think that...
I wouldn't call that clickbait.
The reason why is to me the content has to have a mismatch between itself and then the
title.
So in that case it sounded like the person delivered.
They must have some reason for suggesting that person killed the most people.
I thought it was going to be Stalin or Hitler or someone who dropped the bombs.
No, no. I didn't really look into it. It's quite, I guess it's probably correct.
Sure, sure, sure. I understand.
So you see, I wouldn't use the word click bait in that sense, because the examples that
we just mentioned, like the stereotypical click bait, we really love this five everyday
hacks or whatever.
They actually do deliver the stuff.
You just, you have to click on it to figure out that it's some super dumb stuff.
You know, yeah.
Yes, yes, yes.
So, whatever.
Okay.
All right.
Sounds like a great place to stop.
Lovely to talk to you.
Yeah, lovely to talk to you.
And it was great meeting you in person.
Oh yeah.
Yeah, yeah.
We should do it again.
Don't go anywhere just yet. Now I have a recap of today's episode brought to you by The Economist.
Just as The Economist brings clarity to complex concepts, we're doing the same with our new
AI-powered episode recap. Here's a concise summary of the key insights from today's podcast.
Welcome to our deep dive into this fascinating conversation. It's
between Kurt Joe McGull, host of Theories of Everything, and physicist Sabine
Hassenfelder. Yeah. And you know you've probably heard people talking about a
stagnation in physics these days and that's what they tackled head-on. Right.
Got their conversation as our source material here. And it's really packed
with insights about... Oh, great conlograph insights about, their physics stands today and what the future may hold.
What I think is so interesting is,
he really dives deep into the mathematical side of things.
He actually kind of pushes back on some of Sabine's points
while agreeing with others.
So it's this really cool dynamic.
Yeah, they start right off the bat tackling,
is physics in a state of crisis or is it stagnation?
And Sabine, as you probably know,
prefers the term stagnation.
Stagnation, right.
And she uses this cool analogy of physicists
running on a treadmill, lots of effort, lots of papers,
lots of conferences and things, but no real movement forward.
And she's talking specifically about the foundations
of physics.
You know, those big questions that we've been puzzled by
for decades now.
Dark matter, quantum gravity,
the interpretation of quantum mechanics.
So she's saying it's not that physicists aren't working hard,
it's that the current approaches or methods
just aren't yielding those breakthroughs.
Yeah, I think that's really the core of our argument,
is that despite decades of research,
we haven't had any significant shift in our understanding
in these fundamental mysteries.
But Kurt, who's no stranger to complex theories himself,
offers a slightly different perspective.
He does, he's very much interested
in the mathematical intricacies
of some areas of physics that Sabin considers less fruitful.
Things like soft theorems or asymptotic symmetries
or gauge gravity dualities.
And he acknowledges that those haven't led
to new fundamental physics yet,
but he sees this mathematical elegance
as potentially a sign of something deeper.
So they're both wrestling with this question of progress, but they're coming at it from
slightly different angles.
Yeah, I think that's a great way to put it.
Yeah, and this leads to this really interesting discussion about, you know, what even constitutes
a real problem in physics.
And Sabine argues that some problems might actually be pseudo problems.
Right.
Meaning nature might just be that way.
Right.
Even if it doesn't fit our current theoretical frameworks.
Yeah.
So could you give an example of that?
Sure.
So she points to the strong CP problem.
Okay.
Where there's this certain kind of symmetry violation that should exist theoretically,
but it doesn't seem to show up in our observations of the universe.
And she says, you know,
maybe this isn't really a problem to be solved,
maybe this is just a feature of the universe.
She's challenging this idea
that we need to always strive
for these really neat, elegant explanations.
I think that's right.
Yeah, she pushes back against this notion of naturalness,
which has been a guiding principle
in theoretical physics
for a long time.
But as she points out, naturalness has led to a lot of incorrect predictions.
Things like the expectation of supersymmetry at the Large Hadron Collider, which hasn't
panned out.
Right.
So it's almost like she's saying our sense of aesthetics or our desire for things to
be a certain way might be clouding our judgment.
Yeah, I think that's a great way to put it.
When it comes to understanding the universe.
Yeah, she argues that sometimes what seems unnatural to us
might just be the way that nature operates.
And this ties into her broader critique
about the way that research is conducted today.
Oh right, the systemic problems with academia.
Let's unpack that a bit.
One of her main concerns is this over-reliance on short-term grants, which forces researchers
into this kind of publish or perish mentality where they're constantly chasing funding,
turning out papers instead of pursuing these long-term risky projects that might lead to
really groundbreaking discoveries.
So it's almost like the system itself discourages boldness.
Yeah, that's what she's arguing.
And it creates these sort of self-reinforcing bubbles where researchers get stuck in specific
subfields, because that's where the funding is.
That's what's expected for career advancement.
Right.
So what's the alternative?
Does she offer any solutions?
She brings up this idea of a scientific underground.
Okay.
Which is this community of researchers
that are operating outside of traditional academia
and are free to pursue their passions
without the constraints of, you know,
grant deadlines and institutional pressures.
Oh wow, that's exciting.
It's almost like this band of rebels pushing the boundaries.
She definitely sees it as a potential source
of fresh ideas and new innovative approaches.
But she also acknowledges that these independent researchers
face many challenges, particularly the lack of access
to resources and community support.
She shares a story about a researcher who
couldn't even post their PhD thesis on ARCSFIF,
which is usually a pretty standard practice.
Yeah.
And that really highlights the barriers that they face.
Absolutely and this leads into another really key point
in their conversation, which is this issue
of public trust in science.
Yeah.
And Sabin argues that the systemic problems
within academia.
Right. The sense that things are kind of stuck,
actually contributes to the growing distrust
in scientific expertise.
I think that's a really fascinating connection.
Yeah, and so she's saying by acknowledging these problems,
being upfront about them,
the scientific community can start to rebuild that trust.
That's exactly what she's arguing,
this idea that transparency and a willingness
to acknowledge limitations are crucial
for regaining that public confidence.
It's not about pretending everything's perfect.
Right.
But it's about showing that the process is robust enough
to handle criticism and self-correction.
And this is where they bring in a broader perspective
and they start talking about whether this slow down
in progress is actually happening across multiple fields,
not just physics.
So they discuss insights from economics
and this field of scientometrics,
which studies the patterns of scientific publishing
and discovery.
They even reference work by Tyler Cowen and John Horgan,
who have argued that we are seeing a general slowdown
in scientific progress across the board.
So it's not just physicists who are feeling this.
It seems not to be them.
This stagnation.
And what's really interesting is they push back
against this common counter argument
that progress naturally slows down as fields mature.
They point out that we're seeing this slowdown
across disciplines of varying ages.
And we can't ignore the fact that the number
of scientists worldwide has been increasing exponentially.
So if it were simply a matter of scientific maturity,
wouldn't we expect to see at least some fields
still making rapid progress?
Right, so it's not just that we plucked
all the low-hanging fruit.
Yeah, and they seem to point towards
the structure of scientific institutions themselves
as a potential factor.
Right, and this brings us back to these issues
that we talked about.
Exactly.
The publish or perish, the short-term grants.
The over-specialization.
Yeah, and so it's like we've inadvertently built a system
that discourages the very creativity
that drives scientific progress.
Exactly. And that's why both Kurt and Sabine see this conversation as so crucial.
They believe that acknowledging these systemic challenges is the first step towards finding
solutions. It's about recognizing that the way we organize and fund the scientific research
has a profound impact on the types of questions we ask.
Right.
And the types of discoveries we make.
It makes you wonder, do we need a fundamental rethink of how we approach science?
Yeah.
You know, a shift away from this hyper-competitive, short-term focus
Yeah.
towards a more collaborative, long-term vision.
It's a challenging prospect,
but it's one that I think they both believe
is worth taking on.
Absolutely.
They express hope that by bringing these issues to light,
they can spark new conversations
and inspire the next generation of scientists
to think differently about how we pursue
knowledge and understanding.
So it's not about giving up on science.
It's about being, you know, realistic.
It's about recognizing the limitations.
Recognizing the limitations.
And being willing to imagine these new possibilities.
Exactly.
It's about embracing that spirit of inquiry and exploration
that lies at the heart of science.
This drive to ask big questions, to challenge assumptions,
and to really push the boundaries of what we know.
And you know, maybe as Sabine suggests,
part of the solution lies in fostering a community.
Absolutely.
One that values a wider range of perspectives,
both within traditional institutions and you know.
And that scientific underground.
And that scientific underground.
It's a really inspiring vision.
It is. It's a really inspiring vision. It is.
It's a future where scientific progress is driven
not by, you know, competition and individual ambition.
Right.
But by collaboration, curiosity,
and the shared commitment to unraveling
the mysteries of the universe.
They talked about this publish or perish culture.
Yeah.
And how it can sometimes prioritize quantity over quality.
It's a real dilemma. On the one hand, we want researchers to share their findings widely.
Okay.
But on the other hand, we don't want to incentivize rushed or superficial work.
Yeah, it's this tough balance.
It's a delicate balance.
Yeah.
And they both highlight the rise of special issues in journals
as a particularly concerning trend.
Yeah. And while these special issues can sometimes as a particularly concerning trend.
And while these special issues can sometimes serve a legitimate purpose,
like showcasing a rapidly developing field,
they've also been used by some publishers to just churn out content quickly and boost profits.
So there's a potential conflict of interest there.
That's the concern.
If publishers are prioritizing profit over scientific rigor, that could have a detrimental effect. That's the concern. If publishers are prioritizing profit over scientific rigor, that could have a detrimental
effect on the quality of research.
And they both see this as part of this larger systemic problem within academia, this pressure
to publish as much as possible, often at the expense of deep, thoughtful inquiry.
And it's not just about the quality of research.
Right.
It also ties back into public trust.
Exactly.
If people feel that the research is being driven by
financial motives rather than genuine pursuit of knowledge,
it can erode their confidence.
Exactly. That's why Sabine emphasizes transparency
and accountability within the scientific community.
Okay.
She believes that acknowledging these systemic issues
is crucial for rebuilding public trust
and demonstrating that the scientific process,
despite its flaws, is still the best way
to understand the world.
Absolutely, it's about recognizing that science
is a human endeavor with all the complexities
and imperfections that come with that.
Absolutely, and having those honest conversations about the challenges and limitations
both within the scientific community and with the public.
Speaking of challenges, let's go back to this idea of a scientific underground.
It's a compelling vision, these researchers operating outside of traditional institutions,
free from the pressures of academia.
But as you pointed out, there are these hurdles.
There are significant hurdles.
To overcome.
Independent researchers often lack access
to resources, funding, even just a supportive community.
It seems like a catch-22.
You need resources and support to conduct research,
but you need to have already done research
to get those resources and support.
Exactly, and this is where they see the need
for a fundamental shift in how we think about
and support science.
They believe we need pathways for people to engage
in scientific exploration,
regardless of their institutional affiliations.
So it's about fostering a vibrant scientific ecosystem.
Precisely. One that values a wider range of perspectives and approaches. And that
leads us to the final part of our deep dive. Okay. Where we'll delve into the
broader implications of this slowdown. Yeah. And explore what the future might
hold for scientific discovery and innovation. Welcome back to our deep dive
into this really insightful conversation with Kurt Jamungal and Sabine Hassenfelder.
We've covered a lot of ground from super determinism to the challenges in scientific publishing
today.
But I think now let's tackle this big question of the potential slowdown in scientific progress.
It's a big one.
And it's something that they explored in depth, you know drawing on insights from economics
History the study of scientific publications themselves and they paint this picture that's really thought-provoking
But also a bit unsettling it is unsettling because we we tend to think of science as this ever-advancing force
Constantly pushing the boundaries of knowledge, right? What if we've hit a plateau?
What if the pace of groundbreaking discoveries
is actually slowing down?
It's a question that has been ruled by others,
Tyler Powen, John Horgan, who we mentioned earlier.
And what I find interesting is
Kurt and Sabine don't dismiss this idea.
They engage with it thoughtfully.
They bring in data that suggests a potential decline
in groundbreaking discoveries since the mid-20th century.
That's pretty wild. And they also challenge that easy explanation, right? This idea that
progress naturally slows down as fields mature. Exactly. They point out that this slowdown seems
to be happening across disciplines regardless of their age. And we can't ignore the fact that the
number of scientists worldwide has been growing exponentially. So if it were simply a matter of scientific maturity, wouldn't we expect at least some
fields to still be making rapid progress?
It seems like there's something else going on.
And they point towards the structure of scientific institutions themselves as a potential factor.
Yeah.
They bring us back to the issues that we talked about earlier.
The publish or perish pressure, the short-term grant cycles, the over-specialization within fields.
It's as if we've created a system that rewards incremental advancements over bold, risky leaps
into the unknown.
Right.
It's almost like we've accidentally built a system that stifles creativity, which is
supposed to be the engine of science.
That's exactly right And that's precisely why both Kurt and Sabine see this conversation is so crucial
They believe that acknowledging these systemic challenges is the first step towards finding solutions
You know it's about recognizing that the way we organize and fund scientific research
Has a profound impact on the kinds of questions we ask and the types of discoveries that we make. It makes you wonder, do we need a fundamental rethink?
Like, how do we approach science?
A shift away from this hyper competitive short-term focus
and toward a more collaborative long-term vision.
It's a tough challenge for sure,
but it's one that Kurt and Sabine believe
is worth taking on.
They express hope that by bringing these issues to light,
they can spark new conversations
and inspire the next generation of scientists
to think differently about how we pursue knowledge
and how we pursue understanding.
So it's not about giving up on science
or being disillusioned?
No, it's about being realistic.
It's about recognizing the limitations
of our current systems
and being willing to imagine new possibilities.
It's about embracing that spirit of inquiry and exploration that lies at the heart of
science. The drive to ask these big questions, to challenge assumptions, and to push the
boundaries of what we know and what we understand. Absolutely. One that values a wider range
of perspectives and approaches, both within traditional institutions and within that
scientific underground that we talked about earlier. It's a really inspiring vision, a future where
scientific progress is driven not by competition and individual ambition, but by collaboration,
curiosity, and a shared commitment to unraveling the mysteries of the universe.
Yeah, that's a great point to end on, I think. We've covered so much in this deep dive, questions about stagnation,
this mind-bending world of super-determinism,
the challenges facing scientific publishing,
and ultimately the very nature
of scientific progress itself.
And that's what's so great about this conversation
between Kurt and Sabine.
It's really sparked so many new ideas
and new lines of inquiry.
We've only just scratched the surface here, though.
Yeah. You know, we've offered a glimpse into this really fascinating conversation that is full of insights and provocations.
Now it's your turn to ponder these ideas to explore these questions further and maybe even challenge some of your own assumptions about the world and how we understand it.
Yeah, keep that scientific spirit alive.
Keep asking those big questions.
Stay curious and never stop exploring.
Worth taking on. They express hope that by, you know, bringing these issues to light, they can spark new conversations and inspire the next generation of scientists to think differently about
how we pursue knowledge and how we pursue understanding.
So it's not about giving up on science or being disillusioned?
No, it's about being realistic.
It's about recognizing the limitations
of our current systems and being willing
to imagine new possibilities.
It's about embracing that spirit of inquiry and exploration
that lies at the heart of science,
the drive to ask these big questions,
to challenge assumptions, and to push
the boundaries of what we know and what we understand.
Absolutely.
One that values a wider range of perspectives and approaches both within traditional institutions and within that scientific underground that
we talked about earlier. It's a really inspiring vision. A future where scientific progress
is driven not by competition and individual ambition, but by collaboration, curiosity,
and a shared commitment to unraveling the mysteries of the universe. Yeah, that's a great point to end on I think
We've covered so much in this deep dive questions about stagnation this mind-bending world of super determinism
The challenges facing scientific publishing and ultimately the very nature of scientific progress itself
And that's what's so great about this conversation between Kurt and Sabine. It's really sparked so many new ideas and new lines of inquiry.
We've only just scratched the surface here though.
You know, we've offered a glimpse into this really fascinating conversation that is full
of insights and provocations.
Now it's your turn to ponder these ideas to explore these questions further and maybe
even challenge some of your own assumptions about the world and how we understand it.
Yeah, keep that scientific spirit alive.
Keep asking those big questions.
Stay curious and never stop exploring.
New update!
Started a sub stack.
Writings on there are currently about language and ill-defined concepts as well as some other
mathematical details.
Much more being written there.
This is content that isn't anywhere else.
It's not on theories of everything.
It's not on Patreon. Also, full transcripts will be placed there at some
point in the future. Several people ask me, hey Kurt, you've spoken to so many people
in the fields of theoretical physics, philosophy and consciousness. What are your thoughts?
While I remain impartial in interviews, this substack is a way to peer into my present
deliberations on these topics.
Also, thank you to our partner, The Economist.
Firstly, thank you for watching, thank you for listening. If you haven't subscribed or
clicked that like button, now is the time to do so. Why? Because each subscribe, each like,
helps YouTube push this content to more people
like yourself, plus it helps out Kurt directly, aka me.
I also found out last year that external links count plenty toward the algorithm, which means
that whenever you share on Twitter, say on Facebook or even on Reddit, etc., it shows
YouTube, hey, people are talking about this content outside of YouTube, which
in turn greatly aids the distribution on YouTube.
Thirdly, there's a remarkably active Discord and subreddit for Theories of Everything,
where people explicate Toes, they disagree respectfully about theories, and build as
a community our own Toe.
Links to both are in the description.
Fourthly, you should know this podcast is on iTunes, it's on Spotify, it's on all of
the audio platforms.
All you have to do is type in theories of everything and you'll find it.
Personally, I gain from rewatching lectures and podcasts.
I also read in the comments that hey, toll listeners also gain from replaying.
So how about instead you re-listen on those platforms like iTunes, Spotify, Google Podcasts,
whichever podcast catcher you use.
And finally, if you'd like to support more conversations like this, more content like
this, then do consider visiting patreon.com slash Kurtjmongle and donating with whatever
you like.
There's also PayPal, there's also crypto, there's also just joining on YouTube.
Again, keep in mind, it's support from the sponsors and you that allow me to work on toe full time
You also get early access to ad free episodes whether it's audio or video
It's audio in the case of patreon video in the case of YouTube for instance this episode that you're listening to right now was
Released a few days earlier every dollar helps far more than you think either way your viewership is generosity enough
Thank you so much.