Theories of Everything with Curt Jaimungal - Wayne Myrvold: A 2 Hour Deep Dive Into Entropy
Episode Date: September 29, 2025In this episode, we dive deep with philosopher of physics Wayne Myrvold to puncture entropy clichés and reframe thermodynamics as a resource theory. He argues the “entropy always increases” sloga...n is a consequence—not the law—and shows that Clausius’s entropy is defined only given the second law, while Gibbs vs. Boltzmann entropies answer different questions (“which entropy?”). We tour Maxwell’s demon, Landauer erasure, available energy/Helmholtz free energy, and why, once fluctuations matter, Carnot efficiency is only a statistical bound. Along the way: macrostates vs. microstates, why “disorder” misleads, ergodicity’s limited relevance, whether the universe is an isolated system, heat death as resource exhaustion, and how collapse theories would rewrite the story. We even touch QFT/stat-mech pedagogy and career advice. If you're curious about what entropy really is—and how information, agency, and objectives change the answer—this one’s for you. Join My New Substack (Personal Writings): https://curtjaimungal.substack.com Timestamps: - 00:00 - Is Entropy in the System or in Our Minds? - 07:12 - The Original Thermodynamics: A 'Resource Theory' of Heat and Power - 18:24 - The Second Law Doesn't Assume Entropy; Entropy Requires the Second Law - 30:58 - From Caloric Fluid to Molecular Motion: The Historical View of Entropy - 39:04 - Maxwell's Revelation: Why the Second Law Can't Be an Absolute Truth - 48:11 - Information as a Resource: How Knowledge Can Seemingly Defeat Entropy - 1:00:53 - Boltzmann vs. Gibbs: The Objective vs. Subjective Views of Entropy - 1:10:24 - Maxwell's Demon and Landauer's Principle: The Physical Cost of Information - 1:25:02 - The Inevitable "Heat Death" of the Universe? - 1:30:19 - The Fallacy of Equating Entropy with "Disorder" - 1:35:21 - The Ergodic Hypothesis: A Foundational, Yet Possibly Irrelevant, Concept - 1:43:52 - Why Statistical Mechanics May Be on Shaky Ground (Like QFT) - 1:50:50 - A Professor's Advice: Don't Jump on the Research Bandwagon Links Mentioned: - Ted Jacobson [TOE]: https://youtu.be/3mhctWlXyV8 - Concerning Several Conveniently Applicable Forms For The Main Equations Of The Mechanical Heat Theory [Paper]: https://web.lemoyne.edu/giunta/Clausius1865.pdf - John Norton [TOE]: https://youtu.be/Tghl6aS5A3M - On An Absolute Thermometric Scale [Paper]: https://sites.pitt.edu/~jdnorton/teaching/2559_Therm_Stat_Mech/docs/Thomson_1848.pdf - Carnot Efficiency: https://energyeducation.ca/encyclopedia/Carnot_efficiency - Maxwell’s Talk On Molecules: https://victorianweb.org/science/maxwell/molecules.html - Helmholtz Free Energy: https://www.sciencedirect.com/topics/engineering/helmholtz-free-energy - Boltzmann Entropy: https://chem.libretexts.org/Courses/Western_Washington_University/Biophysical_Chemistry_(Smirnov_and_McCarty)/01%3A_Biochemical_Thermodynamics/1.05%3A_The_Boltzmann_Distribution_and_the_Statistical_Definition_of_Entropy - Maxwell’s Letter: https://cudl.lib.cam.ac.uk/view/PH-CAVENDISH-P-00092/1 - Landauer’s Principle: https://www.sciencedirect.com/science/article/abs/pii/S135521980300039X - On A Universal Tendency In Nature To The Dissipation Of Mechanical Energy: https://www.tandfonline.com/doi/abs/10.1080/14786445208647126 - Neil Turok [TOE]: https://youtu.be/zNZCa1pVE20 - Roger Penrose [TOE]: https://youtu.be/sGm505TFMbU - Sean Carroll [TOE]: https://youtu.be/9AoRxtYZrZo - Understanding The Infinite [Book]: https://www.amazon.com/Understanding-Infinite-Shaughan-Lavine/dp/0674921178 - Classical Electrodynamics [Book]: https://www.amazon.com/Classical-Electrodynamics-John-David-Jackson/dp/1119770769 - Why Information Is Entropy [YouTube]: https://youtu.be/8Uilw9t-syQ Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
You know what's better than being clean-shaven?
A clean cloud storage that doesn't drain your wallet every month for files that you forgot existed.
Now, here's the physics.
Your cloud storage follows entropy, which means it tends to increase across time,
random screenshots, duplicate PDFs, that 4-gigabyte video from 2019.
Clean My Mac's new cloud cleanup feature tackles this directly.
It scans iCloud, OneDrive, and Google.
drive locally on your Mac. Your data never leaves your device. This identifies the space wasters
which eat your storage budget. Think of it as a dimensional reduction for your digital life.
You can evict files from the cloud while keeping local copies. You can remove them entirely.
This feature maps your storage topology showing exactly where those gigabytes hide.
For someone processing research papers, podcast recordings, and reference materials constantly,
cloud bloat becomes expensive quite quickly.
This tool lets you reclaim control
without manually sorting through thousands of files.
Get tidy today, seven days free trial,
use my code theories for 20% off, clean my Mac.
That's coupon code theories for 20% off.
Even though a lot of physicists will say,
Second Law says the total entropy is never decreasing,
that can't be actually be the second law,
that can't be a consequence of the second law.
This is a two-hour deep dive into entropy and the second law.
Most talks on this subject are 10 minutes long,
but today, Professor Wayne Mirvald gives a tour de force,
explaining entropy from multiple angles, dispelling myths,
and even the stunning realization that the second law
is opposite to what you think.
You will get both answers from perfectly competent physics,
and on each side, we'll be absolutely certain that that is the right answer.
Questions explored are, why is entropy not the same as disorder?
What do popular accounts and even undergraduate texts on entropy and thermodynamics
consistently get incorrect?
Is the universe subject to the second law?
Can you break these supposed entropic limits and why quantum mechanics changes everything?
There's plenty of confusion and puzzles about entropy.
today I would like to talk about exactly what is entropy. And one of the questions that give conflicting
answers is imagine in front of you you have some physical system. I believe this comes from
Shelley Goldstein. Like let's just say you give this example, like a glass of water, for instance.
And its physical state is clearly not completely known to you. But then something else appears
to you. You can be an intelligent person, an angel or what have you. It gives you a much better
approximation of this glass of water than what you had before. Then the question,
is, has the systems, the glass of water, has its entropy decreased? So they're broadly speaking
two answers one can give. One is, yes, obviously, because entropy has to do with the information
of the system. So if you've gained information, the entropy has changed. Then the other is,
that's absurd. The entropy has something objective to do with the system. So why would your
information about the system change its entropy? Take it away.
Yep. Yeah, absolutely. And you will get both answers from perfectly competent physicists, and they will be absolutely certain that that is the right answer.
So, I mean, I think that the best way to start reading and thinking about that is you raise a, you, you post a question as, what is entropy?
And I think that's a bit of a misleading question because entropy is one of those words is used in different senses.
and we're used to that.
It's not unusual.
Like if you open up a dictionary and at a random page
and point at any word at random,
there'll probably be like two or three or maybe four different definitions.
Very often in science, people will coin a new technical term
because they want something to have a precise, well-agreed-upon meaning.
And that's actually what Klausius did back in 1865, I think it was.
He thought, okay, here's this important quantity that I and others have been batting around in thermodynamics.
It's important enough it needs a dignified name.
And his rationale was, look, everybody studies the dead languages, right?
We all know Greek and Latin.
And so we don't want something like from English or German or Italian because then it becomes sort of
nationalistic words, so let's coin something from a Greek word. So he coined the word entropy from a
Greek word for transformation, and he deliberately coined it to sound kind of like energy because
it's closely related concept. If I had my way, we would respect Clausius and we would only use
the word entropy in exactly the same sense that Clausius defined it. Okay, and that would be what
Everyone means by entropy.
But historically, that's not what happened.
There's been a number of different quantities of people call entropy.
And they're all related, and they're all related in some sense to thermodynamic entropy.
But they're just different things.
If someone asks you the question, has entropy decreased, I think an actual question is, well, which entropy?
So in other words, there should be entropy sub one, entropy sub two, entropy sub three.
And when someone says, well, what is the entropy of the system, you say, okay, are you referring
to two, three, sub one?
Yeah, exactly, exactly.
Or, I mean, if I had my way,
people would just have coined different words
for these different kinds of things.
But there's a reason why
some people say, well, of course
entropy has to be
an intrinsic
property of a system,
because, you know, this is physics after all.
You know, we're not doing psychology.
It's not in, you know, we're not studying
people's information. We're studying physical properties of physical systems, which
they, you know, quite as they have, no matter what anybody knows about them or what I'd
think about it. Like, if I ask, okay, what's the mass of this cup? That, you know, it would seem
absurd to say, well, how much do you know about the cup? The mass of the cup is something that
the cup has, talking about the rest mass, because sometimes people will,
talk about relativistic mass and talk about that as observer dependent.
But, okay, what's the rest mass of this cup?
Then, yeah, that's the property of the cup.
And it doesn't matter what anyone thinks about it.
And if you think that thermodynamics is a science like that,
it is just studying the physical properties of things,
then it seems absurd that one of its central concepts,
entropy would be something that would be defined,
relative to a state of information.
And I think that at bottom, the fact that people are inclined to think that different
notions of entropy are obviously the right one, and different answers to this question
are obviously the right answer, is even though this gets completely blurred in the text
of tradition, there are actually different conceptions about what the science of thermodynamics
is all about.
Okay, so look, in the second law, it stated that entropy doesn't decrease.
Yeah.
Oh, yes.
Yeah, your caveat's closed system or isolated system and so on.
Right, right, yeah.
Okay.
Then there's a formula for entropy.
Are you saying that even here, there should be sub one and sub two?
No, it actually, if you look, these different notions of entropy are actually defined differently.
Um, and you'll, you'll, actually, if you look at different,
textbooks, when they
introduce the concept of entropy,
they actually will sometimes give very different
definitions.
And
so maybe I should just talk about
what Clausius was doing,
because that's one of the
definitions that's out there, right?
Sure.
So Clausius
was
working in the 1850s, 1860s.
Those are the early days of what we
now call
thermodynamics. And it was Kelvin who gave the science that name. And I think a lot of people
actually misunderstand what that word means, thermodynamics. Because in physics these days, when you
talk about dynamics, you usually mean the laws of evolution. Like, you know, the dynamical laws
to govern the behavior of systems. And that's actually not what Kelvin meant when he, he
decided to call this
emerging science
thermodynamics.
This was, as I said, back in the days when everybody
studied Greek in school
and it's
formed from
two Greek words, the words
for heat and for power.
And thermodynamics has
its roots in the study
of
how you can get
useful mechanical work
out of heat. Like it really, ultimately
has its roots in Carnot's study of heat engines, efficiency of heat engines. If you think of that
as what Thermonomics is about, physicists say, have a word for a theory like that. It's a resource
theory. And this comes out of quantum information theory. So what happened really got going a
couple decades ago, is this field of quantum information theory includes quantum communication
and cryptography and stuff like that. And they were asking questions like, you know, if you've got
two agents who have access to certain resources, what can they do with those? So, for example,
these agents are always called Alice Bob and Bob, by the way. You know, for example, if Alice and Bob
want to send a secure signal that an eavesdropper could not, as a matter of physical principle, eavesdrop on,
what can they do? Can they do it if they have a certain amount of shared entanglement?
So it's using physics in the sense that quantum physics is telling you how physical systems
are going to respond to certain operations and stuff like that. But it's got all this,
but the questions you're asking are really, questions,
not within physics proper. It's questions about what agents with certain kind of means of
manipulating a system and certain resources can do to achieve certain goals. Why is that not in
physics proper? Because when I say physics proper, people usually, what I mean is what
physicists usually think, but physics is about the properties of physical systems, period.
Right. And if I'm talking in an end, so these goals of these
agents aren't a matter of physics.
These are something that you're adding on.
Hmm.
Right.
So there's a, one question is what, you know, what do things do?
So there's a sort of question, what do things do under certain circumstances?
But if I'm, if I'm sending you certain goals, like, is an agent itself a part of a physical
system?
The agents ourselves are physical systems.
Right. But physics studies physical systems in certain respects.
So I'm a physical system, right? You're a physical system. You have thoughts and believe.
Thank you. It's the nicest thing anyone's ever said about me.
Some might disagree. Some might disagree and say you're not just a physical system. You're a combination of a physical system plus an immaterial mind. But I actually think that we are all physical systems.
Um, so, um, and, you know, we, quasi-physical systems have, you know, thoughts and desires and hopes and dreams and stuff like that, but if a scientist is studying my thoughts, so I study that, that scientist is doing psychology and not doing physics.
Okay.
Yeah. Um, so, um, you're just studying different aspects of things. And if I bring in things like, um, here's the game of that I was, or,
are going to, Bob, are going to play, and here's how we're going to score them,
then, and then you give them certain resources, and physics tells, tells you what the
highest possible score is, but basically you're bringing, but you're not doing the sorts of
thing you usually find in a physics textbooks if you're talking about goals and scoring and
things like that. That's what I mean. Got it. Yeah, yeah, okay, good. And,
and I think this will become important for thermodynamics.
So with people who are doing quantum information theory, they said, okay, what we're doing is this is a resource theory.
And then some of the same methods ended up, when people started doing quantum thermodynamics, a lot of people started thinking of this as a resource theory.
So if I give you certain resources and you've got a certain task like lifting a weight or something like that
and maybe you've got some kind of system and you've got a heat bath as a certain temperature,
you know, what's the most work you can do out?
What's the most work you can get out of it?
So like how high can you lift the weight?
And there are people, well, a lot of people in.
working in quantum thermodynamics these days, think,
okay, what we're doing is a resource theory similar
and in some sense is modeled on the quantum information theory.
Okay.
And that's basically how the founders of thermodynamics,
what wasn't quantum,
it wasn't quantum,
but that's basically how the founders of thermodynamics thought of thermodynamics.
It's a study of given certain physical resources,
like Hepas and, you know, things like that,
that what can I, how can I exploit these resources to do work, you know, like drive a car or
something like that or raise. So let me see if I could summarize this. So Clausius, who coined
the term entropy, was thinking of it in terms of a resource theory. Now, a resource theory is
what can I do with these resources? And often in thermodynamics, when you take an introductory
course, you speak either of the first or second lecture about pistons. So given this system,
can I move a piston?
So they were thinking practically.
Yeah, there's a sort of interesting tradeoff
if you're doing practical concerns
and theoretical concerns
because these questions were initially raised
by practical concerns where they sort of took on a life of their own.
And you can see that already in Carnot's work.
So Carnot wrote this little pamphlet called
Reflections on the Mode of Power of Heat.
And he was actually responding to some issues, and this is something that his father, Lazara Karno had done some work on, that was going on the same, at the time is if you've got a heat engine, and usually that was you've got some kind of gas in a chamber and you heat it up and it drives a piston, right?
is it more efficient if you use a more volatile substance?
So it doesn't matter whether you're using air or steam or, say, alcohol or ether or something like that.
Are you going to get more work out of the same amount of heat?
And this was actually a practical matter because some people were thinking,
okay, yeah, let's use alcohol or ether.
And you can kind of imagine what happens because these things not only expand a lot,
faster when they're heated up than air does.
They're also highly flammable.
And it's kind of dangerous to have these things around fire.
And so one of the questions that Karno was asking is,
well, does it matter what the working substance is?
Does it matter what gas you have in the piston?
And he argued that actually the maximum of efficiency,
if I have two heat sources at different temperatures,
the maximum efficiency of an engine running between them is independent of what the stuff is
that you're using in the gas. So that had its roots in practical concerns, but Carnot considered
what we now call thermodynamically reversible processes. And the thermodynamics reversible
process, it involves your only exchanging heat from two things.
at the same temperature, and so what you're doing instead of is you're actually expanding
the gas very, very slowly, and then when you're dumping heat out, but you're compressing it
very, very slowly. And, of course, actual engines are not anywhere close to thermodynamic
reversibly, because what we're actually concerned is not just efficiency, but also power.
right, how much work we're getting per unit time, right?
Sure.
You know, if someone tries to sell you a car and says, okay, this has an amazing gas efficiency,
but you can only go, you know, five kilometers an hour, right?
Right.
We're not buying it, literally, right?
And so a lot of the study of, even though thermodynamics grew out of a study of efficiency,
of engines. A lot of the actual theoretical work, the things you can actually prove things about
involve consideration of thermodynamic reversible processes. And in the real world, there actually
are no thermodynamic reversible processes. I noticed that you interviewed John Norton a while
back. I'm sure he emphasized that point. Yes. Right. But we can approximate
thermodynamically reversible processes,
they just have to go very, very slowly, right?
And actual machines were not interested in things
that work very, very slowly.
So you can still, you can have a resource theory
as a more, they're made by being motivated
by abstract concerns, but you could be actually
considering situations that are very far from
realistic ones.
Talk about how the second law assumes a certain definition of entropy.
Maybe it shouldn't even be called the second law.
Good.
I'm really glad you asked me that, because I think it's the other way around.
When you ask people, what's the second law firm dynamics?
You know, go ask people in the street and something.
They'll say, you know, they'll, a lot of them will say, things like what you said,
that the entropy of an isolated closed system always increases.
Now, the interesting thing is,
if what you mean is thermodynamic entropy,
as Clausius defined it,
that actually is not right for an important reason.
Even though Closius himself, sort of as tongue-in-cheek,
at the end of one of his papers said,
we can express the first and second law as,
or he says, if we may be permitted
to talk about the total energy of the union,
universe and totally entropy of the universe, we can express the first and second laws as total
energy of the universe is constant, the total entropy of the universe strives to a maximum.
That's actually not as official statement of the second law, and there's very good reason.
Because you said the second law of thermodynamics presupposes a certain notion of entropy.
It is actually the reverse.
Klausi's definition of entropy presupposes the second law.
Hmm, okay, how so?
Okay, so one way of expressing the second law would be,
suppose I've got some kind of system and it's easiest to imagine a gas with a piston,
and it goes around in a cycle in the sense that it comes back to the same thermodynamic state that it started in.
And people were thinking about these cycles because they're thinking about heat engines.
So what a heat engine typically does is you've got some working substance gas.
You heat it up, it drives a piston out, and then you either expel the substance or cool it down and compress it and push the piston in, and then you're ready to start it again.
So the engine itself is working in a cycle.
So as far as I've got a gas in a box, and I can change its temperature, and I can – I can –
I've been using Claude to prep for theories of everything episodes, and it's fundamentally
changed how I approach topics. When I'm exploring, say, gauge theory or consciousness prior to
interviewing someone like Roger Penrose, I demand of an LLM that it can actually engage with
the mathematics and philosophy and physics, etc., at the technical level that these conversations
require. I also like how extremely fast it is to answer. I like Claude's personality. I like
it's intelligence. I also use Claude Code on a daily basis and have been for the past few months.
It's a game changer for developers. It works directly in your terminal and understands your
entire code base and handles complex engineering tasks. I also used Claude, the web version,
live, during this podcast with Eva Miranda here. Oh my God. This is fantastic. That's actually a feature
named artifacts and none of the other LLM providers have something that even comes close to it. And no coding
is required. I just describe what I want, and it spits out what I'm looking for. It's the interactive
version of, hey, you put words to what I was thinking. Instead, this one's more like you put
implementation into what I was thinking. That's extremely powerful. Use promo code,
theories of everything, all one word capitalized, ready to tackle bigger problems, sign up for
Claude today and get 50% off Claude Pro, which includes access to Claude Code when you use
my link, clod.a.ai slash theories of everything.
I can draw heat in or push heat out and I can move the piston back and forth,
etc. I'm going to say this a little bit differently because I'm going to
introduce the concept of entropy without using the word and just see if you notice where
it comes in. Okay. Okay. So imagine you've got, say, for example, a gas in a container and
there's a piston you can move around and you can put it next to a heat source and maybe expand it and
use that to move a weight or something. And so suppose I start with it at a certain temperature and pressure
and a certain volume, the piston's a certain place. And I slowly expand it in a thermodynamically
reversible sense and I'm raising a weight and it's connected to a heat source, so heat is going into it.
And then I hand it to you, Kurt, and say,
I want you to put it back to the original state.
Okay.
Now, here's what I'm Clausius is.
It's a consequence of the second law of thermodynamics.
You can't do that without expelling any heat at all from the system.
Right.
Now, one thing I can do is if the original process was thermodynamics reversible,
is I can just do that original process in reverse,
compress the gas back to its original volume,
expelling exactly the same amount of heat
into the same reservoir I got it from at the same temperature.
And that might be the best you can do
if you only have one heat source or sink
at one particular temperature.
But what we really want to do is not lower the weight
as much as we raised it.
And so Clausius says, hey, look,
if you've got another heat bath at a lower temperature,
what you can do is you can expel a smaller matter heat
at a lower temperature
and get the thing back to its original state that way.
So he introduced this notion of,
and so one way of the,
he introduced this notion of equivalence value of heat.
Heat transferred between a system in a reversible process and a heat bath
is, in a certain sense, worth more for what you want to do
if it's at a lower temperature than a higher temperature.
Because if I want to restore the initial state,
I can either use a large quantity of heat at a high temperature
or less heat at a lower temperature.
Okay.
And so that's what he calls the equivalence value of a heat of heat.
It's a function not just of the amount of heat,
but the temperature which is being transferred.
And in fact, you can, as Kelvin realize,
I can define a temperature scale,
which we call the Kelvin scale or absolutely temperature scale,
so that the equivalence value of a quantity of heat at a given temperature,
is just inversely proportional to the temperature.
Just define the temperature scale that way.
Mm-hmm.
Okay.
So one statement of the second law is that if I take something in a cycle
and there's heat being exchanged to various temperatures,
add up all the equivalence values to those heats,
it can't be
more greater than zero.
It's less than or equal to zero.
And if it's a reversible cycle,
then it's equal to zero.
The sum total of all those equivalents
to those is equal to zero.
Yes.
Okay.
Okay.
Now I can define entropy.
Because it's a consequence of that
if I have two different thermodynamic states
and I go from one to another in a reversible process,
the total equivalence values over those processes
is not going to matter which reversible process.
So suppose there's more than one reversible process
that gets me from state A to state B.
And if I go via one reversible process,
add up all the equivalence values of heat,
it'll be the same.
as another process, and the argument is, if these are reversible processes...
Let me see. So you require the second law, if my understanding is correct,
in order to make thermodynamic entropy a well-defined quantity.
Exactly, yeah. Yeah, exactly.
So the definition of thermodynamics entropy is,
if I want to know the entropy difference between two states,
states, then cook up any reversible process to connect those two states and just add up the
equivalence values of heat transferred in that process. And it's a consequence to the second law
that it doesn't matter which reversible process I use. So when someone says, I've broken the
second law, can that statement even be made? Or are you saying that you have to assume it in order
to define the entropy? And so how are you going to break the second law?
Yeah, so if you try to express the second law as total entropy of a,
in any process, the entropy of an isolated system never decreases.
That's actually incorrect, because if you could break the second law,
then you wouldn't have a well-defined thermonica entropy.
Interesting.
Yeah.
So, I mean, this is a point I think that a lot of people miss.
It's actually a consequence.
What I'm saying is, what I'm saying about the definition of thermodynamics is not
anything radical, it's like standard textbooks.
Very often what people do in textbooks is they give a statement of the second law.
Something like Klaus's version is there's no process whose net has no other effect
than moving heat from a cold body to a hot body.
And then that's one version of the second law, and I express it without mentioning entropy.
Yes.
And then say, given the second law, we can define thermodynamic entropy using the standard definition,
that you just take any reversible process connects to states, add up the equivalence values of heat along those processes,
and it's going to be the same.
So, even though a lot of people will say second law is total entropy of a bunch of systems that's isolated is never decreasing,
if you mean therminating entropy, that can't actually be the second law, that can be a consequence of the second law.
But if you actually break the second law, like if I could have a process that had no other effect than to,
move heat from a cold body to a hot body, then thermodynamic entropy just wouldn't be well defined.
Wayne, let me ask you something. So you're extremely historically informed. So historically speaking,
how did people think about entropy? What I mean to say is, we can make an analogy with heat.
So heat, we now think of as having to do with the motion of molecules. Okay, it's something about
the motion of molecules. Now temperature is also something about the motion of molecules. But prior,
100 years ago, 200 years ago,
it was thought of as some form of fluid.
Okay, so how did people use to think about entropy?
Did they think of it as a quantity-like temperature?
It was something abstract.
Did they think of it like a fluid?
Did they think of it like something else?
What was their mental model for entropy?
That's a very good question.
Because, as a matter of fact,
Carnot, when he wrote this book,
he was thinking of heat as a fluid,
they called a caloric that was conserved
and it flowed between bodies.
And even Kelvin, when he was writing his first papers
on what we now call the Kelvin scale,
was thinking of it that way.
But what happened was shortly after that,
and this happened during Carnot's lifetime,
Jule did his experiments on what people called
the mechanical equivalent of heat.
It was, and it was, you know,
the basic idea is,
I do a certain amount of work.
I can generate a certain amount of heat,
and you can measure the work in terms of, say, foot pounds,
and you measure the heat using a calorimeter.
How much will this warm a given sample of water?
And Jule decided there's a mechanical equivalent of heat
that there's an equivalence between work measured in energy,
terms like foot pounds and things like that
and heat measured in calories
and you can convert them
and also there's no limit
to how much heat you can generate
if you just are doing enough work
in this head
a precursor that was Count Rubford
doing experiments with grinding at cannonballs
like if you just
not can balls, canon boars
like if you're grinding away
Oh, okay
Yeah so what people have noticed is
I don't know what that is
what's a cannon bore
Okay, so how do you make a can?
I don't know how people make cannons these days,
but the way people make cannons back then
is you would make a cylinder of iron or steel,
and then you'd drill a hole in it.
So the cannon bore is the path that the projectile goes through.
And, right?
And unsurprisingly, if you're drilling away the piece of metal, right,
it gets hot, right?
So the process involves, I think, horses driving this, you know, drill bit.
Bore.
Yeah, right, yeah.
Horses driving this drill bit and everything get hot, so you have to cool it off with water.
And Count Redford did experiments, and he got convinced that if you've got enough horsepower,
there's no limit to how much heat you can generate.
And that didn't fit well with the idea that caloric heat is this fluid that you're squeezing out of the substance.
Because if there's a finite amount of heat in any given substance, eventually you think you would run out and not be able to generate any more heat by revving it.
Right. Like you can't sweat infinitely.
Yeah, exactly. You can't sweat infinitely. Perfect. That's a wonderful analogy. Yes. Eventually you get dehydrated, right?
And so, yeah, so what happened was people, largely due to Jules' experiments on the mechanical equivalent of heat,
actually became convinced that heat was a form of energy similar to the mechanical energy of things moving around.
And that got known, that became known as the kinetic theory of heat.
And along with that is this picture of gases, for example, of being full of molecules bouncing around,
And when they're hotter, they're moving faster.
Now, the interesting thing is, even though Carnot, when he initiated what we now call the theory of thermodynamics,
and Kelvin in his very first papers on this, even though they were thinking of heat as this conserved fluid,
very quickly the people who are working on thermodynamics got converted to the kinetic theory of heat.
And so a lot of the same people who are developing what we now call thermodynamics
were also working on the mechanical theory, the kinetic theory of heat.
Or Clausius called it the mechanical theory of heat.
And so Clausius was thinking of heat as involving molecules bouncing around.
And so that raises the question of how entropy might actually be realized
in terms of what's going on with the molecules.
Right. And Clausius had some ideas about that, which are mostly forgotten because none of them were very satisfactory.
But the interesting thing is that I can develop the science of thermodynamics with independently of the molecular hypothesis.
Like I can talk about work being done at the macroscopic level.
I can talk heat being exchanged without really being committed to what was happening at the microcontractors.
physical level there.
Yes.
And so Maxwell, who is one of these people who was at the same time participating in the
development of what we now call thermodynamics and participating in the development of what
we now call statistical mechanics, and of course these are very often together in the same
textbook these days, Maxwell said, well, thermodynamics is the study of the thermal and
dynamical properties of matter without recourse, without any hypotheses as the molecular
constituent of matter. So according to Maxwell, as long as I'm doing thermodynamics, I should be
kind of neutral about whether or not matter is composed of molecules. I can just talk about
heat and work as different kinds of energy exchange without being committed to how it's
realized in the microphysics.
And some thermodynamics textbooks actually do that.
Like some thermodynamics textbooks, or actually sometimes you have a thermodynamics course
and then a statistical mechanics course.
And the thermodynamics course can actually be completely independent of talk about molecules.
Yes.
Right.
And but in other books, and these usually have the title of thermal physics rather than thermodynamics,
the two are hand and go hand in hand.
I actually think there's something to be said for the kind of old-fashioned way of thinking about it that still in some textbooks is
let's talk about thermodynamics as science of exchange of heat and energy develop the try to express the basic principles of thermodynamics
independently of any particular theories about the molecular structures matter.
And then, you know, you can say, okay, once we acknowledge that matter has a molecular structure,
how does it have to be modified?
Hmm, okay.
Why do you prefer that?
Well, because it highlights a difference between two different forms,
of the Second Law of Thermodynamics.
And here's why.
So one consequence of the second law of thermodynamics
is if I have a heat engine operating between two heat source and things
to different temperatures, there's a maximum efficiency.
So the efficiency is if I pull out of quantity of heat,
how much can I convert to work can I get out,
What you want to do is get as much work out as possible
and then dump as small quantity as possible
of heat back into the lower temperature thing.
And one consequence of second law of thermodynamics
is given any two temperatures,
there's a maximum efficiency of a heat engine
operating between heat source and sink at those temperatures.
And that's known as the Carnot efficiency,
carno-bound on efficiency.
Okay.
Now, Maxwell, I think, was the first,
And so one way of state in the second law of thermodynamics is
no heat engine is going to have an efficiency that exceeds the coronal bound.
All right.
Okay.
As Maxwell was the first to articulate, clearly,
if the kinetic theory of heat is right,
that actually can't be strictly true.
Why?
And yeah, that's the right course, the right question. Why? Right? Because if the molecular theory, the connect theory, if you write, there's going to be a certain kind of unpredictability about how much work you're going to actually get. Because these molecules are bouncing around more or less at random. And the pressure of the gas on the piston is a matter of the molecules hitting the piston and bouncing off.
And on a fine enough scale, that force per unit area is going to fluctuating
because the molecules are bouncing around more or less at random.
Yes.
And it could happen that you just happen to get lucky that during the time you're expanding the piston,
more molecules than usual or maybe with a higher average energy than usual
happen to hit the piston and you get more work out than you would have expected.
Right.
Right. If you say to a physicist that, okay, that shows that the second law of thermodynamics can't be strictly true. They'll go nonsense, right? Because what you can't do is completely rely on that, right? Because it could happen that I get less than the carno efficiency, right? Because I'm... Yes.
And so what physicists these days accept is, okay, at a fine enough scale,
scale, there's going to be these fluctuations and the amount of work you get. But if you do it again and again, again, on average, you're not going to be reliably able to exceed the Carnot bound. It's a statistical law. It's a statistical law. Yes. Perfect. Yes. And so why I'm so flummoxed as to why you like the thermodynamics one is because I personally don't like the thermodynamics view. I very much like statistical physics, but distance.
liked thermodynamics. And so even the definition of entropy as a weighted sum of logs of
probabilities, that makes intuitive sense to me. I can derive something with that. And I can make
sense of picturing balls, billiard balls bouncing around. But this DQ over T and carno engines,
maybe it's because I don't build things with my hands, I think. Yeah. Maybe that's the delineation.
But you're also a think. I don't know. I don't know. I don't. I don't. I don't.
build things with my hands so um i would exceed the kernel efficiency i would i would do something
dangerous by accident right you know how in physics we like to reduce something that's complex
into something more elegant and more efficient something simpler for instance it turns out you
can do that with your dinner hello fresh sends you exactly the ingredients you need they're pre-measured
they're pre-portioned so you don't have to deal with this superposition of do i have too much
cilantro versus not enough cilantro or whatever you have collapsing in your kitchen every night.
They've just done their largest menu refresh yet with literally 100 different recipes each
week. There's vegetarian options, calories smart ones, protein heavy, my personal favorite.
Then there's a new steak and seafood options at no extra cost. All the meals take approximately
15 minutes to a half hour. They've actually tripled their seafood offerings recently and
added more veggie-packed recipes. Each one has two or more vegetables now. I've been using
it myself. It's the one experiment in my life that's always yielded reproducible results. It's
delicious. It's easy. It saves me from having to live on just black coffee while editing episodes
at 1 a.m. Personally, my favorite part is that it's an activity that I can do with my wife.
Thus, it not only serves as dinner, but as a bonding exercise. The best way to cook just got better.
go to hellofresh.com slash theories of everything 10 FM to get 10 free meals plus a free item for
life. That's one per box with active subscription. Free meals applied as discounts on the first box,
new subscribers only varies by plan. That's hellofresh.com slash theories of everything 10 FM to get
10 free meals plus a free item for life. Yeah, I don't build things with their hands, but I do
actually have
heat engines
that other people
have built
on my
shelf back
there.
So
here's
why I
think it's
important to
distinguish.
So there's
thermodynamic
entropy as
claus
to find it.
And it's
totally
independent of
any
hypotheses
of a
molecular structure
and its
definition
requires,
it presupposes
for its
definition
the
second law of thermodynamics as in one of its formulations. So if the second law of thermodynamics
is not right, then entropy, there is no such quantity as entropy as entropy as entropy as
Klausius defined it because it's just not going to be true anymore that it doesn't matter
which reversible process you pick to go from A to B, to define your entropy. Okay, here's why I think
it's important to realize, to make clear, that's what thermodynamic entropy, as
clubs just defined it is. Because it helps us realize that when you now make the move to
statistical mechanics, the second law of thermodynamics as originally conceived actually isn't
quite right. And it has to be replaced by, as you say, a statistical law. Yes. And is the
statistical version, not the original
version of
the second law of thermodynamics
that physicists these days accept.
And then you use words like tendency.
Yeah, right. And yeah, so
and Maxwell
himself
said, and I think he was the first one
to express it this way, the second
law of thermodynamics is a statistical
regularity.
And this is at a time,
when people like were, this is like middle of the 19th century, or actually she said this
in 1878. So, but this is a time when people are getting like really impressed by statistical
regularities because this is like, it was early in the 19th century that people really started
gathering statistics about populations and noticing that there were these regularities, say the
number of murders per capita in Paris year after year. And that's interesting, that's interesting
because these are statistical regularities
that you can depend on year after year
of, that are averages over aggregates
of individually unpredictable events, right?
Like, you know, like if people could predict
exactly when and where a murder would take,
then it would take place, then it wouldn't take place, right?
Right.
Yeah, so,
So, yeah, so this is Maxwell saying the second law really is just a statistical regularity.
And it has to do with, it's similar to the statistical regularities that the statisticians
who are out there gathering data about populations are doing.
And he actually gave a talk on molecules to the British Association for the advancement,
of science, which had only recently created a section for statistics,
and the hard scientists were kind of looking down on the social sciences.
And he gave this talk on molecule saying,
we the physicists have adopted methods from the statisticians
because we're taking averages over a large number of quantities of things.
Interesting.
Yeah.
So, yeah, so the version of the second.
law that most people accept these days as some kind of probabilistic or statistically qualified thing.
The way that Sillard put it in the 1920s is, you know, imagine someone who is trying to exceed the carnal bound of efficiency is kind of like a gambler who's trying to break the bank at a casino.
You might have occasionally wins and losses,
but then no way you're going to be reliably on average
you're going to be able to win.
And the theorems coming from a probability theory
about the impossibility of doing that
because the expectation value of your winnings
is always negative if the casino is doing what's right.
And law of large numbers says that your winnings
per game is going to, with high probability,
get closer and closer to the expectation value.
So, since some, and see you're saying, you know, the second law should be thought of as a similar thing.
This actually brings in a connection with information if you're still thinking about thermodynamics as some kind of a resource theory.
Because if you've got, so let me go back to a, I mean, suppose you've got a fluctuating pressure on your piston.
If you knew when those fluctuations were going to happen,
like if you could reliably say,
I'm going to pull the push-up piston out
only when the pressure is momentarily higher than average,
then you could violate even the statistical version of the second law.
And think of it this way.
Rather than a piston, you know, imagine you've got a box with a gas in it
and there's a partition down the middle
and a little hole in the middle
so the gas can go through, right?
Okay, so there's going to be continually small fluctuations in the number of molecules on each side as they go back and forth, and there's going to be continuous small fluctuations in the pressure.
And suppose it's been doing this for a while, and then we close the hole, and you're fairly certain that the pressure is greater on one side than other, right?
Well, if you know which side the pressure is greater,
is greater, then you could exploit that to,
to, you know, slightly increase a piston.
And if you reliably knew, you know,
if you can do this again and again and reliably know
where the greater pressure was,
you could violate even a statistical version of the second law.
Okay, so here's the, let's go back to the ancient question.
So suppose you've got a box of gas in front of you
and it initially starts out with the same pressure on both sides
and then it fluctuates a bit and there's a higher pressure on one side
than another and you close the hole.
So it's stuck like that.
Yeah.
Has the entropy decreased?
Well, if you think, okay, you do the standard calculations of entropy,
you've got, it doesn't matter which side the pressure is higher.
because you can just do the standard calculation
and if you've got a box with higher pressure on this side
then lower pressure on this side,
it has a lower entropy than a gas with the same pressure on both sides, right?
And so if you're thinking of
entropy as something that's supposed to be a property of the gas itself,
then, yeah, the entropy is lower.
It has decreased, right?
And you'll say that even when you...
Sorry, you mean the property of the physical system, not the gas,
like not the molecules, individual molecules of the gas,
but the physical system, the whole system.
The physical system is the whole gas, you know, the whole thing.
Yes, okay.
So the physical system is the gas in the box, right?
Got it.
All right.
And standard calculation, if I tell you,
here's a box with two chambers,
and there's this much gas.
at this much, let's say it's the same temperature
in both sides, and you've got this much
gas, at this much pressure on this
side, and this much gas at this
pressure on that side, then
the minimum of entropy is when the two
pressures are the same.
Right?
Standard calculation that you learn to do
in your intro thermodynamics
courses. Okay.
Okay, so when you have a spontaneous
fluctuation, one thing you can
say is, well, look,
what's happening is the entropy is actually
spontaneously fluctuating
around some kind of
mean value, right?
And so there are actually
spontaneous decreases of entropy.
However,
here's another way to
phrase suppress question.
And I'm going to phrase it
not in terms
of entropy, but in terms
of what Kelvin and Maxwell
called available energy.
So here's a question about available energy.
Available energy is, imagine I've got some physical system in front of you
and you've got a heat bath at some fixed temperature.
And I task you with taking it in the state it's in
and try to get as much work out of it as you can,
but I'm going to specify the state you have to leave the system at the end.
Okay.
And the available energy is a measure of how.
much work you can get out.
And it's equivalent to what we now call the Helmholtz free energy.
Right, okay.
Which is total internal energy minus temperature times the entropy.
If a gas spontaneously fluctuates to a situation where there's more pressure on one side
and we close the hole, has the available energy increased?
I'd say yes.
You'd say yes.
Because you can now use it to do something.
Okay, so how are you going to use it to do something?
If all the molecules are now on one side...
Or just more of them, yeah.
Okay, more of them.
Then you can place something here,
and it will spontaneously start to push this guy to the left.
Which side are you going to place it?
Do that on.
What do you mean?
So if all molecules are one side,
then you can just have a piston on that side.
So we have the gas and, you know, let's just say all the molecules on one side, okay?
So they're either all in the left side or on the right side.
Yeah.
And, okay.
So what do you do to get work out of it?
You place the piston in the middle point that divides it, and then you watch the piston grow.
Right. So suppose you want to raise a, so if I just let the piston go, then I haven't gotten a useful word.
work. Suppose I want to raise a weight.
Uh-huh.
What do I do?
I don't know.
Well, here's the thing. If I hook the thing up, if I put a piston
attach a weight to a piston, then, and I want to raise the weight,
so I say I've got the piston, I've got a string on a pulley, you know, you're in a
pulley, and you've got a weight that can go up or down, right?
you know that string and pulley can be either on the left side or the right side of the piston
and if I don't know which side of the gas of the box the gas is in
oh sorry I didn't realize that you don't know which side
yeah all I said was it's either in the left side or the right side right I see okay
okay so right if you don't know then you you know if you guess right you might say okay I'm going to guess
and I'm going to put the weight on this side of the piston
and you could end up raising it, right?
But if you guessed wrong, you could end up lowering the weight.
Yes.
Yes, so this is why, according to some people, it makes sense
to have entropy be a function not only of like a physical state of a system,
but about what somebody knows about it.
Because...
If entropy is supposed to have the connection with available energy that I just said,
that is a measure of how, if available energy has to do with how much work you can get out of a system,
that depends not only the physical state of the system,
but your means of manipulating the system available to you and what you know about it.
Uh-huh.
And what I just said, you know, is non-controversial.
If the question is, how much work can I get out of a system?
That depends on my means of manipulating the system and what I know about the system.
This is a non-controversial thing.
And if you want a notion of entropy to have this connection to available energy,
then it makes sense to have a notion of entropy which is relative to means of manipulation available
and a state of information about the system.
And in my experience, I can say that to people
who initially say to the angel question,
oh, of course not.
Entropy is patently a property,
physical system alone.
It's got nothing to do with what you might know about it.
If I say, oh, well, if I think of thermodynamics as a resource,
theory about a theory about what agents with certain goals and certain means of manipulation
can do with systems, and I want this notion of available energy to be a measure of how much
work you can get out of a system, then clearly available energy can be relative to means
of manipulation and knowledge about the system. Like it matters if you know, because if you have
to do different things to the system to get working out of it, depending on which side of the
box molecules are on, then information is a resource, right? And then, and so there's a perfectly,
no, there's a perfectly, um, acceptable notion of, you know, this, and it doesn't matter
whether you call it entropy or not, but there's this notion that has that connection to, um, available
energy. And what are we going to call it? Well, if you don't like to call that entropy,
you know, make up a new word for it. But it is actually very closely related to the concept
that Clausius coined the term entropy for. And then the difference between this and
the Closius notion of entropy is in traditional thermodynamics. You know, Thermodynamics 1.0,
what people are doing in the 19th century,
they always assumed that even if matter is made up
lots of molecules and there's these fluctuations
of the molecular level,
we're dealing with them in bulk
and any fluctuations are going to be negligible.
Things might as well be predictable.
And so we can just assume we know what's going to happen
as a result of our manipulations.
When you start getting down to the molecular level,
and this is what the people who are working quantum thermodynamics,
are doing. They're saying, okay, you know, we're working at a level where these molecular fluctuations
aren't negligible. And if, you know, what you really want for a notion of entropy is something
that's relative, say, to, you know, certain physical characteristics, but also, you know, it might be
in this, this, in this state. And what I can do with it, you know, how it's going to respond to what I'm
minor manipulations can depend on what state it is,
so it can actually be relative to some, say, probability distribution
over possible states.
And then what you get is a sort of quantum mechanics,
what you get is a second law of thermodynamics
as a sort of statistical average.
Now, what happens in textbooks these days is...
there's basically, even statistical mechanics textbooks.
So there are textbooks that basically take that kind of approach
and whether you're paying attention or not,
entropy is actually defined in terms of a probability distribution
which can represent a state of information about the system.
And then the other way of doing it,
and this is what you were regarding, alluding to otherwise,
is what you say, well, look, given any constraints on the system,
there'll be a set of available states
and there'll be a certain
saying number of possible states available to it
and that entropy is just proportional to the logarithm
of the number of possible states, right?
And so that is what's often called
Boltzmann entropy.
And the other one where entropy is defined
in terms of probability distribution
is often called Gibbs entropy.
And they're both purpose.
but they have different uses. They're different. They're different things and they're
different uses. So if I give you a box and say with probability one half, it's all the
molecules are in this side and with probability one half, the other side, that will have a certain
gives entropy associated with it, which will have something to do.
with what you can do with it,
what work you can get out of it.
And then you say,
and that will vary with what those probabilities are.
Like, it's more used,
that,
it's more, like, if you have a very
high probability that they're on the left side,
that is more valuable to you as a resource
or getting work out of than if
it's 50-50, right?
Yes.
Right.
And so if I'm trying,
to have a notion of entropy
that is an indication of how
valuable the thing is as a resource
we're getting work out of. It makes total
sense for it to be relative to say some
probability distribution. However,
you know, people will also
say, you know, if it's
either on this side, all on this side, or
on that side, you calculate
the Boltzmann entropy if it's on this side,
you calculate the Boltzman entropy
if it's on the other side, it's the same.
Ah, I see.
And that's also correct, right?
So the Boltzmann entropy doesn't depend on what you know about the system.
The Gibbs entropy does.
They serve different purposes, just different concepts.
And what happens when people get in arguments about whether or not it makes sense to,
when people get in arguments about whether or not it makes sense for entropy to be relative to a state of information,
they have in mind different concepts.
of entropy, which are perfectly well-defined, but for different purposes.
While other money managers are holding, Dynamic is hunting.
Seeing past the horizon, investing beyond the benchmark, because your money can't grow
if it doesn't move. Learn more at dynamic.ca.c.c.com.
Okay. When I said probability here, the person who's listening may be thinking
and probability of what? And then what we didn't say much, maybe you mentioned it once or twice,
but not much, is microstate versus macrostate. So it's the probability of a certain macro state.
What is a macro state? It's seen as a count of microstates. What is a micro state? Well,
when people say, what is the physical system? Most of the time on this channel, when we're speaking
about quote unquote fundamental physics, we're thinking of a microstate. So a macro state is then what?
Like, what the heck defines a macro state?
Is it just us as people, we say, this is something we care about more?
So we're going to call this a macro state.
Yeah, so that's a good question.
And, in fact, you'll find different answers in different textbooks.
Because the people who want entropy, statistical and mechanical entropy to be a property of the system by itself,
they usually mean Boltzman entropy, right?
But the Boltzmann entropy, what you do, the first step you do is you partition the
set of possible microstates into macrostates, and you say whatever microstate it's in,
it's going to be in some macro state, and some macro states correspond to a bigger range of
possible microstates than others, and the macro states, which are,
correspond to a bigger range of microstates have a higher entropy than the ones that
correspond to a narrower range of microstates.
And so the entropy does change with microstate, because if the microstate changes, within
a macro state, the entropy doesn't change, but if it crosses from one macro state to another,
then the bolts one entropy changes.
But the entropy isn't a property of the microstate alone,
because it requires this division into macrostates,
which isn't there in the fundamental physics.
Yeah, so technically speaking,
any given microstate has entropy zero.
Well, if you're talking about Boltzmann entropy, right?
Then, in order to define Boltzmann entropy,
I first have to partition the possible states into macro states.
Right.
Right. Okay.
And however, if I tell you,
if I'm going to do like a really, really fine partition, right?
You say my partition is, I'm going to tell you, you know,
every microstate is in a different element of partition.
I'm going to tell you exactly what the microstate is, right?
Then, yeah, then every of those microstates will have zero entropy, right?
Yeah.
But that's what's, that would be kind of useless.
That would be totally useless, right?
So I think that even the people who are saying,
no, entropy can't depend on us.
It can't depend on what we know about it.
It can't depend on how we can manipulate it.
If what they're using as their notion of entropy is Woltzman entropy,
it starts with a division,
dividing up the set of possible states into macrostates.
And you asked, yes, right the question,
well, what is a macro state?
Okay.
Now, one thing that people often say is,
look, there's certain variables that we're going to be measuring, macro variables. And our measuring
instruments are going to have a certain rank of precision, and a macro state is a set of
microstates that are indistinguishable according to the measurements that we're going to do.
And then it's not, then it's not there in the fundamental physics, because it's relative to
some set of, it's relative to some set of measures and some, you know,
some set of instruments,
some of measurement positions.
And I think that that's perfectly fine.
And then saying, okay, well,
we're not doing fundamental physics
when we're doing, talk about entropy.
I think that's perfectly fine.
It bothers people because entropy increase
is supposed to be one of the fundamental laws of the universe
and is not supposed to depend on things
that aren't there in the fundamental physics.
If you're, but, you know, that just might be the right answer, right?
Like a storm dynamics is not a fundamental theory, right?
Now, another thing you could say is, well, what really matters is, you know,
if I'm thinking about this as a resource theory, a distinction between macro states,
well, you know, I'm going to pay attention to distinctions if they make a difference to what I can do with them.
not if they don't.
So if all the molecules are on one,
so if I guess,
if I tell you how many molecules there are on one side of the box
and how many molecules on the other,
that, okay, that's really useful to know
because I can use that to expand one side or another.
That's good to know.
If all I've got is a piston that can expand,
the things in bulk, and I don't have the means to manipulate things at the micro level,
you know, you tell me anything beyond that, you tell me the exact microstate,
it doesn't affect what I can do with it, right? It doesn't affect what I can get out of it.
Yeah. And this is something that a lot of people misunderstand
about Maxwell's demon example. The demon example was meant to illustrate the,
dependence of thermodynamic
contracepts like entropy
on
means of manipulation available.
Mm-hmm.
So
in the
first appearance of what we
now call Maxwell's demon was
a letter from Maxwell
to his friend, Peter
Guthrie Tate,
Peter Guthrie Tate, who was
writing a sketch of thermodynamics.
X. And Maxwell, so you know, you might want to pick a hole in the second law, because he's saying the second law, you know, if the kinetic theory of gas is true, needs to be modified. And, you know, imagine some, you know, little being that could manipulate molecules individually or, you know, imagine that he's got a little trap door in between the compartments of the...
I'll place a video on screen about this.
Yeah, imagine that you've got gas in a box divided into two compartments
and it's a little trap door and the demon can manipulate the door
and let the faster molecules go one way and the solar molecules go another way.
Well, that demon could create without expenditure of work,
or more minimal expenditure of work,
create big pressure, temperature differences that now we as macroscopic beings could exploit.
And the moral of the story, according to Maxwell, is that the second law of thermodynamics
is a statistical generalization, which is applicable only to situations where you're dealing
with large numbers of molecules in bulk.
And when he says statistical generalization, he's expecting his readers to be familiar with
the sorts of statistical generalizations that the social sciences are coming up, right?
like numbers of murders per capita per year, say.
And if you think about it, there actually is a nice analogy.
So if you keep sort of the macro-level conditions the same,
the broad-scale socioeconomic conditions the same,
then plausibly you're going to get a fairly stable number of murders per capita per year
for a year in a given situation.
But, you know, if you could, you know, imagine a team of psychologists going in and talking to people if they had the ability to identify people who were at risk for committing murder or something like that and talk to them and deal with them.
You know, if they could deal with the people on a individual scale, then you might be able to change that per capita per murder.
Not per capita, murders per capita, right?
And so, but, so what Matsuo's saying is, this demon would be able to do what is at present impossible for us.
Because we do not have the abilities to manipulate things at the molecular level.
He didn't think, the way he put it, made it clear that he didn't think there's any fundamental law of physics that would present.
prevent further technological developments from getting to the point where we could do this.
Yes.
Now, as a matter of fact, he didn't really see this, but if you now include the demon, make the demon operate in a cycle.
So the demon, whatever it does, has to reset itself at the end of each iteration of whatever it's doing,
then it actually is a consequence of the laws of physics,
both classical and quantum,
that on average the demon can't break the second law of thermodynamics
because the classical case would be the,
you know,
if you take the operation of,
you know,
take the demon plus the whole system as an isolated system.
Yes, yes.
If the demon can operate in a,
a cycle while reliably
putting all the molecules in the left side of the box
that is incompatible with
Hamiltonian evolution which conserves phase
space volume like you'd be able to take
a system that you'll actually be able to
reduce the volume of phase space occupied with the system
and there's something similar in quantum mechanics
where you've got you've got the whole
system involving isolated evolution, then you've got, you know, you can't take something
that's initially spread out over a big subspace of timber space and put it into a small
subspace.
And so actually, it, Maxwell didn't realize it, that if you require the demon to act in
a cycle, then, but there's theorems, the effect, the effect of both classical and quant.
mechanics, the demon cannot
reliably and continually
do this.
Precisely, what do you mean when you say that the demon
acts in a cycle? The demon has
to
end up in the same physical
state it started out with.
Okay, why do you have to do that?
I imagine that, look, if the demon
has a brain and is
opening and closing this door, then the brain
changes. Yeah.
Yeah, right.
So here's what people were thinking,
and, you know, you're right to talk about the brain,
and, you know, people have given simple models of this
as, like, not, you know, not a creature with a brain,
but maybe a little device with, like, a memory device
or something like that, yeah.
Yeah.
So that, um, the idea is that if the demon has some kind of memory storage,
okay, then, um, and if it never, ever erase it,
And it's just, it always remembers what it did on previous iterations and never ever erases anything.
Eventually, it's just going to run out of memory.
So I can't keep on doing this forever and ever and ever.
And if it has to act in a cycle, if it has to, like, eventually erase the memory,
then there's actually a entropy cost.
associated with racing the memory in that's sometimes known as Landowners principle.
Mm-hmm, yeah.
And it really is just basically a consequence of what I just said,
that if you require the demon to act in a cycle,
then it can't consistently or reliably violate the second law.
So if I don't,
require the demon to act in the cycle,
then, yeah, what it can do is,
okay, think about that blank memory as a resource.
And it's doing this,
and eventually it uses that resource
and hands you this box with a higher pressure
on one side and others and said, okay, good,
you know, now you can use that
to raise a piston or something like that.
Okay, what all you did was you took a resource
and you converted it to another resource.
You didn't actually violate the second law.
Yes, yes, I see. Okay.
So you actually have to think about that memory reserve,
that blank memory reserve is having a entropy of its own,
which, so a memory which is just blank or maybe full of all zeros,
on this view, has a lower entropy than a memory that's randomly populated by ones and zeros.
Okay, so let me see if I got this.
There are two cases.
Either it operates on a cycle or doesn't.
If it doesn't, it's going to use up a number.
resource, in which case you still have a resource theory. If it does operate on a cycle,
then fundamentally you'll be shrinking face space. Now, I know the most physical systems
shrink phase space because there's some friction and so on, but at a fundamental level,
you don't shrink face space. Sorry, you don't shrink the volume you initially started with
in face space. Absolutely. Yeah. So when you've got a dissipative system,
like something that's friction, right? Then you write down equations of emotion and it will go
from, you know, everything in this original version of space, face, face will go to that, right?
But that's because we're not actually thinking that system is an isolated system.
Like, it's in contact with something that's a source of friction, right?
And if you include everything, like, you know, the pendulum that's going back and forth and
and whatever medium it is that's the source of friction, and you think of all that is undergoing
isolated evolution,
and you think, okay, ordinary Hamiltonian dynamics is going to apply,
then that system as a whole is not going to shrink phase-face.
What's happening is as the pendulum is stamped
and it goes and, you know, aquaics a smaller, smaller region of space base,
it's heating up the environment, warming it up,
and basically it's transferring energy to the environment,
increasing its entropy.
Yeah. So I have some funny questions for you.
Yeah. Okay. All right. Good. I can't promise the answers are going to be fun.
Okay. Well, you're an expert in quantum mechanics and quantum field theory. And I'd like to talk to you about that next time in person because you live actually close. So hopefully we get to meet up shortly.
For my colleagues in Europe, London, Ontario and Toronto, Ontario count is close. My colleagues in the Netherlands always find it funny when I say things like that.
So there's a Heisenberg cut that's often referenced when it comes to the measurement problem.
Now, is the partitioning of macrostates, is there an analogy of that as to what we think of as a macro state versus a microstate and the Heisenberg cut for the measurement problem or are these two separate issues?
That's a good question.
And I will really have to think about, I sort of see where you're getting at,
where there are prima facie, there might be a connection.
But I'm not seeing exactly what the connection might be.
And it's not obviously wrong.
So I would have to think about that.
Yeah.
There might be, actually.
Yeah.
My second funny question is, is the university.
an isolated system?
Is the universe an isolated system?
Can we even talk about the universe as a whole?
Presumably, yes, if you actually mean,
yes, everything in the universe is...
If you literally mean universe is everything there is,
then...
It seems tautologically the case.
Right. However, it's another question is,
does the universe as a whole
obey the sorts of laws
that we usually think of
applicable to isolated systems?
And here's why this is a
genuine question
is it might be that okay
for relatively small systems
that we can actually isolate
the physics we apply to isolated systems
applies, but when things are big enough
that doesn't actually apply.
And what I mean by that is
In quantum mechanics, what usually, when you're asking, is the universe an isolated system,
what you usually mean is it evolves according to whatever the appropriate analog of the
Scherner equation is, and that can be represented by a family of unitary operators,
and that preserves Gilbert space norm, and so it can't start out in, say, you know, a small
sub-space of Hilper space and go into bigger, right? But people who take dynamical collapse
theories seriously think that actually this isolated Schernerner Revolution that we apply to sort
of systems in the lab that we isolate is actually an idealization and not quite right.
And that if you have systems that meet certain criteria, either they have enough particles
or they involve displacements of large masses,
then actually the physical law is a different one
that isn't the law that we usually think of as isolated
and, in fact, mathematically mimics the sort of laws
that we use for systems that aren't isolated.
As you know, because I know you've talked to people with this,
there are dynamic collapse theories, right,
which modify the usual shortenerer equation.
And basically, you know, the origin of those was, you know,
people are studying evolution of non-isolated systems
and saying, okay, here's what happens to the stay of the system
if it's, say, in contact with the heat bath or something like that,
and then saying, well, let's just imagine that something of this form
or a similar form actually is the fundamental law.
Reading, playing, learning.
Stellist lenses do more than just correct your child's vision.
They slow down the progression of myopia.
So your child can continue to discover all the world has to offer through their own eyes.
Light the path to a brighter future with stellar lenses for myopia control.
Learn more at SLR.com.
And ask your family eye care professional for SLOR Stellas lenses at your child's next visit.
In which case, it will still be taught a lot.
If that's right, it'll still be tautologically, tautologically, tautilogously,
whatever.
If that's the case, it'll still be a tautology that the universe is isolated.
But it might be that the way the universe as a whole evolves is as if it's continually
be monitored by some external measurer.
Yeah.
Yeah.
Okay. My other funny question is, it's often said that at some point we'll reach the heat
death of the universe and that we won't be able to do anything even if we're supposing that
we're around or whatever is our descendants. Now, do you imagine that to be the case? Because
if we're thinking in terms of a resource theory, then I could imagine that there would be certain
questions that would be more important to us, that would be different to us, to whatever our
descendants are, maybe they are able to utilize a system with more precision.
Absolutely. Absolutely. The way we actually calculate entropy is, and this is something
that's not often emphasized in textbooks, is relative to a certain set of parameters that we
think we're interested in or we're going to manipulate. And for example, if I want to
calculate the entropy of some standard volume of gas, the question is, well,
Do I count samples of gas with different isotope ratios differently?
And as long as I'm only dealing with things chemically,
it doesn't matter like how much of my oxygen is one isotope and how much it doesn't.
If I'm dealing with nuclear reactions or if I've got some way to separate out things according to their mass,
then that actually might be something.
And so I might want to include that in the entry calculations.
Those are, there's a sense in which entropy is relative to, you know, what is that you're going to manipulate.
However, when people are talking about the heat death is, well, no matter what it is that what you, what you, that you want to do, the natural tendency for things is towards, left to themselves is towards diminished ability to do that, right?
So eventually, and it was really in the 19th century
that people started talking about the heat death of the universe.
Kelvin himself wrote a article called
On a Universal Tendency Towards Dissipation of Energy.
And so what is going to happen, if things just keep going,
is the sun's going to burn out and, okay,
pretty much anything that we want to do here on Earth,
like no matter what your goals are,
or your means of manipulation.
It involves some kind of entropy difference
that ultimately traces from the fact that on one side of us
we've got this high-temperature source of low entropy energy,
and on the other side we've got this low-temperature empty space
that we can radiate stuff back to.
And, okay, eventually that's going to run out
even if you have more subtle ways of manipulating things
eventually everything's going to decay into black holes
and no matter what your goals are
and no matter what means you have manipulation
eventually things are going to run out
and just stay that way forever
unless Roger Penrose is right about his conformal
cyclical cosmology when
after that happens
then things get restarted.
But,
like,
so that's true.
And,
but honestly,
we're talking like
absurdly long time scales,
like billions and billions of years.
So,
I think we should be more worried
about whether humans are,
what are things going to be like
for people on Earth in the next few centuries?
You know,
that is something that we can do something about
and what's going to happen
on the time scale of millions and billions of years
that's actually hard for us to wrap our heads around it
so like some people find this
you know heat death of the universe
is sort of depressing and
you know sometimes even people even say
okay this makes everything meaningless
well you know what
you all
you've known most of your life
you're going to die eventually, right?
And you've got a certain limited amount of time
to do stuff with, right?
And do what you can with it while you have it.
You know, do what, you know, make the best of the time that you have, right?
And that applies on the human time scale,
and I think the same thing we say, you know,
suppose the human species is only going to be around for another million years.
Well, I would say to that species, you know,
do, make the best of what you can while with the time you have, right?
So actually, I don't find it depressed.
I mean, I do get occasionally, like, you know,
everyone has to deal with the fact that you and everyone else that you know is mortal
and you have a finite lifespan.
I do, you know, it's hard not to have feelings about that.
I find it difficult to actually have any emotions or feelings
about what's going to happen in a billion years.
Yeah.
Disorder.
Disorder.
Entropy, yeah.
Disorder is a word that we haven't said.
Yeah.
And it's something that many people in the popular press
when speaking about entropy make an equivalence
between entropy and disorder.
What's wrong about that?
You have to be careful about what you mean about order and disorder, right?
And so there's a real sense of which,
if I've got a box of gas,
and there's a partition and all the gases on one side
and none on the other,
that is a more ordered state than if the petition is out and there's gases all over the place.
And there's a, and one way of thinking about that is that if the gas is indeed allowed to roam freely,
you know, it could spontaneously end up in one side of the box,
but that corresponds to a very, very small region of its face space.
and so the idea is that there are certain kinds of states that we find to be ordered
and those just are like a small percentage of all the possible states right and um
there is a sense in which entropy is a measure of um of disorder so what what you're doing
you know if when you're um let's say i'm generating heat by friction like you know i'm grinding
this cannon bore right
There's a sense in which I've got some regular ordered motion.
I've got this thing going around like that,
and I'm taking energy from that ordered motion,
and I'm transferring it to the iron in the canyon canon,
which is manifested as,
so a higgled-higgledy-piggledy-higle-motion molecules.
I think what's not right about that
is that not everything that we would,
intuitively think of a distinction between order and disorder
is actually corresponds to a distinction in energy.
I'm sorry, not everything in entropy, right.
Yeah, so if...
The easiest way that I think about it is a coffee cup,
and initially it's black with black coffee, I'm saying.
And then you pour some milk, and then there's all this turbulence,
and you'd say, oh, that's extremely disordered, and so you stir it,
and then you're like, oh, wow, now it's extremely ordered.
but it actually has the highest entropy.
Yeah, right, yeah.
So, I mean, that's a good, that's a really good example because there are some things that seem more disordered to us that are actually lower entropy.
And that's a really good example.
You've got, you know, before the milk, it's actually better if it's cream because the cream can take some time to sort of dissolve, to disperse, right?
Yeah, so if I've got, you know, if I take some thick cream and put in the coffee cup, I might have,
might have these swirls of cream in there, right?
And that seems, you know, can be very turbulent and disordered.
And then it settles it down to a situation where the cream is evenly distributed.
And that is a higher entropy state than the intermediate state, but it might, it seems to us
like a simpler state, like more.
And so that's, and that's why it's important to think actually in terms of, you know,
order and disorder on the molecular level.
And also, not everything that we think of.
as more or less ordered
really
of course,
wants to say entropy differences.
So when gravity is
come into play,
comes into play,
the natural tendency,
if I've got a bunch of gas
spread out in the interstellar space,
the natural tendency
for it is to gravitationally clumped together.
And so the clumpy...
Yes, yeah.
Right?
And so a bunch of gas uniformly spread
out, which clumps together and forms a star, that's actually an entropy increasing process,
even though intuitively you might think the end state is more ordered than the initial state.
And so if you're thinking sort of as a rough and ready way, there is a sense in which
molecular disorder and entropy go together. It's not a reliable guide. And I think of what sometimes
people think about when they mean
order and disorder is actually something a bit different
in what people sometimes call complexity.
So Sean Carroll was here sometime last year
and he was talking about origins of complexity
and what people who study complexity,
you know, that is another really spite precise difference
is that, you know, tend to say is
neither the minimum nor maximum
Entropy states are the most complex.
There's a sense in...
Right, okay.
Right. Yeah.
Something we didn't speak about
that comes to mind is ergodicity.
Ergodicity, yeah.
So are the laws of physics ergotic?
Is that a well-defined statement?
And also, please define what ergodic is.
Yeah, so classically, ergodicity
pertains to a system
confined to a finite region of a state space
and undergoing isolated evolution
and to be orgotic means that
take virtually any initial condition you want
and take any finite region of faith space,
eventually that initial condition ends up
in that region of phase space.
And for actual physical systems,
it's very difficult to decide.
Like, you know, if I hand you, like, here's a law of dynamics and say, is this ergodic or not,
it's actually very mathematically different, difficult problem to actually decide, right?
So, and of course, that was a classical definition.
The laws of physics deep down, we know, aren't classical.
In quantum mechanics, basically, you know, that definition of ergodicity, like a state just, you know,
doesn't really apply.
There are things that are called quantum or godicity theorems
that basically have the effect that any state can be approached as closely as possible.
Do the actual, are the actual laws of physics,
like if I actually took, say, a box of gas,
and somehow another isolated it
and let it go according to ordinary quantum evolution.
There is a sense in which
something like ergadicity applies
in that if you look at sufficiently long time averages,
then the amount of time it will spend in
and again subspace will be for almost all
initial states proportional to the dimension of subspace.
I'm not sure that, and something of that sort of flavor often comes in when people are
trying to, when people are trying to prove equilibration results.
So you want, you know, what we haven't talked about is this sort of process of you leave
something alone.
It starts out from a far-up-state, and then it goes to an equal-relium-
liberium state. And that's sometimes
called the minus first law of
thermodynamics. Right, right.
Yeah, we haven't talked about that.
And one of the reason we haven't talked about that
is that
it's hard to say anything really precise about that
because there's various results, and
it's not always clear.
You get nice clean mathematical
results whose physical
significance for actual systems is a bit
obscure. And then you get sort of
plausibility
arguments for
actual physical systems.
And I actually think
that air gradicity
in any sense,
which really has to do
with infinite long-term
average behavior,
really isn't the right question.
Because what I want to know
if I'd pour the milk
in the coffee cup,
not what it's going to do
on average
if you've left it
alone, isolated for all eternity,
but what is
are going to do in the next few minutes?
You want to know actually what's going to happen in finite timescales.
So statistical mechanics textbooks are divided on whether ergodicity is actually important
for statistical mechanics.
Some will say, okay, this ergodic hypothesis is at the root of statistical mechanics.
The hypothesis is that actual systems are agotic.
and then others will say,
oh, there's all this math,
you have a really nice mathematical work
having to do with ergodicity
and is completely irrelevant
to statistical mechanics.
Okay, here, with the cream and the coffee cup,
we only have to wait a few minutes.
And so it's not infinite,
it's not T goes to infinity.
However, Nima Arkani Hamad
also talks about how, with particle physics,
particle physics occurs at the boundary.
Why? Because in the math
or scattering from minus infinity to plus infinity.
And yes, it takes place in just a few milliseconds or a few seconds or what have you.
But for the calculations, we just use infinity.
Now, he seems to be using the opposite argument that you just used.
So would you be able to convince him?
No, no, no, Nima.
It's actually not happening at infinity.
It's not at the boundary.
Well, yeah, so when he says happening at infinity,
I think one thing you have to realize is that when physicists say infinity,
What they often mean is it literally infinity, but large enough that it doesn't really matter how big it is.
There's a nice book called Understanding the Infinite by a philosopher named Sean Levine.
And he introduces what he calls the theory of zillions, and a zillion is a technical term.
I see.
A zillion is a number that's so big, it doesn't really matter how big it is.
Okay.
So it's context-dependent, right?
And if you think about it, that's sort of how we use the word.
Like if I say, you know, if someone says, Wayne, you go back, you go to conferences so much, why don't you just buy a Learjet, right?
And I would say, well, you know, that costs like a zillion dollars, right?
I have no idea what a Learjet costs, right?
But I do know that whatever that cost is is so far beyond my own financial resources.
It doesn't really matter exactly how big it is, right?
And one of my colleagues said, you know, we have to realize in quantum field theory, you know, asymptomatic infinity is like five meters.
Because what you do, when you're doing these scattering experiments, right, what you're doing is there's a relatively small scattering region, right?
And far enough from that scattering region, the field is effectively free, right?
and then so you're basically taking in and out states as if you know as if they're free fields
you're doing your calculation and your calculating scatvings cross sections etc for effectively
free fields and really what you know you'll say add infinity but and mathematically you might
take you know the limit as things go to infinity because that's a nice clean clean physical result
But what you really mean is this is a good approximation far enough from the scattering region that the interactions can be neglected.
And I think that that's what he means, when he says the interesting stuff happens at infinity, right?
And so with something like that, if I've got an interaction and I have a sense of how fast the interaction falls off at distance,
I can get a sense of how far I have to be from the scattering region
to say, okay, these are effectively free fields, right?
What we want from equilibration results
is some result about, okay, how long do I have to wait
until I say, okay, yeah, we're effectively at infinity
because the thing has equilibrated.
And that's what you're trying to get out of the equilibrium results,
and it's not as simple as, because it's not as simple as in the quantum case
because in the quantum case, you've got these distance-dependent forces,
and you know how fast they drop off.
And what you're trying to find out in the equilibration case is, well,
how fast is something you equilibrate?
How fast does it get to the point where I can basically ignore the fact
that it was out of equilibrium at the beginning?
Okay, I have another funny question.
Yes, okay.
So Natty Cyberg said that one of the ways,
that we can, an indicator for quantum field theory being on shaky foundations or not firm
foundations, is that we teach quantum field theory differently. So almost no textbook and quantum field
theory is the same and almost no course is the same. So some person may say, let's start with
scalar fields and then add interactions. And then some person may say, well, let's start with all
free fields and then add interactions and others as a functional probe and so on and so on.
so that's not a controversial statement
that QFT isn't firm
but what I'm wondering is do you personally Wayne
have an idea as to feel or sub-feel
this particular subject in physics that other people think
no no this is well understood but you think
actually there is trouble here
well I think that statistics mechanics is a case in point
because people
textbooks are written
to give the impression
that we understand everything
and this is all worked out
and if you actually
go from one textbook to another
you'll find very different approaches.
Like in quantum field theory
everyone knows that there's different approaches
and statistical mechanics
is sort of swept under the rug
and so that is one case
where I think there are real questions
about the rationale
now for certain kind of methods
that got kind of swept under the rug.
And the thing about thermodynamics is
even though we don't think of it as fundamental
cutting-edge science, it's got its roots in 19th century physics,
different thermodynamics textbooks will take very different approaches.
And I think the root of that is
that there's sort of different conceptions about what
therminics is supposed to be. So one is, one conception is what I call, you know, the resource
theoretic conception, where it really is about what you can do with various things. But what
people usually want from a thermodynamics textbook, especially if it's chemical, you know, preparing
people for doing chemical thermodynamics, is you want to figure out what the equilibrium states
of a system are. And those are the ones that maximize end.
and a textbook with that orientation will tend to minimize talk of manipulations in doing work and things like that
and treat entropy as if it's simply a property of matter like mass and other things.
I think in a lot of areas, actually the textbook tradition will sometimes obscure different ways of thinking about the theory.
And so the question is, in what areas are there where there's a sort of settled, everyone agrees on how to do this?
I guess classical director dynamics, every single textbook in existence is a copy of J.D. Jackson.
Saxon spoke.
Yeah.
Yeah.
And I think it is a,
yeah, that's possible because there is this,
there is this theory that we call classical
electrodynamics that we think has been superseded
by quantum electrodynamics.
So we can all agree on what classical electrodynamics is
because it's not, it's in a sense a closed book.
And quantum field theory is a continuing area of active research.
And, yeah, so one of the reasons for the difference of approaches in quantum field theory is we just don't have a good mathematical grip on the theory as we do other areas of physics, right?
You know, you write down a
Lagrangian, standard model
Lagrangian, and
is this a well-defined thing?
You put it in a cut-off, and
you know, if you let the cut-of-gast
into affinity, you have
blow-ups, which you have
certain techniques for regulating.
Is that telling us that
the theory we wrote down actually isn't
well-defined at all, at all
energies, or are these cutoffs that we're introducing just a calculational tool for getting at
the consequences of a well-defined theory?
I think that, as far as I understand it, and there are people who are much more on the literature,
I actually think that that's still more or less an open question.
I think the standard view is it doesn't really matter whether the theory you're writing
down is well-defined at all energies.
because we think that it's an effective field theory valid at certain energies,
and we don't know what's going on beyond those energies.
But you don't buy that answer, or what?
I think that's right.
I think that's right.
So that's why we can get away with actually not knowing the answers,
whether the theory we write down is actually well-defined at all energies.
If that's your attitude, then it doesn't really matter whether it is.
what's a lesson you learned too late
what is a lesson i learned too late
um
and i'm assuming you want to know about you know
physics and philosophy of physics and not about
personal life um
So, okay, this idea that, you know, what I've been saying about thermodynamics,
that there's two different conceptions of what the theory is.
But we have resource theory, considerate a conception and this other conception
according to which it's more like mainstream physics.
And that took me a surprising amount of time to actually get clear in my own head about.
But now I think it's really, you know, it should be like one of the first things that anyone says when they're talking about thermodynamics.
And as you know, I've given talks several times with the title, A Tale of Two Sciences, both called Thermodynamics.
And, yeah, really, it only is, you know, relatively recent years, I think, okay, that's the way I should.
I want to be thinking about that.
Now, many people watch this channel who are researchers in math, physics, computer science,
so adjacent fields, and philosophers, and there's also lay people that watch.
So I'm curious what advice you have that you give to your graduate students, but also advice
that may apply to this wide gamut of people that watch.
Okay, I would say, here's advice I give to my grad students when they're in this
apply to like researchers in any field if they're just starting out or something like that
is when you're choosing what things to um when you're choosing what things to work on
what you should not do is look around to say okay what's the hot topic what's the popular
thing and jump on the on the current bandwagon and for two reasons if you're doing something
because you think it's popular
and you're not personally interested in it.
Well, if you're not interested in your work,
there's just no way you're going to get anyone else
interested in your work, right?
And also, if you're jumping on a bandwagon
and then you're applying for jobs
and you're submitting your things you've written
or you're submitting parts of your dissertation
for publication,
my experience as an editor,
I was editor of a philosophy fiscal
kernel for a number of years,
my experience as an editor, if you get, when we get a paper, which is the nth minor addition to a well-worn topic,
the threshold for that being worth publishing is very, very high because, right, if you don't want to publish,
even if what you're saying is correct, one of the things you're asking is, okay, if this is going to take up journal space,
is this actually a significant advance over what's out there in the literature?
and if something's a hot topic
then people are going to get tired of it fairly quick
so do something they're interested in
but don't choose something just so narrow
that only three people are going to have an idea
of what you're talking about so there's sort of a happy medium
between choosing a research topic that some people
are going to have some knowledge about
and jumping on the bandwagon and doing what everyone else is
doing. Yeah. Okay, so let's imagine you were speaking to your PhD and postdoc students
who want to get a job in the field. I imagine that at some point they have to maybe not jump on
a bandwagon, but hitch a ride occasionally because don't you have to get grants, don't you have to
be marketable? Okay. So how do you navigate that? So my teacher, Adner Shimoni, had the
honor of working with him at Boston University when I was a grad student. One thing he would say is,
just as Aristotle taught us that ethical virtues are means between opposing vices.
Intellectual virtues are also means between opposing vices.
And I think when you're choosing a research area, there's two opposing vices.
One is choosing something that's such in this area that no one's only three people in the world
are going to have any idea what you're talking about.
And I think the other vice is jumping on a bandwagon and doing what everyone else is doing.
And I think the reason I mentioned that other vice first is I think there's an mistaken impression out there that that's what you're supposed to do.
That's what you should be doing, right?
And I think that that's a mistake.
And this is based on my experience as an editor and also talking to other people in the field who edit journals and also talking in my experience as on panels to adjudicate grants and things like that is.
Sure, like if someone reading something and says,
I have no idea what this is about, okay, that's a, you know, that's a tough sell, right?
But if you've got a dozen grant applications in front of you,
and 10 of them are minor variance on the same thing,
and what they're going to do is at best advance,
make a minor advance in a well-worn field.
and another one is an interesting and promising research project that is worth doing but relatively unexplored.
That's actually going to count in favor of the one that's worth doing and relatively unexplored.
There's this idea that I think it's just simply false, that there's a sort of group thinking everyone wants to only give grants to what they think everyone else is doing.
I think that that's just false.
So, and in terms of getting jobs, let me tell you, this is a true story.
Many, many years ago, I was on a, we were hiring at the University of Western Ontario,
and I was on the hiring committee, and we had a job ad, which was fairly broad.
And what had happened is in one of the areas of...
in one of the areas of specialization that was included in the job ad,
a big name philosopher had recently published a book that was getting a lot of attention.
And what happened was everybody in the world did a grad seminar on this book.
And I was sitting there reading these applications.
This was back in the days when people actually sent us paper applications
and there's a file box with all the applications in it
and you take it into your office after hours and you'd go.
going through it, right?
And I was reading the writing sample of one candidate.
And I said, I read the open paragraph, I'm going,
didn't I just read this?
I'm going, oh, my God, one of our applicants
has plagiarized the writing sample from another applicant.
And then I went back and got that other file,
and they were, in fact, different.
But the opening paragraphs were almost word for the same,
because this was this issue that everyone was talking
about and there was a very standard way of setting up the issue. I see. So if you want people to
actually confuse your writing sample with someone else's, then jump on a bandwagon. Interesting.
So this also applies to film and businesses in general. You don't want to be in the red
is said to be, the red contested waters. You want to be in the blue ocean. I'm not familiar with
that terminology, but I'll believe you. Yeah, right. You just referenced, I could be,
I could give you a personal lesson, but instead I'll give you a lesson that applies to philosophers and physicists.
And that has to be curious.
What would be the personal lesson that you learned too late?
And something not trivial, like, oh, I learned to, I should double bag my groceries.
Um, let's see.
Listen to my personal life that I think I learned too late.
If there are toxic people in your life, avoid them.
Be around people that you're comfortable around and you feel good around.
and try to minimize your contact with the toxic people.
Thank you so much for spending so much time with me.
Well, thank you. I've really enjoyed this.
Well, it's been two hours. It feels like it just flew by.
Yes, that's always a great side.
In fact, in Harry Potter, I think there was a sands of time.
And then Harry asks the professor, what is this?
Because it was a different type of sands.
And then the person said, it stands still when the conversation's engaging.
Well, that's good.
Actually, I have another life lesson, which I heard early in my life,
but I am to this day not particularly good at applying.
And this is an interview that I heard on the radio as a teenager with David Lee Roth,
who was big at the time when I was a teenager.
And he said, here's my life lesson.
Don't sweat the little shit.
And it's all little shit.
Interesting.
I mean, I don't think it's actually true that it's all little shit,
but I think a lot of,
I think they don't sweat the little shit is something
a lot of us have difficulty applying,
that we end up fussing too much about things in the long run
aren't really important.
Professor, thank you.
Well, thank you very much.
Thank you very much.
I really enjoyed this,
and I hope out of this mess you can put together something reasonably,
you and your editor can put something together reasonably,
all right. All right. Thank you.
Hi there. Kurt here. If you'd like more content from theories of everything and the very
best listening experience, then be sure to check out my substack at kurtjymongle.org. Some of the
top perks are that every week you get brand new episodes ahead of time. You also get bonus written
content exclusively for our members. That's see.
You can also just search my name and the word substack on Google.
Since I started that substack, it somehow already became number two in the science category.
Now, substack for those who are unfamiliar is like a newsletter, one that's beautifully
formatted, there's zero spam, this is the best place to follow the content of this channel that
isn't anywhere else. It's not on YouTube. It's not on Patreon. It's exclusive to the substack. It's free.
There are ways for you to support me on substack if you want, and you'll get special bonuses if you do.
Several people ask me like, hey, Kurt, you've spoken to so many people in the fields of theoretical
physics, a philosophy, of consciousness. What are your thoughts, man? Well, while I remain impartial
in interviews, this substack is a way to peer.
into my present deliberations on these topics. And it's the perfect way to support me directly.
Kurtjymongle.org or search Kurtzimungle substack on Google. Oh, and I've received several messages,
emails, and comments from professors and researchers saying that they recommend theories of everything
to their students. That's fantastic. If you're a professor or a lecturer or what have you,
and there's a particular standout episode
that students can benefit from
or your friends, please do share.
And of course, a huge thank you
to our advertising sponsor, The Economist.
Visit Economist.com slash Toe
to get a massive discount on their annual subscription.
I subscribe to The Economist, and you'll love it as well.
Toe is actually the only podcast
that they currently partner with.
So it's a huge honor for me,
and for you, you're getting an economy,
exclusive discount. That's economist.com slash totoe. And finally, you should know this podcast is on
iTunes. It's on Spotify. It's on all the audio platforms. All you have to do is type in theories of everything
and you'll find it. I know my last name is complicated, so maybe you don't want to type in
gymongle, but you can type in theories of everything and you'll find it. Personally, I gain from
re-watching lectures and podcasts. I also read in the comment that Toll listeners also
gain from replaying. So how about instead you relisten on one of those platforms like iTunes,
Spotify, Google Podcasts? Whatever podcast catcher you use, I'm there with you. Thank you for listening.
