The Peter Attia Drive - #167 - Gary Taubes: Bad science and challenging the conventional wisdom of obesity
Episode Date: June 28, 2021Gary Taubes is an investigative science and health journalist and a best-selling author. In this podcast, Gary explains how he developed a healthy skepticism for science as he was transitioning from b...eing a physics major to beginning as a science journalist. He talks about how he was particularly drawn to sussing out “pathologic science,” telling the stories behind his books on the discovery of the W and Z bosons and cold fusion, emphasizing the need for researchers to perform a thorough background analysis. Gary then describes how his work came to focus on public health, nutrition, and obesity. He provides a great historic overview of obesity research and provides his explanation for why the conventional wisdom today is incorrect.We discuss: Gary’s background in science and journalism, and developing a healthy skepticism for science [2:20]; Gary’s boxing experience, and the challenge of appreciating behavioral risk [8:40]; How Gary developed his writing skills, and what the best science writers do well [16:45]; Example of how science can go wrong, and the story behind Gary’s first book, Nobel Dreams [25:15]; Theoretical vs. experimental physicists: The important differentiation and the relationship between the two [36:00]; Pathological science: research tainted by unconscious bias or subjective effects [40:30]; Reflecting on the aftermath of writing Nobel Dreams and the legacy of Carlo Rubbia [49:45]; Scientific fraud: The story of the cold fusion experiments at Georgia Tech and the subject of Gary’s book, Bad Science [53:45]; Problems with epidemiology, history of the scientific method, and the conflict of public health science [1:09:00]; Gary’s first foray into the bad science of nutrition [1:26:45]; Research implicating insulin’s role in obesity, and the story behind what led to Gary’s book, Good Calories, Bad Calories [1:36:15] The history of obesity research, dietary fat, and fat metabolism [1:46:00] The evolving understanding of the role of fat metabolism in obesity and weight gain [1:55:15] Mutant mice experiments giving way to competing theories about obesity [2:04:00] How Gary thinks about the findings that do not support his alternative hypothesis about obesity [2:08:00] Challenges with addressing the obesity and diabetes epidemics, palatability and convenience of food, and other hypotheses [2:14:45]; Challenging the energy balance hypothesis, and the difficulty of doing good nutrition studies [2:25:00]; and More. Learn more: https://peterattiamd.com/ Show notes page for this episode: https://peterattiamd.com/GaryTaubes Subscribe to receive exclusive subscriber-only content: https://peterattiamd.com/subscribe/ Sign up to receive Peter's email newsletter: https://peterattiamd.com/newsletter/ Connect with Peter on Facebook | Twitter | Instagram.
Transcript
Discussion (0)
Hey everyone, welcome to the Drive Podcast. I'm your host, Peter Atia. This podcast, my
website, and my weekly newsletter, I'll focus on the goal of translating the science
of longevity into something accessible for everyone. Our goal is to provide the best
content and health and wellness. Full stop, and we've assembled a great team of analysts to make this happen.
If you enjoy this podcast, we've created a membership program that brings you far more
in-depth content if you want to take your knowledge of this space to the next level.
At the end of this episode, I'll explain what those benefits are, or if you want to learn
more now, head over to peteratia MD dot com forward slash subscribe.
Now without further delay, here's today's episode.
I guess this week is Gary Thompson. Gary's an investigative science and health journalist. He's
the author of multiple books, including the case for Keto, the case against sugar, why we get fat
and what to do about it, good calories, bad calories, bad science, and Nobel dreams.
Most of you who recognize Gary's name
will undoubtedly recognize him from his work in nutrition,
and that's covered obviously in the most recent of his books.
However, we spend the first part of this discussion,
really talking about his background,
how he got into journalism as a physics major,
and how he spent the first really decade and a half of his career
writing about science and how he was particularly drawn to bad science or what would be called pathologic
science. Gary almost stumbled into health sciences in the late 90s and really he's never looked back
since. In this podcast, we probably spend about two-thirds of it covering his career up until that point and about the last third getting into this
Gary provides a great history of the overview of obesity
research and
provides his explanation for why the conventional wisdom today is
Incorrect whether or not you come away from this episode believing that Gary is right or wrong
I hope you do come away from this episode, believing that Gary is right or wrong, I hope you do come away from this episode at least acknowledging that there perhaps remains some
uncertainty with respect to our understanding of obesity and its causes.
So without further delay, please join my conversation with Gary Tops.
Hey Gary, how you doing?
I'm good Peter, how are you?
I'm good as well.
I feel like it's been a very long time.
I don't know the last time I actually saw you.
It's been how many years?
Probably three or four San Francisco, we'd bird offices.
Yeah, God, that's maybe four or five years ago now.
Who knows?
We were certainly younger.
Speak for yourself, come on.
Oh, yeah, that's true.
You may be aging the other way.
I forget we're in the room.
There's so much to talk about.
I want to start with some of the stuff that probably a lot of people forget about you,
which is long before you were writing about nutrition, you were writing about other disciplines of science.
And I'm curious how you got interested in this whole thing.
So remind me what you studied in
college. You went to Harvard. And did you study physics? Was that your undergrad? Yeah, I was a
physics major. I actually started off as astrophysics major and then got a C minus in quantum physics.
And my advisor suggested I change my major. So we switched to applied physics and I spent my senior year taking writing courses
for the most part. But yeah. What drew you to physics in the first place? Were you a good student
in high school? Yeah, well, you got to be a good student. High school to get into Harvard.
Red Science Fiction Book School and Opt. This was the 60s. You know, we all wanted to be astronauts.
Now our kids want to be billionaires back then they wanted to be astronauts. So I read science fiction books, I studied astrophysics, I have an older brother as you
know who's very smart and was studying physics and I was competing with him.
So that's what I did.
Problem is my brain couldn't wrap itself around the concept of Hamiltonian was beyond my
ability to comprehend.
So that was the end of that.
So the whole time you're studying physics,
you're thinking, I want to now apply this.
And I think you went to Stanford there after
to do a master's in aerospace, if my memory serves me correctly.
I did.
And I was still thinking astronaut, weirdly enough.
Ed Stanford, I actually lived with a few naval pilots in the same housing
you know apartment building and I realized one that there really wasn't any need for a 6-foot
2-120-pound astronaut when you could send up a 5-foot 950-pound astronaut. It was in better shape
and more fuel efficient and B that I wouldn't survive very well on any kind of military
hierarchy that required blind acknowledgement to superiors.
So started aiming towards journalism from there.
Where was that seed planted of COVID insatiable skepticism and a refusal to bend to authority?
On one level, I once had a editor at Discover Magazine
who asked me sarcasticly or facetiously whether my brother
was as, quote, well adjusted as I was.
And my response was I didn't have to grow up in his shadow.
My youngest son now is 12 years old,
and he's a basketball obsessed, and he plays
with the local AAU basketball team.
And I noticed that the best players
are the ones who have older brothers who play I noticed that the best players are the ones
who have older brothers who play.
They're the scrappiest, they're the toughest,
they refuse to give up a rebound.
And I think because they've been playing
with their older brothers their whole life,
I mean, the ones that can compete quit
and go into another sport and the ones that can
sort of refuse to give in.
And so on some level, that probably started my skepticism. But then when I was
just, you know, so I go to journalism school, I get, I want to be an investigative journalist.
I had read all the presidents' men by Woodward and Bernstein. I was fascinated by it.
The only job I could get out of journalism school was I had two semi-offers. One was at the Dallas Morning News and in 1979 I wasn't willing to move
to Dallas. The other was at CNN in Atlanta where they didn't allow cigarette smoking in
the newsroom and I was a smoker. So I stayed in New York and took a job with Discover Magazine
and three or four years in I did maybe two or three years in, I did a piece,
reported an article on the Shroud of Turin. So at the time, we searched from
Los Alamos for taking this high-tech imaging equipment to Turin to study the
Shroud and they had concluded, so this is supposedly the burial Shroud of
Christ, that it appeared in the historical record right around the 11th or 12th century
where there's a big market in sort of fake religious artifacts.
So there's a strong prior belief that this is a fake religious artifact,
but the Los Alamos researchers take their high-tech imaging equipment to to reen and they say and they get permission from the church to do it
and they say they can't understand how this was made
Even though it's also been carbon dated to the 11th of 12th century. I remember a Sunday evening
I was living in my studio apartment in New York. I was reading their reports that they had published and I forget what it was
It struck me as profoundly
Unsupported by the data, but I actually called the researcher in Los Alamos at home
to say, how could you possibly interpret this
from what you did?
And it was just the step one in realizing,
and maybe because I had gone to Harvard
and I had gone to Stanford,
and I knew a lot of people who went into science.
Personally, there's nothing fundamentally different
about somebody who goes into science
and somebody who goes into journalism,
other than we have sort of different mechanisms
of wanting to understand what truth is.
What was his response?
I don't remember.
I don't remember, but if he, hey,
certainly wasn't able to satisfactorily defend
the interpretation.
It was also an interesting conclusion
about the observation about negative results, which comes up all the time. When somebody also an interesting conclusion about the observation about negative
results, which comes up all the time when somebody gets a negative result when they're unable
to determine how the shroud was created. Does that mean that the shroud was created by supernatural
means that are beyond the ability of the equipment to detect it? Does it mean that the equipment
that they used are simply inadequate for doing the job.
And in effect almost 40 years later we're having the same debate online this week about the cause
of obesity. Are you doing the right experiments? Have you refuted the hypothesis or are the experiments
flawed? And that's always a fundamental issue in science. Did you know the harm of cigarette smoking when you were a smoker?
It's a good question.
My mother died of lung cancer and she was a smoker, but that was about 10 years after I started.
Uh, and what's interesting?
So I started smoking 1978 when I went out to Stanford for graduate school.
I was depressed.
The sunshine in Palo Alto wasn't helping you.
You missed those dark Boston.
Winters.
You know, it's in try.
I mean, this is getting deeply into my psychopathology here.
I've suffered mild depression.
It's my whole life as you know, you go from Harvard where I play football.
I'm well liked.
I've built a sort of four year career at the school.
You think of yourself in a particular way,
and then you go someplace completely new
where nobody knows you.
I didn't handle that well.
So I started smoking too.
I don't know, maybe to spite the kids
who seem so happy and complacent in the Stanford sunshine
when I was definitely not happy and complacent,
and then it took me 20 years to quit.
I don't know of children,
and we discussed this when we were doing news.
I don't know if kids ever think in terms
of the things they do being dangerous.
We both did some very dangerous things
when we were younger.
We were both boxers.
I wanna actually talk about your boxing history
because the article you wrote in Playboy is such a great one.
But I don't know if I've told you this story,
but the entire time I was boxing, I never thought it was dangerous. I had this true mental block that was I don't get hit.
So it's not dangerous. Like nothing's happening to me. I'm evading and I'm hitting and da-da-da-da-da-da-da-da.
Which of course, it was categorically untrue because my style of boxing was actually to get hit, tire out opponents, and eventually
overcome them due to superior conditioning.
But it never seemed to hurt.
And it wasn't until I suffered a very, very severe concussion when I was 20.
In this, I still remember it very well.
I was actually hospitalized for two days, had a CT scan, had significant bruising, had
significant cerebral contusions,
and had a headache that lasted for three months.
This was the hardest I've ever had in my bell rung.
And it was only at that point that I thought,
oh my God, you actually get hit in this sport.
And by that point, I'd already decided
I didn't wanna be a professional boxer.
That was the irony of it.
It was I was already in college,
and I was now only boxing as something fun to do.
But it literally took that event for me to go, this sport is crazy.
And from that point on, I would look at it very differently.
But while I was doing it, I never thought that.
I only got into it competitively after my best friend got killed boxing.
Yeah, I mean, there's a lot of things.
I was a football player in high school in college.
And when you're a football player in high school in college, when your football player in high school in college back then getting your bell wrong
It just was what it was. I mean, I clearly had at least one concussion in high school football
My best friend in college who was actually a lot of kids best friends are very charming
Yeah, wonderful young man of Puerto Rican descent from New York was a amateur boxer in New York
Went back to boxing after his football career came to an end in Dunedin.
Injuries and got our senior urethes first fight and he got killed in the ring.
He was knocked out, his head hit the mat.
We were in Loll, Massachusetts watching the fight at the time and he never woke up.
My memory is of them unplugging him a week later.
It did not stop me from then moving to New York when school was over.
Actually, three years later, and when I got to New York starting to box because I was bored.
Did you think about him?
When I did it, yeah, sure, but I never thought that what happened to him could happen to me
until I got knocked out in the golden gloves.
At which point there's this awareness when you get knocked on
conscious, when you wake up, that some people never wake up from that, that that might have been
the moment to which your candle was blown out in effect, and there's no relighting it.
Talking about this is interesting, because a lot of what we do in life and talk about when you're
talking about prevention of chronic illnesses and prevention of disease and prevention of
addiction. It's getting young men and women who don't think and don't process
risk the same way we do as we get older. I always thought it was fascinating
that the less life we have, less to live, the more risk adverse we would. If
something bad happens, you're actually losing less of your life than you are
when you're younger. And I'm sure that behavioral psychologists like Connemon and Furski could say things
or do say things about that.
Tell me about the Playboy article that you wrote after that Golden Gloves tournament.
I know you still have a, at least the last time I was at your house, maybe eight years
ago, you actually had a framed photo of you lying prostrate in the ring.
You know, I still do what's in our coat entry way. I often thought of putting it up here
instead of this picture because I furdo it as my hubris protection. So while I was working
at Discover Magazine and I was boxing, so this was 1983-84.
By the way, not too long after Duck Who Kim was killed by Ray Mancini. I mean,
the early 80s was a very difficult period in boxing. There were a number of high-profile
deaths in the ring. Yeah, he was killed while I was boxing, and we actually did a story
on it that discover about what happens to the brain. A good friend of mine, a New York
Times reporter now named Denise Grady, was writing that story, and while I was boxing and getting ready to box in the golden gloves,
and she would come and then slip like articles under my door about whatever we were
calling the traumatic event back then.
Yeah, I was just a young man.
I was a box, I had a friend, a good friend who was a Norman Mallorff, nephew,
and a Norman Mallorff had a group of young men,
not even that young, who met at a gym
that's no longer there on 14th Street, Manhattan,
and back when that was a very questionable part of town.
And we would go down, I think,
was every Saturday morning.
We would box for a few hours,
and then we would go out to lunch.
And yeah, from that, I got the idea,
why don't I box him a golden gloves and write about it.
So I originally was doing it for sports illustrator but then I realized that I was technically too old
to do it. That I missed the cutoff by about two or three months and that if I wanted to box I would
have to revise my birth certificate. And so when I realized that I talked to the editors at sports
illustrator and I said you, you do investigative journalism. We
were in the same building, Discover and Sports Illustrated, we're both
owned by time and corporate at the time. I told the editors, you do, you know, hard
hitting investigative journalism. If I want to continue to do this article, I'm
going to have to fake my age. And I think that's a problem. The editor is said,
well, it is now that you told us. So we pulled it from sports illustrated. And one of the editors I worked with had a good
friend of playboy. So I pitched it to playboy and my sister-in-law got angry at me for writing for
magazine that doesn't portray women in the most ideal light, depending on how you look at it. And
she had a point, but I said some of the best writers in America, right for Playboy.
And maybe later I'll have the platform
by which I can be too good for this approach.
Anyway, we did it.
I won my first fight against a policeman from Staten Island
who beat the crap out of me in the first round.
And then I realized that if I punched back,
I might help.
And I knocked him out in the second round,
and then the second fight was a week later, and I got knocked out, and I think a minute
and thirty-seven seconds by the fellow who ended up winning our division, who was not particularly
good, but I had never really had enough time to work on defense.
You had been boxing for many years, competitively.
I had been doing this for about four months at any level
other than a Saturday morning hobby.
But it was a reminder that anytime I think I'm some of us
grow up with this kind of James Bond complex,
I think it's fair to say between the watches
and the fast cars and the archery and things like that,
you might have it also that we think we're so cool,
we can do anything and that was my reminder that I'm certainly not and that there are things I should stay away
from if I want to have a long and healthy life.
I'm more than being boxing and I haven't gone back and then never went back
into the ring after that.
What's the process like of learning to write? I was rereading an article yesterday
that you had recently written and don't get too flattered by this,
but I forgot how well you write.
I think I write reasonably well.
I've learned to write better over the years.
And I think a lot of that came from how often I used to write,
how often you used to rip what I used to write apart.
Back when we were at New See,
I was doing so much of the writing
and you were doing so much of the editing
and revising of internal stuff that I would write.
And that process,
of course, I think, made me much better, and I can now see it when I look at the writing of other
people. But you're writing today, I think, is really excellent. At least it's a style that I find
really good. You can write about science in a way that's very readable. How does one learn that
craft? When you show up at Columbia, which ostensibly is the best journalism school in the country, I assume
you're coming in incredibly raw. Did you believe you had some talent, some hidden talent for writing,
or is that not even a part of what they're selecting at a top journalism school? Is it more
the thought process they're selecting for, and the writing is the easy thing to teach on top?
Well, I thought I could write because I used to write letters to friends and girlfriends
that I thought were clever and they liked. So there was always a sense that somewhere inside me,
I was a writer. When I first started taking writing classes, it was actually at night school,
at Harvard. That's why I got my master's degree from Stanford. I went back to Boston and I don't
even remember what I was doing for a living, but over the course of a semester, I took two writing courses, one taught by a science writer from the Boston Herald, who I later
worked with at the Discover Magazine, and one taught by fiction editor from the Atlantic,
named Michael Curtis, who had been a roommate of Thomas Pinchon and Dick Farinha at Cornell
in the early 1960s. It was already a famous fiction editor at the Atlantic,
and the very first article I wrote for him could probably remember the subject if I had to,
but what I do remember is it came back with the words purile, written in one-in-child letters
on the front, underlined twice, and I had a look-up purile, which meant childish and amateurish
and thoughtless.
Over the course of a semester, Michael Curtis started to hammer me into being a functional
writer by constant and relentless criticism of what I did.
And I appreciated, in fact, the Boston Herald Science Editor was so nice, he would read
people's work and he believed you should never say anything negative.
About their work and I gained absolutely nothing from being in his class. He was a lovely man, but I don't think anyone else did either.
Except maybe a unjustified sense of their own talent.
Heard his hammered in to me. Then when I got to journalism school, it was just about basically teaching you to write in a report. And so it's a steady learning process.
And then as a journalist, when I started to discover,
I started as a reporter.
So Time Inc. back then had reporters who
did the information gathering.
And then we were write files.
And the files would be given to the writers.
And the writers were true craftsmen.
And I started off, I don't know, the first story I
remember doing was about some type of particle accelerator
at Michigan State. I had the writers, they are friends who were writers who would read this draft
and relentlessly critique it into, I mean, that it was something that was worth submitting to the
editors and then the editors would do the same thing before it would be published. And as I became
a writer and got older, I did that same thing for other reporters who came in under me. And there are a few very well-respected journalists who,
I think, benefited from me being relentlessly critical. And then the actual process of writing is
just, you know, I told you this years ago, Calvin Trillin, the famous New Yorker writer, had a phrase
he called the first draft of Bommet out, where you just get everything down and paper
and it kind of relieves the pressure of the first draft. So you get everything down on paper,
the computer, and then you just keep rewriting it and rewriting it and rewriting it until it reads
like something I would have wanted to read if it hadn't been written by me. And because I read a lot
and I read a lot of very talented authors. I know what good writing looks like and
feels like and you just continue to revise and edit until it gets there. It's...
Who's the best living science writer in your opinion? I don't think I could answer that question.
There are a lot of people who have a lot of different skills. It's like saying who's the best living
football player or the best my 12 year old who is always asking me who I think the five best basketball players, you know, is was will chambered and
better than Steph Curry while they were different.
There are science journalists.
Jesus, he's a young man at the Atlantic now whose name I'm going to forget, and read somebody who writes,
thoughtful, well-reported article on a different subject
every three or four days.
And I often read his stuff, and I think,
could I have done that when I was his age
and worked 70, 80 hours a week, possibly,
but I don't think so.
And I don't think I could have done it as well.
When I was growing up, the best out there was Jim Gleck
who wrote remember chaos and
The Feynman biography the name of which I forget at the moment. There were a whole series of
Science-journal, a whole collection of science journalists who came of age in the
1980s when science magazines had first started there was a magazine called science
when science magazines had first started. There was a magazine called Science,
well, Science 80, then it was Science 81, Science 82.
There was Discover.
Some of those like Charles Mann and Steve Hall,
John Tierney.
Chaos in the blood.
Chaos in the blood, yeah.
Just exquisitely beautiful, thoughtful writers.
In fact, I can't, Charles Mann,
he's a friend and Steve's
a friend and I think I call Charles goes by cam and I can't read cam's books because
they're so painfully, the writing is so painfully beautiful that I feel inadequate when I read
them. It's very difficult. The same goes for Steve. I think what they do, they do much better
than I do. But we're now the senior figures in the field.
What is it about their work though?
You once told me that great science writing isn't really about the science.
What is it about Steve, for example, that you read and think he's doing that so well?
It's a combination of comprehensive rigorous reporting, so you know you have the story
although you can never really judge if somebody else has a story right unless they're writing
about your field, the prose style, the ability to tell a story
around a complicated subject.
I was gonna say, I just read The Ghost Map
by Stephen Levy, and have you ever read that?
No.
It's an extraordinary book.
I almost, I hope that's the best book he ever wrote,
because if he's written better books,
then I'm in the wrong field.
And I know many people say that anyway, but on one level, just the pros.
So the ability to write a sentence and a paragraph that other people want to read,
but then the story that you tell and the digressions to the story
and keeping it always interesting and making it both bigger than it other people might see
and it's simultaneously getting the details of the lives and the times.
And I'm sure every genre has its own challenges, but science has its challenge that you're inherently
writing about a subject that's very complex, it's hard for even the people who the scientists
themselves to grab their heads around. So I'm still going to have to think about who the best
science, who I think the best science writer is today if there is one.
And there's so many out there today.
The question you always ask is what books out there like the ghost map.
And again, on one level, it's about the John Snow and the color epidemic in England and whatever 1864 in London.
And how John Snow and his compatriots figured out that this was
cholera was not bad air but infectious agent in the water supply but then it's
about the birth of modern cities, about urbanization, about the problems that
cities faced coming together, about cities into the future and then
fundamentally about the scientific process, the challenge of an outside or doing
science, which might be one reason why it resonated so much with me.
You start with a small story and you turn it into something large and intricate, meaningful,
extraordinary book.
So at what point during your time at Discover did you become interested with the subject
matter that would eventually become your first book?
Yeah, my background was in physics, so it was natural
for me to write about physics.
And Harvard physicist named Carlo Wubia,
working at the European Center for Nuclear Research,
Surn and Geneva had his research collaboration
of 150 people, a physicist known as UA1,
had discovered two fundamental particles that were sort of
the two of the three last remaining, well two of the four last remaining particles of the
standard model of physics. So physics, the electricity and magnetism had been united with the
weak force, the electro-week theory, and that was a great step forward in the 1960s and 70s, and by
1973, that theory had mostly been confirmed, and it predicted the existence of two particles
that transmit or carry the Electro-Week force.
This was the WNZ boson.
And Ruby is a group, Ruby, who was a very controversial physicist who taught at Harvard had convinced the Europe to, in effect, spend the money to build an accelerator that had the power to detect the WNC particles.
And when they announced the discovery, we did an article on it to discover. I forget who I must have written that, but then discover was part of Time magazine. Time has their man of the
year cover, right? So discover had this concept of the scientist of the year, and because of this
great discovery, and everyone knew that Ruby was going to get the Nobel Prize for discover made
Ruby the scientist of the year. So I get the flight of Geneva. Actually, I flew to Belgium, rented a BMW, drove through Paris, went to Geneva.
Time had a very good expense account back then.
My life has been downhill ever since.
Anyway, I spent time with Ruby and Geneva.
I interviewed his friends.
I wrote them up as a profile for the scientists of the year, time ink flew into New York,
get the Explorers Club.
He was given the award.
And this was before Stockholm or after Stockholm.
This was before Stockholm.
Okay.
He was awarded the Nobel Prize Award and like it's always the fall so it was at the fall
of 84.
85, yeah.
It's 85, okay.
They published the two boasts in like January or February of 83, is that right?
Or was it earlier 82?
Must have been, probably 83.
Okay.
So anyway, the point is,
not only does Ruby get this award,
it gets a nice gold Rolex from the Explorer's Club.
So he's very grateful to Gary Tabs for,
he thinks somehow,
and I probably was the one who suggested him
for scientists of the year.
So that April, which would be April of 84,
he comes to Washington,
there's the big physics meeting
every year in Washington and Ruby gives a presentation announcing that he's now on to the next step,
which is physics beyond the standard model. And he shows what they've got. And he shows a few events
and we could talk about that. And the messages that all he has to, it's the biggest discovery in physics
in 40 years. And all he has to do is turn on the accelerator
which will happen at autumn and gather more data and now it down and it's Nobel Prize number two.
And so I was at that talk so I came up to him afterwards. I said, look, it's, you know, this is
terrifically exciting and it's rare that anyone predicts a great discovery in advance. So can I
write a book about this? Can I maybe I can come to CERN, live with you guys and document this as it happens?
And Rubia liked me. He thought of me as his personal scribe. So he said yes, and I put together,
well first I put together a proposal for the Atlantic that they rejected because Charles Mann and his co-author at the time Bob Kreese had just put their own proposal in to do a
piece on high energy physics.
So when I was always following behind Mann and Kreese, if you're going to follow mine
two writers, other two very good writers to lose out to.
Anyway, so I pitched it as a book deal instead. Actually, I was approached by a publisher
who was publishing a book by Shelley Glashow, who's a Nobel laureate at Harvard, who I knew,
and they had asked Shelley who the good science writers were, and Shelley said, you could
get your tab. She's probably the best. So they approached me. I had this proposal from the Atlantic
that they rejected. I sent it to the publisher and they said they'd be willing to pay me to do the book.
So I left Discover Magazine and flew to Geneva and got a room at the hostel on the lab and was, today we'd say I was embedded with the physicists for the next nine months.
And thought I was going to write about a great discovery and pretty quickly realize that this was far less compelling.
The evidence were far less compelling than Ruby had claimed in Washington and that there
was a very good story here and it was probably not the story of a great discovery.
You know, when you ask what informed my approach to science and science journalism, that was
a learning experience because so there's 150 physicists on the experiment
So the gist of it is you've got well an atom smash or part of a collider so you're colliding
Subatomic particles together at speeds as close as you could get them to the speed of light the faster the particles go the
Great of the energy in the collisions and then you're looking in the collisions for the signs, for particles that might be made that your theory of physics, the standard model did not predict.
So in that sense, you're testing the standard model. And as the accelerators are able to put
more and more energy into these collisions, you're able to test your predictions of the standard model
further and further out. And then you've got to understand the predictions, which is part of what
we're dealing with here. But so you've got to understand the predictions, which is part of what we're dealing with here.
But so you've got this huge particle accelerator.
And if I remember correctly, it was maybe four miles
into conference.
It straddles the French Swiss border.
And it points on the accelerator
where you're colliding the two beams together.
You build a detector.
And a detector, these things are with the size of a mansion.
Three, four stories high,
that's where the money, that's, you know, there are as expense for the accelerator itself, my cost
$50 million or $100 million, today they cost billions, back then it was hundreds of millions,
and the detector, which I always thought of as kind of a camera that's created to photograph, to be able to detect the
passage or the emittance of all the various particles that you will come out of these collisions.
So, you've got photon detectors to detect photons that come out. You've got muon detectors
to detect muons that come out. And so you collide it and of the billions of collisions
you're making, I don't know, every day,
you're looking for one collision that creates one particle
that your standard model doesn't predict.
And just to put that in context,
as you said, you could have billions of collisions per day,
but I think if I recall, I mean,
Rubia would say, look, you might get 20 a year that are relevant.
And so you just think about the signal to noise ratio there.
And again, I think just for folks to understand, you're trying to look at,
so you'd have a prediction of the mass effect of that collision.
So you have to have some conservation of mass, you have to have some
conservation of charge.
And you're trying to trying to coalesce all
of these different things to look for discrepancies that are occurring at almost an infinitesimal
fraction of the number of collisions.
It's very hard to wrap your mind around.
That are predicted to be occurring, or not occurring.
That's right.
They may not even occur.
Yeah.
You've got the particles colliding.
Everything that comes out of that collision,
you have to detect.
Because if you miss something,
it's beginning to sound a lot like Calibration,
which is a subject we should get to later.
If you miss something,
then that collision is worthless to you.
And you're likely to be fooled.
Because you're likely to think some particle was created
that wasn't created because you do have to have conservation of energy.
So if a certain amount of energy goes off in one direction,
you better get the same amount of energy going off in the other direction.
And ideally, with every particle you detect and you detect the energy in the particles,
that's part of what the detector does.
And you recreate this collision.
You know that everything's balanced.
If it's not balanced, does that mean that your detector failed?
Or that a particle was created that your detector couldn't see because it's beyond the standard
model?
Now, you have to know your detector better than parts per billion issues because you have
to be able to estimate whether or not your detector failed.
So on some level, this idea that science is about signal to noise,
you have to be able to predict the noise almost perfectly,
to understand the noise almost perfectly.
And this is a sort of repeating theme and everything
I've written about, even if I don't phrase it like that.
And NUSI, we used to talk about signal to noise problems
all the time.
If you don't understand your background,
which in high- energy physics is the standard
model predictions of what you would expect to see if there's nothing new out there, and
what you would expect to see from the understood flaws of the equipment. So that's the issue,
the problem we're confronting. So you've got parts per billion effects that you're looking for,
or parts per trillion effects, and you have to understand the flaws in your detector to
better than that. And so what's his name? Rum's spelled what I said, you got your known unknowns
and your unknown unknowns, and you're constantly trying to figure out your unknown unknowns,
and you never fully figure out your unknown unknowns. So physicists, we read about this, say, there's a talk
recently of a new particle possibly being discovered. And physicists say, well, they haven't
discovered it yet, because it's only a three sigma effect. And a three sigma effect is what,
do you know this better than I do, a point zero, zero, zero?
Probably zero, zero, zero, one, five, or something like that, but our threshold would be 6 sigma to really say.
So the reason for the 6 sigma, actually 5 sigma, is because you want to make a lot of room for the unknown unknowns.
You want to say, this is how I perceive it, we can calculate that the sigma is a calculation of the...
Standard deviation, the difference from the mean.
Yeah, but there's going to be stuff you don't know. There's going to be stuff that's going to
fool you that you don't know. So you want to have an event that's so absolutely fundamentally
frigging outside the range of what you think you know, that you can confidently assume that what
you don't know, the unknown, unknown, unknown, can't explain it either in its new physics.
Before you go further, there's one thing I want you
to explain to listeners, because it's a beautiful part
of physics, but it doesn't exist in every other field
of science, which is the relationship between the theoretical
and the experimental physicists.
I think it's worth pointing out here
how that is a collaborative nature in physics.
Yeah, physicists divides up into theory and experiment.
Okay, it wasn't always quite so clear, but even if you think of, you know, Einstein was
a theorist, Eddington, who did the seminal test of the theory of relativity, was an experimentalist.
So as physics evolves, you have people who wrestle with the theory, and then they have
to come up with a theory that makes predictions that can be
experimentally tested, and then you have experts who are experimentalists,
whose job is to do the experiment.
And then there's got to be crosstalk between the theorists and the experimentalists
because the experimentalists have to know precisely what they're looking for.
And they need the help of the theorists to tell them how this might manifest itself.
And the experimentalists have to understand their detectors, their experiments to this
sort of superhuman ability.
That's one of the defining characteristics of an experimentalist is the awareness of
of exquisite physicists, is the awareness of how their equipment could fool them.
And so this how physics has always developed.
And then you get to the point that theorists
at job is to come up with a good theory,
and make a prediction, and the experimentalist
at job is to test that in fields and biology and medicine,
you don't have that diversion, which I think is a problem.
And we talked about this through our years.
You've got the theorists and the experimentalists
are merged into one person.
Everyone's a theorist.
Everyone should be capable
of doing the experiment, and I think they're
two entirely different skill sets.
And they both require such a fine level of detail
and thinking and rigor that I'm not sure that,
you know, after a certain point, like say, from 1900 on,
when I don't think one person could encompass both aspects of their
almost different endeavors.
So to get back to Surn, because this is where it's relevant, when I get to Surn, I move
into the hostel, I'm hanging out with the physicist.
Once Ruby published the W&Z discoveries, and everyone knew this was a place to be in
physics, even before that.
He collected a group of very ambitious young, mostly men, physicists from some of the best
physics schools in the world who came to work with him to analyze the data.
Has UA1 and UA2 collapsed at this point?
It's always both.
You always need both detectors because if you see something at one you still need independent
replication. Yeah, so if it's if you don't see it in both you got a problem
That's why even now the arch-hedron collider concern has four I think four or five different detectors
It's not enough to see something in one detector
You always need independent replication and again, that's true of every science
So you have this young man who are doing the analysis. And they've become, they've moved in, they've come in late,
they've become sort of the chosen quartiers of rubia.
They hang with him, they eat with him,
they meet with him, they tell him what he wants to hear,
which is what's happening with the data.
And then you've got the physicists who actually built
the damn detector and the technicians who built the detector.
And these are, you know, these I used to think of them as they'd come from, they don't come from Harvard and Stanford, Oxford and Cambridge.
They work at like the Red Brick universities in the middle and in the UK and they've been there from the beginning.
They built this mansion sized piece of $30 million equipment with their hands. There's no mass produced. They know
how it can fool them because they built it. So as soon as I get there, I've got these young
physicists who are new to the experiment who are all excited about this great discovery, and I'm
also hanging with these the old hands who built the equipment by hand, and they're basically poo-pooing
everything and saying this is crazy.
We cannot understand the flaws in our detectors well enough to make the kind of claims that
Rubia has been making.
Now are they voicing that?
Well, they're trying to, but the problem is you need a group leader who's willing to hear that.
So once Rubia started talking that he had made a discovery
when Washington in 1984 1984 what he wants to
hear he's gone out in the limb right? I thought he only predicted it I didn't
think he actually said it had happened. You know you've probably read my book
more recently than I have. Even if you're claiming that it will happen if you
think of there's this term that the physicist used pathological science which
is a science of things that aren't so, and my whole career on some level has been writing this right or wrong.
Now I think I understand pathological science and I've studied it as much as any human being
alive.
One of the defining characteristics of pathological science is people commit themselves
publicly to a result based on premature evidence. It's not I am glad they haven't
locked it down. There's still a chance they could be wrong and they don't understand the
likelihood of that chance. No one ever does until you get very good at this when you realize
that chance is enormous. Science is supposed to be hypothesis and test, right? The idea is
you're supposed to rigorously test your hypotheses and ideally you're trying
to prove that you're wrong.
Richard Feynman, first principle of science, said, you must not fool yourself and you're
the easiest person to fool.
So you get evidence that you've discovered something new, your assumption should be that
you're wrong, that somehow your equipment is fooling yourself and now you try and go
out to demonstrate that's true and you do everything you can to prove it.
And if you fail to find out how you're wrong, now you write a paper that says evidence
for the observation of Particle X question mark and you put together a presentation and
you go around, you start in your department at whatever university you're at and you give
the presentation to your colleagues and basically you're asking them to explain to
you how you fucked up if you're part in the language because surely you did.
And if they can explain it to you, you march around the country of the world
giving this symposium, you still haven't published the paper, the evidence for
the observation of particle X question mark, because you're still working
under the assumption that you screwed up and you just don't understand either your equipment.
And as you're traveling around the country, people should be saying to you, well, what about
this?
Did you think of this?
Did you know that the, you know, the muon detectors, actually, there's a glitch in the detectors
that are made in Belgium versus the ones made in Switzerland and the ones made in Belgium
are going to spark once every seven and a half months.
Have you checked that? And if you checked it great, if you didn't, you go back to Surn or wherever you seven and a half months. Have you checked that?
And if you checked it great, if you didn't,
you go back to Surn or wherever you are on your check it.
And so you keep giving this lecture.
And finally, at the end of, I don't know, 20 seminars,
nobody can explain to you how you fucked up.
Now you publish the paper with the question mark.
And what you're asking them to do is tell you
how you screwed off, because if you assume that you did
and then they tell you, you don't resist it. But if you claim that you didn't and you publish the paper first,
now what you're doing, instead of that exercise of trying to find out how you
assuredly screwed up, you're trying to collect evidence for how you got the right answer.
So you go from paying attention to the negative evidence to paying attention to the positive
evidence. Again, everyone does this. I'm accused of that all the time.
It's like the study comes out that confirms what I believe.
I don't question it the same way.
All you will question a study that comes out that appears
to challenge it.
So once Ruby had gone to the point where he was
predicting a discovery, all he wanted to hear from his
physicist was the evidence supporting that.
That's the first mistake along the way.
So the physicist who wanted to tell him about the problems with the detector,
and all the ways the detector could fool them, and what they weren't thinking about,
and what they were missing, if those people did their job, they were sort of seen as downers.
So after a while, what happens is you get the sort of polarization in the experiment
where the principal investigator tends to
selectively listen to the people who support what he wants to believe and
selectively resist suddenly resist the people who argue against it and
Then as the experiment goes on the more the people who tell them what they want to believe confirm as beliefs then that
who tell them what they want to believe, confirm his beliefs, then that confirms his belief that he should not listen to the critics, the skeptics. And this happens in all of science, is what we've
seen in an extreme way in COVID. It goes on in climate change, it goes on in nutrition and
chronic disease. Once you decide you know what the truth is, you tend to stop listening to the
people who disagree with you and then you surround yourself with the people.
This is also classic group length theory.
At what point did Rubia realize you were no longer his scribe in residence during this
10 month period?
Probably when he read a draft of the book.
Carla was interesting because he had enormous self-confidence.
You have to picture these larger than life.
He's Italian. He's overweight. he's got this exuberance.
He's a Nobel Laureate.
He's a Nobel Laureate. I went to Stockholm with him.
You know, I was fitted for my tuxedo with Carlo.
I mean, that's how close we were.
Once I started realizing there were serious issues
with this experiment, I would challenge him on this.
So I didn't hide my
sketch. I would say, well, you're saying this, but so and so, from Edinburgh or
College de France or University of Pisa is telling me this, I wouldn't out them
like that, but this is what I'm hearing from your experimenters. And Carla, my
take, was that he was something of a pathological liar. So he defined truth as what was convenient for him that he could get people to believe.
So I would challenge him on an issue and say, well, you say this, but it's not what I'm
hearing from your researchers.
And he would say, yeah, well, that's a good point.
Then he would take his step back and tell me something else, hoping I would believe in.
If I challenge that, he would acknowledge it and and tell me something else when we just keep going. It never got to the point where I
felt from him any awareness that what I was gonna write was not the story he
wanted to tell. And I think this is a classic problem with you know people of
his sort of quality and strength of ego which is he couldn't imagine that I
could see the universe differently than he did. I don't think he could
imagine that anyone could really see the universe differently than he did. I don't think he could imagine that anyone could really see the universe differently than he did.
When I wrote the book, it was called Don't Bell Dreams, and described what had happened.
He had an executive committee at the experiment made up of, you know,
a dozen or so physicists from all these collaborating institutions, and he asked,
he wanted to sue me, and he asked them to join in the suit so that would have more power.
They refused and what they didn't tell him is that most of them had read my book and
critiqued it in draft. It's a favorite of me. Didn't say that in the book because that I said
that in the book I would have created problems for them professionally. The book was accurate,
so but he didn't see the universe that way.
Did you like the book itself?
Uh, the writing makes me cringe. I can see the book it could have been. You know, I came out when I was...
...still 29 years old, I guess. The story in it is interesting, but I can't read it. My 29 year old prose still makes me cringe, so.
It came out in what year 87ish?
Maybe it was February 87, yeah. So I would have been not 29, but 30.
Did you go back to discover at that point or did you immediately jump into the investigative work
that would ultimately become the book bad science? No, I went back to discover. I was freelancing for discover. And actually,
it's interesting. When the book came out, it was February 1987. I was living in the Upper West
Side of Manhattan. My publisher's idea of getting me publicity was to get page six in the New
York Post, which is the gossip column. So I heard that page six was covering it. And it was,
I remember it was a dark snowy day in New York and dreary
and I went across the street and went down Columbus Avenue and there was scaffolding and
there was a newspaper stand and I bought the New York Post.
The headline of page 6 is egghead squabble over Nobel Prize and in it this Nobel laureate
Carla Ruby is calling me a 30 year old writer in New York and asshole. And I'm
thinking that my career and yeah I'm never gonna eat lunch in this town again and
my career in journalism is over. And as I continue to report stories I start
talking to researchers about my book and Rubia and they want to know about Rubia
and they'll say to me well if you think Rubia's bad you should write about
someone so. And it turns out that there are characters like him in every area of science.
So we can talk about this pathological science. People tend to think of fraud, right?
And when you're committing fraud and science, you're knowingly trying to fool other people
by manipulating the evidence. But there are two ways you can manipulate the evidence.
You could create a signal, or you could do an inadequate job of studying your background.
Remember, it's a signal to noise problem. So you could fudge a signal or create a signal or move a signal or you could ignore the background.
And if you ignore the background into a poor job in the background, you'll still end up with a big signal to noise ratio.
You haven't technically committed fraud. You're not actively trying to fool anyone else, you're fooling yourself. And that's the kind of science Ruby had
did. He was ambitious, he was looking for discoveries, doing a background analysis
is the hard, relentless, rigorous, grunt work of science. It's endless and thankless. Because
if you do it right, all you do is prove that you were wrong all along Publicizing a signal and getting a Nobel Prize is a good part, right? How do you reconcile that dichotomy?
Which is this is a guy that's not committing fraud of course, but in your words
He's doing bad science because he's omitting to do the background check
But at the same time he is the guy through sheer personality and force of will that discovers
these two bosons that up until the discovery of the Higgs boson were the last two pieces of
this puzzle. And I don't think anybody's disputing the work that went into that Nobel Prize.
It's an unnerving revelation, right? It suggests that even at the uppermost echelon of science,
you can have people that are committing these omissions.
One of the messages is in my book.
I didn't go into this in great debt,
but the papers had won him the Nobel Prize,
and again, I'm remembering from almost 40 years ago now.
Those are probably wrong, too.
He rushed it, I think, it would be generous, yeah.
He rushed them, yeah.
Because he was competing, but the particles had to exist.
That discovery had
already been made, the neutral currents in 1973 or 1974, I think it was Fermilab. So those
particles had to exist, and they had to exist with the characteristics that he was looking
for. So when he published the discovery based on, I think it was seven events that had these
characteristics, no one asked were these really W particles.
In retrospect, most who all of them probably were detector flaws that they didn't understand yet,
that they still had to work out over the next year when they're trying to discover particles beyond the standard model.
This was the issue with Ruby. It was the reason he got the award immediately is there was a political issue behind all this also which is since World War II all the major
discoveries in physics had been made in the US. The US had spent the big money on
particle accelerators. So this was Surn's opportunity, Europe's opportunity to
achieve relevance again. And the award was given almost instantaneously because
this discovery made Europe basically redistributed the center of gravity of the physics world,
back to, you know, of nothing else to center the Atlantic.
When I was reporting the book, one of the things I did was travel around the country and actually around Europe talking to other physicists about their work, their discoveries, and about Rubia, and trying to rationalize what they
told me with who he was and who I thought he was in defense of Rubia.
None of this gets done without him.
No, I'm going to rephrase that.
None of it gets done as quickly without him.
It all gets done better without him.
There are people who, from the experiment, who Ruby
had ground up and chewed up and spit out whose careers basically came to an end
because of him. And I said this, and I remember the introduction of the book, I
said there were sort of three types of physicists in the world. There were those
who were at Ruby's level and told me he was brilliant. There were those who knew
Ruby well and said he was
very smart guy but they wouldn't want to work for him. And then there were those who worked for him.
There were points in the course of the experiment I would sit in on group meetings and watch some young French physicist be reduced to tears by Ruby as bullying. And bullying can be very
beneficial in science because you're really trying to force someone to think critically
and force any sloppiness out of them.
But there were times I considered like,
look, just ask Ruby to step inside.
Don't write the book, punch him out.
Well, physics be the better for it.
I decided physics might be, but I couldn't afford it.
So, yeah, that's what you wrestle with.
It's a young writer.
The reviewers who didn't like my book,
which included Christopher Laman Hount at the New York Times,
and Jeremy Bernstein was in New York
or physics writer writing, I remember correctly,
for the Boston Globe.
They thought I had no right to write the book I wrote,
to take on a physicist of this level, too.
I think it was Bernstein who said this sort
of sociology and character flaws of physicists at this level don't interest him in the least.
The only thing that's important is this glorious cathedral of knowledge wasn't my interest.
So when did you happen upon Flaishman and Ponds?
I went back to discover as a freelance, right?
I had a terrific contract with discover.
I loved living in Paris
where I wrote no-bout dreams, so I moved back to Paris writing for discoverer and then time
ink sold discover. Science magazines have trouble making money because it turns out the only people
who really read them are high school boys, or at least back then, are high school boys and they
don't spend a lot of money, so they don't support advertising revenue. So timing so discover to a low budget publisher
and I lost my freelance in gig
and I had a choice of living in Paris
and getting by on scrapping to do freelance articles
or moving to the States.
I had friends in LA who were writing screenplays
and making enormous sums of money
and they said as have been said the screenwriters in the past come to LA
Write screenplays get rich do what you want. So I book about this? And I needed to supplement my income.
I thought I could spend a year doing the book
and make enough money to write screenplays
for two years after that.
And then as is my wand, I got obsessed
and ended up spending three years on the book
and owed my father $40,000 when I was done.
I bonded with many writers over the years,
young writers about how much money we owed our parents
by the time we were done with our first book. You know, it was another fascinating story. I actually read the
so-called fusion to chemist Stan Ponds from the University of Utah and his mentor, Martin
Fleischman from Southampton University in the UK announced that they've basically created nuclear
fusion in a test tube. March 23rd, 1989, it's the front page of the Wall Street Journal
within a day.
It's front page news all around the country.
The very first article I read in the Los Angeles Times mentioned
there was a competing group at Brigham Young University.
And as soon as I saw that there was a competing group,
I assumed they were wrong.
Because as you mentioned, with UA1, UA2,
you have two experiments competing with each other.
So the motivation is to establish reliable knowledge
beyond reasonable doubt.
That's what you're trying to do in science.
But if your motivation is also to beat the other people
to the reliable knowledge,
you're going to allow one motivation to compromise the other. And now the beyond reliable
doubt part is going to be tossed out in an effort to win. That was one of the fundamental
problems at Surn, with the work you I won was doing. You I too got everything right, because
once they came in second, they were perfectly happy to do the good job. I think they were
doing all along. On first principles, did you just think it was ridiculous based on your knowledge of
physics?
No, I didn't know enough to think that.
I just smelled bad science.
It's been a while since I read bad science, actually.
I think I read that actually the first one I went back to read, but did they claim it was
net energy positive?
Yeah, the idea is, how do you make nuclear, for example while you put heavy water in a test tube with an electrode
made of palladium and you plug it into the wall and the deuterium gets sucked into the
palladium and the idea somehow gets compressed so much that it fuses and generates ideally
neutrons, which are one sign of future gamma rays or some radioactivity, which would tell
you that there was a nuclear reaction going on,
if not at least generating a lot of heat.
So ponds and flashmen had had an explosion in their laboratory
with one of these cold fusion cells.
Was it being run by ponds' son?
It might have been their garage, I forget the details.
And when the cell exploded, right, if you're looking for,
trying to create nuclear fusion and test tube
and your cell explodes, ereka. But even more important that it wasn't
just that there was a competition with the people of physicists to bring them
young 30 miles, 40 miles down the highway from the University of Utah in Salt
Lake City. But the physicists that bring them young was had studied something
called muon-catalyzed fusion, which was a type of cold fusion of a sort, and had
been shown Ponds and Flicements proposal, and had started working on this after reading
Ponds and Flicements proposal, so Ponds and Flicements thought their discovery was being stolen
from them by a physicist who should know enough to know whether it's right or not.
And as I discuss this at the very end of my research when I spent about eight hours with the then president
of the University of Utah, a very thoughtful, wonderful man,
named Chase Peterson.
And he was explaining to me, we were talking about their thought
process and going public with something
that became this huge fiasco.
He said, yeah, if somebody's nothing makes your girlfriend
look more attractive than your best friend trying to steal her from you.
And so once they decided that these people at Brigham Young were trying to steal it from
them, they decided it must be right.
They had to throw press release.
They threw a press release and this enormous phenomenon kicks in where everyone in the
world tries to replicate it because it holds the promise of infinite free energy.
It's the wealth of OPEC and anyone can study it.
It costs like $3,000 worth of materials
to put together a cold fusion cell.
So you have basically created a bad science generator.
When did it become abundantly clear to you
what was going on?
How long into your investigation?
This is a very different kind of story
because this is immediately front page news.
So when I jump into it, there are meetings,
there are conferences, there are other people discussing this.
I probably flew out to Utah within a week
of having that conversation with my editor.
And Utah is, of course, surrounded by journalists
who want to know if infinite free energy has been created.
Funny coincidence. The Wall Street Journal ran it on the front page. The author
was a science writer for the Wall Street Journal named Jeremy Bishop and he
sort of took... So I write a book explaining why Jeremy Bishop was sort of not
thinking things through clearly when he put this on the front page. I moved back to New York in 1993 or so.
I moved on to a block on the Upper West Side of Manhattan where
Jerry Bishop and I are looking into each other's windows
from across the street.
Anyway, the world was finding out that this was wrong.
I'm following all the attempts to replicate it.
In a sense, this was how science is supposed to work, right? Somebody makes a claim of a discovery and they publish
the data necessary to replicate that discovery by independent laboratories.
And so independent laboratories go out and try to do that. And around the world,
chemists and physicists try to do that. And most people got the right answer,
which is called fusion, doesn't exist, but a half dozen laboratories around the world for reasons attended to be different in each lab got the wrong answer.
Usually, we can talk about that, I mean there would be a lot of interesting reasons why.
So the phenomena keeps going because of the signal being generated from these six labs,
well, again, all these other labs in the world are working out the background.
And some of the best scientists in the world are involved with this as a major discovery.
If it's true, so one of the things that attracted was some of the very best physicists and nuclear
physicists in the world, and then I had the end chemists, and I had the opportunity to work closely
with these people, basically to embed myself with them, while they tried to replicate the experiment
and did in effect the background analysis at Pons and Fleischmann and Steve Jones of
Brigham Young never had the time to do.
So now you get to find out all the ways your equipment could have fooled you and assuredly
did, and that's the science working the way it should work, and so by June of that year
it was clear that this was wrong, that they had pontons and fly. That whole thing was wrong. And then I continued reporting it
for about nine months to tie up all the loose ends to explain some of the other
positive results. But it was clear to the world by within two to three months
that this was just wrong. And then it faded from the papers. And then you wrote a
book about this. Of course, so the book was really less about letting everybody know
this was wrong.
The purpose of the book was to explain how it became wrong,
right? How did this happen?
Yeah, I became obsessed with how this happened.
Still would fascinate me.
So, you know, when I'm writing about nutrition
and chronic disease, obesity, diabetes,
the question is how do people establish conventional wisdom,
dogma, the ruling theory
and any science and on what evidence? And in this case, cold fusion, I wrote it as kind
of I thought as a sort of case study in bad science, something in my fantasy, something
that every graduate student who goes into science would be required to read so they could see
the price, all the different ways you're going to fool yourself. And the price of doing
so, of claiming something to be true that then later turns out not to be.
What are the hallmarks of pathological science?
Again, there was a great paper written about this that wasn't it brought back to light after
the cold fusion debacle. Irving Langmere, Nobel laureate chemist, gave a talk at IBM in 1957
on pathological. He got a pathological science,
which is a science of things that aren't so. These are things he said. It's not about fraud or
manipulation. The data is about the scientist not realizing how easily they can be fooled.
And there are certain signs in symptom. The effects tend to be at the very level of the equipment's ability to detect them. There were four criteria that he listed.
I should remember them, but I can't.
Yeah, there was a sensitivity analysis in there as well.
Small perturbations.
But let me give you an example sort of how this plays out.
So you've got these test tubes, right?
And this was a common thing.
And they're supposedly generating excess heat.
But the physicists are saying if you've created nuclear fusion, you should be generating gamma rays and
neutrons. It should be radioactive. Chemists at the Georgia Tech Research Institute, which is
attached to Georgia Tech University, they put together co-fusion cells. And they borrow neutron
detectors. In fact, you could think of them as Geiger counters
from the physics department.
And they've got these cold fusion cells
and they plug them in.
The cells are bubbling away
and they hold the detectors over the cold fusion cells
and the detectors go off.
And then they move the detector away
and the detector stops rattling
and then they bring it back over the cold fusion cell
and it starts detecting neutrons. Clearly. So they write up a paper, they have a press
conference, and at the press conference, one of the local physicists from Georgia
Tech says to them, do you realize that these neutron detectors are humidity
sensitive? Now they're chemists. They've never worked with neutron detectors
before. So they, this at the press conference, and they go,
tell us more about that. And they say, well, when you held it over the cell,
your, the cell was bubbling. It's humid. That might have been why it went off, right?
They're doing what scientists are supposed to do, ideally, before publication.
So they go back to the lab, and now they do what they should have done all along as they
do a control.
So, control is same experiment without the deuterium just do it with water.
So now you're generating, yeah, now they hold the detector over the control and it goes
off and they hold it over and they realize within a day of their press conference, probably
12 hours of their press conference that they've screwed up.
And the reason they screwed up is because they, in effect, chose the wrong control.
They thought the control was just not holding it over the cell,
but they weren't controlling for the humidity.
It's back to understanding your background.
What are all the other possible causes of what we're seeing
and we have to go through methodically
and this is what takes time, time, time,
and you need the help of every smart person you know to point out all the different things
that you might have thought for that you now have to design an experiment to control
for.
So everywhere in the country, somebody did an experiment and claimed that they had seen
signs of cold fusion.
Ultimately, you could explain it away as some aspect of the background that they didn't
understand.
And it was always so it was chemists trying to do physics.
It was physicists trying to do chemistry.
And often, if the cells are really creating nuclear fusion,
they got to be generating more heat than they're taking in
from the wall.
You know, they're plugged into the wall.
So they have to generate more heat than they,
and expenditure has to be greater than intake.
It's a Chico thing here.
And the signs of calorimetry is very complicated.
So there are people out there who spend their whole lives doing nothing but calorimetry.
They were called calorimetris in the chemical world.
And they were capable of doing these experiments right.
When they did the experiments, they saw no exosete.
When somebody who wasn't a professional calorimetris did the experiment,
they inevitably saw a little bit of excess heat off them because they didn't realize
that there's some feedback going down the cord so you're gonna get energy coming
back into the cell from the power source. If you haven't spent your whole life
doing this, you don't know the details of the background.
How long did it take for pawns and flashmen to come to grips with their errors?
Well, I don't think they ever did.
One of them's not even alive anymore, right?
So he went to his grave believing.
I think they both might have passed away by now.
Fleischmann would have been at least 50 in 1989.
So, you know, it's funny.
There's a section of the Twitter sphere
that likes to point out that I was asked once at a bat in England,
whether or not I would ever change my mind
about the uselessness of the energy balance equation for obesity.
And I said, no, I doubt it.
So constantly I get these tweets directed at me where people say, look, why are you even
discussing with tabs?
He said he would never change his mind anyway.
And the reason I said that is because you have phenomenon like ponds and flashments.
I witness people live through this.
It takes a kind of superhuman intellect to be able to say something I believe beyond
reasonable doubt and have staked my reputation on is actually wrong.
It's always easier for somebody to believe personally that all the other experiments screwed
up.
People just didn't do the experiment, right?
Now, I used to joke that even if cold fusion was real, someday,
back when I made this joke, I was probably thinking around 2021, there will be the
Stanpons Memorial Cold Fusion nuclear plan, you know, on the coast between San Diego and LA,
pumping out enough energy from three beakers of heavy water to power the entire southern empire,
and I'm going to be crawling around the outside of that perimeter
fence with, you know, a beard and pieces of Dunkin' Donuts and a beard and a tattered copy
of bad science looking for where the damn thing is plugged in.
Because I know it's a con, right? I am never going to be able to change my mind and accept.
But then on the other hand, my argument to the cold fusion people was if you want us to believe it,
great cold fusion cells make billions of dollars. The day, my argument to the cold fusion people is if you want us to believe it, great cold fusion cells make billions of dollars.
The day that my neighbor buys a cold fusion power car, I'm going to have to start believing it or I'm in trouble.
Until then, I think an enormous amount of skepticism, if not closing my mind is perhaps a correct thing to do.
Well, I do want to come back to the point about your flexibility or lack thereof with respect
to current belief systems around nutrition, but let's now really pivot from a career that
spans a decade and a half, writing about physics, and other means of science to the pursuit
of public health.
How did your curiosity get peaked there?
What took you into a new avenue of science?
The called fusion book, Bad Science. I interviewed around 300 people for this book.
Horace Frelan Judson, who was one of the great science writers of the late 20th century,
he wrote a book about the history of the molecular biology revolution. Anyway, he said I had done more
research for the stupidest scientific subject than any human being alive.
I was obsessed with how this had happened, how people made the mistakes, how the sociology
worked.
Again, I became obsessed with pathological science as a concept.
And when I was done with the book, I had a lot of friends in the physics community who
really respected and appreciated what I did, and some of my friends in the physics community
were involved in this debate about whether or not
electromagnetic fields from power lines can cause cancer and that involved so on one level physics
because what we know about physics is they can cause cancer because their wavelengths are too big to interact with something as small as cells and
on the other side you have epidemiology you have the science of
disease health diseases spread and researchers saying we can measure
the field strength around power lines and somehow make an association between that and the
levels of cancer and people living near the power lines.
And so my physicists' friends were horrified at the level of science they were seeing.
And when my book came out, they said, well, look, if you're interested in bad science,
maybe you should look at this stuff in public health,
because it's terrible. And so the very first article I did,
I did an article for the Atlantic on the power line cancer connection.
And it was dependent on this field of epidemiology,
which I had never paid any attention to, never had any reason to,
but I had spent the previous six, seven years
learning about how hard science is to get right, and being schooled and tutored and
lectured into believing that you have to be extraordinarily rigorous and methodical and relentless.
Or you'll get the wrong answer. And here was a field where basically people just said,
we're going to look at this, we're going to look at that, we're going to get an association,
we're going to assume it's causal. They consider doing the kind of rigorous testing of
hypotheses to be a luxury that they couldn't afford. So I wrote the Atlantic piece, and then I wrote
a piece on epidemiology for the journal science that was infamous. I was used to be that if you
wanted to publish critics of the field,
first you published Alvin Feinstein who was at Yale, and then it was Feinstein and Talbs,
and now it's Ion-80s. And Feinstein and Talbs have kind of been dwarfed by John Ion-80s
contribution. And Feinstein was a controversial character. Both great pieces, by the way,
which we'll be linking to in the show notes. Thank you. So fine-steam's piece, you've got a field of science where you can't
do the experiments. So that's the first problem. That's a problem with every field where there's an
active controversy. If you can do the experiments, easily, like cold fusion, I think I've discovered
infinite free energy. Well, tell me how you did it. Let me see if I can do the same thing.
And if I can't, we ought to talk.
And if he can't and she can't and they can't,
then you're probably screwed up, dude.
If you can't do the experiment, then depending
how interesting it is, you then develop a field of science
around basically a hypothesis.
And often, as more and more people see the same phenomenon,
but without using the rigorous experimental techniques of an experimental science,
they start to believe that this hypothesis is true.
And you, again, what I describe with Rubia,
with this sort of group thing phenomenon,
where you collect the people who see the signal and you
Ignore the people who work on the background in epidemiology. There was never any significant work on the background
So you could think of epidemiology where you you have a cohort of people say the nurses health study the most famous in the US
110,000 nurses and you give them questionnaires food frequency questionnaires and you ask them what they're eating and they tell you what they're eating and then you follow them and you see who gets sick and who doesn't and then you look at what the people who
Get sick tended to be eating versus what the people who didn't get sick tend to be eating and you have an association between
eating and you have an association between diet and disease and there's no causal information in that association. But now you hypothesize that the
association is causal. Whatever they're eating or not eating causes disease.
And there's no way to rigorously test it. So the process of science breaks down.
Let's stop for a second Gary and explain to people why the intuitive is not so,
because I think it's worth spending a second on the fact that all of this stuff that we talk about
in terms of the rigor of scientific thinking, it's a relatively recent phenomenon, it's less than
400 years old. All of the things you're describing are less than 400 years old, which means from
an evolutionary perspective, they're non-existent. It's not wired into our DNA.
Just my dad that everything I'm describing is in, so Francis Bacon, 400, one years ago,
published as Nova Morgana, which is sort of the beginning text in the scientific method.
And everything's in there. I had to go back around 2001-2002. I read Bacon because I was thinking maybe I'm just crazy.
But if you think back to 4,000 years ago, 40,000 years ago, right? If you think back into the roots of our genes,
the ability to pattern recognize and make an inference that, by definition, is presumed to be causal from an association must have been a vital trait for our success.
I saw Larry doing X.
Larry has more mating opportunities than me.
Do an X must be the reason that Larry has
more mating opportunities than me.
That's true.
So you make the association and then you test it.
I'm going to do X and see if I get more mating.
And evolution is going to test it, right? Because if John now does X and it ends up getting them killed,
but we didn't codify it. I think that's the difference. We didn't live long enough to see the result
because the arc of the information coming back is too long. So you're right.
Evolution did follow the experiment, but at the individual level we never got to learn from it.
So, come along 401 years ago, and of course many steps along the way with the introduction of true randomization,
statistics, which became an important tool, etc.
You have this concept that is very foreign to people, and while anybody can say very glibly,
well, correlation doesn't equal causation, I think anybody has heard that and understands what it means.
I don't think it's entirely clear to people why control and randomization matter and how ubiquitous bias is.
So why is it that people living next to telephone poles who have a higher likelihood of getting cancer are not getting it because
of the telephone poles or the power lines rather.
Let's understand our background, right?
So now this is the...
So the background is all the ways you can be fooled.
Okay, so we see more cancers close to the power lines than we see far away.
Can we explain that as something other than the power lines cause cancer?
And so what else could explain this? That's the issue.
And now you've opened up a door where we have an infinite number of possibilities. That's why right now,
the probability that your hypothesis correct has gone down enormously as soon as you accept the
reality that there are an infinite number of other possibilities that you have to roll out.
Some are very likely. You prefer to use, you prefer to use, you know, their first order factors
and second order, and as you go down,
they become less and less likely.
And this, when we talked about the physicist
needing a five sigma effect,
that's because they're gonna say,
well, we could roll out all the major factors
and we wanna leave a lot of room
for the fourth, fifth, sixth order variables
because we can't get to everything.
So maybe people who live closer to power lines
are poorer than people who live farther away from power lines. They might be a slightly lower
socioeconomic status, okay? Nobody wants to live next to power lines. They're unsightly,
they make noises, so if you can afford not to, you don't. So maybe what you think is being caused by
the cancer is actually a result of socioeconomic
status.
I think a lot of what the epidemiologists publish, the false positives that could be explained
by socioeconomic status.
Now, the epidemiologists will tell you that they correct for that.
Don't they look at the household incomes?
If they look at the household incomes, there are major studies, the nurses' health study,
the most famous study in America, never looked at household incomes. The point is many people don't. So,
let's say you had to get the health records of people, right? To know how much cancers they had,
so how did you get their health records? Well, we called the people up and we talked to them and
asked them if they wanted to be in bomb
this studying, we get their health records. Well, the people we call, we got to
consider we're calling them during the day. So maybe the people answered the
phone are the people who aren't working, right? So maybe there's a socio
economic step. Maybe there's a bias there because the more afluent people, the
people actually have jobs are not home cancer the phone during the day, right?
Maybe, so there's a whole world of things like that.
Or maybe the people more likely to participate in the study are the ones that have worse outcomes
worth reporting.
Want to participate in the study because God knows, you know, little Jimmy had a brain
cancer and they never understood
why and now you're giving them a reason.
So they want to be a part of their study.
And, you know, this is the process that Ruby had didn't like to do of doing the background
analysis.
It's very hard to do because you have to, each one of these factors is a hypothesis
in and of itself.
So you have to be able to test the hypothesis.
Maybe the biases in the phone sampling process caused what we see.
How do we test that?
You know, if you can't do an experiment, it's very hard to test.
And then there's this factor.
We were talking about Nobel dreams, my first book.
So Rubia and Company are spending the fall and the winter of 1985, 86 trying to
understand their background. They understand, trying to understand their background,
they understand more and more of their background,
they finally come to the revelation
that they're probably not seeing anything
and other people are doing the same thing also.
And there's a meeting in the Ioste Valley in Northern Italy
where they're gonna discuss all of this.
And I went to that meeting and in the meeting,
an elderly Italian physicist, I forget his
first name, his last name was Alterrelli, a very cool guy.
Alterrelli stands up in the meeting, Ruby and everybody is there.
And Alterrelli begins to explain all the different ways that the equipment could have
fooled Ruby into thinking every possible other explanation for the event. So maybe in this case, the muon comes off,
and it gets lodged in the infinitesimal dividing line
between the two detectors and you don't see it.
Maybe a new trino comes off on this one,
and it bounces off the side of the tal detector,
and you don't see it.
And by the time he's done, and the chapter in the book
was called the Alderale cocktail, and you come up see it. And by the time he's done, and the chapter in the book was called the
Alter-Eli cocktail, and you come up with all these different ways.
One of this, one of that, one of these are all the unknown, unknown.
So you can now imagine them because every, the around the world physicists
have spent months trying to.
And it's clear that if you begin to understand the background and all the
unknown, unknown, unknown, you could explain everything Ruby and company has seen.
It's not a five sigma result. it's not even a one sigma result that's where the discovery
died was in that meeting and all of a sudden again this is program my thinking in cold fusion
all these different labs that had published positive results I told you what happened at the
Georgia Tech research and said they all had a different explanation there's another version of
the alter LA cocktail but the point was whenever they saw something
that confirmed their preconceptions, they accepted it.
Okay, so there's a selection bias in what you pay attention to,
and you're always going to pay less attention
to the things you don't want to see.
So in epidemiology, people might look for one explanation,
say I believe socioeconomic status probably explains a good deal of these phenomena
And people just don't measure it or they don't measure it right
But even if it can only explain a quarter of the phenomenon or a tenth
I can guarantee you that there are probably nine other things that could cover the other nine the differences in epidemiology
Right you're looking for
95% confidence level you're willing to claim a causal effect based on,
there is no causal information
and you're willing to claim a causality based on two sigma,
not five, not three, but a 95% confidence level.
When if you actually try to think about it,
if you said, let's treat this like the physicist's
treat their science, we can go through
and we can come up with so many possible alternative explanations
that there's a very high likelihood
that what you're seeing claiming
is a discovery as a false positive.
In the case of that example,
what would be the alternative experiment?
How would we, if we're trying to answer
the public health question, which is,
it's inevitable that we have power lines, we have to make an infrastructure decision, which is can they traverse the areas where people live?
One would argue that the length of time it would take to answer that question, which would be to randomize people, independent of every variable imaginable, assuming you could get people
willing to comport to such an experiment for a long enough period of time to see a result.
I mean, you can quickly convince yourself that it's simply impossible to do a true experiment
here, and yet an answer to that question should be known.
So what is good enough?
Well, and that's the debate.
This is what the field of chronic disease epidemiology
has been going through.
So infectious disease epidemiology,
like with COVID, is a very different issue.
You can, in effect, do randomized control trials
like you do with vaccines.
Here, as you just explained, they're effectively impossible.
Certainly with something like power lines,
because you're never gonna be able to randomized people
to live near power lines or away from power lines.
You know, I think they're just behind the limit of science. So the question is how do you deal with that issue?
The conventional thinking has been to embrace this idea of prudent avoidance.
So you know, we do that when we wear masks for COVID, but prudent avoidance says you create regulations to
try and minimize whatever fields are being
emitted from these power lines, so the power lines pay more.
They're not avoiding a potential harm, doesn't come without its own harms, as we know in
the COVID debate about what happens with shutdowns.
So you can say, well, we can regulate the power lines better, but that means the power companies
are going to raise their rates.
And now people are going to have to pay more for power, which means their socioeconomic status
is going to be lower per dollar.
There are positives and negatives, and these have become social value judgments more than
anything.
The question is also what's the incumbent on the researchers.
You know, we're both every fan of science tends to be a Richard
Feynman science fan, if not a Richard Feynman as an individual has become
apparently far more conflicting. Feynman says, and I quote this in the beginning
good-good calories, bad calories, a fundamental, what you need fundamentally and
good science is this sort of bending over backwards to be honest about what
you know and what you don't know.
So as soon as you start doing, and he says this in his 1974 Caltech commencement talk,
where he talks about the difference between advertising, good science, good science, the
it's incumbent upon the investigators to talk about all the limitations in their research.
And I think in an ideal article, you know, published journal report, the limitation section would be longer
than the result section.
Because what we want to know is all the ways they could have fooled themselves realizing
that they very likely did.
And then to discuss it with complete and utter honesty, rigor.
And I have critics who would say I don't do that.
Personally, I try to do that the best I possibly can because I think that's the ethical moral
obligation of doing
science or writing about science is not overexaggerating. And if the people with the
scientists covering the power line controversy said, we think we know this, but these are other
things we don't know. Nobody acts, that's the problem. So if they're right, like we think cigarettes
cause lung cancer,
but we never did a randomized control trial. Now the lung cancer effect is huge,
that's like a six sigma effect, even if there's no causal information, so we could kind of believe
that's causal because we can't think of any alternative explanation for why smokers have
such a hugely increased risk. If you're honest about what you see, that's fulfilling the obligation.
The scientist has been now, nobody changes their behavior. So if you're right, that say power lines
cause cancer, people are going to get cancer because you haven't forcefully enough made your argument.
And that's the conflict of public health science. How do we get people to change their behavior when
we really don't know if they should. And this is an enormous problem.
It's the dietary salt, dietary fat.
It's where we're going to be going in the next couple of minutes here.
So you've struggled through power lines and cancer.
You've come to wrestle with the pitfalls of epidemiology.
What brings you into the den of nutrition?
I'm living in LA, Venice near the beach.
Doing freelance science journalism and working
on screenplays probably still, not making any money, so my friends were wrong about
at least my ability to get rich writing screenplays.
I call up my editor at science and I say, look, I got to pay the rent next month.
I need a story I could turn over quickly.
Do you have anything I could write that I'll generate a paycheck?
And so, researchers have just reported, they're reporting on the first results on the dash
diet.
This dietary approach is to stop hypertension.
It's a low fat, lots of fruits and vegetables, dietary approach, and the papers coming out
in the New England Journal of Medicine and they have a pre-parent, they have a pre-release
copy, it's embargoed, so I should write about this.
And you've never written about nutrition and health science.
Never written about nutrition.
No.
Okay, so he's pretty desperate if he's giving you this because this is a little outside
of your wheelhouse, right?
Yeah, it's one page in the magazine.
It's not a lot going on.
So the way you do one page story, right, you call up the principal investigator.
If the article hasn't been published yet, you asked the PI, you interview him,
you asked the PI for the names of a couple of other people who know about the research.
You could comment three interviews for one page is doing your job.
So it takes a morning to do the interviews and afternoon to write the article.
It's a day and a half. I get my rent money.
What I didn't know about this article is it had been leaked to write the article. It's a day and a half. I get my rent money.
What I didn't know about this article is it had been leaked to science in advance,
and it had been leaked with a list of researchers who I could interview.
This is what year? This was 1998 or 9. My editor gives me the list. It doesn't tell me it was
leaked. Or if he did, I didn't pay attention. So I get the article I call up the principal investigator Larry Appell at Johns Hopkins. I interview him
I ask him for the names of people I could talk to I call up one of the people on the list
This is a former president the American Heart Association at the University of Alabama Birmingham and this person tells me that they can talk about
The paper or they'll lose their funding and I say it's this we're talking about a diet trial. The New England Journal
of Medicine, nobody loses their funding for talking about that. It's going to be not
going to come out to after the embargo. The woman she refuses to talk to me. She won't
tell me anything. I said, look, if there's something wrong with this paper, let's go off
the record. Not for attribution, like Woodward and Bernstein and the garage with deep throat.
Tell me what's wrong with this paper, what the issue is, because if you don't tell me,
I'll never know and I'll report it incorrectly. She refuses even to do that.
So then I get off the phone with her. I call up one of the people that Larry Appell with
Johns Hopkins has given me and he's a researcher who starts yelling at me.
This guy is the grand old man of the field.
I don't know this.
And he starts yelling at me over the phone that there's no controversy over salt and blood
pressure.
And I say, I'm not calling about salt and blood pressure.
Professor, I'm calling about this diet trial that lowered blood pressure in the New England
Journal of Medicine.
And he continues to berate me that there's no controversy.
So I get off the phone with him,
and I call up my editor at Science.
I said I had an American Heart Association
former president refused to talk to me
because she would lose her funding she said if she did.
And then I had this other guy yelling at me
that there's no controversy over salt and blood pressure
when I'm not writing about salt and blood pressure.
There must be a controversy about salt and blood pressure. There must be a
controversy about salt and blood pressure that I know nothing about. So I'm
going to write up this article, get my paycheck, and then if you don't mind, I'm
going to look into this assumed salt blood pressure controversy and see what
we're missing. And I spent the next nine months reporting that I interviewed about
85 people for one magazine article. Turns out that it is one of the most vitriolic
controversies in the history of medicine and even though already by 1998 we'd all been eating low-salt diets in America for
15 to 20 years it was clear that the randomized control evidence never really supported the intervention and that it was backed up by a lot of bad
epidemiology and
intervention, and that it was backed up by a lot of bad epidemiology and research is assuming that associations were causal that weren't, and that we're even questionable associations
to begin with. And while I was doing that story, this fellow who was yelling at me, I liked
to joke, he sounded exactly like Walter Mathau over the telephone. By the way, he's still
alive, he's about 101 years old, so while I'm running him down as a scientist, that's
evidence that maybe he understands nutrition and diet
far better than I do.
Anyway, while I was interviewing him, this Walter Math Alkharacter,
it was clearly the, you know, I had spent 10 years in my life
studying bad science, was clear this guy was one of the worst
scientists I'd ever interviewed in my life in the bottom five,
at least. And I thought I had of the worst scientists I'd ever interviewed in my life. In the bottom five, at least.
And I thought I had interviewed the worst.
He took credit not just for getting Americans to eat less salt, but eat less fat as well
for the low fat diet we had all been on since 1984.
So at one point I called on my editor and I said, look, this guy was involved in this
controversy, the fat controversy.
And anyway, fat story, there's got to be a story there.
The message from Nobel dreams and bad science was that bad scientists never get the right
answer.
It's just it's too hard to get the right answer for you to go in, being sloppy and slip
shot and lazy and ambitious and get a right, nature isn't that kind.
So I said, when I'm done writing about salt,
I'm going to write about fat.
I have no idea what the story is.
I've been eating, I've been living in LA,
eating my egg whites, and probably a 15% fat diet
in Ornish would have been proud of me.
But if the dogma was based in any substantive way
on this fellow's work, there's a story there.
And I spent a year writing that piece. I interviewed 145 researchers and administrators
for one magazine article.
The big fat lie?
No, this was the soft science of dietary fat.
So this was, the salt story was called
the political science of salt, political impurentices.
And the fat story was called
the soft science of dietary fat,
soft in parentheses. They both won National Association of Science, Riders, Science and Society awards.
After you wrote these two articles, one on the soft science of fat and the other on the
political science of salt, what made you decide to go even further and write what would become
perhaps the biggest and most controversial piece you wrote at least in the newspaper.
I believe it was for the Times Magazine, wasn't it?
Yeah, New York Times Magazine.
I wanted to write a book when I was done with the two science articles on whatever was
happening in medical science that could lead to these kinds of mistakes.
Remember my obsession is pathological science.
The nutrition aspect of it is just an interesting vehicle
through which to explore it.
I realized that if I did a book then I would go broke.
I was married, I had responsibilities.
Remember I had come out $40,000 in debt
just doing the cold fusion book.
It was clear I didn't work fast and I didn't want to work fast.
And that I would not be able to get in advance
large enough to cover the time it would take to do the book.
So I was living in New York, I was having lunch once a month with an editor from the New York Times magazine,
because among other things, we shared an affinity for the same local French cafe in the village.
And we would talk about story ideas, and we decided it might be a good idea to see about, to ask
a question what caused the obesity epidemic.
And I said to this guy when I was reporting the dietary fat story for science, I had had
met up with the administrator from the NIH who said, you know, it's interesting when we
told people to go on low fat diets in 1984, we assumed we really didn't have the evidence
to support the heart disease connection and the message of my story
As they never got that evidence, but we thought if nothing else we'd be telling people to avoid the densest calories in the diet
So if they avoided fat, they'd lose weight and that would take care of the obesity and
Overweight of the greatest risk factors for heart disease and he said lo and behold now we have an obesity epidemic and
and he said, lo and behold, now we have an obesity epidemic. And apparently people stopped eating fat,
needing more carbohydrates and that got them fatter.
So I always had this two hypotheses
for what caused the obesity epidemic,
which you can see in the data begins
somewhere between 1978 and 1991.
And it coincides with two fundamental changes
in the American diet.
One was the embracing of the idea
that a low fat diet is a hard healthy diet. So carbohydrates in general go from being considered inherently fattening,
which is sort of the conventional wisdom of till in the 1960s, and then they get transformed
into hard-healthy diet foods. And you may be too young to remember this, but we all stop
eating butter and started having pasta and bagels every day
and lo and behold everybody starts getting fatter. The other thing is high fructose corn syrup came in
1970, 778 high fructose corn syrup 55 comes in which can replace sugar in Coca-Cola and Pepsi and by
1984 saturates the beverage industry and this coincides with the beginning of the epidemic
and people like Michael Pollan had suggested
that high-frocose corn syrup was a cause of the epidemic.
So we decided I would write an article
about what might have caused the obesity epidemic
was new enough then that people cared
and this would be an important story.
And in the course of writing that article,
I came upon five studies that had been
finished, but not yet published, that I'll been discussed in conferences, so I could discuss
them in the article, which had compared the Atkins Diet, which is a low carbohydrate, high fat,
eat as much as you want, diet to the kind of low fat calorie restricted diet, the American Heart Association was pushing.
And then all five trials, the Echons diet, not only did people lose more weight, but their
heart disease risk factors improved.
So remember, I'm coming at this kind of from my program to think from a scientific perspective.
I won't say that I'm coming from it to it as a scientist because I know people don't like to hear that from a journalist.
There are two fundamental, there are two hypotheses out there.
One is that people get fat because they eat too much.
And the other is that people get heart disease because they eat high fat foods.
And now you want a diet trial where you compare a high fat eat as much as want, diets to a low fat calorie restricted diet.
And your two hypotheses would predict that the high fat diet, the adkins diet, those
patients would get fatter because they could eat as much as they want.
And clearly they got fat to begin with because they ate too much.
And that they would have worse heart disease risk factors.
And in both cases, a hypothesis failed to pan out.
So from a scientific perspective,
the first five clinical trials on low carb,
high fat diets, actually it was the second through six.
There was one or earlier one that saw the same thing
that no one discussed,
refuted two of the fundamental hypotheses
of modern nutrition science.
How did you find these?
I mean, I know one of them in there is the Minnesota Heart Study, which wasn't published until
1989, right?
Yeah, no, this is Eric Westman's first trial at Duke.
There was a group at the VA hospital in Pennsylvania that was doing a trial.
There was some researchers in Long Island that did a trial.
Why weren't they published?
They just hadn't been published yet.
They had been finished and the research they were talking about.
Then once I did my article, it made it both more important for them to get published.
And simultaneously harder for them to get published because nobody wanted to hear.
They wanted to see what these people really had, but not actually
have to publish what these people really had.
But over the next two or three years, they all came out in the journals.
How did you sort of come to grips with this?
This is now challenging your own beliefs.
Presumably, you had believed that this low fat diet was healthier.
Had you ever struggled with your weight?
Yeah, it was a college football player.
So in college, I tried to be as heavy as humanly possible
and eating constantly.
I don't know if you did this in college,
but then we would go out at 11 o'clock at night
to LCs, or a diner across the street from the dorm
and eat 1,000 calories or 1,500 calories
worth of the biggest sandwiches they had
and then go home and go to bed.
I mean, that's what you, you know,
anyway, I could get up to 240. My boxing weight was 212 and
once I turned 30 and simultaneously had moved to California, I just started drifting upward. While I was reporting the science article, I was simultaneously doing a piece for Discover Magazine on the mathematics of the stock market.
And so I was up at MIT interviewing an economist who ran the laboratory for financial engineering
up there about his research, trying to establish whether people like Warren Buffett or
Broyan, or just Lucky, which is an interesting question because you're looking for the signal
of talent over the signal of luck, over the background of luck.
And you don't, of course, fully understand your background.
So we got to talking about good science and bad science.
I told him about the dietary fat article I was writing,
and he said, oh, well, if you're doing a story on fat,
you got to try Atkins.
He said, his collaborator wardens father
lost 200 pounds on Atkins.
And this fellow on MIT is an Asian American.
And he said,, basically gave up white
rice and lost 40 pounds.
So I went back to LA where I was living.
At that point, I was unmarried.
My parents had passed away.
I had no children.
If I killed myself on Atkins, the only one who might care were my cats.
And so I did Atkins as an experiment and lost 25 pounds in six weeks.
And then like everyone else, I sort of drifted off
the diet by the time I started this New York Times magazine piece. If you go back to read the New
York Times magazine article, there's a line in it that says overweight of course is caused by
taking in more calories in your expand, which today I think is a meaningless statement in both wrong
and meaningless. There were a lot of ways I had already, because of my research on the dietary fat story,
I could already accept the idea that these diets would not raise the risk of heart disease.
But it was clear to me that dietary fat story was uncompaling at best.
And there was no compelling evidence to avoid saturated fats.
The question about the fundamental cause of obesity and how best to regulate weight was something I didn't really understand when I did
that article. But that exceedingly controversial article on the front page
of the New York Times magazine will indeed get you a large book advance if the
subject's one of natural interests to readers. It did get me a large book
advance. People like to say that that's the only reason I came to the
controversial conclusions I did because it would sell more books.
They don't realize that a large advance in New York was enough to live for four years
and the book took me five years.
That's when I started the research for Good Telleries, Bad Calleries, and got obsessed with
that story.
And how did you were thinking of the nuance around this topic evolved during the writing
of Good Telleries, Bad Calleries the writing of Good Keller's bad calories?
I mean, it's still evolving.
That's what so bizarre about this is I still wake up three in the morning thinking, why
did I say that 13 years ago?
So there's these two issues, right?
Which I mean, I don't think we'll have time to go into both of them.
The one being the saturated fat issue, which is that dietary saturated fat drives atherosclerosis
and the other being the cause of obesity,
being calorie imbalance. And you kind of go after both of these in parallel, but they're really
different. They're not mutually exclusive. Right, but they're related. They're very related,
yes. And that's the issue. You know, the dietary heart story was we get heart disease because of the saturated fat content of the diet
elevating LDL cholesterol and here a guy like Atkins comes along and says well look don't eat carbohydrates
So if you want to minimize your risk of heart disease by the dietary fat story you avoid fats
Saturated fats you replace them with carbohydrates back then now they say monoun saturated or unsaturated fats and
with carbohydrates back then, now they say monounsaturated or unsaturated fats. And you're now eating a carbohydrate-rich diet, and in theory, you're going to minimize
your risk of heart disease.
The flip side was that carbohydrates are fat-thing.
That's the simplest way to describe what's now called the carbohydrate insulin model.
To those of us who gain weight easily, it's a carbohydrates that do it, and we can eat
carbs. But if you don't eat carbs, what are you going to replace those with? And your
choices are protein and fat and protein usually comes with fat attached and real foods.
And so you're going to increase your fat consumption one way or the other and now you're reading
a diet that then theory is supposed to kill you. And that was the one of the many arguments
against Atkins was it's going to trigger our hard to see is and you're going to drop dead.
When I first tried it as an experiment, I was kept waiting to drop dead and I'm still
waiting.
I mean, I still have a piece of bacon and pat a butter and I wait for my heart to blow up.
So one of the issues I had to deal with in the book is how did we come to believe that
dietary fats problem?
So the first third of the book is the evolution
of the, it's the soft science of dietary fat at work length that how did this theory
evolve? What was the real data? What were the problems with the data? Why is it likely wrong?
And then the second part of the book is that as doing the research, I realize that there
had always been a competing hypothesis, which is that the chronic diseases that associate with modern diets and lifestyles are driven by the carbohydrate content of the diet.
So this started as a British hypothesis, British nutritionists, and then it should have embraced
in the 1960s, the research by people like Jerry Reeve and others on insulin resistance,
and all of that targeted the quality and the type of carbohydrates
reading by CMC index and the fructose con. And I didn't know anything about that
when I started this research. And so the first third of good calories, bad
calories, is the deconstruction of the fat hypothesis. The second third is the
replacement with a carbohydrate centric hypothesis. And then the third third
of the second half of the book actually is this question of obesity
because it's clear that whatever causes obesity is so closely associated with our disease and diabetes,
type 2 diabetes.
Now whatever makes us fat also causes these chronic diseases that associate with us.
And the question is what makes us fat?
And now I, doing the research, I also came upon a whole other stream of
research that had kind of been ignored. First, by pre-World War II European researchers
saying obesity is not an energy balance problem, it's not caused by people eating too much,
it's a hormonal regulatory disorder. And then post-World War II, by people studying
primarily the science of hunger, saying,
in effect, hunger is caused by fuel availability.
If your liver thinks it's in a fuel available, then you're not going to be hungry.
And when your liver starts thinking fuel is being crimped, then it'll release inhibitions on food-seeking behavior.
And that worked, done by people in this field of physiological psychology, all implicated. The insulin, as the hormone determining fuel availability, the primary hormone.
And then there was also, and this I think is what gets me most, until 1930, virtually no
one studied fat tissue.
In all these sort of burgeoning fields of physiology and endocrinology, nobody really
cared.
The assumption was it was a nerd, it was for padding and cushioning,
and if you start people,
somehow the fat tissue would shrink.
They had no way to study fat,
tissue or fat accumulation.
Then beginning in 1933 with this work
of this German emigre,
Schoenheimer at Columbia,
you start having people studying fat tissue,
they become aware that mobilization,
deposition goes into fat, goes on constantly.
Like even when you're in between meals, you're starving, your body is still depositing calories
as fat and mobilizing calories as fat.
And through the 1940s and 50s, they begin to work out the details as more and more tools
become available to study this problem. And by the mid 1960s, you have very well established science of what we could call intermediary
metabolism, which is what your body does with the foods, the proteins, fats, and carbohydrates
after you eat them.
And then what your body continues to do to make fuel available as necessary to your body.
And all of this is left out of the science of obesity.
As soon as the obesity researcher starts saying,
we've demonstrated that obesity's caused by energy balance,
imbalance, in political way of saying that is
people eat too much.
That's a fact of life.
We can see it in our animal models,
because when we create an obese animal model,
it tends to be what they call hyperfagia,
which is very hungry.
So we have an association between hunger and obesity
and we assume causality.
And they just kind of ignore the entire science
of famitambalism.
And even to this day when I try to make that point,
it's excruciating that nobody wants to hear it.
Nobody wants to deal with it.
And if I had to prove it to an editor, I could.
Ideally, I would have the fact checker fly out to my office at Oakland and say, let's go
through these textbooks.
From 1965 onward, and I'll show you the complete and utter absence of any discussion of the science of fat metabolism, fat storage from the obesity discussion. And from
the 1960s on where the obesity researchers are dominated, well obesity
research in the 1960s is dominated by psychiatrists and psychologists trying to
get fat people to eat less. People like Mickey Stunkard, it was a wonderful man,
but his expertise with psychology. And this idea that obesity is a hormonal dysregulation of fat storage, fatty acid oxidation and storage,
a fuel partitioning disorder gets left out of the field.
Who do you most credit for that changing, that new course in the 1930s there about?
Where did this idea that is the dominant point of view with respect
to obesity being caused by caloric imbalance rather than being the result?
Because there has to be luck sometimes, right?
Like sometimes there's a dominant personality winds out or something like that.
You think of a field like diabetes where Elliott Jocelyn opens the first diabetes clinic
in the US dedicated.
Diabetes clinic in the US around 1900.
And by 1916 he's seen a thousand patients,
which is about 950 more diabetic patients
than any other doctor in America.
So he writes the first textbook in 1916
and he does another edition in 1917 and 1923 and 28 and 33.
And we're now on, I think,
edition 14 of Jocelyn's Diabetes Malitis.
Jocelyn's became the determinant of truth, the arbitrator of truth in the diabetes world.
One researcher, one physician.
It's a wonderful physician, not a great scientist.
Because of his position in the field, this doesn't happen in other, I don't know if it
happened, it didn't happen in physics because there are too many bright people arguing about the data and not enough people outside care.
But in medicine, you get fields in which single individuals like the famous breast cancer surgeon from Hopkins.
All set.
Yeah. It's maybe you a kind of unique phenomena in science, in obesity research. There was no such thing as obesity research, pre-1930.
If when you read the literature and you go back
and look at the literature, there's half a dozen physicians
around the world who are writing articles about obesity
and often with some statement of causality.
And there's kind of two competing hypotheses.
One is fat people just eat too much.
And you know that's true because you have an association
between gluttony and obesity, right?
They watch Shakespeare. We know fall staff was obese.
He had the zest for living. He ate people out of house and home.
So therefore, you assume causality from an association that's classic pathological science.
And the other is that people who suffer from obesity have some kind of hormonal dysray,
some kind of dysregulation
where they're gonna get fat regardless of how much they eat.
So 1930 Lewis Newberg comes along.
He's a physician at the University of Michigan.
He does what he considers the first experimental test
of the hormonal regulatory disorder hypothesis
that you get fat no matter how much you eat or how little.
He runs a test on six or seven people, he discusses it at a medical meeting in Boston.
So this guy is now, we'll call him, if not the Newton of the field, the Eddington of modern.
You know, this is the guy's name that everyone, if obesity research was a functioning science, every obesity researcher would be insulted if they, you thought they didn't know who Lewis Newberg was.
He publishes a series of papers in 1930 based on 3031, based on the one experiment,
with like seven patients, where he basically starves lean people and patients with obesity and says they lose weight at roughly the same rate when star,
therefore the obese people, I'm not making this up, got fat because they ate too much.
He saw this as refuting the conventional hormonal regulatory hypothesis. Nobody's ever done an
experiment in obesity before. So, Newberg's papers begin to be taken as gospel.
And even though there are European researchers,
particularly Julius Bauer, who is one of the pioneers
of the science of endocrinology at the University of Vienna
and Austria, which are the Germans and Austrians
at this period of time, are this sort of apex
of the height of medical science.
He's writing that this is naive and can't possibly be
right based on a series of what I think are very logical and undeniable observations.
Even back in 1920, as soon as Newberg published his article, Bauer coauthors an article,
the first thing he ever writes in English, explaining why Newberg's argument is a name,
that this idea that people get fat because they take in more calories than they expend. 1938, the first animal model of obesity is pioneered at the laboratory
of fellow named Rans and the leading neuroanatomist of the era and by his
graduate student Albert Heatherington and they take what's called a stereotaxic
instrument where you could use it to direct the needle into the brain of an
animal and it been used on dogs, Heatherington pioneers, so we could use it to direct the needle into the brain of an animal and it been used on dogs,
Heatherington pioneers, so we could use it on rats, and he can reproducibly create obesity if they
lesion the ventramedial hypothalamus of the rat. Heatherington has a, he's a postdoc. There's a fellow
postdoc at the lab named John Brobeck, who 1939 gets accepted at Yale for medical school. He's got his doctorate.
He goes back to New Haven.
He needs to get a job to help finance medical school.
He gets a job in a laboratory and he realizes
it's a stereotaxic instrument
and he can do the same experiments that Heather intended.
And Brobeck starts doing the VMAsion experiments at Yale.
And Brobeck comes, creates what he believes he
discovers what he calls hyperphagic obesity. So you leesian the VMH of the rat,
the rat gets a beast and it gets crazy hungry and so you now have the
association between hunger and obesity. I've read that the rats got so hungry
that even as you're taking the anesthesia off them they're basically trying to eat as they're coming out of anesthesia.
I interviewed recently a researcher did these experiments in the 70s.
He compared it to gasping for breath.
So like if you are drowning and you suddenly come up above the water and you're going,
the animals are eating like that.
And it's a fascinating observation because you can explain it pretty easily. And in fact, if you don't allow to the anesthesia to
wear off, if you allow the animal to eat while it's still under the influence of
the anesthesia, it might choke to death and they often did. So the researchers had
to learn to give the animal a couple hours despite its evidence
starvation. Brobeck has read Newberg, okay? So Brobeck interprets this experiment as I created
hyperphasia, extreme hunger in the animals, that's the discovery and the animal got obese,
the fact that the animal got obese because when it's hyperphasia, it's just boring. Animals
get fat because they eat too much. I know that because that's what Newberg says. Back in
Chicago at Ransons Lab.
Let's pause for one sec because that's a very important point. I think this gets to something
you've already talked about twice, but I want to make sure the reader doesn't miss the subtlety
of this or the listener rather. The observation is undeniable. If you lesion the part of the brain
in question, the animal eats in an uncontrolled manner. However,
if your incoming hypothesis is that overeating leads to obesity, you will interpret it as
lesioning that part of the brain leads to overeating. If your hypothesis is that fat accumulation,
disregulated fat accumulation leads to obesity, you would interpret that finding as lesioning that part of the brain
leads to
disregulated fat accumulation, which then causes overeating. And that's a that's a very subtle
difference and I try to be fair to the people of the era to see how easy it could be to make that mistake.
Well, and now let's put it in the context
of remember the cold fusion.
What's your control?
So if your hypothesis is that eating too much
causes obesity in these animals when they're lesion,
the proper control is control for the eating too much.
So what I was gonna say back in Chicago,
Heatherington and Ranson had animals
that got obese, independent of the hyperphasia. So some of their animals got fat anyway. Actually,
some of Brobex animals also got fat, despite not eating any more than lean animals.
Did he ever pair feed his animals with non-lesioned animals? Brobex did. He did an experiment where he
pair fed them with non-lesioned animals. And if I remember correctly, it was a dozen animals, nine of them did not get
fatter, but three of them did, despite being paraped. So,
Brobeck, because he had a pre-existing hypothesis, assumed that there must have
been something wrong with these three animals, where he did the lesion
incorrectly, or maybe they were eating at different times. So he left them out of the analysis.
He mentions it in their paper, but they're the counterargument.
They're the evidence for this counterhypoth, which is something about the lesion, dysregulates
fat accumulation.
What do you think explains the other nine?
You can imagine that if you dysregulate the fat.
So let's say we take an animal that normally would grow naturally and we create a dysregulation in their fat tissue such that it's going to store access fat.
It might not be able to do that if you restrict its calories.
It's that simple.
We all know from personal experience we can lose weight by starving ourselves.
So we can somehow starve our predisposition to get fatatter by restricting the amount of food we eat.
The counterargument to the hyperphasia causes obesity.
Evidence is obesity being caused in the absence of hyperphasia.
So what Ranssen and Heatherington argued, Ranssen was a leading neural anatomist the day
he had just come from studying diabetes insipidus.
And in diabetes insipidus, you can also cause a bilision in a different part of the hypothalamus. You have animals that are extremely thirsty.
I forget the technical term. Yeah. So like any diet, they have endless thirst and they're
peeing constantly. So the conventional thinking might be the reason they're peeing so much is
because they're so thirsty. And the counterhypothesis is that the thirst is a response that the the lesion causes them to
pee constantly. They're losing body water. That makes them thirsty.
That's become unequivocally the case. We now know that diabetes in Separatus is the result
of the loss of the hormone called antidiretic hormone or DDAVP. The hyper drinking is in response
to replenishing the water that they're
losing because they can't concentrate urine.
Exactly. So, brands and things, well, if they're losing, if you're losing water and drinking
in response, maybe in these cases, the animal is losing calories, energy, and to its fat
tissue. And they're eating in response. The hyperphasia is a response to the loss of energy.
And some of his animals, the one that got obese, were also sedentary. So he said, maybe the ones that got obese without hyperphasia were sedentary. He said, maybe they reason
their sedentary is, because again, losing calories into the fat tissue. The fat tissue has been
is creating a sort of vacuum. And the animals responding, either by eating more,
exercising less.
So changes in energy balance are a response
to the fat tissue being driven to accumulate calories
or fat calories.
They published this paper three months later,
this 1942 ransom has a heart attack and dies.
He was eating too much fat.
Clearly.
Aetherington's postdoc is the middle-world War II joins the Air Force and leaves the
research behind. And the only voice left in the field is John Brobeck. And so Brobeck
dismisses their hypothesis, not really even understanding it is clear from the literature.
He still thinks they're trying to blame obesity on sedentary behavior,
because he's trapped in this energy balance thinking.
He doesn't realize that they're trying to blame
both the sedentary behavior and the hypophagia
on the drive to accumulate fat,
which is the fundamental different paradigm.
So Grobeck continues to write about
hypophagia causing obesity by the 1950s between
Brobex work and Newberg's work. The conventional wisdom is all fat people get fat
because they eat too much and there are textbooks that have a statement to
that effect in virtually those identical words. By the 1960s obesity as I said
is being the leading figures in the field, or psychologists,
and psychiatrists who are trying to explain why fat people eat too much and how to stop
them for meeting.
And there's a whole world of physiologists and biochemists studying fat metabolism, learning
about all the ways it's regulated, learning how it's deposition and mobilization of fat
go on independent of the nutritional state of the organism
that's a direct quote from one of the seminal papers.
And that what you should be studying is not how much people eat and exercise, but why
they accumulate so much fat.
What's the best explanation for why Elysian to the VMAGE, which is a very central, very,
very specific narrow part of the brain. Why would that have such a broad
peripheral consequence of fat hoarding within all of the adipose tissue throughout the
body? What is the mechanistic or best mechanistic explanation for that in that model?
Well, in my model, first observable effect from the B.M.H. lesion is hyperinsulinemia.
So you lesion the brain and the animal hypersecretes insulin in response to even thinking about
food.
Now if you think about it, an animal that's hypersecreting insulin isn't going to be
able to, so the insulin is signaling its fat tissue to take up fat and to store it for
food.
It's in storage for fuel, it's inhibiting the process of lapalysis.
It's also inhibiting the oxidation of fatty acids
and the muscle tissue through the malonyl-CoA pathway.
So think about what happens.
Now, to these ratchew people with tumors,
you have the surgery, you hypersacrete insulin,
you wake up from the surgery without carbohydrate supplies,
and unable to oxidize dietary fat.
So that would explain the hunger.
It's as though you've created a starvation state in the animal instantaneously.
You know, normally it's going to take a while for the animal to get to the point where it has to start cannibalizing its own muscle for fuel.
If you're starving it, but now you've done it instantaneously with delusion, and that
would explain the gasping for food that these animals manifest in the hyperphasia.
What's the relative increase in insulin that those animals experienced?
You know, I don't have those numbers.
That's an important thing, right?
I mean, if that's the conduit through which a central lesion has such a broad peripheral
implication, I would expect an enormous increase in insulin to produce that, a level that
wouldn't be otherwise physiologically described.
I guess I'm struggling with how that alone could do it.
Well, also, if you think about it, until 1960, they had no way to measure insulin in the
bloodstream.
But we could do it today is what I'm saying.
Yeah, I just don't know the number.
I mean, it was clearly enormous.
And it was observed.
And post 1960, you get yellow and berthins radium, you know, assays.
So now you could actually measure insulin levels from blood samples accurately and with sensitivity
and pretty quickly afterwards.
That's when they realize that the BMH lesion animals were hyperinsulinemic.
How did the O.B. O.B. mice also give us a clue into both ways to interpret the same observation?
Yeah, so the O.B. O.B. mouse is a mouse that's discovered at the Jackson Laboratories in Maine.
It's a mutant that manifests obesity, dramatic obesity, and turns out there's two different,
depending on what strain you breed these animals into.
The DBDB mouse is diabetic, obese and diabetic.
The OB-OB mouse is just obese.
The work is done at the Jackson laboratory.
What's interesting is the assumption is always right
that these animals get fed,
that they're missing some kind of satiety hormone.
The theories through the 1950s, based on Newberg's work and this idea that obese people get
fed, not because they partition fuel preferentially into fat tissue, but because they take in more
energy than they expend.
The dominant, the only hypotheses of obesity in the 1950s. There's a lipistat hypothesis from a guy named Kennedy and a glucose stat hypothesis from
Jean Maillier.
There's a thermostat hypothesis with body heat.
They're all hypotheses trying to explain eating too much in the animal.
And none of these pan out.
Not only they're not pan out, you've got these animals at the obese.
If you think they're caused by eating too much, again, what's the correct control?
Just pair feed the animal. Don't let the animal eat more than lean animal.
So this experiment is the numerous times with leptin, the OB-OB mice. If you take a lean animal, measure how much it's eating.
Give half of that to the OB-obi mouse. So you're literally
semi-starving it. It gets obese anyway. One of the fellow's who publishes results in Douglas
Coleman, but Jeff Friedman gets credit for doing the work that identified Leopthin as a hormone
that's missing in the obiobi animal, dysfunctional, and he did it based on Coleman's parabiosis
experiments that Jackson lab with the obBEOBE and DVDB animals.
So Jeff Friedman comes along.
The young ambitious researcher, he wants to look for the obesity hormone, the satiety hormone
that's missing in the OBEOBE mice, he comes with this assumption that what's missing is a signal
from the fat tissue to the brain, telling the brain not to eat too much.
When he discovers leptin, he interprets that signal as being the satiety hormone.
It's been identified as the satiety hormone ever since.
When people write books with titles like The Hungry Brain,
they're assuming that the absence of leptin makes a brain hungry.
The person gets hungry, eats too much, and that's why they get fat. And yet, you always have this
observation in the field that the animal gets fat even when it's half-starved. The point is,
I'm doing this book on diabetes now. I was curious, what is it about the DBDB animal? Why would a
the absence of a leptin receptor cause diabetes? And I never
really thought about it. So you look into the literature, we actually asked Rudy Leibov and
Rudy said, well, basically it depends what background you breed the animal onto. And then I went
back and I found where Douglas Coleman discussed it. And so the issue is both animals are obese,
both animals are hyper-incyloneinemia from weaning onward. So,
when the DVDB animals, the background strain can't sustain the hyper-insulinemia, so it's
pancreas fails and you manifest Frank Diabetes. The OB-OB strain can continue to keep pumping out
the necessary insulin. But now you've got leptin basically triggering hyper-insulinemia,
and it's always been known these animals are hyperinsulinemia.
And now you're back to the Bauer Ransin hypothesis.
I didn't say, by the way, that when Ransin and Heatherington interpreted the data the way
they did, they cited Julius Bauer.
So, Brobex sites Newberg.
He's got a preconceived opinion.
And Ransin had his own preconceived opinion based on the diabetes and syphilis work and Julius Bauer's papers supported that.
But we grew up with one hypothesis, which is the Brobeck eating too much hypothesis and
it influenced how the leptin work was interpreted.
It influenced virtually every experiment afterwards.
And then some body like me comes along and says, well, wait a minute.
All this evidence supports the alternative hypothesis
and you have not been considering that for 60, 70 years.
How can you even trust anything you've done
when you were in a way that there was a competing hypothesis?
It's an awkward place to be in.
Now, your work has brought a lot of attention
to this alternative hypothesis over the past decade
and there have been a number of people who have tried to test it. How would you reconcile the findings that have not demonstrated
what would be predicted by the carbohydrate insulin model?
This then gets into the kind of questions we talked about earlier about judging the value of
the experiments, how rigorously they're done, and the biases and preconceptions of the researchers.
So, and again, I've been having these arguments on Twitter this week,
and I have to stay the hell off Twitter.
And you know this, we funded these people at Newsy.
We funded two groups of researchers,
one who was inherently basically had a hypothesis
that dietary fat and fat balance was a driver of obesity,
and one that had a carbohydrate insulin model like
we did and the researchers who believe the Convention of Wisdom interpreted their results
as supporting the Convention of Wisdom and refuting the carbohydrate insulin model and
the researchers who believe the carbohydrate insulin model interpreted their results as supporting
that model.
So let's pause there for a second because I simply don't see how this field
can make any progress.
And I'm trying to understand how this field can make progress
in a way that physics does.
When as you point out everybody in this field is biased,
everybody has a point of view,
and everyone seems to do experiments
that simply confirm their point of view.
There's one other thing that makes this more difficult, which is the inherent messiness
of biology.
It's complicated as physics is, I think biology is more complicated.
We don't have a standard model of biology, the way we have a standard model of physics.
So there are more unknown unknowns in biology.
And then secondly, the experiments are far more difficult to control.
I think there's more noise in the biological experiments.
And then you couple that with everything we just said,
I just don't understand what a path forward looks like
towards a reconciliation.
I don't know if reconciliation is the word you're looking for.
You want to know which hypothesis is right.
Yeah, I mean a scientific reconciliation.
We're recapitulating the discussions we had
whatever, eight, ten years ago now,
when NUSI was starting.
One of my favorite stories is background 2009 after Good Calories, Bad Calories came out.
I was invited to lecture at the Pennington Biomedical Research Center, which is the largest
obesity research center in the country.
And I gave my lecture why we get fat at Opacity 101 and I suggested that the energy balance
hypothesis thinking was, I like to say not even wrong
Stealing from Wolfgang Paulie and then why all the reasons why it should be replaced with the hormonal regulatory
Disorder focusing on insulin and after my talk one of the faculty raised his hand very politely a gentleman who is probably then as old as I am now
Maybe 65 and he said excuse me,, Mr. Talb's, would
it be correct to assume that you think we are all idiots? Because the argument I was
making is that they embrace the wrong paradigm. And it was a tough question, because partially
the answer, I want to say, well, that's one way to look at it. I can't say that. So what
I said to them is I think the problem here is that when you entered the field,
there was a paradigm, a way of thinking about obesity
that seemed intuitively obvious,
this idea that it's an energy balance disorder
that you never questioned it.
And certainly your mentors didn't question it.
So you were assumed this had been well tested
and well proven and unambiguous
and that it deserved to be dogma
and it hadn't been and
it didn't and that's been a problem ever since.
Now we come along 50 years later, 40 years later and we have to get people to a huge proportion
of the community to entertain the possibility that their fundamental belief system is wrong.
So this isn't a subtle shift in thinking, this is you're operating under the wrong paradigm.
You think the earth is flat and it's round.
Or you think the sun rotates around the earth, but it's the other way around.
99.999 thrown as many 9s as you want.
Percent of the time people say that they're quacks.
So why are you not?
It's a good question.
I often argue this with my old friend, our old colleague Mark Friedman, and Mark says,
well, we're not because we're right.
And I say, well, every quack thinks they're right, Mark.
That's not evidence that we're not quacks.
That's the same totology that you argue against.
Yeah.
Well, the other argument isn't even better one.
I'm not a doctor, so technically I can't be a quack.
The best I could hope for is whack job.
I would offer a more compelling reason if you turn out to not be a quack, which is the
application of the current model is failing. That's probably the most compelling reason to
continue to question it, I would say. So in other words, if we believed that the earth
was the center of the universe, and every time we tried to launch a rocket into space,
it blew up because we failed to understand gravity
and orbits.
I would hope we would then say, God,
what if the earth is actually moving,
even though it doesn't feel like it?
That's where all these discussions start.
It's failure to prevent obesity,
failure to treat obesity,
the obesity and diabetes epidemics are out of control.
Let us question these fundamental hypotheses.
How do we test them?
I'm thinking at the moment, having done that, and moved to the next stage where we now have
competing researchers arguing, one says, I'm right, the other says, I'm right, I side
with the one who thinks like I do, how do we get people to care?
How do we move this? It's still this issue. How do you get to move it outside the diet issue
as we discuss over the years? The conventional wisdom comes with a lot of explanations
for why people don't lose weight on various dietary interventions. And some of those are true.
You know, people don't comply with interventions. They like, dude, nobody sticks with the diet. So maybe everybody's different.
And so maybe we have to invest huge sums
into individualized nutritional therapy.
Precision nutrition.
Yeah, that's the way the NIH is going.
So the idea as well, these people are arguing
it's carbohydrates.
Yeah, we don't believe that.
So we're not going to spend any money researching that.
We use their people believing it's fat.
There are people who think it's meat in an ideal world.
And we discuss this for eight years, right?
The federal government would say,
we COVID is going to pass.
And obesity and diabetes are going to go back
to being the long pole,
is using NASA terminology.
They're costing a billion dollars a year
in direct medical costs.
We have to understand how we failed
Because we have to stop blaming it on industry and blaming it on the
Individuals and so let us put together teams of researchers red teams blue teams
Multiple colors and hash this out and try and figure out how we failed to control these epidemics and until we do that
We're not going to reject any reasonable hypothesis.
But how would you even populate said teams?
I mean, to your point, there really aren't that many people that would still stand by this
idea that you propose.
I mean, if we're going to get practical here, Peter, we're not going to make any progress,
whatsoever.
Dreams, they'd say, populating a jury in a trial.
You pick people on bias scientists
who are good at what they do,
have demonstrated that they're good at what they do.
You can't pick obesity researchers
because they have biases.
You certainly can't pick nutritionists
because they have biases.
I question whether any of them ever taught
to think critically enough about scientific progress.
I mean, think about the kind of effort
that went into COVID.
The amount of money that was spent on research
prevent this disease from killing what,
one tenth of the number of people who
die from a chronic disease is related
to obesity and diabetes every year.
I mean, I'm not, I'm glad they did it.
But you make that kind of effort,
you can solve this problem.
And step one is saying, look, we failed.
The conventional wisdom fails.
It's clear it fails because we have obesity and diabetes epidemics that haven't been stemmed
in any way.
We have to question our assumptions.
Who actually does that job?
There's a lot of very intelligent critical thinkers out there.
The question is, how do you get them to care?
Now, some of them would argue that that hey, we know the answer and the paliability of food, the
availability of food, the affordability of food are driving this equation and the convenience
of modern life is making inactivity even a greater and greater issue. And so you might
argue, well, look, we haven't put all the steps in place to address those
issues.
The answer is going to be one through policy.
We have to make foods that don't taste as good.
You know what I mean?
Yeah, yeah.
I'll give you a funny story.
My son came home the other day and he said, Mommy, one of the kids at school today had
something for school and I really want it too.
And she was like, okay, what is it?
He's like, I don't know.
It came in a blue bag and it was a triangle.
And of course, it was cool ranch Doritos.
So what does Jill go and do?
She goes and buys cool ranch Doritos for Reese,
for lunch, for a little snack.
And you know, this is a kid who really likes to eat good food.
And so every day he gets a little ziploc
with like five cool ranch Doritos.
The problem is I can't eat just five cool ranch Doritos. So the other day, I'm inhaling half a bag of cool ranch Doritos. The problem is I can't eat just five cool ranch Doritos. So the other day,
I am inhaling half a bag of cool ranch Doritos and all I'm thinking is this is almost as impressive as
the Apollo 11 program from an engineering perspective. Like the way they made this thing, the crunch,
the taste, the lingering flavor, it's unbelievable. Like there's no denying how good this stuff is. So why did I just sit there and eat a whole bag of that thing?
You know, that's a very good question. But here's the counter argument. I gave a talk a couple months ago in Tahoe, and it was just as the ability to give talks and person was winding down. And the fellow invited me to give the talk is from Texas. Afterward sent me a gift box of
Wagyu
Stakes from a butcher in New York, and these are the Wagyu's that cost that it's about a hundred and seventy dollars a pound
One of them is called Wagyu sashimi. You're just looking at it. You know, it's by weight as much fat as protein and by calories and
is much fat as protein and by calories and pharmids, it's going to be 80% fat. And I actually
made one of these for lunch and I looked up how to do it. So you get the skillet and you use the fat on the edge to grease the skillet and hot and then it's a minute on each side.
And it was eight ounces and I couldn't finish it. It was so filling. My son Harry said, he got a taste. He said,
this is beef butter, dad. I still finished it. So it was delicious. However, even if I could afford it,
I still have some in the freezer. I didn't. It wasn't like a Dorito issue. So yeah, it's not quite
as accessible. But the question would then be, did I get fatter? Did I somehow
is accessible, but the question would then be did I get fatter, did I somehow dysregulate my fat tissue by eating that the way you might be doing eating the
Dorito? Was it the palability that drives your particular hyperphasia or is it
the peripheral response to the macronutrient composition that then drives your
hyperphasia when it comes to Doritos? I mean I just think if if you look at their response time, it has to be more central than peripheral.
It's barely exiting my esophagus while I'm inhaling these things.
But still, the response is going to be a response to an expected peripheral change.
So, you know, your body is evolution and homoestasis are very good at what they do.
They know what's about to happen. So they're going to prepare your body is evolution and almost, the homeostasis are very good at what they do. They know what's about to happen.
So they're gonna prepare your body for that to happen.
Think about your experience when you were 205 pounds
and swimming, working out three hours a day
and eating as healthy as any human being could eat.
Question is, again, why then,
and what's changed between then and now?
And that wasn't a Dorito induced effect.
You know, this idea that some people are just going to get fat are pre-programmed.
They're the quotas from George Bernard Shaw, but one of the leads that we're just going
to get fat no matter how much we eat.
So if that's the case, what triggers that?
The Doritos, I would say, trigger it where it, the Wagyu, which is far more calories does not.
How do we get people to care? How do you get people to write it wrong? There's a lot of people.
Clearly, the conventional wisdom is what you just argued. We know what causes it. It's too much
food available. It's very easy to blame industry. You know, Michael Moss has written two terrific
books. I got a lot of publicity because he's blaming industry. He's not blaming the scientists, he's not blaming the administrators or the government, he's blaming the industry.
It's very easy, when I was blaming the industry, people love what I do. As soon as I shift to blaming
the scientific community for doing unacceptable science, then I have trouble getting the message
heard. I don't know what the answer is, you know, it's something I think about every day. And when people are programmed to assume you're wrong,
how do you get them to accept the possibility that you're not
and that somebody has to study this?
And I don't know what the answer is.
I do know it's the challenge.
I said this before during the New Sea years.
It's like, you decide you're going to tilt it.
Windmills for life.
You've got to get used to the fact that the windmills
are going to kick in the ass when you ride by.
I don't know.
I think people do care, Gary.
I mean, I guess it depends how you define care.
People certainly care about obesity.
Are you really asking the question,
not how do we get people to care,
but how do we get people to question?
Yeah. And by people, I mean,
people in a position of authority
who are in the, have the ability to put the necessary
funds and effort towards addressing this question.
And I think one of the problems is that the way we fund research in this country, for instance,
is a kind of, you know, you get together study groups of mostly like-minded individuals and they
give out small, but these are all the issues we discussed at Newsy. They give out small R01 grants of $500,000 a year for five years to what
the Koon would have called normal science. So we have our conventional wisdom, our
dominant paradigm, we're just going to continue to fund whatever research
people have to do to create bricks to fit in that paradigm. There is no method by which
paradigms are questioned and research programs can shift. There's an assumption
that it could happen naturally. Science is self-correcting, but it's the way we
fund research in this country doesn't actually allow that to happen. If there's
money to be made, then capitalism will kick in and people will take advantage of
the opportunities that other people might be missing. And people will take advantage of the opportunities
that other people might be missing.
And that's why you have operations like verte health,
which is doing very well, advocating,
using nutritional ketosis to treat type 2 diabetes
and you have other people who have started operations,
like diet doctor too.
There is no mechanism.
So when I say people, I mean,
like how do I get Francis Collins to care?
And the journalist in living in Oakland talking on a podcast,
it's just so many levels from influence.
And even if you did get him to care, could you get the NIH
to do anything about it?
We're having this conversation the day after there
was an article in Science by two researchers
who we know very well arguing that this carbohydrate
insulin modem obesity is just wrong.
Fail.
It's interesting, we tested it at fail.
That's what we were talking about when we discussed why not, how do I reconcile my beliefs
with that.
My response was, okay, you leave out this people who tested it and that supported the model.
You're looking at only a small proportion of the evidence here, but even if you're
right, we have a problem. You've avoided the elephant in the living room, which
are the obesity and diabetes epidemics. What do we do about it?
The paper that came out yesterday by John Smeakman and Kevin Hall is a relatively
short paper, but an interesting summation of
what would be viewed as a possible refutation of this carbohydrate insulin model. I think if it
less is a public health paper, I don't see them trying to address obesity and I don't see them
offering or proposing to offer a solution to obesity. But I think they're really trying to understand
a mechanistic thing. And I don't know John, but I do know Kevin. And my view of Kevin is this guy who came in as an outsider. So he has at least the benefit of
not having some of the limitations that others might have who have come up through the field without
the historical context. And well, maybe Kevin doesn't have that historical context just as no one would,
at least he came in, I think, as a physicist looking at this from first principles,
which you could argue could be an advantage to him, right?
Yeah.
In fact, Phil, that's how I was introduced to Kevin by Mitchell Azar at Penn because he
thought we saw the problem in similar ways, which we did back then.
And then I introduced him to you, smart, young researcher without biases at the time.
Through the course of our interaction with the Energy Balance Consortium, I think our
belief system gets complicated.
If you recall, his initial claim to fame as a researcher would got him his position
in my age as I understand it is the model that he has of metabolism and how that influences
body weight.
And at one point, I had asked him to, you know, he had suggested we could use his model to test my beliefs and his model rejected them.
And I argued, well, you've rejected them because we're working in an area that your model has no data.
So why don't we do the experiment to generate that data and before Newsy hired him?
I was arguing to Kevin that he could do this experiment with an NM1 or 2 at NIH with the metabolic chamber.
As soon as they started interpreting their experiment, one of the
fundamental problems with pathological science is you establish what you believe based on premature data.
So you don't, rather than entertaining multiple hypotheses as possible explaining what you
see, you kind of collapse down onto one and then you go public with that prematurely.
And now you're trying to support that hypothesis rather than test that hypothesis.
So once Kevin started to interpret the evidence from the energy balance consortium as a
refutation of the Carbohydrate insulin model, I think it's basically, and supportive of
his computer model, I think it basically began to create his bias and lock it in. Then the more conflict he and I had,
which you witnessed, the more that locked in his bias
as it might have locked in.
My other possibility is that he's just right and I'm wrong,
which if we could do more experiments,
we might find out.
One of the critiques of that paper was on different types
of methodologies that are used to measure energy expenditure,
which is a very important thing to study and understand was on different types of methodologies that are used to measure energy expenditure, which
is a very important thing to study and understand when trying to test these hypotheses.
You've already alluded to one method, which is indirect calrimetry, where people sit inside
of a medical grade, hermatically sealed chamber, that can at very minute levels measure,
inhaled carbon dioxide, and calculate the consumption of oxygen and the
ratio of those things allows us to calculate very precisely the amount of energy consumed.
An alternative method to doing that is something called doubly labeled water, where a person drinks
water that has two different types. One is heavily labeled oxygen, the other
deuterium. And by collecting urine over a period of days following that ingestion,
you can also estimate energy expenditure,
but do so in a free living environment.
At the outset, you alluded to two experiments that were done in parallel.
The experiment that was done by the group that had the hypothesis
that energy balance determines obesity,
used the more precise of these,
calrymetry, the group that studied this
who had the preconceived
idea that the carbohydrate insulin model was correct, did a longer outpatient study and
used this other method. One of the critiques of the paper is that the method of doubly
labeled water is not valid in people who are carbohydrate restricted. And that explains
the difference in these findings, because otherwise they're very difficult to reconcile, right?
Right. Even the first Kevin Hall's energy balance paper, remember they saw a signal, so they
are measuring energy two different ways. I think of science differently than Kevin does.
You measure energy by the chamber, indirect calorimetry, and by DLW, energy expenditure.
Remember the reason we're measuring energy expenditures because the researchers didn't believe in the course of four weeks you could see an effect on fat mass.
So measuring fat mass directly by DEXA wasn't a possibility.
And I'm still curious because there was also evidence.
Remember back then DEXA is going to be confounded by water loss.
And so can you ever use DEXA reliably for the kind of measurements that are made over
say two weeks when you're transitioning somebody from a low fat to a very low carbohydrate diet and I don't think so.
I don't think so either and yet that's been done in the last two papers out of NIH.
So anyway, the point is you've got two ways to measure energy. They give you two different results. One result is more consistent with the energy balance prediction.
The other result is inconsistent with the energy balance and consistent with the carbohydrate
insulin model. One way to interpret this data is to say we measured energy expenditure
two different ways. We got two different results. Therefore our results were not very robust.
We tend to trust the chamber, but if the chamber
were right, we would have expected to see the same thing with the DLW and we didn't.
Unless the DLW is wrong and the chamber is the gold standard.
Which is a possibility, but the first thinking is we get divergent results here.
We have other issues with the experiment, non-randomized.
So let's design a different experiment.
You know, Gary, it's clear that there's probably still enough smoke to question whether there's
some fire with respect to these unanswered questions. Do you think this is an answerable question?
Well, I hope so. Clearly, I keep doing what I'm doing because I'm hoping I can motivate people to
think more deeply about this and
to resolve this issue. I don't think progress will ever be made, meaningful progress will be made
on obesity without understanding the fundamental cause and without elucidating how treatments are
working, although I have to say the GLP1 agonists are fascinating. So there's a public health problem,
there's a scientific issue. Things have changed.
So in the 20 years that we've done this, when I started this, conventional wisdom was that low-carb
high fat diets like Atkins, what we call keto and pre-2000 and 10 or so, that these diets were deadly,
that they would cause heart disease, that they would ultimately make you fatter. Today, for instance,
the American Diabetes Association
recommends these diets for type 2 diabetes,
which means one-tenth of the public.
They're everywhere, they're viral.
The world is saturated with books on keto.
And even this article in science,
we've been discussing acknowledged
that they can be beneficial for weight control.
And nobody's talking anymore about them causing heart disease.
In fact, they're probably the most studied diets in history.
And if you go to clinicaltrials.gov, you'll find over 200 trials in the works looking
at ketogenic diets for everything from epilepsy, diabetes, cancer, yeah.
The cancer and Alzheimer's.
Yeah, far beyond that, traumatic brain injury.
I mean, you've interviewed a lot of these people. So in one sense, we've made an enormous amount of progress
in liberating the intervention that comes out
of this way of thinking such that it can now be used for people.
And anyone who struggles with the weight
or blood sugar control can now know
that they can try these diets. And it's not hard to find a physician to help them guide them through it.
And from the personal perspective, we know that they won't kill you, which is what we assumed going in 20 years ago.
And so it would be nice if the research community understood obesity the way I think they should.
So it's a way of saying it would be nice if I was right about all this.
But even if I'm dead wrong, the ability to use this dietary intervention to improve your
health has now become widespread, mostly accepted.
And clearly there are teams of researchers all over the world who find the clinical efficacy
of these diets fascinating.
So the world is changing. It might not be changing as quickly as I would like and it may not change
as much as I would like. But like I said, even the science article the other day, there are statements
in there that would have been considered unacceptable 20 years ago, including the acceptance
of insulin plays a major role in fat accumulation and so obesity.
It's just not the role, they were arguing, it's just not the role that I think it is and
David Ludwig thinks it is.
Well Gary, we've been going out this for a while.
This was kind of a tour de force of science and a history of science in many ways.
I'm sure folks found your journey interesting, especially given that many people will know
you from your more recent work and I think many people will not know you from the foundation of work that led to your curiosity here
I've always found that to be the most interesting part of your personal story by the way is the way that you stumbled into
health sciences almost accidentally. I've always found it interesting and obviously I think a lot of good has come of it
So Gary, thanks so much for making the time especially in this case a lot of time probably more than your typical podcast
Thank you for having me.
It's been a pleasure and it's great to have an opportunity to not just catch up but
talk at you for three and a half hours.
Thank you for listening to this week's episode of The Drive.
If you're interested in diving deeper into any topics we discuss, we've created a membership
program that allows us to bring you more in-depth exclusive content without relying on paid ads. It's our goal to ensure members get back much more than the price
of the subscription. Now, that end, membership benefits include a bunch of things.
One, totally kick-ass comprehensive podcast show notes that detail every topic paper person
thing we discuss on each episode. The word on the street is, nobody's show notes rival these.
thing we discuss on each episode. The word on the street is nobody's show notes rival these.
Monthly AMA episodes are ask me anything episodes hearing these episodes completely.
Access to our private podcast feed that allows you to hear everything without having to listen to spills like this. The qualities which are a super short podcast that we release every Tuesday
through Friday highlighting the best questions, and tactics discussed on previous episodes of the drive.
This is a great way to catch up on previous episodes
without having to go back and necessarily listen to everyone.
Steep discounts on products that I believe in,
but for which I'm not getting paid to endorse.
And a whole bunch of other benefits
that we continue to trickle in as time goes on.
If you want to learn more and access these member-only
benefits,
you can head over to peteratiamd.com forward slash subscribe.
You can find me on Twitter, Instagram, and Facebook,
all with the ID, peteratiamd.
You can also leave us a review on Apple podcasts
or whatever podcast player you listen on.
This podcast is for general informational purposes only.
It does not constitute the practice
of medicine, nursing, or other professional health care services, including the giving of medical
advice. No doctor-patient relationship is formed. The use of this information and the materials
linked to this podcast is at the user's own risk. The content on this podcast is not intended to be
a substitute for professional medical advice,
diagnosis, or treatment. Users should not disregard or delay in obtaining medical advice
from any medical condition they have, and they should seek the assistance of their health care
professionals for any such conditions. Finally, I take conflicts of interest very seriously.
For all of my disclosures in the companies I invest in
or advise, please visit peteratiamd.com forward slash about where I keep an up-to-date and active
list of such companies. you you