Animal Spirits Podcast - Re-Kindled: Superforecasting (EP.95)
Episode Date: August 5, 2019On this edition of Re-Kindled we discuss Philip Tetlock's masterpiece on prediction, Superforecasting. Find complete shownotes on our blogs... Ben Carlson’s A Wealth of Common Sense Michael Batnick...’s The Irrelevant Investor Like us on Facebook And feel free to shoot us an email at animalspiritspod@gmail.com with any feedback, questions, recommendations, or ideas for future topics of conversation. Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
Welcome to Animal Spirits, a show about markets, life, and investing.
Join Michael Batnik and Ben Carlson as they talk about what they're reading, writing, and watching.
Michael Battenick and Ben Carlson work for Ritt Holt's Wealth Management.
All opinions expressed by Michael and Ben or any podcast guests are solely their own opinions
and do not reflect the opinion of Ritt Holt's wealth management.
This podcast is for informational purposes only and should not be relied upon for investment
decisions. Clients of Rithold's wealth management may maintain position,
and the securities discussed in this podcast.
Welcome to a new episode of Rekindled with Michael and Ben.
Today's show, we will be talking about the book Super Forecasting by Philip Tetlock and Dan Gardner.
For those who are unaware, Philip Tetlock is a professor who's made a career out of studying experts, basically.
His first big book was called Expert Political Judgment.
I think it came out in 2006-ish, and he ran these forecasting tournaments from 1984 to 2003,
with almost like 300 experts in a variety of fields, including like government officials,
economists, professors, journalists, investors.
And this was the thing that put him on the map.
And he looked at almost 30,000 predictions about the future.
And they basically found these expert forecasters were right pretty much less than the flip of a coin.
So this was one of those that dart throwing monkey could do better than a group of experts type of deal.
So this kind of put him on the map.
Oh, wait a minute.
literally the dart throwing monkeys came from his research. They called his researchers the dart throwing
monkeys. Oh, that's where the term came from? That's on page four. Okay. I guess I skimmed over that.
That's okay, Ben. Listen, you're an admitted skimmer. That book was actually a little harder to get through,
I thought. I read that recently than super forecasting. I think super forecasting out of his books,
if you're going to pick one, is probably the better one to read. And I had read this one before and
rewrite it again for our little book club here. And I think it aged nicely. It was really good. And
especially for people like us who are in the investing field who deal with just a constant barrage of
predictions on daily basis. It's kind of nice to see someone who actually looks at it in sort of a non-partisan,
I guess the word I'm looking for is maybe something else. But just it's a very clear-headed look at
this, right? It goes by the evidence and by the facts. And obviously, he was also kind of downplaying
a little bit of his own research saying, listen, I'm not saying it's impossible to forecast. I'm just saying
that there are better ways to go about it than the way that most people do. Yeah, this book definitely to me
aged like a fine white claw. So in terms of the forecasting being bunk and dark throwing monkeys,
he wrote, the message became, all expert forecasts are useless, which is nonsense. So he basically
said that, quote, debunkers go too far when they dismiss all forecasting as a fool's errand.
So on the one hand, he destroys pundits that are on TV all the time and there's no sort of
accountability whatsoever. That's on the one hand. On the other hand, he says, not only does he say,
he proved it in his research that there is such a thing as super forecasting and some of the
characteristics that these people have are stuff that we'll get into later in the show.
Well, one of the problems with it is there's a lot of skill required to be good at forecasting,
but there really aren't that many skills required to be good at being a bullshit artist,
which is kind of the people that he takes the task here.
So he said, the one undeniable talent that talking heads have is their skill at telling a
compelling story with conviction, and that is enough.
And honestly, that's pretty much all that's required is just the ability to tell a good
story to get people to believe you. You just have to be a good salesperson. So why don't we have
accountability? He called it a demand side problem because people want to be told or people want
to hear predictions. Whereas the market going, who's going to be elected, but they don't
care to see the experts or in air quotes track record. I feel like these days it should be
easier to hold people accountable because everything is being recorded in real time and it's on
the internet if you want to see it. There's video. There's people's writing. And it still doesn't
matter because I think that we're just overwhelmed with so many predictions that people just
almost don't even care what the track record is anymore. They just want to attach to the good
story. But there's no incentive to because it would destroy the entire forecasting slash entertainment
industry. He actually did say sometimes forecasts are meant to entertain, which, and he mentioned
Jim Kramer. But think about if you saw people on financial TV and you had over their head like
how accurate they were. I don't know how you would even quantify it. But that's not the point.
The point is not that these people are more accurate than anybody else. A lot of it is for entertainment.
Right. Yeah, that's true. And sometimes it's hard for people to get down to that idea that that's really all that it's for.
When you watch like NFL network or anything and you see the guys that are picking games, oftentimes they're no better than a coin toss. But what's the benefit of knowing that ahead of time?
Yeah, that's all the analysts in ESPN do. It's again, it's an entertainment thing. So I thought he kind of brought it back. I thought this comparison was pretty apt. So he said when George Washington fell ill in 1799, his esteemed physicians bled him relentlessly, dosed him with mercury.
to cause diarrhea, induce vomiting, and raise blood-filled blisters by applying hot cups to the old man's skin.
And he basically said, this is how they would have done it back in, like, Rome's time.
And he said the reason that these unscientific methods of medicine stagnated for so long is because there was just no doubt at all.
So, like, science never adapted because people were so certain and they just listened to the experts.
And he says the forecasting in the 21st century looks too much like 19th century medicine, which is a good way to put it.
Funny, imagine being George Washington and be like, this is fine.
Like, take my blood. I don't care.
Yeah, how could you not be like, this isn't working? But this is sort of the point that
we worship at the altar of pundits and people that have perceived expertise. He had a great
line in terms of hindsight bias that it's really hard to do mental time travel. So it's easy
for us to be like, how the heck did George Washington and millions of other people allow these
procedures to be done on them? But they didn't know. No one knew. That's the problem. And the crazy
thing is, is that there was just no information back then. And now we have information. So people can
actually look stuff up and figure out the facts, but people still don't want to sometimes. That's his
point is that, like, if there's a story that you're attaching to and you don't want to be
persuaded, it doesn't matter what the facts say. You're not going to be persuaded. We have all been
too quick to make of our minds and too slow to change them. There was like a million gems like
this in the book. You know the one that always confuses me, though? And this isn't a lot of books. I don't
know if he was the first one to put it out, the fox versus the hedgehog thing, I always get that
confused. So he talks about the fox knows many things. Oh, I just got it backwards. I was about
to say the fox knows one big thing. The fox knows many things. The hedgehog knows one big thing.
Ah. I assume, did David Epstein talk about that one in range? He had to, right? He must have. I feel like
that's in all these books. You're like a fox hodge. Hodge? Hodge? I don't know. He had some really
good, I don't think he really likes skewering people, but he uses them as good examples. Like Steve Ballmer in
2007, which it's kind of crazy because 2007 is not that long ago, but in the world of
technology, it's a lifetime ago because when he was CEO of Microsoft, he said there's no chance
that the iPhone is going to get any significant market share, no chance. But then he sort of
breaks that down by saying, well, did it get significant market share? Because actually other
phones have. And so he shows how easy it is for people who make predictions like that to sort
of backtrack on them and never admit that they were wrong. Well, because a lot of times they're
vague. He just pounded Thomas Friedman as like the king of vague forecasts.
Yes. But he said if you actually put numbers on this and you look at Bomerson more than just what the quote, but if you go and read what else he said, he can make the case that he actually wasn't that wrong.
So he puts some parameters around forecasting. So he says, it's not only what you're trying to predict, but how far into the future and under what circumstances. So that's like one of the perfect things that a Charleston does when they make forecasts is they just don't put a time horizon on it. So they can always basically move the goalposts when they want. And in one of his other books that I use this for my organizational level book, he talked about five excuses that people make when they are wrong. And so I thought these are all pretty good. So the if only clause, like if only one thing would have happened, then I would have been right. I guess this is Latin. The satirist.
paribus clause, which is something completely unexpected happened, so it's not really my fault.
I thought that was all us equal. I'm not good with Latin. I'll admit it. Maybe you're right. I don't
know. But this is the, if the Fed wouldn't have stepped in, the market would have crashed. So I really am
right. The other one is it almost happened. So I was close, but just almost had it. The other one is just
wait. I'm not wrong, but early. And the other one is don't hold it against me. It's just one
prediction, which when people make big predictions, it's never just one prediction. But I kind of like
those. Can we go back just to the early times? So he was talking about with the Washington
and stuff. This is a great quote, not from him. These physicians were like blind men arguing over
the colors of the rainbow. It is kind of hard to believe in the grand scheme of things. Modern
medicine is not that old. And what they used to have before was, I mean, nothing that they did
really worked. Well, did you see the part about scurve here? Did you skim over that?
Yeah, and I've seen that story a few times before. That's a great story, too.
So this guy gave the sailors oranges, but he, and obviously they saw the bed there.
but he didn't even necessarily, like, make the connection.
So there was correlation and causation.
He just didn't pick up on it.
Right.
Even when they figured something out, they almost wouldn't believe it even.
He failed so completely to make sense of his own experiment that even he was left
unconvinced of the exceptional benefits of lemons and limes.
For years thereafter, sailors kept getting scurvy and doctors kept prescribing worthless medicine.
Right.
And again, it's one of those things where people just had no clue.
So they weren't really testing this stuff.
So it's not like they were, like, looking for alternatives at all.
I think I wrote a blog post on this, that's just how people have a compulsion to tell stories.
And this experiment has been in other books, I'm sure.
People's, their hemispheres of their brain were disconnected, and they would see like a shovel, and one would say snow, and the other would say, like, chicken because and would just make up nonsense.
Right, because that part of your brain is, there's a huge storytelling part that just is always looking for that.
This compulsion to explain arises with clocklike regularity.
Every time a stock market closes and a journalist says something like, the Dow rose 95 points today on new.
news that, dot, dot, dot. And Barron's does this on Twitter all the time. Obviously, there's a joke at
this point. He said, it's a rare day when a journalist says the market rose today for any one of
a hundred different reasons or a mix of them, so no one knows. Right. If they wanted to be
honest to themselves, they would say that every single day, which you can't. Because again,
part of financial media is entertaining people, right? Because there's not always going to be
a good explanation. There's people that obviously know that they don't know. And the hammer on this is
from Connman who said, it is wise to take admissions of uncertainty seriously. Yeah, I pulled that
one out too. You should listen to people who say, listen, we really can't be sure, but that's so
unsatisfying. So those people will never get airtime. Bob Schiller is terrible on TV because this is sort of
his MO. Right. He looks at both sides of an argument and that that's not what they want. They want
someone to say with certainty, the market is going to fall 5% over the next seven days.
But no, it's funny. So Schiller's not meant for TV, but articles that mention Schiller are
insane because they take some of his quotes out of context. Yes. It's always like the market is
more expensive than it's ever been because he has a lot of historical data. And so the other part
of that conman quote, which it sounds like Tetlock has worked with Conneman a lot on his work. So he said,
you talked about like the admission of uncertainty seriously, but he says declarations of high
confidence mainly tell you that an individual has constructed a coherent story in his mind,
not necessarily that the story is true. So the way I look at it, when someone is completely certain
about the future, they're either a delusional or be just lying to themselves or lying to their
audience. And maybe sometimes a little bit of both. What do you think the breakdown is of those
people? I believe that's a type one, type two charlotton, to quote one Ben Carlson from someone's
upcoming book. Why don't you just explain that concept and tease your book? So in my book,
I talk about financial fraud. What book? Don't fall for it. Which you came up with the name,
actually. We'll get into that later. And I wanted to look at people who deceive others financially.
And I broke it down into two types of charlatans.
And one type is the person who is out there willingly deceiving people and they knowingly do it.
And they go out and they really just want to take advantage of people and they understand human nature.
And so they go out of their way to take advantage of others.
So the type one charlotta would be somebody like Warren Buffett, for example.
Yeah.
Perfect.
And the other kind is someone who is so confident in themselves that they're self-deluded into believing that everything they're doing has a purpose.
And they ended up going in the back door to take it.
advantage of people, even though they're not trying to. And this is almost someone like Dick Fold
at Lehman Brothers, who, even though he made out with a bunch of money, he rode the ship
down the whole way and went bankrupt with the company. So that type of person who is so delusional
that they think everything is going to work out, they'd only realize the error of their ways
while it's happening. I would just throw this out that type 1's outnumber, type 2 is 4 to 1.
That's it sounds about right. People know that there's always someone to be taken advantage of,
basically. So here's another one that he kind of hammers on a few times. So Nile Ferguson,
I think the debt crisis in Europe is unresolved and may be very close to going critical.
And this is Harvard economist and historian. He said this in January 2012. The Greek default may be
a matter of days away. And as we talked about a couple weeks ago, this is at the time when Greece
was, it probably sounded smart at the time because Greece was paying like their yields are like 30%.
Now they're at 2% or something. It's always kind of funny to look at these in hindsight.
By the way, if you followed me into Greek Bonds in 2011, now is a good time to take him off.
All right.
So the other one with Ferguson was he was a member of a group of, I think it was 20 people,
who wrote an open letter to Ben Bernacki.
Now, I just want to get your take on this.
By the way, I have to correct you.
Every time you say Bernacki.
Bernanke?
It's Bernanke.
Isn't it like February where the R is silent?
I don't think so.
I don't say Wednesday.
Different.
Okay.
This time.
Ben Bernanke.
So, first of all, a little tangent here, has there ever been something that started with
an open letter to that has gone well?
Like, has that ever been like a worthwhile read, an open letter?
No, you're absolutely right.
Very good.
Putting it out there.
And so this was a group of people and actually a lot of really well-known investment people.
We won't put them on blast here.
You can go look it up for yourself.
But a lot of really well-known names wrote this.
And they wrote this in...
Wait, real quick, speaking of well-known names, I forgot to mention this earlier.
I'm a person that looks at the back of the books for endorsements, even though I know it really means nothing.
Do you do that? I feel like everybody does that.
A little bit. I think I've read a research that says it really, no one ever cares about it.
It doesn't really matter.
What do you mean? It doesn't...
Like, no one buys a book because of what someone else wrote on the sleeve or the back.
I totally disagree. And I can't prove that, but...
Okay, who's in the back of this one?
All right. Conaman, Stephen Pinker, Bob Rubin, Tyler Cowan, and John, and John
Nothing to hate. Pretty good. I thought Twitter canceled Stephen Pinker recently. What do you mean?
I don't know. I thought he had some shady things that he endorsed or something. What do you mean? Speak up.
We'll talk about this later. You know people get canceled very easily these days on Twitter?
Steven Pinker got canceled. All right. I look forward to the offline conversation.
Okay. So this is an open letter to Ben Bernacki and this is dated November 15th, 2010. This is when people were still, the world is falling apart, even though things had gotten a little better than a stock market. So I looked this up.
Unuponement rate was still 9.8%. And again, this is a group of really well-known investors,
professors. They were really worried about what was going on with the Federal Reserve.
And it's easy to look back at these things and make fun of them now.
Things were terrible in 2010.
At the time, I think a lot of people believe this. So here we go.
We believe the Federal Reserve's large-scale asset purchase plan, so-called quantitative easing,
should be reconsidered and discontinued. We do not believe such a plan is necessary or advisable
under current circumstances. The plan asset purchases risk currency to base
and inflation, and we do not think that they will achieve the Fed's objective of promoting employment.
Now, this is just the first line. So again, the unemployment was 9.8% back then is now gone down to
3.7%. You could say maybe we're giving too much credit to the Fed for doing this. Obviously,
what they thought would happen in terms of currency debasement and inflation did not happen.
Could have happened. Yes. They weren't wrong just early.
No, no, I'm not even kidding. No, it could have. But the thing is, if you look at the textbooks,
what they were doing, their monetary policy with interest rates at zero, the textbook said,
yes, we should have runaway inflation.
Right.
If you didn't understand how Fed policy actually worked, you would have believed that.
I'm starting to think that monetary policy is complicated.
So what I learned in my economics 101 does not hold court in the real world?
He used this as an example to show how even when you're wrong, you can still move the goalposts.
And so he says several months after I wrote that chapter Bloomberg reporters asked the signatories
how they felt about the letter in hindsight.
Those who responded unanimously said they were right, and then they wouldn't change a word.
And they interviewed Jim Grant, who is a financial commentator, and he said, I think there's plenty
of inflation, not at the checkout counter necessarily, but on Wall Street.
Asset price inflation.
Yes, which is like something someone who's been wrong would say.
But here's the thing. Jim Grant writes a newsletter, and he's notoriously been anti-Fed,
and he's notoriously been wrong about pretty much everything the Fed has done for a long time.
but again being a newsletter writer he's not managing money he's in the entertainment business so for
someone like him to ever admit that he's wrong about this stuff makes zero sense if you're going to be
someone who is a good forecaster you probably aren't as you said to the schiller point aren't going to be
a good entertainer so someone like even if he knows in the back of his mind oh man i was totally
wrong about that there's no way in world he's ever going to admit that because that just like pulls
the curtain back and yeah it wouldn't make sense but the problem is there are people who
follow someone like that and think that guy is right. It's just going to take some time. Hyperinflation
is right around the corner. But that's always existed and there are always people willing to sell to a
certain group of audience that once their views confirmed. Right. And unfortunately, I think after the
financial crisis, there were a group of people who had their brains broken and those people are never
going to admit that they're wrong and they're always going to latch on to these sort of conspiracy theories.
They did an update and he said, four years passed and no one budged. That should be unsatisfying.
satisfactory to everyone, no matter what their view on the merits.
Right.
There were some people on that list.
I know in that Bloomberg article who said, you know what, we were wrong, but it's easy
to take the ones who didn't.
One of the things that super forecasters do is they constantly update their beliefs and
change their mind, which is nobody has ever said when the facts show that I'm wrong,
I will change my mind.
Obviously, everybody thinks that they're going to change their mind.
But a lot of people don't.
Kane said that.
When the facts change, I change my mind.
What do you do, sir?
which, according to this book, Cain's never actually said.
Right.
But my point is, nobody would disagree with that in theory.
So as an example, when the Celtics beat the Bucks in the first game of the playoffs,
it was sort of like, oh, wow, bucks aren't ready.
Celtics were just sort of on cruise control during the season and now they're really turning it on.
So you could have had that opinion and then been proven wrong in game two, in game three, and game four.
A good forecaster would be like, okay, game one, whatever, it's over.
now game two, we have more information. Now game three, we have more information. Now game four,
okay, clearly this series has changed. We have more information. Odds are 80 percent the bucks are
going to take game five. That's how it's supposed to be done. So he talks about how people update
their probabilities after the fact, but they forget. So he said in 1988, he asked his group of
forecasters, what will happen in the Soviet Union. And the Soviet Union disintegrated basically in
1991. So he said in 92, 93, he returned to those experts.
Russia. Sorry. USSR. He reminded them of the question they were asking.
1988 and asked them to recall what their estimates were. He said on average the experts were called
a number 31 percentage points higher than the correct figure. So an expert who thought there was
only a 10% chance that it would happen might remember themselves thinking there was a 40 or 50%
chance. And there was even a case which an expert who pegged the probability at 20% recalled it
as being 70%. And so this is the hindsight bias, or I knew it all along effect of even people who
are wrong sometimes don't remember how wrong they were. And they sort of, they move the goalposts to
a spot where they, you know, actually I was kind of right. I think I kind of nailed this one,
actually. There's a lot of people who did this with a financial crisis. Dan Gilbert wrote a lot about
this, how we just misremember the past. So Tetlock said, brushing off surprises makes the past
like more predictable than it was. And this encourages the belief that the future is much more
predictable than it is. When you're reading books like this, I think like the takeaway for everyone
should just be like, it's hard to even trust your own brain, like really hard, which is why
my trading journal was so helpful. Yeah.
The other thing, he says, when something unlikely and important happens, it's deeply human to ask why. And sometimes the why just doesn't exist, especially when something crazy happens. Sometimes the world is just extremely random. And so remember this from the book, this person, Lynn Wells, wrote her thoughts for the 2001 quadrennial defense review. And she said, if you have been a security policymaker in the world's greatest power in 1900, you would have been in Brit looking warily at your old age enemy, France. And then she goes, decade by decade by decade.
30s, 40s, 50s, 60s, etc. And then in 2001, which was prior to 9-11, she ended it with, all of
which is to say that I'm not sure what 2010 will look like, but I'm sure that it will be very
little like we expect. So we should plan accordingly. Yeah. So I actually use that in my new book
I'm writing as well. It was such a good piece of just no one knows what the hell is ever going
to happen. And so he did go through some of the way. So the way that they did this is
Tedlock once again tried to put together a team of people who could actually use their brain
correctly and update their priors and not hold on to their prior beliefs and become better
at forecasting.
A lot of these were just regular people.
And so he put them together and he found a group that could actually do this.
One of the surprising things was is that they, maybe it's unsurprising now, but people
were actually better in a group than they were individually.
So it wasn't like someone was better on their own and they were just this person who knew
exactly what was going to happen.
and they updated all their priors.
It was when they were together
and they could have an open mind.
But he put together a group of characteristics
that actually make someone halfway decent at forecasting.
And he said, you know,
it wasn't like foresight is something
that you're just gifted at birth.
He said it's basically the product
of a particular way of thinking,
of gathering information, updating belief.
So he said, super forecasting,
people are open-minded,
they're careful, they're curious,
and they're self-critical.
And he said, that's one of the big pieces
is that you're humble enough to update
when new information comes out.
And I think that's one of the problems
that a lot of forecasters have and people that make predictions have. Even when new information
comes that should make them rethink what the probabilities are, they just hold on to whatever
they had and they come up with excuses for why it isn't. And the other part is he talks about how
just framing these things can make people think different ways. So a 90% chance of failing is
much different than a 10% chance of changing the world. And so it depends how you frame a lot of
these things. The quote from John Kenneth Galbraith, faced with the choice between changing one's
mind and proving that there is no need to do so, almost everyone gets busy on the proof.
Right. And so there are some really good aspects in here that were characteristics where he
shows how people change their mind. And he said, he's trying to figure out what made one group
successful in the other one not. He said it didn't matter like their level intelligence or access
to information. The critical factor was how they thought. And so he said that these people were
united by the fact that the thinking was so, they got rid of all their ideological thoughts.
and the people that had problems, they took all these complex problems and tried to put it into this cause effect template that they already had in their mind.
Yeah, just good stuff.
In terms of like being vague or keeping score, I think he did a really good job nailing like why numbers matter and why I'm being specific matters.
So talking about like the Cuban Missile Crisis, which was obviously a disaster or not Cuban Missile Crisis.
That was a great success.
The Bay of Pigs was a disaster.
He wrote, the man who wrote the words, fair chance.
In other words, somebody told Kennedy that there was a fair chance of something happening.
Later said he had in mind odds of three to one against success.
But Kennedy was never told precisely what fair chance meant.
And not unreasonably, he took it to be a much more positive assessment.
So he said that you should quantify this.
So 100% means you're certain.
93% means you're almost certain.
75% means it's probable.
50% means chances about even.
30% probably not.
7% almost certain.
Certainly not. Zero percent impossible. So when you see things like there's a 70 percent chance that
it rains and it doesn't rain, you're like, this model's broken. And it's like, wait a minute. No, it's not. If
this simulation happened 100 days, 70 days it would rain, 30 days it wouldn't. I liked how he broke
down by figuring out whether they should do go or no go to go get bin Laden because they didn't
know for sure it was him. And it's interesting because he talked about the numbers. He said, yeah,
if you say there's an 80 or 70 percent chance that's happening, people assume that means 100.
But if you say it's 60, 40 or 50, 50, 50, people know that it could go either way.
And so a lot of it depends on how you frame it.
And people just don't really understand how probabilities work most of the time.
And he said that's true for both people who are taking predictions and people are giving them
because you have to kind of define what it actually means.
But it's more than just predictions.
It's people that view things a certain way can't even see things today.
So Larry Kudlow, for instance, in 2008, said, we are in a mental recession, not an actual recession.
I'm going to use that one in the future.
Tetlock said, the American economy is Larry Cuddle's beat, but in 2008, when it was increasingly
obvious that it was in trouble, he didn't see what others did. He couldn't. Everything looked
green to him. And to me, the most important piece of this book was pages 232 and 233, which
we put it on our Instagram post. So Connman's what you see is all there is. Tedlock called that
the mother of all cognitive illusions. And he gives a story about General Michael Flynn, saying,
I think we're in a period of prolonged societal conflict that is very unprecedented.
And obviously that's not true because just look at history.
It's just patently false.
But Tetlock said it felt true, which is the oldest trick in a psychological book, and Flynn fell for it.
And so he showed like those two lines with the arrows.
And he said, not even knowing it's an illusion can switch off the illusion.
We can't switch off the tip of our nose perspective, which is so freaking powerful.
Because there are some things, a lot of things in life, maybe politics.
is the most glaring example, where for the life of you, you cannot understand how somebody
feels a certain way. And the obvious answer is because you're not them. You don't see the
world through the lens that they do. I've seen that the line trick where one of them looks like
two arrows and one of them looks like the opposite of arrows on the end. And I know that the lines
are the same. But if you look at it, your brain still tricks you every time.
I take out the paper every time. Like this time it's not going to match.
Yeah, that's a good one. I would say that this is the too long didn't read quote from the book.
if you want to know what this book was about.
Underlying super forecasting is a spirit of humility,
a sense that the complexity of reality is staggering,
our ability to comprehend limited,
and mistakes inevitable.
That's good.
He talked about, too, how a growth mindset is important.
So he said only people with a growth mindset
pay close attention to information
that could stretch their knowledge.
And for them, learning was a priority.
And he said that was one of the things
that these people all had in common.
So to be a top flight forecaster,
you have to be curious and actually want to improve
and not just stick with whatever model you've already figured out about the world?
Here's one more great one.
Keynes is always ready to contradict not only his colleagues, but also himself,
whenever circumstances make this seem appropriate,
reported a 1945 profile of the consistently inconsistent economist.
So far from feeling guilty about such reversals of position,
he utilizes them as pretext for rebukes to those he saw as less nimble-minded.
So this is where that quote came from, whether or not he actually said it.
But this is the gist.
So back to the book.
Legend says that while conferring with Roosevelt at Quebec, Churchill sent Keynes a cable reading, quote, I'm coming around to your point of view.
His lordship replied, quote, sorry to hear it, have started to change my mind.
That's perfect.
Well done.
So I think this book is good for people in a lot of fields because I think there's certain people no matter what they're looking at, whether it's politics, the economy, the markets, whatever is going on in their life.
that are easily fooled by randomness and by an intelligent sounding person. I think this book
is a good way to get you to question what other people are talking about and try to come up with
your own sort of process behind thinking through things and not just believing the first
intelligent sounding person you hear. So Connman, who has mentioned a lot on this book,
is an extreme. He's said that he's a pessimist by nature. So he doesn't really believe that just
being aware of your behavioral biases can help you improve upon them. And I think that's true
to a certain extent. But I don't want to say that's a cop-out because I'm not accusing him
of that. But I think that this book actually is one of those rare pieces of text that can
actually change the way you think a little bit. I'm not saying that all of a sudden you're
going to conquer all of the cognitive biases that we have. I think that's obviously not possible.
But this did a really good job of explaining how you can get better. So this doesn't just
diagnose a problem and actually prescribe solutions. So I think just for that alone is very valuable.
Yes, and he could have spent the whole book dunking on bad predictions, which is something that we probably do too much of, because it's really easy to do. But he looked at both sides of the aisle, and he also said, like, listen, my previous work doesn't mean people can't forecast if they have the right mindset. It just means it's really hard to do. And there are a lot of people out there who take advantage of the fact that people love to hear stories. So think about that when you're listening to the next forecast you hear. Ben, what are we doing next?
I think the next time we're going to do a Malcolm Gladwell duo and we haven't figured out which
one of us is going to read what were his books called his first two blink tipping point
blink and tipping point so each of us are going to read one of them and we're going to do a
dual Malcolm Gladwell rekindled so we'll let you know when we're going to do that probably in the
next month and one of us is going to take one either this to take the other and anyway super
forecasting highly recommended it's a great way to think about the world and try to understand
people's motives better. By the way, there's probably too late for this, but he had a co-author,
Dan Gardner. Yeah, I think I mentioned that. Okay. I think he helped along with the research
and that sort of stuff. So, but it was nice to, yeah, have the byline. So anyway, thanks for
joining us for our book club. We'll try to do another one of these again. Malcolm Gladwell next time
and we'll show you later.