Radiolab - Tit for Tat
Episode Date: September 17, 2019In the early 60s, Robert Axelrod was a math major messing around with refrigerator-sized computers. Then a dramatic global crisis made him wonder about the space between a rock and a hard place, and w...hether being good may be a good strategy. With help from Andrew Zolli and Steve Strogatz, we tackle the prisoner’s dilemma, a classic thought experiment, and learn about a simple strategy to navigate the waters of cooperation and betrayal. Then Axelrod, along with Stanley Weintraub, takes us back to the trenches of World War I, to the winter of 1914, and an unlikely Christmas party along the Western Front.
Transcript
Discussion (0)
Wait, you're listening.
Okay.
All right.
You're listening to Radio Lab.
Radio Lab.
From W. N.Y.
See?
See?
Yeah.
Hey, this is Radio Lab. I'm Chad Abumrod.
I'm Robert Krilwitch.
Oh.
Yes, would you like to say our topic, Robert?
Our topic today is goodness.
Hey, I'm Chad.
So last week, we played you a show that is one of our all-time favorites.
It was three different stories.
about different kinds of collisions between people,
between moral philosophies, between right and wrong,
even right and left.
Anyhow, this week, I want to continue the thoughts we had going in that show
for one more step.
We're going to play part of a different show
that takes the strategies that came up
in those kinds of showdowns last week, between people,
and plays them out on a geopolitical scale.
grand global strategy.
Yes.
Hello, hello.
And we're going to tell you a really cool story, we think, that begins with this guy.
My name is Robert Axelrod.
I'm the Walgreen Professor for the Study of Human Understanding in the Department of Political Science
and the Ford School of Public Policy of the University of Michigan.
I know that's a mouth.
That was like your dean was like looking over here to say it all, please.
Yeah, well, you know, you could just say I'm a professor of,
public policy and political science or something.
Well, but before he was all of that,
Axelrod when he was in high school,
he was one of those guys who just loved computers.
Well, yes, in 59, 1960.
I hung around the Northwestern University Computer Center.
59.60?
So were those large pieces of furniture
in refrigerated buildings?
They were.
In fact, the whole campus had one computer,
and they let me use it for 15 minutes here and 15 minutes there.
And what would you do with the computer?
What I did, I did a very simple computer,
simulation of hypothetical life forms and environments for a science project.
Ah, really? Yeah. You're a pre-geek, is what you are.
Yes. Before the word had been invented. I think you could say that.
But then in 1962, when Axelrod was down in a computer basement, I guess, somewhere,
all over the world, everybody else was watching one of the great dramas in modern times.
Good evening, my fellow citizens, unfold.
The Cuban missile crisis. Within the past week, unmistakable evidence has established the
fact that a series of offensive missile sites is now in preparation on that imprisoned island.
And Axelrod started thinking about the dilemma we were here.
Well, each side wants to spend more money buying missiles and things.
You know, we could build more bombs, but then they could build more bombs.
It would be better if they would both stop, but if we stop and they don't, that would be bad.
Very bad.
Yeah, and so I was interested in what were the conditions that would allow people to get out of this problem.
And then he starts thinking, well, wait, maybe I could just,
use my computer to help me figure out what's a good strategy for this for something like the
Cuban missile crisis well yes right and what made you think that computers could help with that
well I came across a simple game called the Prisoner's Dilemma yeah right noise from the window
okay so the Prisoner's Dilemma is a very famous thought experiment it's a little tricky to
describe but I got a friend of mine Andrew Zolle who's written about the Prisoner's Delima in an upcoming
book resilience the science of why things bounce back I got him to lay it out
for me. What is the prisoners in lima?
So imagine
that two bank robbers
are hanging out across the street
from the First National Bank
and the police
pick them up. They've received a tip
that these two guys are about to rob
the bank. Got it? Yep. So the cops
take these two guys back to the station,
do the whole law and order thing, put them in different
rooms. And they walk into each one.
Let's call them Lucky and Joe.
And they say, to Lucky,
we have enough to make sure that you
go away for a six-month sentence.
But this is not really what the cops want.
They want a longer sentence for one of these guys,
so they make Lucky an offer.
If you, Lucky, rat out Joe,
and Joe doesn't say anything,
you will go free,
and Joe will go to jail for 10 years.
If the reverse happens...
Meaning if you say nothing and Joe rats you out...
You're going to jail for 10 years,
and he's going to walk free.
If you both end up ratting on each other, you both get five.
Five years.
Whereas if you both keep your mouth shut?
You're each going to jail for six months for loitering.
So somehow, if Lucky and Joe could talk to each other, they both say don't speak.
Absolutely.
But the big problem that Lucky and Joe have is they can't talk to each other.
All right, so you're lucky, okay?
What do you do? Do you rat Joe out or not?
Do I know this guy?
Uh-uh.
At all?
I mean, you met for this one job, but tomorrow you'll never see him again.
Ever.
Ever.
Well, if I knew him and I could trust him, then I think I know what I would do.
You'd keep him out shut because he'd keep him again, of course.
Because he'd keep his mouth shut.
It would be a sweet thing.
Indeed.
But, see, since I don't know, I might as well, what would happen if he rats me out?
You'd go to jail for 10 years.
He'd go free, that bastard.
Ten years.
Yeah.
But if I rat him out, and the worst I get is...
Five years.
Or, you know, I go away free.
I'm totally free.
Do it, Crow.
Say what's in your heart.
I'm throwing him under the bus,
Jay.
Yes, throw him under.
What's his name again?
Joe.
Joe.
You see he's already gone.
He's already
remember him.
You're dead to me, Joe.
So you see, in this type of scenario
where you don't know the guy,
you have a very strong incentive.
To rat the other guy out.
Or as the social scientist would say,
to defect.
That's right.
If you play it only once,
if you only meet somebody once,
whatever the other guy does,
you're better off defecting against them.
Just here on out, whenever you hear the word defect, know that it means screw the other guy over.
But the really interesting stuff happens if you play over and over again, if you're going to meet the same people again.
Because now you're thinking, should I help this guy out the next time?
If he screwed me, should I screw him?
But this secret, swift, extraordinary buildup of communist missiles.
What do you do? You want to cooperate, but you don't want to get screwed.
Which cannot be accepted by this country.
Right. You know, these kind of thoughts were paramount in those days because a prisoner's dilemma was being placed.
between the two superpowers.
This is our friend Steve Strogett's, the Cornell mathematician,
who says at that time, all kinds of folks.
Political scientists and economists and psychologists, mathematicians.
We're writing papers about the prisoner's dilemma.
Literally, in thinking, come on,
we've got to be able to win this game
if we're going to play against the Russians,
and we have to do it right.
Exactly, but there was no consensus on the best way to do it.
And so I was interested in what's a good strategy for this.
And that's when Robert Axelrod is sitting down there in the
basement somewhere in the Midwest with the big computer. That's when he had his idea.
His approach, which was really novel at the time, was to conduct a computer tournament.
A computer tournament?
Yeah. Invite the people that had come up with these different ideas to play with each other.
In other words, what he said is, all right, Mr. Wise Guy, you know, you've written so-and-so-many
articles on the Prisoner's Dilemma. You think you understand it. How about joining this tournament
where you have to submit a program that will play Prisoner's Dilemma
against programs submitted by the other experts.
We'll have a round-robin.
Right. Try these different programs against each other.
So all these computer guys are brought to Caesar's Palace in Las Vegas,
and they all wear tuxedos, and they're all sat down at the table.
No.
It's a nice image, but what really happened was everyone submitted their programs to Axelrod.
They would mail their entries to me.
But there was a trophy.
There was a trophy.
So I wrote to people, and I said,
If you win, I'll send you a trophy.
You know, a little plaque that says you won the computer tournament.
Okay, so here's the deal.
Every program will play every other program 200 times.
There will be points in each round, and then Axelrod will total the scores.
And see what actually worked.
By which he means, in the long run, even if you lose some rounds here and there,
one of these strategies is going to beat all the others, meaning it'll let you survive.
Maybe even prosper?
That's the game.
That's right.
And can you introduce us to some of the contestants?
Yeah.
So there was one program called Massive Retaliatory Strike.
On the first move, it just cooperates.
But then as soon as the other program doesn't cooperate, it would then retaliate for the rest of the game.
Like, sorry, man, you blew it.
I'll never trust you again.
Yeah, that's it for you.
This is like the way my wife is.
Whenever a guy in her earlier life stood her up, that was it.
Game over.
But there were also some trickier programs.
I mean, some crafty ones try to make a model of the opponent.
Like you mentioned one that was called Tester.
So Tester would see what you were like.
It would start by being mean.
And then if you start retaliating, it backs off and says, you know,
ho, chill out.
It's okay, man.
And then starts cooperating for a while until it throws in another.
Just to test the other guy goes after all was called Tester.
Yeah.
Tester is kind of designed to see how much it could get away with.
I mean, it sounds kind of sensible in a way.
I mean, but...
Well, but if you see, if you think about what happens if these two players play each other.
If Tester plays massive retaliation 200 times...
Pretty soon the tester will defect...
And then massive retaliation will never cooperate again.
Screw you, bow.
Yeah, no, screw you!
Let's go.
You screw me.
You come in there.
You get close to me.
So, in fact, they'll do very...
go do very badly, both of them.
When you're sitting there, did you have a hunch as to which would be the most successful
program?
Well, I didn't know, which is why I wanted to do it.
But I did have a hunch that, you know, thousands or tens of thousands of lines of code
would be needed to have a pretty competent program.
So when the mailman delivers the fattest envelope to your house, you're like, this could be the one.
Well, yes, right.
Now, it didn't turn out that way.
When it was all set and done, when he loaded all the programs into the computer,
when they'd all played each other 200 times, the program that won?
It's really two lines of code.
Two lines of code?
Yeah, it's got a simple name.
It's called tit for tat.
First line of code?
Be...
Nice.
Nice?
Yeah, nice.
Nice is a technical word in this game.
Nice means I never am nasty first.
And after that...
Second line of code?
It just does what the other player did on the previous move.
Oh.
So if the other player has just cooperated, it'll cooperate.
And if the other player has just defected, it'll defect.
It retaliates on the next move.
Couldn't be clearer.
On the other hand, it only retaliates that one time.
I mean, unless provoked further.
It does its retaliation and now bygones or bygones and that's it.
So wait, how exactly did it win?
I mean, can you give us a sense of why it won?
Okay, so let's suppose, here let's take an extreme case of some very simple programs.
One of them I'll call Jesus.
Just for the sake of argument.
Just for the sake of a name.
Now, the Jesus program cooperates on every turn.
That is, it's always, you know.
Good.
Yes.
So the Jesus program is a simple algorithm that says, always be good.
Good, good, good, good.
That's right.
And let's say the other program is the Lucifer program, which no matter what always is bad.
Okay.
These are your two extremes, says Steve.
And of course, most programs and most people fall somewhere in the middle.
Right.
But in tit-for-tat, you've got a strategy that can swing both ways.
For instance, with Jesus, tit-for-tat starts by cooperating, as does Jesus.
And then they're going to keep cooperating for the whole 200 rounds.
Which is, you know.
Good.
let's suppose it plays Lucifer, where there's no chance to cooperate.
Then says Steve Tit for tat just plays good defense.
So when Lucifer does his thing, tit for tat retaliates.
And they pretty much keep doing that and stay even.
So in other words, it's a very robust program.
It elicits cooperation if the opponent has any inclination to cooperate.
But it doesn't take any guff.
And it wins.
So you might say in evolutionary terms, this program,
is the fittest. So actually, Axelrod played an evolutionary version of his tournament. That is,
he had these programs after they played their tournament, get a chance to reproduce copies of
themselves according to how well they did. You mean the winners would get to have more babies?
Yeah. And then would the babies play each other? Yeah, he ran them again. I mean, he ran them
for many generations. And so, like, suppose you have a world of Lucifer's, and there are a few
tit-for-tat players out there. Can they thrive? Can cooperation emerge in this heart?
horribly hostile world.
What an interesting question.
So he looked at that, and the answer was, if you have enough of them,
so that they have enough chance of meeting each other,
and they can actually invade and take over the world,
even if the world starts horribly mean.
I mean, what I take to be the big message, though,
I mean, what always sent chills down my spine,
is that we see this version of morality around the world.
You know, be upright, forgiving, but retaliatory.
I mean, that sounds to me like the Old Testament.
It's not turn the other cheek.
It's an eye for an eye, but not ten eyes for an eye.
And to think that it's not something that's handed down by our teachers or by God,
but that it's something that came from biology.
I like that argument personally.
We're going to take a quick break.
When we come back, the story moves from computer programs to real people, real people.
real people in the middle of a very real war.
Stick around.
Hey there, this is Greg in Huntington Beach, California.
Radio Lab is supported in part by the Alfred P. Sloan Foundation,
enhancing public understanding of science and technology in the modern world.
More information about Sloan at www.sloan.org.
Hey, I'm Jad. We're back, and we're playing a piece that we called at the time,
tit for tat.
We just heard about the computer program of that name and how it feels.
fared in Robert Axelrod's Thunderdome-style competition between computer programs,
but we're going to shift the playing field now.
This is what's so impressive to me about Axelrod's work.
So he's not just playing math games.
He tries to tie this to history and politics as seen.
I like to scan journals.
One of my happen to say it's pastime because it's part of my profession.
But I came across a book called The Live and Let Live System in World War I.
So here's where we jump away from the math and the computer tournaments and into something very real.
The war began late in July, 1914.
That's Stan.
Stanley Weintraild.
Expert in World War I.
Evan Few Professor Emeritus at Penn State.
And the story that Stan's going to help us tell takes place on what was called the Western Front, which was basically these two lines of trenches.
Very close to each other, a few hundred yards apart.
And they stretched for hundreds of miles.
And that fall.
In November, the weather turned bad.
heavy rains, then it became icy, and then slush, and then snow.
It became disgusting because the trenches also were filled with rats.
The rats went after not only the food, but after corpses.
And it was oddly in this miserable, disgusting hellhole, that something quite amazing happened.
No one quite knows how it started, but one day, maybe around daybreak, let's see.
say. While the two sides were fighting, some of the British soldiers...
Stop firing long enough to have breakfast.
And as they were eating, they noticed, hmm, the Germans stopped too, to have their breakfast.
When they're both done...
They'd begin firing again.
Next morning, same thing. British take their breakfast break at about the same time.
The Germans do the same thing. The morning after that, the same thing. And then the next.
And after a while...
Both sides caught on if they didn't interrupt the other one.
then they wouldn't be interrupted.
On the whole, there is silence.
This is from a letter a British soldier sent home to his wife at the time.
After all, if you prevent your enemy from drawing his rations,
his remedy is simple.
He will prevent you from drawing yours.
When Axelrod read this?
I thought, gee, this sounds very familiar.
Line one of Tip for Tat.
Be nice first.
Now, the Brits probably didn't mean to be nice first when they started the breakfast truce,
but it happened.
And then the Germans reciprocated,
which is line two.
Now, keep in mind, these two sides are at war,
and implicit in line two is a threat.
If you mess with me, I'm going to mess with you.
Well, think about snipers, for example.
There's letters where they explain where the snipers would shoot at a tree
over and over and over again,
showing that, in fact, they were really accurate,
meaning that if they wanted to kill you, they'd get you.
And this was going on during the breakfast truce,
and these little agreements, you know, like,
I'm going to be nice to you, but I could kick your ass.
Don't forget.
Well, these little truce is spread all.
up and down the Western Front until things really changed. Fast forward to December, Christmas Eve.
The climate was just about freezing on Christmas Eve, and the Germans had a tradition of tabletop Christmas
trees, small trees. For weeks, he said, the German government had been shipping small trees
literally to the trenches, hundreds and hundreds of trees. And that night, on Christmas Eve,
at dusk the Germans began putting up their trees, mounted them on the rim of their trench,
and lit candles on them, singing Christmas carol.
The British, who might have been known more than 50 or 70 yards away,
crawled forward into no man's land to see better.
And then they were spotted.
Here's a letter from a German soldier sent home to his family,
which describes what happened next.
I shouted to our enemies that we didn't wish to shoot.
I said we could speak to each other.
At first there was silence.
And then very slowly out of the darkness,
the British guys approached.
And so we came together and shook hands.
See, this is where I start to think,
are you making this up?
Because this is where it starts to sound
sort of crazy to me.
That's Pat Walters, our producer.
It sounds as if this is being made up,
and the result was for many decades,
people assumed that this was just myth.
It couldn't possibly have happened.
But we know it had happened
because we have the letters
that the British and the Germans sent back home.
We know that they met,
in darkness and decided, why don't we have a truth in the morning?
Next morning, thousands of soldiers put down their rifles, climbed out of their trenches into no-man's
land, and started hanging out with each other.
A lot of us went over and talked to them, and this lasted the whole morning.
I talked to several of them, and I must say they seemed extraordinarily fine men.
Soldiers got together, started fires, cooked Christmas dinners.
Swapped presents and drank.
The Germans hauled out these enormous barrels of beer.
They traded stuff.
Cigars and trinkets.
Even helped one another.
Buried the dead.
And in some places on the Western Front, this period of goodwill lasted a whole week.
But then, the generals found out.
They were very angry about this, and they said, if we didn't send you to the front to be nice to the other guys, we said to kill them.
If the general says, hey, I want you to shoot those Germans.
That's an order.
Well, then they would...
Wouldn't that...
Oh, gee, sorry, General, I missed.
But I'll try again better next time.
The way the general is finally figured out how to disrupt this whole thing is they would say,
okay, you guys go out on a raid, and I want you to bring back a prisoner or a corpse.
In other words, show me a scalp. That's an order.
And that mess things up royally.
Here's a letter from a British soldier whose unit contained a band, which was apparently pretty common.
He writes this letter about one of the moments when the truce vanished.
At six minutes to midnight, the band opened with DeVoct Um, Rai.
which is a German patriotic anthem.
So some of the Germans, according to this letter,
climbed up onto the rim of their trench to listen to this English band playing their song.
Then, as the last note sounded,
every grenade firing rifle, trench mortar, and bomb-throwing machine
let fly simultaneously into the German trench.
So you can imagine the Germans that weren't killed would have felt betrayed.
They had just been hanging out with these guys.
And the next night they would have attacked back, and the British would have attacked them back,
and then the Germans would have retaliated against them, and on and on and on.
And it would kind of echo back and forth forever.
And that's what happened.
There were immense casualties, as many as 50,000 casualties in a day.
And this says Axelrod is where you see sort of the dark side of tit for tat.
One of the weaknesses of the tit for tat strategy, or one of the problems with it is these echoes.
Not just echoes of good, obviously, but echoes of violence.
could get bad. So what I found, though, was that instead of playing pure tit for tat, where you always defect if the other guy defects.
There are certain circumstances, he says, and this I find completely fascinating, where you want to modify that second line of code, so that you're not always retaliating, you're nearly always retaliating.
Right. If you were a little bit generous, by which I mean, say, 10% of the time you don't defect, then what happens is that these echoes will stop. And I would call that generous tit for.
for tat. So this is kind of interesting. Like we started with Moses, you know, an eye for an eye.
But here it's saying maybe for every nine parts Moses, you need one part Jesus. You know,
meaning like turn the other cheek. Turn the other cheek. It sounds like you described like a
cooking recipe or something. Well, like nine parts. One thing. Yeah. I mean, if you abstracted,
it's kind of a recipe. It's a recipe. It's a recipe for life. But it isn't a recipe. That ignores the
deep fact of it. Look, if I were punching you in the face right now, what are you going to do?
I'm going to punch you back. Yeah. And I'm going to punch you back. And I'm going to punch you back.
You punch me back and we're in pain.
And somehow in the middle of being blasted by my powerful fist,
you have to come up with the moral courage to say,
I think I'm going to kiss this guy now.
And that is not, as you well know, that is not an easy thing to do.
All right, but you're making it all personal.
My point is, if you zoom out,
this is a strategy that just seems to be woven into the fabric of the cosmos.
It works for computers.
It works for people.
It probably works for amoeba.
Okay?
It just works.
And you think that exists on some higher plane?
I do.
I do.
I don't.
I think this is still, as you just called it, very personal.
I think a person has to choose to be kind.
All right.
I'm going to make that choice right now then, okay?
You know you're irritating me?
I'm going to say to you, Robert, you look very nice today.
You know what I'm going to do to you?
All right.
Enough of this.
Radiolab.org is our online home.
You can read lots of stuff there, and you can subscribe to our podcast.
It's WW.
you. That's implied.
Yeah.
Hi, this is Steve Strogat.
Radio Lab is produced by Jed, Aboumrod, and Pat Walters.
Our staff includes Sorin Wheeler, Ellen Horn, Tim Howard, Brenna Farrow, and Lynn Leib.
With help from Abby Wendell and Douglas Smith.
Special thanks to Nick Capadice, Graham Parker, Daniel Neumann, and Meg Bowles.
End of mailbox.
