60 Minutes - 4/22/2018: The Data Miner, The Future Factory, For Better or for Worse
Episode Date: April 23, 2018Aleksandr Kogan -- the app developer at the heart of the Facebook privacy scandal -- weighs in on the Cambridge Analytica controversy. We go inside MIT's Media Lab -- where scientists are turning futu...ristic ideas into present day possibilities. Plus -- a decade long look into the progression of an Alzheimer's patient. Those stories on tonight's "60 Minutes." Learn more about your ad choices. Visit megaphone.fm/adchoices To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy Learn more about your ad choices. Visit https://podcastchoices.com/adchoices
Transcript
Discussion (0)
What's better than a well-marbled ribeye sizzling on the barbecue?
A well-marbled ribeye sizzling on the barbecue that was carefully selected by an Instacart shopper and delivered to your door.
A well-marbled ribeye you ordered without even leaving the kiddie pool.
Whatever groceries your summer calls for, Instacart has you covered.
Download the Instacart app and enjoy $0 delivery fees on your first three orders.
Service fees, exclusions, and terms apply. Instacart. Grocer $0 delivery fees on your first three orders. Service fees,
exclusions, and terms apply. Instacart, groceries that over-deliver.
He's at the center of the Facebook scandal. Facebook says that you lied to them. And Mark Zuckerberg blamed him for selling the data
of millions of unwitting users. People have a right to be very upset. I'm upset that that happened.
But not everyone believes Facebook's explanation either. You've got a company that has repeatedly
had privacy scandals. You know, if your partner was cheating on you and they cheated on you 15
times and apologized 15 times, at some point you have to say enough is enough.
Welcome to the future.
MIT's Media Lab, a place that follows crazy ideas wherever they may lead.
We get to think about the future.
What does the world look like 10 years, 20 years, 30 years?
What should it look like?
Time to go to sleep.
How about dream control?
Robotic prosthetics?
What's the largest city in Bulgaria?
And what is the population?
Or connecting the human brain to the internet?
Sophia, 1.1 million?
That is correct.
You know, the best way to predict the future is to invent it. When you heard the word Alzheimer's, what did that do to you? I was
devastated. What makes this story so unusual is that almost every year for the past 10 years,
we've interviewed Mike and Carol as Alzheimer's took over her brain. Even though this is intensely personal,
they wanted all of us to see the devastating impact of Alzheimer's on each of them over a decade.
What's your husband's name?
My husband's?
Your husband's name?
Yeah.
The guy sitting to your left.
I'm Steve Croft.
I'm Leslie Stahl. I'm Scott Pelley. I'm Anderson Cooper. I'm John LaPook.
I'm Bill Whitaker. Those stories tonight on 60 Minutes.
The new BMO VI Porter MasterCard is your ticket to more. More perks.
More points.
More flights.
More of all the things you want in a travel rewards card.
And then some.
Get your ticket to more with the new BMO V.I. Porter MasterCard and get up to $2,400 in value in your first 13 months.
Terms and conditions apply.
Visit bmo.com slash viporter to learn more.
Facebook and its CEO, Mark Zuckerberg, are in a whale of trouble,
and not just because the company has lost tens of billions of dollars
in market value in recent weeks.
We now know that during years of essentially policing itself,
Facebook allowed Russian trolls to buy U.S. election ads,
advertisers to discriminate by race,
hate groups to spread fake news.
And because Facebook shirked privacy concerns,
a company called Cambridge Analytica was able to surreptitiously gain access
to personal data mined from as many as 87 million Facebook users.
The man who mined that data for Cambridge Analytica is a scientist named Alexander Kogan.
He's at the center of the Facebook controversy because he developed an app that harvested data
from tens of millions of unwitting Facebook users. The main infraction, the main charge is that you sold the data.
So, I mean, at the time, I thought we were doing everything that was correct.
You know, I was kind of acting, honestly, quite naively.
I thought we were doing everything okay.
Facebook says that you lied to them.
That's frustrating to hear, to be honest.
If I had any inkling that what we were going to do was going to destroy my relationship with Facebook, I would have never done it.
If I had any inkling that I was going to cause people to be upset, I would have never done it.
This was the blindness we had back then.
For someone implicated in the biggest privacy scandal on earth, Kogan seems incongruously guileless.
Before all this happened, what was your job and what was your field of study?
So I was a social psychologist.
I was working as a university lecturer at the University of Cambridge.
In England.
In England.
And I ran this lab that studied happiness and kindness.
Happiness and kindness.
Yeah.
That's a far cry from the adjectives lobbed at him now. Sinister and un. Yeah. That's a far cry from the adjectives
lobbed at him now.
Sinister and unethical.
Here's what he did.
He asked Facebook users
to take a survey he designed
from which he built
psychological profiles
meant to predict their behavior.
He failed to disclose,
one,
that what he was really after
was access to their friends,
tens of millions of people he could not otherwise reach easily,
and two, that he was doing the survey for Cambridge Analytica,
a political consulting firm that used the material to influence people on how to vote.
The company's then-CEO bragged about their prediction models on stage.
By having hundreds and hundreds of thousands of Americans undertake this survey,
we were able to form a model to predict the personality of every single adult in the United States of America.
Did you get to the point where you were predicting personalities?
And you gave that to Cambridge Analytica?
Correct.
What did you think they were going to use it for?
I knew it was going to be for elections,
and I had an understanding or a feeling that it was going to be for the Republican side.
As political consultants, Cambridge Analytica is hired by campaigns
to analyze voters and target them with ads.
In the 2016 presidential election,
Cambridge Analytica worked first for the Ted Cruz campaign, then later for Donald Trump, though his campaign says they didn't use
the Kogan data. The Republican benefactors Robert and Rebecca Mercer were Cambridge Analytica's
financial backers. Steve Bannon was on the board. So did you ever meet or hear about
Steve Bannon at Cambridge Analytica? No. The Mercers? No. Jared Kushner? Nope. Nothing. And
those names would not even have like rung a bell for me, to be honest. Tell us what you did. So
I create this app where people sign up to do a study. And when they sign up to do the study,
we would give them a survey.
And in the survey,
we would have just this Facebook login button.
And they would click the button,
authorize us, we'd get their data.
Authorize us to do what?
To collect certain data.
We would collect things like the location,
their gender, their birthday,
their page likes,
and similar information for their friends.
And all of this was... But did you say you collected information on their friends?
We did.
But they didn't opt in.
So they didn't opt in explicitly.
No, no, no.
They didn't opt in, period.
The friends did not opt in.
And that's part of the...
It seems crazy now,
but this was a core feature of the Facebook platform for years.
This was not a special permission you had to get.
This was just something that was available
to anybody who wanted it, who was a developer. How many apps do you think there are? How many
developers who did what you did? Tens of thousands. Tens of thousands? Tens of thousands.
And Facebook obviously was aware. Of course. It was a feature, not a bug. The feature was called
friend permissions, which Sandy Parakilis, who used to work at Facebook, explains.
The way it works is, if you're using an app, and I'm your friend,
the app can say, hey, Leslie, we want to get your data for use in this app,
and we also want to get your friend's data.
If you say, I will allow that, then the app gets my data, too.
What you're saying is, I give permission for the friend. The friend doesn't give permission.
Right.
That, I mean...
It doesn't feel right when you say it out loud.
No, it doesn't feel right.
Right.
Facebook should have been aware of how this could be abused, because they were repeatedly warned,
including by Parakeelis, who used to be a manager in charge of protecting
data at the company. He says he raised concerns years before Kogan built his app.
I think they didn't want to know. You know, the impression that I got working there is...
They didn't want the public to know.
Well, they didn't want to know in the sense that if they didn't know, then they could say they didn't know
and they weren't liable, whereas if they knew, they would actually have to do something about
it.
And one of the things that I was concerned about was that applications or developers
of applications would receive all of this Facebook data, and that once they received
it, there was no insight.
Facebook had no control or view over what they were doing with the data.
Once the data left Facebook, did Facebook have any real way to find out what happened to it?
No.
Or was it just gone?
It was gone.
Wow.
They could put it on a hard drive and they could hide it in the closet.
Would you say then policing this was pretty impossible?
It was very frustrating.
Did you bring this to the attention of the higher-ups, the executives?
Yeah, a number of folks, including several executives.
So were the executives' hair on fire?
Did they say, oh, my God, we have to fix this, we have to do something?
I didn't really see any traction in terms of making changes to protect people.
They didn't prioritize it, I think is how I would phrase it.
So would you say that they didn't prioritize privacy?
Yes, I would say that they prioritized the growth of users,
the growth in the data they could collect,
and their ability to monetize that through advertising.
That's what they prioritized because those were the metrics
and are the metrics that the stock market cares about.
Facebook CEO Mark Zuckerberg turned down our request for an interview.
Eventually, the company did change its policy so app developers can no longer gather to protect users' privacy
by allowing covert harvesting of so much personal data
became the center of the congressional hearings two weeks ago.
In his defense, CEO Mark Zuckerberg pointed the finger at one particular app developer.
If a developer who people gave their information to, in this case Alexander Kogan, then goes and in violation of his agreement with us, sells the data to Cambridge Analytica, that's a big issue.
I think people have a right to be very upset. I'm upset that that happened.
You're a villain in many eyes, the guy who stole data from Facebook and then sold it.
The idea that we stole the data, I think,
is technically incorrect. I mean, they created these great tools for developers to collect the
data and they made it very easy. I mean, this is not a hack. This was, here's the door, it's open,
we're giving away the groceries, please collect them. Your point, though, I think,
is that they're singling you out. I think there's utility to trying to tell the narrative that this is a special case,
that I was a rogue app and this was really unusual.
Because if the truth is told that this is pretty usual and normal, it's a much bigger problem.
And he says he wasn't hiding anything from Facebook.
When Alexander Kogan built his app, he posted its terms of service. That's what users agree
to when they download an app. His terms of service said this, if you click OK, you permit us to
disseminate, transfer, or sell your data, even though it was in direct
conflict with Facebook's developer policy. It says plainly in the developer policy,
clearly, that you are not allowed to transfer or sell data. It says that. Come on, this was
as clear as can be. I understand that now. You didn't understand that then? I'm not even sure if I read the developer policy back then.
He says that nobody read these privacy sign-offs.
Not him, not the users who signed on, not Facebook.
This is the frustrating bit, where Facebook clearly has never cared.
I mean, they've never enforced this agreement.
And they tell you that they can monitor it and they can audit and they can check
and they'll let you know if you're doing anything wrong.
I had a terms of service that was up there
for a year and a half that said
I could transfer and sell the data.
Never heard a word.
The belief in Silicon Valley,
and certainly our belief at that point,
was that the general public must be aware that their data is being sold and shared and used to advertise to them.
And nobody cares.
Facebook did shut down his app, but only after it was exposed in the press in 2015.
The company didn't start notifying the tens of millions of users whose data had been scraped until this month.
They never took action against this man, Joseph Chancellor, who was Kogan's co-worker.
And where is he today?
He works at Facebook.
Wait a minute.
Did he have anything to do with the study you did for Cambridge Analytica?
Yeah, I mean, we did everything together. So they've come after you, but not someone who did exactly what you did with you.
Yes.
And he actually works at Facebook?
Correct.
Are you on Facebook?
No, they deleted my account.
You can't be on Facebook. You're banned.
I'm banned.
And the partner works for them.
Correct.
What's wrong with this picture? I'm missing something. Yeah, I mean, this is my frustration with all this,
where I had a pretty good relationship with Facebook for years. Really? So they knew who
you were? Yeah. I visited their campus many times. They had hired my students, and I even did a
consulting project with Facebook in November of 2015. And what I was teaching them was lessons I learned from working with this data set that we had
collected for Cambridge Analytica. So I was explaining, like, here's kind of what we did,
and here's what we learned, and here's how you could apply it internally to help you with surveys
and survey predictions and things like that. Facebook confirmed that Kogan had done research and
consulting with the company in 2013 and 2015. But in a statement told 60 Minutes,
at no point during these two years was Facebook aware of Kogan's activities with Cambridge
Analytica. Kogan is testifying before the British Parliament next week. He says he's financially ruined and
discredited. Through his ordeal, he says he's come to see the error in the assumptions made
by the tech world about Americans' attitudes toward privacy. Now we all know what you did.
Was it right? Back then, we thought it was fine. Right now, my opinion has really been changed,
and has been changed in particular, because I think that core idea that we had, that everybody knows, nobody cares, was fundamentally flawed. And so if that idea is wrong, then what we did
was not right, and was not wise. And for that, I'm sincerely sorry. It turns out Kogan has something in common with Mark Zuckerberg.
They're both suddenly contrite.
We didn't take a broad enough view of our responsibility, and that was a big mistake.
And it was my mistake, and I'm sorry.
Mark Zuckerberg says that he cares about privacy now.
I think the real problem is not what he feels in his heart.
I think the real problem is that you've got a
company that has repeatedly
had privacy scandals, that has
repeatedly shown that it doesn't
prioritize privacy over the years.
When you think about that, it's like
put yourself in the position of
if your partner was cheating on you and they
cheated on you 15 times and apologized
15 times,
at some point you have to say enough is enough.
We need to make some kind of a change here.
Sometimes historic events suck.
But what shouldn't suck is learning about history.
I do that through storytelling.
History That Doesn't Suck is a chart-topping history-telling podcast chronicling the epic story of America decade by decade.
Right now, I'm digging into the history of incredible infrastructure projects of the 1930s,
including the Hoover Dam, the Empire State Building, the Golden Gate Bridge, and more.
The promise is in the title, History That Doesn't Suck. Available on the free Odyssey app or wherever you get your podcasts.
Back in the 1980s, a laboratory of misfits foresaw our future.
Touchscreens, automated driving instructions, wearable technology, and electronic ink
were all developed at the Massachusetts Institute of Technology in a place they call the Media Lab.
It's a research lab and graduate school program that long ago outgrew its name.
Today, it's creating technologies to grow food in the desert, control our dreams, and
connect the human brain to the Internet.
Come have a look at what we found in a place that you could call the future factory.
To Arnav Kapoor, a graduate student in the Media Lab, the future is silent.
He's developed a system to surf the Internet with his mind.
What happens is when you're reading or when you're talking to yourself,
your brain transmits electrical signals to your vocal cords. You can actually pick these
signals up and you can get certain clues as to what the person intends to speak. So the brain
is sending an electrical signal for a word that you would normally speak, but your device is
intercepting that signal. It is. So instead of speaking the word, your device is sending it into a computer.
That's correct.
That's unbelievable. Let's see how this works.
So we tried it.
What is 45,689 divided by 67?
Sure.
He silently asks the computer,
and then hears the answer through vibrations transmitted through his skull and into his inner ear.
6, 8, 1, 0.9, 2, 5.
Exactly right.
One more. What's the largest city in Bulgaria, and what is the population?
The screen shows how long it takes the computer to read the words that he's saying to himself.
Sofia, 1.1 million.
That is correct. You just Googled that.
I did.
You could be an expert in any subject.
You have the entire internet in your head.
That's the idea.
Ideas are the currency of MIT's Media Lab.
The lab is a six-story tower of Babel,
where 230 graduate students speak dialects of art, engineering, biology, physics, and coding,
all translated into innovation.
The Media Lab is this glorious mixture, this renaissance,
where we break down these formal disciplines and we mix it all up and we see what pops out.
That's the magic, that intellectual diversity.
Hugh Herr is a professor who leads an advanced prosthetics lab.
And what do you get from that?
You get this craziness when you put like a toy designer next to a person that's thinking about
what instruments will look like in the future next to someone like me that's interfacing machines to the nervous system.
You get really weird technologies.
You get things that no one could have conceived of.
The Media Lab was conceived in a 1984 proposal. MIT's Nicholas Negroponte wrote,
Computers are media that will lead to interactive systems. He predicted the rise of flat panel
displays, HDTVs, and news whenever you wanted.
Negroponte became co-founder of the lab and its director for 20 years. When we were demonstrating these things in, let's say, 85, 86, 87,
it was really considered new.
It looked like magic.
It was indistinguishable from magic.
In 1979, MIT developed MovieMap, which predated Google Street View by decades.
Now, notice what's so common today that you didn't even notice it.
He's touching the screen.
If you had seen that on 60 Minutes in the 80s, you would have been amazed. And you might have been dazzled by one of the earliest flat screens. It was six
inches by six inches, black and white. It was a $500,000 piece of glass. It cost a half a million
dollars, that piece of glass. And I said, that piece of glass will be six feet in diagonal with millions of pixels in full color.
In 1997, the lab also gave birth to the grandfather of Siri and Alexa.
Nomadic wake up.
Okay, I'm listening.
Go to my email.
Where do you want to go? And in 1989, it created turn-by-turn navigation that it called backseat driver.
They're right at the stop sign.
And the MIT patent lawyers looked at it and said,
this will never happen, never be done,
because the insurance companies won't allow it, so we're not going to patent it.
Look through the glass-walled labs today,
and you will witness
400 projects in the making. The lab is developing pacemaker batteries recharged by the beating of
the heart, self-driving taxi tricycles that you summon with your phone, phones that do retinal
eye exams, and teaching robots.
So we think that the devices of tomorrow have an opportunity to do so much more and to fit better in our lives.
Professor Patty Maas ran the graduate program student admissions for more than a decade.
We really select for people who have a passion.
We don't have to tell them to work hard.
We have to tell them to work less hard and to get sleep occasionally. How often does a student come
to you with an idea and you think, we're not going to do that? Actually, for us, the crazier,
the better. Typically, there's some blood vessels running. Adam horowitz's idea was so nutty he was
one of 50 new students admitted last year out of 1300 applications i was really interested in a
state of sleep where you start to dream before you're fully unconscious where you keep coming
up with ideas right as you're about to go to sleep time to go to sleep. Har Horowitz's system plants ideas for dreams.
Remember to think of a mountain.
Then records conversations with the dreamer during that semi-conscious moment before you fall asleep.
Tell me, what are you thinking?
I'm doing an origami pyramid.
Her origami pyramid dream was influenced by the robot saying the word mountain. It's long
been believed that this is the moment when the mind is its most creative. Har Horowitz hopes
to capture ideas that we often lose by the next morning. So it's basically like a conversation.
You can ask, hey, Jibo, I'd like to dream about a rabbit tonight. It would watch for that trigger
of unconsciousness.
And then right as you're hitting the lip, it triggers you with the audio,
and it asks you, what is it that you're thinking about?
You record all that sleep talking, and then later, when you wake up fully,
you can ask for those recordings.
And when he brought this idea to you, what did you think? Really?
Crazy enough.
Yeah.
Welcome to the world of bodies and motions.
Nearby in Hugh Herr's lab, Everett Lawson's brain is connected to his prosthetic foot,
a replacement for the club foot he was born with. The very definition of a leg or a limb or an ankle
is going to dramatically change with what they're doing. It isn't just whole, it's 150%.
You feel directly connected, huh?
Yeah, when I fire a muscle really fast, it makes its full sweep.
Herr's team has electronically connected the computers in the robotic foot
with the muscles and nerves in Lawson's leg.
He's not only able to control via his thoughts, he can actually feel
the design synthetic limb. He feels the joints moving as if the joints are made of skin and bone.
For Professor Herr, necessity was the mother of invention. He lost his legs to frostbite at age
17 after he was stranded by a winter storm while mountain climbing.
Through that recovery process, my limbs are amputated. I designed my own limbs. I returned
to my sport of mountain climbing. I was climbing better than I'd achieved with normal biological
limbs. That experience was so inspiring because I realized the power of technology to heal, to rehabilitate,
and even extend human capability beyond natural physiological levels.
You developed the legs that you're wearing today.
Each leg has three computers, actually, and 12 sensors. And they run these computations
based on the sensory information that's coming in. And then what's controlled is a motor system, like muscle,
that drives me as I walk and enables me to walk at different speeds.
What will this mean for people with disabilities?
Technology is freeing.
It removes the shackles of disability from humans.
And the vision of the Media Lab is that one day,
through advances in technologies,
we will eliminate all disability.
So that was a big deal.
The current director of the Media Lab is Joey Ito, a four-time college dropout and one of
those misfits that the lab prefers. After success in high-tech venture capital, he came
here to preside over the lab's 30 faculty
and a $75 million annual budget. How do you pay for all this? So we have 90 companies that pay us
a membership fee to join the consortium. And then because it's all coming into one pot, I can
distribute the funds to our faculty and students, and they don't have to write grant proposals.
They don't have to ask for permission, they just make things.
Do any of these companies lean on you from time to time and say, hey, we need some product here?
They do. I've fired companies for that.
You've fired them?
Yeah, I've told companies you're too bottom line oriented, maybe we're not right for you. The sponsors, which include Lego, the toy maker, Toshiba, ExxonMobil, and General Electric,
get first crack at inventions.
The lab holds 302 patents and counting.
We're inside of the lab.
Caleb Harper's idea is so big it doesn't fit in the building.
So MIT donated the site of an abandoned particle accelerator for this
trained architect who is now building farms.
Welcome to the farm.
He calls these food computers, farms where conditions are perfect.
They're all capable of controlling climate, so they make a recipe.
This much CO2, this much O2, this temperature.
So we create a world in a box.
Most people understand if you say, oh, the tomatoes in Tuscany on the North Slope taste so good and you can't get them anywhere else.
That's those genetics under those conditions that cause that beautiful tomato.
So we study that inside of these boxes with sensors and the ability to control climate.
Tuscany in a box.
Tuscany in a box, Napa in a box, Bordeaux in a box.
Now these are plants you're growing in air.
Yeah, so this is... These basil plants grow not in soil, but in air.
The plant is super happy.
No dirt.
Air saturated with a custom mix of moisture and nutrients.
So each one of these are drops that drops down to the reservoir.
The food computers grow almost anything, anywhere.
What have you learned about cotton farming?
So cotton is actually a perennial plant, which means it would grow, you know, the whole year long.
But it's treated like an annual. We have a season.
So in this environment, since it's perfect for cotton, we've had plants go 12 months.
So how many crops can you get in a controlled environment like this?
You can crop up to four
or five seasons. We're growing on average three to four times faster than they can grow in the field.
The uncommon growth of the media lab flows from its refusal to be bound to goals, contracts,
or next quarter's profits. It is simply a ship of exploration going wherever a crazy idea may lead. We get to think
about the future. What does the world look like 10 years, 20 years, 30 years? What should it look
like? You know, the best way to predict the future is to invent it. Wendy's most important deal of
the day has a fresh lineup. Pick any two breakfast items for $4. New four-piece French toast sticks, bacon
or sausage wrap, biscuit or English muffin sandwiches, small hot coffee, and more. Limited
time only at participating Wendy's Taxes Extra. Now, Dr. John LaPook on assignment for 60 minutes.
Mike and Carol Daly have been married for 53 years. Like more than 5 million American families,
they're dealing with dementia. Carol has been suffering from Alzheimer's, the main type of
dementia. What makes this story so unusual is that almost every year for the past 10 years,
we've interviewed Mike and Carol as Alzheimer's took over her brain. Even though this is intensely personal,
they wanted all of us to see the devastating impact of Alzheimer's on each of them over a decade.
We should have brought the bread.
When we first met Carol and Mike in 2008,
Carol was active, conversational,
and determined to make the best of her failing memory.
How old are you now?
65, 65 now.
I think, right?
Yeah.
Carol's memory had been spotty for several years.
I started to notice it at home, and I used to joke about it to my kids.
I would say, you know, I think she has Alzheimer's, the way she forgets everything.
Then a doctor told her she really did have Alzheimer's.
Mike's mother had had it, now his wife.
Carol, when you heard the word Alzheimer's, what did that do to you?
I was devastated because I saw his mother, what she went through.
It's terrible.
She was walking the streets in the middle of the night, and we had to bring her home.
As Carol's memory deteriorated, she lost her job at a bank
and lost her ability to do a lot of what she'd always done at home.
Did you used to be a good cook?
Yeah.
Oh, yeah.
What happened?
It stopped.
I just couldn't do it.
What couldn't do it.
What couldn't you do?
I didn't know what to do first.
The meatloaf. Oh, the meatloaf.
That was the...
It was terrible.
I couldn't eat it.
Because...
I don't know what I did with the ingredients or whatever.
I just couldn't eat it.
And you're tearing up.
It's upsetting for you.
I don't want to be like this.
I really don't.
Unable to concentrate,
Carol had to give up reading and movies.
Hard for someone who'd worshipped Clark Gable.
Oh, so handsome.
So you remember that?
That I know.
They told us Carol's illness had brought them closer,
but they feared the future.
My fear is, I guess,
maybe it'd get worse, you know, and it probably would.
And it did.
Almost three years later, when we went back to visit,
Carol had no idea how old she was.
80? No, I don't know.
You're actually 67.
67?
67.
Yeah.
And what about her favorite actor?
Do you remember Clark Gable?
Oh, yeah, that was my...
Yeah.
Who is he?
Oh.
Oh.
I...
I don't know what now.
Now Mike, a former New York City cop, had to apply her makeup and dress her.
But he told us this was his chance to repay all that Carol had done for him.
She had a job.
She cleaned house.
She did the wash.
She made the beds.
And she put up with me.
So all that's changed for us is the roles.
Now, I do the wash.
I make the beds.
I help Carol.
But that's not what you signed up for.
Yes, I did.
But when we took our oath, it's for better or for worse.
So I did sign up for it in the beginning.
But Mike had put on almost 20 pounds over the last two years
and started taking pills to reduce anxiety and help him sleep.
The thing is, I could sit here and feel sorry for myself,
but what is that going to do for me?
At our next meeting, one year later,
when Carol couldn't come up with words,
she answered with laughs.
What kind of thing?
That's not right.
And three years since our first visit,
she needed constant watching.
I can't go out by myself, you know, like that.
So, we have to have somebody around.
That's a bad feeling.
Yeah.
You've lost your independence.
Yep.
That's what you do
after all these years
I can't give up
and I'll continue to try
and I pray to God
that she goes before me
because I'm not going to put him in a nursing home.
I can handle it.
But
we live a life.
But that life was a lot tougher
when we returned two years later.
By then, Carol could no longer
remember her last name
or this.
What's your husband's name?
My husband's?
Your husband's name? My husband's?
Your husband's name.
The guy sitting to your left.
That big guy who loves you.
Yeah, who loves me.
Beyond the memory loss,
as Alzheimer's affected more of her brain,
it was destroying more of her physical abilities.
She's losing the ability to control her feet, her hands.
It was six years ago that I first met you.
Yeah.
And at the time, you were shouldering all of the burden.
Right.
And you're still shouldering all of the burden. I. And you're still shouldering all of the burden.
I mean, how are your shoulders?
They're sore, no doubt about it.
But you have to do what you have to do.
Carol.
At our next meeting, two years later,
conversation with Carol was impossible.
It's been almost eight years since we first met
and since we first sat on this couch.
Yeah.
Without making you embarrassed, do you
remember my name? No. What's this called? What I'm wearing on my wrist. What's the name of that?
I don't know. It's a wristwatch. Oh, yeah. Does that sound familiar? Yeah. Carol reached a point where she was not able to do anything for
herself at all. She couldn't feed herself, couldn't go to the bathroom by herself. And Mike had reached
the point where he simply couldn't take care of her by himself. So he hired a home care aid during
the day, costing almost $40,000 a year. Now, Alzheimer's was hitting
them financially on top of mentally and physically. What would you say the toll has been of this long
journey on you? I'm dying. I really took a hit. The stress, I thought I had a heart attack to
begin with. You had chest pain. They want to put me in the hospital. I can't go to the hospital. All right, what do I do with Carol?
Then she has anxiety attacks, part of the Alzheimer's. Anxiety attacks may be part of
what's happening to you too. It sounds like if you had chest pain, but it wasn't a heart attack,
is that what it was? An anxiety attack? I call it stress.
According to the Alzheimer's Association, the vast majority of caregivers say their toughest challenge is emotional stress.
I can still remember when you said, no, Miguel, I can handle this.
I think about that comment I made and I said, what a jerk I was. Well, not a jerk, but just you were sort of near the beginning of your journey,
and you didn't know.
Yeah.
You know, I thought this was it, you know?
So she can't remember things.
I see people with dementia.
They function normally.
She can't walk.
The impact on everybody else is enormous.
One year later, 14 years since she was first diagnosed,
Carol was spending most of her days sitting silently,
no longer able to understand questions.
We can't communicate.
It's lonely.
Let me just get them so they go on nicely.
But watch what happened when social worker Dan Cohn
put headphones on Carol
and played some of her old favorites.
The words aren't there,
but the beat is, and the melody is.
The melody is pretty good.
In Alzheimer's, older memories are usually the last to go.
But even then, a faded distant memory can sometimes be revived.
And since the music we love is really tied to our emotional system,
and our emotional system is still very much intact,
that's what we're connecting, and that's why it still works.
And it was tied to his emotional system too.
He was tearing up.
I think those tears were happy tears,
knowing that she hasn't lost it all.
It was like, wow, wow.
But the wow did not last.
When we met this past January,
Carol, now 74, was too far gone to react to music. She's so changed just since the last
time I saw her. And her pulse is as strong as can be and regular. I'm feeling it right now.
So her heart seems strong, but she has so deteriorated.
Stand up, Carol. Come on.
Earlier that morning, they'd shown us how hard it is to get Carol ready for the day.
What didn't you realize would happen?
That she becomes a vegetable. That's basically what I feel like she is now. Mike is still too heavy, his blood pressure's too high,
and a few months ago, his thoughts were too dark.
I'm ready to put the gun to my head.
I didn't really thought of suicide.
Really?
Yeah, it got to that point.
Caregiving is really tough.
Hardest job I've ever had.
And that's from a former New York City cop. But suicidal thoughts are not uncommon for people taking care of a family member with dementia. Mike hired more aides, so Carol now has 24-hour
help. It's draining his savings, but allowing him to get out of the house and make new friends.
And that's helping lift his depression. I leave this at home. And when I go out,
it's a new Mike out there now. But at home, he worries that Carol is in danger. Has she fallen?
Yes. She hasn't broken any bones? No. Just bruises. No. So now, despite years of telling us he wouldn't put Carol in a nursing home...
I'm coming to the point where maybe a nursing home is the answer for her, her safety.
Ten days after that, and 53 years after their wedding day, Mike did put Carol in a nursing home.
Do you still love her?
I loved Carol, who was Carol.
But now, Carol's not Carol anymore.
When Carol was still Carol, that would have been the best time to discuss the kind of caregiving decisions Mike Daly eventually had to face alone.
Mike hopes that sharing such intimate details of their lives
will help others be better prepared than they were.
50 Seasons of 60 Minutes, from April 2006.
Bob Simon traveled to Kenya to introduce us to Dame Daphne Sheldrick and the orphanage
she founded for abandoned elephants. Their tremendous capacity for caring is, I think,
perhaps the most amazing thing about them, even at a very, very young age. Their sort of forgiveness
and selfishness. So, you know, I often say, as I think I've said before,
they have all the best attributes of us humans
and not very many of the bad.
Daphne and the keepers may run this place officially,
but it's the elephants who are really in charge.
They may be little, they may be orphans, but trust me,
they're not as little as they look.
In fact, I feel like I'm in an elephant sandwich.
Yes, you are.
Dame Daphne Sheldrick died earlier this month in Nairobi.
She was 83.
I'm Leslie Stahl.
We'll be back next week with another edition of 60 Minutes.