The Bridge with Peter Mansbridge - A "Bridge" Special - Polling: The Good The Bad The Ugly
Episode Date: September 30, 2019Day 20 of Canada's 2019 Federal Election. With special guests David Herle and Shachi Kurl. David Herle host of the podcast "The Herle Burly" and is a Principal Partner of the Gandalf Group, and Shachi... Kurl is Executive Director of the Angus Reid Institute. | Thank for subscribing and for submitting a rating and review! * TWITTER @petermansbridge | INSTAGRAM @thepetermansbridge ** https://www.thepetermansbridge.com/ *** Producer: Manscorp Media Services
Transcript
Discussion (0)
And hello there, I'm Peter Mansbridge. This is The Bridge, and if I sound excited today,
I am excited because this is a very special edition of The Bridge. As I warned you on the
very first day, I was hoping during the campaign that there'll be a couple of days when we'll do special broadcasts,
and one of them would be about polling.
I'm always talking about polling.
I'm not a big fan about polling because I have concerns,
and yet I always end up talking about it.
So the best way to deal with that is to actually do a program on polling
and put out these concerns and some of your concerns
and get them answered by a couple of the best pollsters in the country.
And that's what we've got on tonight's special edition of The Bridge.
Okay, here we go.
As promised, two of the country's top pollsters talking about polling.
That's what we're going to try and do here for the next little while.
Who have we got?
We've got Shachi Curl, Executive Director of the Angus Reid Institute.
Shachi's here in Toronto, normally in Vancouver.
So we're lucky to have you here for a couple of days.
Hello.
Yes, hello.
And David Hurley, political consultant, principal partner,
polling and research firm Gandalf Group, but best known.
As an insider.
Well, as an insider.
That's true, but best known really these days as hosting the Hurley Burley.
Thank you very much.
Which is a fun podcast.
You've never asked me out of your podcast, Dave. You've never asked me. Well, I'm in the bestley Burley. Thank you very much. Which is a fun podcast. You've never asked me out of your podcast.
You've never asked me.
Well, I'm in the best of company then.
I have begged you.
He keeps saying.
He'll tell you, got to have you on the program.
That's the last you'll hear about it.
It's like when a guy says, I'll call you.
I get it.
I get it.
Anyway, the beauty of the Hurley Burley is it was one of the original
Canadian political podcasts.
And it's become extremely popular through the election campaign
with David
and Scott Reid
and Jenny Byrne.
Who ran a lot of Harper's stuff.
And so the group of them
are... And whom I blame for Ford as well.
And you blame for Ford.
But she left Ford, right?
She did.
They won, and then she left shortly after winning,
after getting a job in government.
Anyway, enough about the Hurley Burley.
This is the bridge.
And the reason I want to do this is because a lot of people are still,
including myself, after years of covering polling and talking about polling,
I'm still confused about a lot of things that happen in polling.
So I think this is a good opportunity to discuss it
and also to have two pollsters together in the same room,
which doesn't happen often.
No, we're not a collegial group.
No.
No, I'll say that.
I mean, pollsters tend to sort of like talk privately
and kind of raise concerns about other pollsters.
Oh, no, no.
Now many of them have just taken to calling each other out on Twitter
and in public and saying terrible things.
And, you know, it's...
Well, don't hold back now.
Everybody's wrong but me.
People tend to have this idea that it's well don't hold back now everybody's wrong to me people tend to have this
idea that it's like it's a sector it's an industry okay it's not a regulated industry you don't go to
school to become a pollster you don't write an exam to call yourself a pollster and so as a result
the so-called sector does behave a little bit like the used car industry and you've got the
pickup truck guys saying well you can't trust electric cars they're no good and you've got
the electric people going we're on the cusp and the cutting edge of polling and ours is the best
and this is really the battle that you have over methodology in this country but i know you're
going to ask questions and then we're going to get into all of this. Well we're going to start on on kind of that vein in the sense of what's the most misunderstood thing about polling? All of the
things? A lot of what you just mentioned but but generally for you know for for people out there
including journalists what's the most misunderstood thing about polling david um the most misunderstood thing about polling is that
it is not significantly used to measure the current state of public opinion but to understand
how to change the current state of public opinion and that's how the political parties use it that's
how the corporate world or the labor movement or anybody that commissions polling nobody does it
because they want to know what people think right. Nobody does it because they want to know what people think right now.
They do it because they want to change what people think.
That's what polling does.
So when the media uses polling the way the media uses it,
that's not the way the real people who populate the polling industry
in terms of its use use it.
No, no, no.
If you're inside a
political party and you're getting your daily tracking results the uh top line number will be
dispensed with in two minutes of a 25 30 minute presentation it's mostly about what are people
thinking about what are they caring about what do they think about my candidate what do they think
about the other candidate who's dominating this race right now who's got the frame for this race
that's what polling is telling you.
That's the difference between a one-question survey that people get through the media
and the kind of detailed information that the parties are working with.
Shanchi?
Well, I would agree and I would disagree.
I think David's absolutely right.
That is how the commercial side and sort of the the dark art side not to ascribe you know
but but really sort of the commercial effective side what is going on behind the curtain I think
I would like to think that when journalists try to look at polling numbers they are trying to
to an extent understand a snapshot in time the work that I do at the institute which is a not
for-profit organization so we don't do that kind of client
work, is trying to understand both things, both that high-level top line and also where is the
soft underbelly of persuadable votes and what are the political parties going to be doing to try to
sway it. So I think it's both. But I think that the most misunderstood thing
about polling is people use it like some sort of bloody crystal ball to predict the future.
And yes, there are trends that you can pick up off of it. There are signals in that noise. But
at the end of the day, I went to go give a talk and someone's like, so should I check my lottery
numbers against what you're going to tell me? And I'm like, no, jerk, you shouldn't. I'm telling you what the data says today. Depending
on how long the trend line goes, I can tell you what they've been thinking over the last 30 years.
And is this a departure? Is this more of the same? Is it a big deal? Is it a little deal?
But this idea that we can somehow tell you exactly what's going to happen on election night as a as as a proof
point of what's coming out of the field today that i think is one of the biggest biggest
misconceptions about polling so can i just put a nuance on what you were saying there which is
yeah you cannot use polling to predict what's going to happen any distance out because there's
intervening events you can't predict but one of the amazing things about research that people don't understand
is it does tell you quite definitively,
if you say this message to this group of people, they will react that way.
You are right about that.
Right.
And so that's what parties know,
and that's what all professional communicators who have access to research know,
is if I say this message with these words and this intonation to this segment
of the population, they will react to that in this way. And there's been plenty of examples of that
even on this, even in this election, even on this campaign trail. If you look at the trade-off
between energy and the environment, you actually see that Canadians don't want either or, they want
both and. And yet, two things happen.
One, the one party that tries a both-and approach ends up getting punched from both sides
by their political opponents
for being too soft on pipelines
or too soft on climate action.
Or you end up with what I think are missed opportunities.
The screaming finding on that pipeline
versus climate action debate is if the
Conservatives had just moved a little bit more towards the centre and had more to say on the
climate file, I think you'd be looking at a very different campaign dynamic today.
I just wonder how much pollsters feed this belief of being predictive.
I think they do.
You're left with the impression when you hear a pollster explaining
their latest numbers that in fact they are predicting. They don't necessarily say that.
Sometimes they get awfully close to saying it. But I think it's understandable why a lot of
people would think those numbers are being predictive. So David, hurt feelings alert.
A lot of this is the domain.
Again, the industry or many of the main players out there tend to be men over the age of 50 who like to talk about how much they know.
So, and you combine that with the commercial aspect of it.
You go take a meeting with a big swinging client.
I can tell you what's going to happen.
This gentleman, the name of his company has been the Gandalf Group, right?
It's all magic.
I know the spells and the secrets.
Thou shalt not pass.
I think she's got you on that one.
So, of course, that bleeds from the conversation with the client
into the conversation into the public square,
into the conversation with the journalist.
And as an ex-journalist and a broadcaster, I get it.
They don't understand math.
Math is hard.
Half the time, they don't know what the hell they're talking about
when they're reporting polls, especially the young ones.
And I talk to newsrooms every time I see a poll egregiously wrongly reported on.
I will call their news director and say,
I will come and talk to your people
and give them polling 101 basics.
Well, there's a lot of that.
And they're all like, nobody has time for that.
And I'm like, well, that's great.
They rarely talk, as they should always mention,
when it was done.
Nobody talks about that.
Rarely mention margin of error.
Yeah.
And certainly don't mention margin of error
when they keep going down in a national poll
and they say, but in the province and in this particular region,
they keep going down and down as the margin of error is going up and up.
You rarely see that.
The pollsters mention it in the tiniest of print in their copies.
It's not that tiny.
It's 12 point font.
When you were doing the National, we always mentioned
all of these things. Did you try to make
your story sound as authoritative as
possible or did you try to make sure
that the viewers knew all of the weaknesses
in the story you were presenting, all the things you didn't
know, all the things that your reporter
had been unable to uncover?
Yes.
But I think that is what drives the lack of transparency and i think it is a lack of your clients are expecting some
level of authoritativeness from you they want to know they're spending a lot of money yeah your
clients are and they want to know how are we doing right so you are you are
needing to tell them literally how you're doing it doesn't mean that you ever say this is how it's
going to be three weeks from now necessarily but this is the trend line and this is what we're
expecting and but that is the narrative and that is the tone or the attitude that then bleeds into
i think what is a different beast around the public and the transparent
reporting of polling, where you should say, you know what, do not try to take this sub-base of
100 people in one province that's really teeny tiny and start dividing it up into regions and
cities and pretend like you have an accurate look at what's going on. The media have a responsibility
here. The pollsters are giving them a lot of information.
They're giving them the general qualifiers,
and there's lots of, I mean,
you're out there doing your thing,
trying to educate people about polling.
David Coletto is out there doing his thing,
trying to educate people about polling.
And the other day, I saw a tweet from iPolitics
that said that the whole complexion of the race had changed
because the Liberals had tumbled.
Tumbled?
Two points?
No, no.
By nearly a full percentage point.
Oh.
Nearly a full percentage point.
That stuff drives me back.
And that's not from the pollster.
That's a journalist trying to twerk a story.
No.
And some of the journalism around.
I mentioned this last week on The Bridge.
That's the name of this
podcast um i mentioned last week how i saw an ad run by one news organization about the poll they
had commissioned where the headline in the ad was if the vote was to be held tomorrow this would be
the result and the result and this was a poll it was five days old or at least the beginning of
it had been five days ago and that's just like it's unforgivable um i you know when you asked
me about what i did on the national when it came to polls i was very you know i was kind of reluctant
on polls you know i've always you know always reported them to some degree but i i was always
kind of reluctant because of the changing nature of polling
and how it's being done,
the different methodologies
and the questions that are raised about that.
One thing I started doing in the early 90s,
and I think we were the first ones to do it,
which was to collect all the polls that were done
and kind of put them all together and average them out.
And once a week we'd say, this is kind of the average of all the polls that were done and kind of put them all together and average them out. And once a week we'd say this is kind of the average of all the polls that are out there.
And then we kind of backed away from that because we started to have doubts about it
because there were different methodologies involved.
And now this sort of aggregating of polls has become a thing.
Some people do it.
The CBC does it.
And they say they're very careful about this issue, about, you know, differing methodologies.
But I still worry about that.
It's like comparing apples and oranges and mixing them all in together as if they were all one.
Well, and so that was a big part of the problem with the 2016 U.S. election was the heavy, heavy reliance on aggregators.
And then all of a sudden the aggregators had it wrong.
So pollsters had it wrong and the polling industry had it wrong. And you just talked
about apples and bananas. I have likened it when I talk about this to aggregation as the art or the
failure of smoothie making, where you're not just throwing in different methodologies,
but you're also dealing in the fact that sometimes the data might be old,
so it smells a little funky,
or maybe you don't have the banana from that province or state.
So rather than tell the consumer,
look, there again are some gaps in what this data and information contains,
you're just throwing it in there, pouring it out, and saying this is the right thing. You're not talking about the fact that sampling errors are compounded every time you
aggregate. It compounds and compounds and compounds. So it's a tool for understanding what a bunch of
pollsters are talking about. But really, if you're a polling consumer, whether you are a paying client
or whether you are a journalist or whether you're someone who just, if you're a political junkie and you're into this, you've got to arm yourself with a bunch of information and the knowledge to ask questions.
So you talked about methodology.
That's one thing.
And we know that certain methodologies are better suited to certain kinds of polling than others.
A phone poll may be much better suited
to doing a neighborhood canvas of 300 people
in a very tight environment.
But Gallup had problems with landline calling in 2012
because it skews older.
And who the hell has a landline anymore anyway
if you're under the age of 35?
So there's big gaps and problems there.
Well, but people call cell phones, right?
Yeah, but online has its issues. I'm saying that, you know, there's a gaps and problems there. Well, but people call cell phones, right? Yeah, but online has its issues.
I'm saying that, you know, there's a lot of folks out there who are saying that one methodology continues to be a gold standard.
And I, well, I disagree.
I disagree on that.
I think what you actually need to do is look at the track record of the polling organization involved.
Are they generally getting it right?
Are they generally getting it wrong? Are they generally getting it wrong?
That's one thing to arm yourself with.
Who paid for the poll?
What were the questions asked?
You know, we never talk about questionnaires.
We never actually talk about the questions.
There are add-ons.
You could be also talking about what soap you use, right?
Along with who you're going to vote for or which party.
Or something that might have been more relevant
and might have led you to a different answer
than you might have given had you not been asked those questions beforehand.
You know, where you just asked a long series of questions about...
About health care.
Or border security.
And how do you feel about the way your government's doing?
And by the way, how are you going to vote?
You know, those are the types of things that affect the question and affect the outcome what about in terms of i know you two
were kind of screaming back and forth at each other on this issue of how you actually get
uh the person polls views i mean when polling started in 1840 or whenever it was it was kind
of like a word of mouth in a very simple uh poll but when
george gallup i guess developed it more into the uh mid 1900s initially it was door-to-door
and then it was landline phone and then it became well we better add cell phones too because you
know an increasing number using cell phones and now there's something uh know, an increasing number are using cell phones, and now there's an increasing number of pollsters
who are doing things online.
In a basic way, what are the differences here,
and how can they impact the result?
Well, I mean, do you want to start at the beginning?
I'm going to start, and you are going to take issue with me,
but I'm going to start.
I think up to the online evolution,
all of the changes had been to make it more accurate.
And when we went from phones to online,
it wasn't because it was more accurate.
It was because it was more convenient and cheaper.
It was cheaper, yes.
Cheaper.
And so it isn't more accurate.
In fact, the big problem with online, although in most cases they seem to be accurate,
although it's fair to say, by the way, for the polling industry,
that outside of elections, we have no accountability whatsoever
because there's no benchmark.
It's only elections where anybody gets to know whether we were accurate or not.
That's why elections are so important in this world.
If someone hires me and I go off and tell them your reputation, support you people have 54 of the people have a favorable opinion of you who's to
say i'm wrong or right there's no other method of knowing so only in elections do these things get
tested and i would say that online polls are in the ballpark okay they are in the ball they are
they are in the ballpark and if ballpark's all you care about, and for most commercial use, it is,
then you're saving a lot of money and it's pretty good.
If you need to call this thing within one or two percentage points,
you better be doing it by phone.
So the parties, do they do it by phone?
Well, I always have.
I've never done anything by phone.
I can't speak for the other parties.
Not all political parties have done it by phone.
And David's done Liberal Party polling, both provincially and federally.
So the Institute, again, does not do any client work,
does not work with any political parties.
But I have known over the years that other polling organizations
and actors have used online polling. It hasn't
been disastrous for them. I think the point earlier that I was trying to make is not that
one is better than the other. So don't misunderstand me. But that I don't think
any methodology can today in 2019 claim to be pristine or without problems. So phone polling
has its issues with expense with high
refusal rates where are you going to find people even if you do get them on their cell phone do
they really want to talk to you or worse where it's you better keep it short you know with iv
okay so and that is inherently a problem one two questions 10 minutes most well if it's ivr
if it's a robo call they won't talk to the robot for more than four
or five questions what's IVR so that is that is uh that is basically a robo dialer where you're
talking to a robot press one for yes press two for no they're going to give up after a couple
of questions no and my thing is I talked to one the other day I mean and and for my sins it's a
good thing an IVR pollster isn't next to me because Because I would say, do you really know if it's a cat or a baby answering that question?
But, you know, there's been an evolution to polling.
And every time a new methodology is introduced, everyone loses their proverbial stuff.
So when we switched from the days of door-to-door polling in Gallup to landline polling, people were like,
well, how do we know the person on the other side of the line is
telling the truth and then that became the big question with the switch to online polling well
how do you know because you can't hear the truth in their voice but wait a second there's something
different than that and you i i have a suspicion that you know far more about the academic theory
here than i do okay okay but there is a fundamental change
from telephone calling to online polling,
and that was the elimination of probability sampling.
Absolutely.
And the entire intellectual foundation of polling,
and the reason one can expect it to be accurate
is because it's based on probability sampling
and that every person had
an equal chance of being surveyed now that you're using opt-in online panels that is gone so you're
just interviewing a lot of people you're not necessarily interviewing a representative set
of people i okay so if it is done well if it's done well then then I agree with you. The pure randomness is gone.
100% you do not have pure, pure randomness.
But I would also say you don't have pure randomness in online phoning either
because it used to be that you would literally randomly pick phone numbers
out of the phone book.
The phone book don't exist anymore.
And in terms of participation and having a home phone
and having somebody, once you get them on their home phone or their cell phone,
wanting to talk to you, I would argue that that is not random anymore either.
When it comes to opt-in panels, and yes, full disclosure, that's the methodology that we use,
it's not as though it's the same thing, and I think this is often fundamentally understood.
It's not as though you're on a news site and it's a poll of the day thing.
Do you approve or disapprove of Justin Trudeau?
You click on it and you don't know if that person is clicking from Kamloops or Kenya.
What you do, if it is done well, if it is done properly,
with an online panel, you are dealing with tens and tens of thousands of Canadians who are found
or recruited or encouraged, not just from the political junkie blogs, but from golf enthusiast
magazine. Maybe they don't like anything to do with politics or from fashion or the new immigrant
or Mary Clara, whatever it is. So you're trying to create a sample of mini Canada, and you're still using the same controls
for age, gender, income, education. Were you born in this country? Were you not? And then I think
the other thing that people should remember is when we deploy that sample online, out of that
panel of tens and tens and tens and tens of thousands, You're talking, again, to 1,500 to 2,000 people,
and there is a degree, a large degree of randomness
in terms of who gets that.
So it's not as though you opt in and all of a sudden people think,
well, I'm going to sign up for this organization
and that way I'm going to influence the result of the poll that comes out.
You can't do that because you can't actually predict
what you're going to receive and be asked about. You want to pick up on that? No, I think it's fair what she
said. And again, the results seem to be indicating that all my polls are in the ballpark correct. So
I don't mean to trash them here in that regard. And I use them a lot in the commercial sense.
And the point that, you you know the response rates on
telephone are so low now what is how low is possible to believe that they are represented
eight and ten nine and ten refusal sometimes is that fair or higher or higher what do you mean
eight or nine but you make you make 10 calls to get maybe one respondent you talk to 10 people
before you get one respondent one person that talk to 10 people before you'd get one respondent,
one person that was willing to complete the survey.
So you would think that that itself would be having some impact
on the reliability of the surveys
because there should be some difference
between people that decide to answer them
and people that don't decide to answer them.
But on the other hand, it appears at the moment that that isn't the case,
that the only way these people that answer surveys differ from people who don't is that they're
prepared to answer surveys it's not that they're lonely and just want somebody to talk to me but
the actual i mean if you look at telephone surveys random telephone surveys compared to election
results they're going to be within two or three percentage points, generally. I want to throw in some questions here from the Bridge listeners who have sent a few.
And they've sent a lot.
I'm just going to read a few here.
And by the way, if you've sent other questions aside from polling in the last little few
days, I will get to them eventually, but it won't be tonight.
Okay, Pat Wharton from Vernon, BC.
This is the crux of her question.
I'd like to know if you think that pollsters have become influencers.
This goes around this issue, this question that some people have,
and some people have concluded in studies, that pollsters actually
push people towards a result with the results they're seeing,
and that some people figure, I want to be with a winner. Is that conscious or unconscious?
I assume it's conscious. Look, the act of measuring a thing, what's the theory called
in science? Like Heisenberg? I'm getting it wrong. But, the act of measuring a thing. What's the theory called in science?
Like Heisenberg?
I'm getting it wrong.
But there is a theory that talks about how the act of measuring a thing can change the thing itself.
Wasn't Heisenberg the guy with the meth lab and breaking down?
That's right.
Okay, so it's not that.
Somebody Google it quick.
But the point, it's a real thing.
Clearly my strength is social science and not other science.
But carry on.
The fundamental point I'm trying to make is, yes, the act of changing a thing or measuring a thing can change a thing.
You hear a poll reported two days before an election that says, you know, the incumbent party is a handful of points behind the challenger.
The incumbent party, then their strategists, their campaign managers then turn to all of their volunteers and all of their identified voters and go, look, we're at risk of losing this thing.
You've got to get out and vote.
And then you can see a surge of enthusiasm
that puts that party over the top.
And then the postscript to that is everyone says,
well, the pollsters got it wrong, right?
But yes, so to the question,
to the question are individual pollsters
out there influencing things?
I don't think anybody comes on a chat like this
or goes out and talks to reporters or to their clients and says, I am actively trying to change the outcome of something.
Although you might be trying to do that with a client.
I get it.
But when you're reporting it, yes, people will have reactions and they will react accordingly.
Is that nefarious or is just that human nature?
Well, I'm not sure she's indicating or he that it's nefarious.
It's just an interesting theory that there's an influence there that goes beyond the influence you talked about at the beginning in terms of how you can influence a private company's decision-making based on what you're finding that people want.
But in terms of this, are you influencing voters well i
think you i think you are and um i think it is a discussion of whether that's good or bad so
one way you could influence voters would be just a pure bandwagon effect right people like to be
with a winner people can be associated with something that is successful rather than something
that is failing so that if polls can you know continue to
show somebody winning and it can it can have both a bandwagon effect for their supporters and it can
have a very demotivating effect for the losing party supporters you will recall a very very young
alan greg being dragged out by a very young joe clark in the 1980 election to desperately try to contradict a late Gallup poll
that they said was wrong and proved to be wrong
by dramatically overstating what the Liberal lead was.
And they knew how dangerous that was late in the campaign
to be shown to be trailing badly.
So I agree it has that kind of impact.
It was actually Bill Neville, the great Bill Neville.
Is that right?
Who dragged out.
Is that right?
Alan, who was the conservative pollster,
and Bill Neville, who was Joe Clark's kind of jury box.
I just remember poor Alan in a leather jacket sitting up there at a table
and looking like he had an impossible test.
It was the worst possible position for any pollster to be in,
for any party, to have to go up there the night before
or two nights before an election and say,
well, the latest polls.
Wrong, especially if they knew it probably wasn't wrong.
Well, it was wrong, though,
because it said the Liberals were up by 20 points
and they were only won by 10.
10's enough.
But the impact of that, going back to the question,
then can have a very depressing effect on supporters.
But the impact I think people think is nefarious
is the strategic voting part of it.
And I think that's where pollsters perform a valuable service for voters.
Because it is possible for a voter to say,
my preferred choice is the NDP.
But it is most important to me that another party not win.
And so if the NDP can win my riding, I'll vote for the NDP.
If the NDP can't win my riding,
then I'm going to vote for somebody else who can win
and beat those people I don't want to win.
That's a legitimate voting construct and only polling provides people with
that information about how to best get the government that they want at the end of the day
yeah the letters i've had on that and this issue of strategic voting has been
the argument to me is don't vote at all my party's not going
to win i really wanted them to win and they're not going to win and i i'm not interested in
strategic voting so i'm just not going to vote well i mean polling yeah i'll just tell you what
whatever the last polls sorry whatever the polls a week before Election Day say the NDP are at,
they'll get 2% or 3% less than that.
History has shown us that.
They're going to go liberal to defeat the Conservatives.
Yeah, only if the Conservatives are looking like they could win.
And what I would say about that is how much of that is people reacting to polling
and then how much of that is people reacting
to what their leadership is telling them.
So the caveat to that is someone like Elizabeth May,
who whatever she ends up doing in this campaign,
has managed at least this time around to say
a vote for my party is not by necessity a thrown
away vote because i've been able to show that we got a couple on base in british columbia we got a
whole bunch on base in new brunswick hey we even got one in ontario i won my seat if you vote your
conscience maybe just maybe you'll get what you want which has been an interesting antidote to the whole
strategic voting thing so that's not again to say voters are still going to have to say
there's one federal party on offer with a climate plan one federal party that doesn't care about
climate change and so i could vote green or i could make sure that ed rushear is not the prime
minister well and for climate climate motivated voters that's going to be an interesting dilemma.
And for climate-motivated voters, they're going to, I mean, look,
now we're talking like we are Elizabeth May and Justin Trudeau,
but, you know, so let's not do that.
Let's just stay on polling and not slide into the politics
of what we're witnessing unfold here,
because that's an interesting discussion too,
and we can maybe have that on a later date.
It's polling related.
Yeah.
I mean, isn't everything?
He says carefully.
Bethany Collicutt writes this question.
I've been polled many times, but only on my landline.
Many times?
Like, I've never been polled.
I think there have been attempts to poll, but I haven't lifted up.
She really likes talking to pollsters.
But anyway, that's not her question.
With today's trend of younger Canadians only having cell phones,
how do you make sure that your poll sample is representative of the population?
Assuming, of course, that most elderly people or older people are still using
landlines. How do you make sure? So there's, again, there's a whole bunch of things in terms
of sampling, and sampling needs to always be evolving. So it's not just a matter of talking
to people on a cell phone anymore, because what we're finding and what some of the research is
showing on this issue is people now don't want to talk as
much. So if you're trying to reach a young person on their cell phone, it's not going to be, hey,
I'm David and I want to talk to you. And they're like, yeah, click. It's a text message may pop up.
I mean, we all kind of chuckled about Sarah from the Conservative Party who wanted to know,
you know, press, will you just, can you text back yes or no if you're going to be voting for us? But text-based interviewing is going to be and is
already starting to be the next thing. Facebook messenger-based interviewing is going to be the
next thing. The questions around all of that as practitioners is how do you put up the checks and
balances to make sure you're not talking to a bot
in Russia? How do you make sure you're talking to a real person? But it always has to be evolving
around where do you find people, where they are in the medium in which they actually want to
correspond with you. So the most egregious problems here, and it's a problem, the most egregious problems here are with the IVR or robocall polling.
Yeah, I would agree.
Because young people will not talk to the robot.
And so they don't answer those surveys.
And so what all polling companies do when they get their raw data in
is they weight that data up against Canadian census data
so that it is reflective of Canadian census data.
It is generally assumed when you do that that you're not doing very much weighting.
You may have a few cases over in the 40 to 50 category and a few cases under in the 29 to 40
category and you're sort of just adjusting that. But what happens with IVR is you can do a 600-person sample
and have five people under the age of 30.
That's supposed to be 150 people.
So then you take that five people and you weight it up
so that it represents 150 people,
and then you've got Naheed Nenshi losing badly in Calgary.
And you're pumping up tires.
What happens with the less, you know, it's molecules.
You've got just fewer atoms floating around in that tire.
And that is what happened in Calgary, right?
Yep.
It's assumed.
All right, here's another one.
Paul, or sorry, David Clark.
Or no, wait a minute.
It's Paul Clark writing from David Clark's account.
Well, there you go.
Oh, wow.
I don't know how we're going to poll you, Paul.
Do the pollsters use any factors to account for the 30% of eligible voters
who do not vote?
Or do they assume the non-voters are distributed in the exact same ratios
as the responses from their sample population.
Yes.
No!
Yes.
The allocated, the undecided are always allocated proportionally
to what the answers have been.
And so that's how, when you see something that says
the Liberals are at 34 and the Conservatives are at 33 or whatever it is today,
vice versa, right?
They weren't at those numbers.
Those numbers now add up to 100,
but there would have been, say, 15% undecided,
won't say...
Oh, you're talking about the netting out of the undecided.
That's exactly right.
But are we assuming the undecided are the same as the do not vote?
No.
His question is about the 30 don't vote
right i mean usually you know our turnout rates so there's there's questions you ask and and again
they're not and they're asked in a way that is not meant to ascribe judgment but realizing everyone's
busy you know do you think you're going to to cast a ballot in in this campaign do you are you going
to get around to it because you know voting is hassle. You might ask people if they know where their polling station
is. You might ask them about their voting history. What was the last election you remember voting in?
You might ask them what the day of the election is. And if they can tell you it's anywhere near
October 21st, chances are you have someone who's probably a little more vote intentional than someone who
actually can't answer that question. So there's ways to understand what you're dealing with. Now,
the difference between the undecideds and the will not votes, the closer you get, realistically
speaking, the closer you get to an election, if you're still dealing with somebody who says,
I don't know who I'm going to vote for 48 hours before an election, chances are, just based on what history tells us, they will not vote.
But to the questioner's point, you're dealing with two different things.
There's people who are non-voters, and then there's people who are still in that pool of, I might change my mind.
The later you get in a campaign,
they tend to kind of merge and become the same person.
This is a slightly different view.
Okay.
I have a slightly different view,
which is I've spent sort of 20 years trying to figure this out through various elections because nothing would be more valuable
and nothing is more of a problem for the polling industry
predicting elections than to model the electorate.
And if you could actually model that 60% or 65% that's going to come out,
and I've tried all the questions that you are suggesting,
and correlations are pretty weak on all of them.
And the reality is that the only thing that's a really valid indicator
of whether you're going to vote in the next election
is whether you voted in the last one.
The Americans have that data.
We do not have that data at our disposal.
So we don't know.
So the Americans can model likely voters based on past voters.
If a Canadian pollster has a likely voter screen,
they're making it up, and it's art and science mixed together.
There's a lot of waiting and a lot of guessing.
They're trying to figure something out.
I'm saying I haven't in 20 years found a good model for that,
for really doing that.
So I think it's very, very hard.
And the problem is people give you socially correct responses.
If you ask people if they intend to vote in the election,
somewhere between 80% and 90% will say yes.
Well, that's why you add caveats. I know you don't want to spend forever on this. I think that you can have a
sense of indication. Is it a magic bullet? No, there are no magic bullets. And what continues
to bedevil us in terms of polling and election outcomes is not measuring the pie. I think
everyone's pretty good at slicing up the pie and figuring out how big the pieces are relative to the parties. It's the size of the pie. It's
the who's going to show up. And young voters, voters under the age of 35, continue to be the
X factor. And so I feel sometimes like we spend too much time talking about the numbers and not
talking about the turnout dynamics. If you have fewer
voters, we were talking earlier about those who are just going to stay home,
staying home only ever benefits, in the last several elections, has only ever
benefited the Conservative Party. It's how Stephen Harper won three elections
in a row, albeit two of the more minorities. If you have large voter
turnout, as we saw in 2015, which was a high watermark for both voters aged 18 to 24 and 24 to 35, the proportionality of their voter participation went up 18 points in one case and 12 points in the other.
They were there.
The big question is, are they turned off and feeling cynical and going to stay home, or are they going to show up again because they're exercised by issues such as affordability and climate change?
I feel like as pollsters, as an industry, as part of this discussion,
we don't spend enough time talking about how turnout affects results.
Okay.
I think we take that message and remember it.
Because it does.
It obviously does.
You know, and I can remember when I started in this business,
when you two were not even born.
You know, our turnouts were in the high 70s.
You know I'm 57 years old, right?
Yeah, right. Yeah, I am.
Unfortunately, I do.
I'm not.
But, you know, our turnouts used to be in the high 70s,
which sounds great now.
I mean, it's low compared with some other areas,
Prince Edward Island, Newfoundland do pretty well.
But, you know, we were in the 50s two elections ago.
Last one, last big one was 88.
Yeah.
Yeah, and that was because we had a very focused debate, right, around free trade.
We could have had a focused debate here.
Maybe it'll still become one around climate change.
Could be.
Could be around health care.
You know, there's a number of things that it could be around.
But at the moment, it is nothing compared with 88.
You were both there in 88.
Sorry to do this, but I want to know how it is that we actually managed
to get to a single ballot question in 88.
That is a remarkable thing because parties are trying to settle a ballot question.
It was the debate.
It was the debate.
It wasn't the ballot question before that. Mulroney didn't want it to be a ballot question. It was the debate. It was the debate. Because it wasn't the ballot question before that.
Mulroney didn't want it to be the ballot question.
Broadbent didn't want it to be the ballot question.
Only Turner wanted it to be the ballot question.
And up until the debate, it was not the ballot question.
That worked really well for him.
It became the ballot question because of one very simple ad
and one dynamite exchange.
The erased border.
The erased border. The erased border.
Everybody could figure that out, right?
You know, true or not, they could figure it out.
And there was a great exchange between Turner and Mulroney,
which was, you know, the flip side of the exchange they'd had in 1984.
They sold us out.
Yeah.
Right?
Exactly.
And, you know, that engaged everybody.
And suddenly that became the debate for the last couple of weeks.
And the fact that it was the ballot question, in a sense,
and Alan Gregg, again, was, you know,
on the bridge in terms of the campaign strategy for the final couple of weeks
because the Conservatives were in chaos after that debate.
It's a great story, by the way.
People, this is such a part of political history,
but nobody, I don't think,
really necessarily remembers what it means.
Bond the Bridge,
which was Alan Gregg's pithy advice
to the Conservative campaign,
was there is growing antipathy toward free trade, right?
And there is growing support for the Liberal Party.
And the link between those two things is John Turner.
And if you blow up John Turner,
you will blow up the link between anti-free trade and Liberal votes.
And so they went after,
they didn't try to defend free trade in the last two weeks.
They went after Turner, right?
Turner was the bridge between free trade and Liberal.
Which was amazing because in the first half of the campaign,
the Liberals actually tried to dump him in the middle of the campaign.
Oh, that I remember.
And then suddenly in the back half of the campaign,
he was their greatest strength.
Mulroney, or sorry, Turner, while he was being attacked,
could have opened up a second front and he didn't.
GST.
GST.
It was just sitting there.
Yeah.
Nobody was talking about it.
It was coming in on January 1st of the following year.
And he didn't.
You know, it was just sitting there.
I can see you've been well briefed by Michael Kirby.
No, it was...
Hey, listen, I remember that campaign really well.
And, you know, 84 as well.
I mean, those were classic campaigns.
They were two of the great campaigns.
To bring it back to 2019, what we don't have is leadership or strategists or an electorate that's either prepared to buy in to the attempts by the parties to frame ballot questions
or the ability to define ballot questions.
We're talking about polling here.
Oh.
But it's a good point.
And we should maybe discuss at some point.
But we've already gone...
That's your polite way of telling us to stop.
46 minutes here, which is great.
This is the new longest ever bridge.
So we've set new records here.
But we actually still have some questions to ask.
Content, content, content.
Okay, lightning round.
Lightning round, exactly.
Valerie Cormier.
She lives two blocks from English Bay.
Isn't she lucky?
This is going to be about tankers.
No tankers in these
questions. What do you consider a fair
cross-section of the population and how do
you attempt to achieve this in your polling?
I feel like
we touched on that. We have touched on
it, so we're looking for the one-sentence answer.
You're looking for, ultimately,
a sample that almost perfectly mirrors
the Canadian population census data. You're looking for mini-Can that almost perfectly mirrors the Canadian population census data. You're looking for mini Canada. You're looking for mini Canada. And it's
based on the census data. Based on the census data. Yeah. Okay. Yeah. What method do you depend on?
Well, we kind of did this too in this century, you know, phone, in-person internet, we've kind
of gone there. This is good. More details about how and where you find poll participants.
So I talked a little bit about recruiting. So people, ultimately it comes down to not just
talking to people, but talking to people who want to talk to you, who want to take five or 10 or
however many minutes out of their day. And best practice is really, you don't want to go much
longer than 10 to 15 minutes max. You're going to lose them after that. So the't want to go much longer than 10 to 15 minutes max you're gonna lose them after that so the best way to go about it is again when people
write to me and go how do I sign up for your polls why did nobody ask me those
aren't necessarily the people we want to talk to we don't not want to talk to
them but that's not going to help us with our blind spots but how do you find
those people you thought are gonna help? Well, methodologies are all different.
We find them online.
There's advertising.
Some people recruit them by phone.
Some people advertise to get them in.
They're paid an incentive, right?
Okay, you want to talk about incentives?
Yes, there is an incentive.
The incentive in many cases is $0.25, $0.50.
Some people, some online panels will deal in points you've
got to collect so many points the idea of the incentive for 25 cents people
because they want talking to you they're filling in the survey while they're
watching television ah they are they are talking to you they are filling in a
survey they are responding to a questionnaire because they
want to have their say if at some point they have earned enough points to get their chapters gift
card or their amazon gift card don't mute but the idea of oh well we're all susceptible in the
online world to professional pollsters who are watching soap operas all day.
And,
and therefore,
you know,
yeah,
David's giving me the look,
you know,
I,
I stand,
look,
look at our track record.
I stand by what we do and how we find them.
I'm not attacking your business.
I really am not.
I'm just going to say in to,
to the,
to the questioner in the traditional method of a telephone survey,
you will have banks and banks of people sitting at computer terminals with a headset on,
and the computer will be randomly generating phone numbers.
It's called random digit dialing.
So it's not coming from a phone book anymore.
It's not coming from a list.
The computer is randomly generating a Canadian telephone number.
So you have no idea who you're going to talk to.
No idea who you're going to talk to.
But what's interesting about telephone polling and what does make it again a little different from the other
methods is that since you are looking for a specific person now you've randomly we talked
earlier about that random possibility of being surveyed you've picked up a person a good tell
a good company doing telephone interviewing will try over the course of a week seven or eight times
at different times of the day to get that person to answer the survey before they move on to
somebody else and the other the other methods don't have that same rigor of finding who exactly
were we supposed to be talking to you can well you can deploy in an online world which is the
other main world where you're doing it, where you have a set of
profiling questions that you've asked the person who's opting in. And by the way, opting in is
simply saying, I'm going to give you my name, maybe my cell number, maybe my email address.
And at some point I accept that if you try to contact me in the era of privacy laws and the
way that works, that yes, I've given you my permission to give me a call
or to reach out to me via email and ask me some questions in a poll.
We will then try to find out, okay, well, how old are you?
What part of the country do you live in?
Maybe what are some of your interests?
Maybe what's your ethnic background?
Or what languages do you speak at home?
So we have a sense, again, of some of these census marks that we're trying to hit.
And then we will randomly, within that online sample, deploy to a group that represents.
Right, so like if I hired you to do a survey, you wouldn't send out the invitation to your entire panel.
No.
No, you would send it out to a portion there is a randomization of the portion
of that panel that we know represents again that mini canada by census and then if you've got
shortfalls because people haven't responded or because you've got shortfalls elsewhere again
you are then going to say well we're let's release another tranche of sample. Ideally, that's, again, a very small amount to find that 18 to 34-year-old gamer in Sudbury
that you just really need to make sure that your sample is watertight.
And in fairness to these methodologies, there are other issues.
And one of them you should be clear about is the cost and speed.
So to do a survey of 1500 american of 1500 canadians national probably has if you do it
online probably hasn't i'm not i don't know what your pricing is probably has hard costs of less
than five thousand dollars associated with doing that survey sample size of what? $1,500. Pure cost per complete.
Yeah.
Yeah, I would say that's
in the ballpark. Okay. By telephone,
$60,000.
Whoa. $60,000.
And
by telephone, it will take you 10
days to complete that survey. Online,
you'll have the results in two days.
Is that fair? Yes. Is that all fair? Yes. days to complete that survey online you'll have the results in two days well so the temptation yes temptation obviously then is online yeah faster cheaper yeah and cheaper again we may
disagree on this cheaper isn't necessarily worse there are going to be specialized situations and
i talked about this if you need to understand what's going on in 300
households in a very specific quadrant of a neighborhood in a city do two things knock on
doors yourself or uh hire david but if you want a snapshot of what's going on in a region in a
province across the country you can get pretty good results by going online.
And we're not the only online practitioners out there.
Okay, here's the last one from a listener.
This is from Ken Polk in Brockville, Ontario.
He sent a number in.
I know Ken Polk.
You do?
Do you really?
If it's a Ken Polk, I know.
He worked on our 2004 election campaign.
Well, I think you've got to throw that question out. He used to work for Mr. Ketcham. So you fixed this thing? I did. do you really if it's the ken polk i know he worked on our 2004 election campaign well i think
i think you've got to throw that question out for mr ketcham so you fixed this thing i did i fixed
all your phone you can't where's the question from angus why aren't you reading the question
from angus where is angus he's in some central american uh exotic place he's in the office today
i just talked to him um he said not to say hi. No, I'm
kidding. He said say hi to you both. Well, know him or not, this Ken Polk in Brockville, Ontario,
it's a good question. Any sense of where millennials are in terms of likely voting?
There seems to be a sharp demographic distinction on where voters stand on climate change, for instance.
Where's this millennial vote? I mean, the thing about millennials, it's such a big bracket,
right? What is it? What do we actually count? That's a very good point. Well, actually,
they're getting older, David, right? They're about to turn 40 or under. Yeah. So when we're now
referring to millennials, but in our heads, and I see this a lot is we we're now referring to millennials but in our heads and i see this a lot
is what we're really talking about is the 20-somethings and that's more like the i-gen or
gen y or whatever we're calling them millennials are hitting middle age just like the boomers are
all doing careful yeah i'm trying to find a nice way out of that sentence. They're going.
Most seniors are now boomers.
They're aging like fine wine.
They are.
Yes.
But what we do see, and I think we see it regardless of methodology,
is some real generational lines being driven on a lot of questions.
So whether that's climate change,
whether that is even the way Canadians reacted
to the blackface scandal.
Young, progressive people were far more likely
to be outraged, disturbed.
Older people, people over the age of 40, 45,
it's not like anybody was condoning it,
going, yes, blackface is good.
But there was a much higher rate of uh the shrug factor or
what's the big deal and where this really came to play in the data that we looked at was actually
among visible minorities themselves visible minorities under the age of 35 furious their
parents were more like i couldn't get a job, I couldn't find an apartment, people used to, you know,
push me off the sidewalk or do terrible things. This is not the worst thing to happen. So
generationally, when we look at values, it's not just the issues of the day, patriotism.
Canadians over 40 much more likely to claim a strong sense of connection to their country
than Canadians under the age of 40
who have been raised in a really globalized internet era environment. You look at the notion
of family. Canadians over 35, 40, your family is your family. You get what you get. Younger Canadians
know you can pick your family. That applies to their views of marriage, same-sex marriage, LGBTQ2 rights, the economy,
innovation, you name it. We are in this period in this country today where the generations really
are coming apart in a very cavernous way. Last question goes to you, David David and it's actually a repeat of of the first question that went to you
about this um the most misunderstood part because what I found um especially interesting about that
was this difference between what the public polls are are trying to determine versus what the political, the party polls are.
Because the polls that can have the greatest impact, one assumes, on this election are not the ones we read about in the paper or hear about on television.
They're these ones that the parties are taking.
So tell me, once again, the kind of things that the parties are asking.
What are they looking for in their polling?
And I assume this is nightly polling.
I also assume it's not national.
It's not like everywhere.
It's in particular ridings that could have an impact on the vote on October 21st
that are the ones that they see as potential for victory or potential for defeat,
and they're trying to understand why.
Yeah, I don't want to get too much into that. There's a million different approaches to that.
Some people, when they're polling for elections, poll a bunch of different swing writings to get
a sense of that. Other people, which is what I always did, is poll the entire jurisdiction
and apply those results to a seat model to see how things move. So there's a bunch of different ways
of doing that. But I think you should think of, you raised the question of
questionnaires earlier. You think you should think of it in terms of the questionnaire.
So if you see a poll in the media, it may tell you that Justin Trudeau's favorability ratings
are up or down, or choice for best prime minister is up or down. Why? So that's what the parties
will know. Parties will know, liberals will be testing for,
the Conservatives are calling him a phony.
Is that rising, right?
Conservatives are saying he's not as advertised.
The NDP say he can't be trusted.
Which of those things is moving?
And which of those things is correlated to the reduction in his support?
So therefore, what do we need to be positioning against?
What do we need to be balancing against? To go back to the strategic
voting, one of the questions I always asked people was, if you had to choose, would you rather have
a Liberal government or a Conservative government? And so 75% of NDP voters would say I'd rather have
a Liberal government than an NDP government. And then I will say to those people, is it more
important to you to not have a Conservative government or to elect an NDP member? And then I will say to those people, is it more important to you to not have a conservative government or to elect an NDP member?
And then I know who's really vulnerable to me in that NDP cohort.
They want a Liberal government and it's more important to them than electing a New Democrat.
So what do those people care about?
What do I need to say to those people in the last week?
Other questions like second choice.
2015 was really remarkable because what we realized even when the Liberals were in third place was they were everyone's consensus candidate. They were the second choice of both New Democrats and Conservatives. You know you might be on to something there. So again, it's about getting beyond the horse race numbers. And whether that's a party poll or whether that's just a well-constructed public poll, it should always go further than who do you plan to vote for on October 21st.
But what you're also saying is that when we watch a leader,
if they're a newser or their speech or whatever.
Research tested to within an inch of its life.
Right.
Strategically driven by quantitative surveys,
detailed language driven by focus groups. Nothing nothing nothing impromptu about that they're not talking
about it unless they know exactly who they're going to win and who they can potentially lose
and that's okay they've already calibrated for that so they're constantly focus gripping speeches
as they're taking place real real-time focus groups?
Absolutely.
They will be doing some of that for sure.
I'm sure they'll be doing that for the debates.
Right.
But they have got messages, right,
that they have researched to know exactly,
not just the kind of thing you need to be saying.
I'm talking about the research gets down to,
how do you construct this sentence?
What words go in?
What words go out? What words go out?
What do you emphasize?
What do you not emphasize?
And don't say it any other way.
Don't get experimental about this.
Say it that way over and over and over again.
Because when you get tired of saying it, people have just started to hear it.
This has been fascinating.
As I fully expected, Shachi and David, it's been great.
And there's a little, I was going to say politics 101 course there.
It's actually like politics 401.
It's really been good.
Fascinating.
Thank you.
Thanks for having me on.
Yeah, thanks for having us.
Honoured to be on the bridge and great to see what your next venture is.
Well, I look forward to my next appearance on the Hurley Burley.
Well, you know what? You can both come on the Hurley Burley. Yeah, me too.
Well, you know what?
You can both come on the Hurley Burley.
I'd be delighted.
But I will say one thing, right?
If this is Peter's source of income, the next time we're on this thing,
it's not going to be at the Shangri-La.
All right, you two.
Look, you know, he gave you a nice can of Bud Light.
I'm finishing off a nice, you know, glass of white wine.
It's all good.
I don't understand how lucrative these podcasts are.
Oh, goodness.
Yeah, exactly.
This is where you want to retire.
Well, if you like to be paid in likes, David, you're a winner.
Thank you both.
Thank you both. Thank you.
Well, I hope you enjoyed that.
It was different and it was long.
That's the longest bridge on record.
We'll see whether we can beat that in the days ahead.
If you have some thoughts on what you've heard over the last hour,
I consider it really, as I suggested there, a kind of master class in the art of polling
and raising the issues that surround polling.
And I thought both Shachi and David were very honest
and straightforward in their answers about how they handle some of those questions.
I think I learned a lot, and as somebody who's been covering politics
for almost 50 years and polling for most of that time,
I learned a lot in those discussions.
So I hope you did as well.
Anyway, if you have thoughts, drop me a line
at themansbridgepodcast at gmail.com.
That's themansbridgepodcast at gmail.com.
We'll be back tomorrow night with a regular version of The Bridge.
Hope you've enjoyed today. Thank you.