Front Burner - What’s up with these political polls?

Episode Date: November 4, 2024

With the U.S. election just a day away and a Canadian one that could be called very soon, we’re all spending a lot of time talking about polls.But how exactly do they work and what happens when they... get it wrong? Last week, Saskatchewan Premier Scott Moe won another majority government for the Saskatchewan Party despite some polls beforehand showing the NDP in the lead. And famously, the polls highly underestimated Donald Trump’s voter base in both 2016 and 2020.So to better understand the ins and outs of the polling business and the challenges of adapting it to changing habits and politics, we’re talking to David Coletto, founder and CEO of Abacus Data.For transcripts of Front Burner, please visit: https://www.cbc.ca/radio/frontburner/transcripts

Transcript
Discussion (0)
Starting point is 00:00:00 In the Dragon's Den, a simple pitch can lead to a life-changing connection. Watch new episodes of Dragon's Den free on CBC Gem. Brought to you in part by National Angel Capital Organization, empowering Canada's entrepreneurs through angel investment and industry connections. This is a CBC Podcast. Hi, I'm Jamie Poisson. With the U.S. election just one day away, and a Canadian one that could really be called any minute now, we've spent a lot of time talking about one thing, polls. Election strategies are built around them.
Starting point is 00:00:46 Pundits base their forecasts and analysis on them. Shows like ours look to them to get a sense of what's really going on. But what happens when they get it wrong? In 2016 and 2020, the polls severely underestimated the groundswell of support for Donald Trump. Just last week in Saskatchewan, polls showed a tight race with the NDP slightly ahead. But in the end, Premier Scott Moe's Saskatchewan party still ended up with the majority government. To get to the bottom of why polls sometimes fail and what pollsters are doing to adapt to changing tech and political landscapes, I'm talking to David Coletto, founder and CEO of Abacus Data.
Starting point is 00:01:30 David, hey, it's always great to have you. Thanks for having me, Jamie. Great to see you. It's good to see you too. So maybe we can actually start here. How has polling been done up until recently and how has it changed? Well, I think polling first to keep in mind is a relatively young science or field, right? It wasn't until basically the 1930s that you started to see researchers starting in an academic environment, being able to do mass, large sample size surveys, but it has evolved as technology has evolved. So how people communicate, how we get a hold of them has changed from the days where in the early days of polling,
Starting point is 00:02:14 you'd go to people's homes and you'd sit down with them and you'd ask them questions. And then as the telephone became universal, you called them and the telephone is still universal. Everybody has one. In fact, everyone carries one around pretty much, but they don't answer them as much anymore. And if they do, they don't talk to us. So in the last two decades, we've moved towards mostly online research and we recruit people through different methods online. But what has become, I think, very clear to me is that fewer and fewer people are easily accessible to conduct surveys. And I think fewer and fewer people are interested in answering them, which makes the accuracy of polling far more challenging. And we have seen
Starting point is 00:03:00 a shift from it being very much a science where you could randomly survey, you know, a thousand people and it would be pretty accurate to one where pollsters like me have to adjust knowing that this is very much imperfect. I just have a quick logistical question. When you're trying to grab people online, like how? Are you sending them emails? How are you reaching them? Yeah, we're reaching them primarily by email, but these are not random emails. We cannot spam people. You know, you can't generate random email addresses and then send out millions of them. So we have access to most pollsters, all pollsters have access to what we call panels.
Starting point is 00:03:39 These are people who have indicated their willingness to participate in surveys. who have indicated their willingness to participate in surveys. They're called sometimes opt-in panels, opt-in surveys, because people are opting in to conduct them. And so, for example, Abacus Data has access to about half a million Canadians that we can recruit from these different panels and find the right mix of people based on their demographics, where they live, in the case of our political surveys, sort of their political outlook. But they are coming in from these panels. And so if you were
Starting point is 00:04:13 not a member of this panel, and Abacus Data is doing an online survey, you were not going to be invited. You have to be a member of one. Does a certain kind of person agree to be part of that panel? And I ask this in part because, well, I know, you know, I interviewed the prime minister last year. We were talking about how young people, the polling shows that young people are moving away from him in droves. And he made this comment to me that was like. First of all, how many of our young people are actually answering polls? Well, I don't think David Paletto at Abacus Data is talking to landlines. I think he's talking to them.
Starting point is 00:04:46 Well, there's all sorts of questions. But the reality is polls, we were low. I'm not saying that that was his best answer to that question. But does he have a point? Well, look, the thing that keeps me up at night as a researcher, as a pollster, is are the people who we're talking to, who are answering our surveys, the same as people who don't? Because if they are fundamentally different, then we have a problem. When the prime minister or others says that some subgroups of our population are not participating, are not able to be reached, There's, I think some, there's some truth to that, which is why there's so much uncertainty around the US election today. Because there's this notion, we'll talk about this, I know, Jamie, but, you know, the notion of are there shy Trump voters? Are there certain types of demographics that just aren't reachable at the same rate or
Starting point is 00:05:39 the same level as others, that makes these estimates inaccurate or missing something. And so I think, though, that right now, at least in the Canadian context, we have seen a lot of evidence that our methods still are reaching a representative sample of people, which is what we aim to do. David, when you say that there's been evidence that it's working, like, well, so aim to do. David, when you say that there's been evidence that it's working, like, well, so give it to me. What do you think? Like, when is it borne out recently? Well, look, there hasn't been a federal Canadian election, which the polls have been, you know, wrong in which people point to. We can talk about Saskatchewan just last week, in which the polls, yeah, certainly missed estimating the popular vote.
Starting point is 00:06:25 The last round of polls are indicating a very tight race as the Saskatchewan NDP under Carla Beck seemed to have gained the momentum in the last days of this provincial campaign against Scott Moe and the Saskatchewan Party. If we look at the polling average, I have 48% for the Saskatchewan NDP and 47% for the Sask Party. for the Saskatchewan NDP and 47% for the Sask Party. CBC News is projecting the Saskatchewan Party will form its fifth straight majority government. The Saskatchewan Party elected in 32 seats, one more than is needed for a majority in that province. The NDP won 22. That's an improvement on the 14 the party had at dissolution. There have been other provincial elections that are clear misses, but for the most part, Canadian pollsters have been pretty good
Starting point is 00:07:11 at being able to estimate how people are going to vote, how people are feeling about things. So I'm still really confident that we can do this. But we also have to recognize that these are imperfect tools. And if we believe that we are able to do a survey and in a race like in the United States or that could be in some jurisdictions in Canada that are incredibly close, then we have to also recognize that these things can't measure to that level of precision. That there's a margin of error for a reason. And we have to recognize that at some point, if it's so close, we just have to wait until the votes are counted because we're not really going to get a good sense of it from the polling itself. You often hear people talk about why they think the polling was off in this election or that race afterwards, I suppose, with the benefit of hindsight. Why do you think, because you just mentioned Saskatchewan, why do you think the polls were just off?
Starting point is 00:08:22 Do you have a sense of what happened there? Yeah, I think typically speaking, it's a technical term called non-response bias. But basically, in layman's terms, it means that certain voters were not participating in our surveys at the same rate as they should have been. And as a result, and they voted differently than the rest. And as a result, the poll underestimated. And in the case of Saskatchewan, I think it was, the polls did a pretty good job at estimating that the NDP were seeing massive swings in their favor in the cities, in Saskatoon and Regina, but totally underestimated just how strong the Saskatchewan party support was in the smaller communities in rural parts of the province, which is where they ran up their numbers and and outperformed the polls.
Starting point is 00:09:11 Right. So at the end of the day, it was still a majority government for the Sask party, but still a slim majority, the slimmest they've seen since they were elected. But the polls as a whole missed the fact that that rural Saskatchewan still overwhelmingly voted. So what does that mean? Yeah. Like who did they miss? Who is the voter that they missed? Yeah. They missed folks who live in small and rural communities who don't likely sign up for online panels or don't pick up the phone and tell people how they're feeling. Right. And we see it in a lot of communities across the country. I think polls systemic systematically do under represent conservative, small C conservative oriented Canadians who maybe don't normally participate in politics, maybe just don't aren't interested in sharing their views. I don't normally participate in politics, maybe just aren't interested in sharing their
Starting point is 00:10:05 views. I don't think there were voters who were shy to admit they were voting for Scott Moe and the Sask Party, but we missed them. We weren't able to get them to participate, whether the surveys were done online or many of them were done by phone using automated telephone surveys. When you hear people talk about how the polls underestimated support for Trump in 2016, and then in 2020, is it for some of the same reasons here? I think it is. But different also, because one of the things we have to remember about American politics right now is we have historic levels of participation, right? The US, even in presidential elections, compared to Canada, typically has much lower participation, meaning
Starting point is 00:10:53 fewer eligible voters actually participate in presidential elections in the past. What is unique about the Trump era is that he has engaged both his own supporters, people who may never have voted in elections, who may have looked at past Republican candidates or past presidential candidates and none of them are for me. Trump has spoken to many of those and they've engaged and come out and vote. At the same time, the threat of Trump has also engaged, I think, some voters who are opposed to him to come out and vote that might not have normally done so. But in the Trump universe, I do think there is a group of people who, again, probably don't read a lot of news and probably wouldn't have ever voted because there wasn't a candidate who spoke to them. who spoke to them. But these are also probably people who, again, won't answer a survey, won't join an online panel, won't pick up the phone and talk to a pollster for 20 minutes, right? The pollsters this time, I believe, have adjusted their methods to try to at least,
Starting point is 00:11:54 if not reach those people more effectively, or at least weight them up. And that's a technical thing we do to make sure that they're picking them up in the survey. Can you explain that to me, this waiting? I've heard a lot of people talk about this in the last couple of weeks. Yeah. So pollsters make choices around how to make their samples more representative or as representative as they can be. We know that it's harder to get young people to participate, that some people, depending on where they live or their background or their socioeconomic status, just are less likely to participate. And so what weighting basically does is it makes some groups
Starting point is 00:12:39 in our survey worth more in terms of their responses than others. It's an adjustment, basically. And so in the case of the US, for example, back in 2016, one of the things that I was identified as the pollsters, perhaps missing some Trump voters, is many didn't, to get state level polls, didn't wait by education, right? And your level of education, whether you went to college or university or not, became such a powerful of education, whether you went to college or university or not, became such a powerful predictor of whether you voted for Trump or Clinton, that if you didn't do it, you were in a way biasing your surveys, right? And so that's an example. We do that. So when we do a national survey in Canada, we weight by age, gender, education, official language and region on most surveys.
Starting point is 00:13:27 And then for others, we might adjust for some political variable, knowing that conservatives, for example, are less likely to answer our surveys. In the Dragon's Den, a simple pitch can lead to a life-changing connection. Watch new episodes of Dragon's Den free on CBC Gem. Brought to you in part by National Angel Capital Organization, empowering Canada's entrepreneurs through angel investment and industry connections. Hi, it's Ramit Sethi here. You may have seen my money show on Netflix. I've been talking about money for 20 years.
Starting point is 00:14:09 I've talked to millions of people and I have some startling numbers to share with you. Did you know that of the people I speak to, 50% of them do not know their own household income? That's not a typo, 50%. That's because money is confusing. In my new book and podcast, Money for Couples, I help you and your partner create a financial vision together. To listen to this podcast, just search for Money for Couples. What makes a bad poll? Like if you're looking at polls and you
Starting point is 00:14:40 being a pollster, like, oh, this is garbage. Like, what is it that you're seeing in that poll that you don't like? Well, the first thing is that if it goes against conventional wisdom, the first step is to say, well, that's not necessarily garbage because there are moments where things happen and something has to pick it up. But if something seems weird and off, odds are it is. And so bad polls come in many different shapes or forms. Sometimes it's because the pollster, it's not a malicious or they don't intend to put crap out, but sometimes they do. And we've seen maybe some evidence of that in the United States. Some pollsters are so-called pollsters trying to tilt the models to favor one side or the other. But typically speaking, a quality poll comes together based on the questions that are asked,
Starting point is 00:15:33 the order of the questions they're asked, the quality of the sample, and then the quality of analysis of the data. And so sample size alone doesn't indicate quality, right? People often think, well, oh, the larger the sample size, the better the survey. Not true. It is more potentially more accurate if the other pieces of that poll are good. You can do a large national survey of 10,000 people. And if the questions are garbage, if the weighting and the representativeness of the sample is off, no big survey sample is going to solve for all those other things. So the reputation of the pollster, their performance in the past, their transparency and willingness to share what they're doing, how they're doing it, I think are good indicators to do. Now, that's hard for the average consumer to figure out, well, what do I know? And so I think it's, you know, as somebody who started a company 14 years ago
Starting point is 00:16:32 and tried to, has made it, I think, to a point where our research is well respected, it took time. And you've got to earn your dues, I think, before you can build the confidence that people need to trust what you're putting out there, because it's a big responsibility. On that note, do you have a sense that the polls affect future polls? Do you know what I mean? Like, we poll so far out from elections. And do we have any evidence that they, you know, factor into how people end up thinking about issues or politicians or political parties?
Starting point is 00:17:12 I think there is evidence, academic studies have been done that shows, you know, polls do have an effect on people. You know, look at the Canadian political landscape. There is no doubt in my mind that the number of polls that suggest and indicate that the Liberals are deeply unpopular and the Conservatives are well ahead has helped frame in people's minds when I asked them, who do you think is going to win the next election? Now, almost half of Canadians think the Conservatives are going to win that election. And that doesn't just happen, right? I think the narrative that media and journalists write about, which are informed by the polls, has an impact. And so, yeah, I do think polls can influence the results. Not necessarily in a bad way. I don't think that's a bad thing. But I think what the virtue of a survey is, is it helps people understand
Starting point is 00:18:02 what their neighbors are thinking, what Canadians in other parts of the country are thinking. And I use the 2011 federal election as a good example of that, because in that election, the polls, when the election started, said, you know, Stephen Harper was fairly well ahead. Michael Ignatieff and the Liberals, who were the official opposition, were in second. And Jack Layton was in a typical place he's usually being. But about halfway through that campaign, something was happening in Quebec in particular. The NDP vote was starting to rise. And so the polls picked it up and indicated to Canadians
Starting point is 00:18:36 in other parts of the country, perhaps who did not want Stephen Harper to win that election, that Jack Layton might become the best alternative to stop him. My friends, Canadians have asked new Democrats to take on more responsibility in Parliament. For the first time in our history, they have asked us to serve as Canada's official opposition. serve as Canada's official opposition. And so in that case, polls helped inform voters. In the absence of any of them, nobody would have known maybe the orange race was happening in Quebec until it happened. Right. And so, you know, the political parties would have known that was happening. But the public polls allowed the rest of the country and voters to make a slightly more
Starting point is 00:19:23 informed choice. Now, as long as the polls are accurate and there's no nefarious kind of disinformation type behavior happening, then I think that's a good process. It's when it doesn't, when it's bad stuff that you could then have the negative effect that we're trying to prevent. Just going back to the U.S. election, have we seen any of that happening in this election? Because I know there are a lot of partisan polls out there, right? Yeah, I mean, I think there's evidence that in some states, particularly the state-level polls, and some of the national polls are being released by companies I've never heard of, or by news organizations that I've never heard of. And there is some view that because of how influential some of these models are, so think of Nate Silver or FiveThirtyEight, which is ABC News' model or The Economist or, you know, think of the equivalents of Arik Grenier, right, who take all the polling data and then try to tell us, OK, what's likely to happen? Yeah, that that some have said that that
Starting point is 00:20:36 some people are putting out bogus type polls to try to move that average in favor of one candidate or the other. And many are arguing it's the Trump side that's doing it. The modelers say, well, look, it's not having a real effect on it. But nonetheless, there's an effort, which then reminds us of how important people believe these models are, that they're informing the conversation, that we're coming into, you know, tomorrow's election, basically, with a flip flip a coin scenario that if you're a campaign, you kind of want that because it means your voters are going to be motivated to vote're that they have waited properly and thought about past mistakes and corrected for them? Or are you not willing to go that far? We have to evaluate the success, the so-called success of polling through a lens that recognizes there's going to be, there should be error. There should be variation between the polls.
Starting point is 00:22:02 And that in an environment as we're in right now, where it's in most states that are going to decide this outcome, it's a one or two point difference between the two main candidates. Then we have to step away from the idea that the polls are going to quote miss, right? Now, if there is a miss in the sense that one of these candidates wins by a large margin on Tuesday night, or when all the votes are counted, that to me will be a big mess and a big miss. But if this is close, and I think one of the things that I haven't done a lot of polling in the US during this election, but one of the things I've heard about the pollsters is they're all hedging their bets. They're all hurting around this 50-50 kind of mark. Nobody wants to take a position and say who they think is going to win, which by the way, as a pollster who's been in that situation in very close elections before, I understand, right, that fear of picking an outcome that I know in my heart my polls can't estimate because it is that close.
Starting point is 00:22:57 So basically what you're telling me is that you're not going to tell me who you think is going to win right now. If you had to bet, how much would you bet? Yeah, if I would bet, I don't think I, if I had to bet, I don't know if I'd put any money on this one. But look, I think as I look at what we've seen over the last few days, right, there was a poll out on Saturday night from a very well-respected pollster in Iowa, a state that no one thought was in the mix, right? And Ann Seltzer, who came out for the Des Moines, Iowa register said that her poll has Kamala Harris up by three in a state
Starting point is 00:23:36 that Donald Trump has won twice before. Now, that's, again, a very courageous pollster who puts out a result that goes entirely against even a poll that came out earlier in the day that had Trump up by 10. Now, if that poll is right, know, college educated, white seniors, women in large margins that are going to put her over the top in many of these states. So, you know what, I'll say it. I have a feeling that Kamala Harris will win the election, but my confidence on that call is not very high. David, thank you, as always, for being here. It's a pleasure. My pleasure.
Starting point is 00:24:29 Thanks, Jamie. All right, that is all for today. I'm Jamie Poisson. Thanks so much for listening. Talk to you tomorrow. For more CBC Podcasts, go to cbc.ca slash podcasts.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.