Pollsters face midterm test after missing mark last couple of cycles

.

Voting in the USA
People voting on voting booths, wearing face masks Vladimir Vladimirov/Getty Images

Pollsters face midterm test after missing mark last couple of cycles

Video Embed

Soon, the country will learn the results of the 2022 midterm elections, a key test of whether public opinion polls accurately reflect how the electorate is feeling and voting this cycle. In recent elections, the polls have been overestimating Democrats’ performance, leading some to question whether they are a good barometer.

Pre-election polls during the 2020 election underestimated former President Donald Trump’s vote margin by about 5 percentage points, the highest error in the last 20 years, according to the American Association for Public Opinion Research. In 2018, Republicans performed better than polls predicted in Florida, Missouri, Indiana, Tennessee, and Ohio.

In recent cycles, pollsters said their results have been plagued by something called nonresponse bias — a phenomenon in which Trump’s supporters have been less likely to respond to surveys than demographically similar voters. The New York Times reported that white registered Democrats were more than 20% likelier to respond to their surveys than white registered Republicans.

“Because many Republicans aren’t participating, many pollsters are left to weight the data to try to increase the weight that’s given to Republican respondents. People who tend to vote Republican just don’t participate in surveys as much, and that’s what messes things up so much,” said Samara Klar, an associate professor at the University of Arizona School of Government and Public Policy.

In order to account for the differences in response rates, pollsters are left to weight results to help correct the imbalances and to match the demographics of the electorate, but that has proven to be challenging. In 2020, some of the pollsters’ usual statistical techniques were unable to fix the problem, particularly among demographic groups like Hispanic voters, who tend to respond to surveys at lower rates than others. Weighting respondents from underrepresented groups to represent the population didn’t do the trick.

Some polls in the past did not properly represent less educated voters, thought to be one of the major errors in 2016. Klar, who specializes in public opinion and survey research, said pollsters are much more cognizant of this now.

“National-level polls are actually pretty good at weighting by education because we have pretty good national data that we use as benchmarks. If you look at census data or country-level data, we have a pretty good idea of the proportion of voters who, for example, graduated from college,” she said.

However, some pollsters said they are adopting other ways to deal with the wide disparity between Democratic and Republican response rates, beyond making statistical adjustments.

“Not everyone is weighting them, that’s not the solution everyone is deploying,” said Celinda Lake, a leading Democratic pollster and political strategist who worked on Joe Biden’s 2020 campaign. Lake said another approach is to continue seeking out Trump Republicans until pollsters have the right number in their sample.

Lake is concerned about the accuracy of the public opinion polling this election cycle because she said there’s a broad range of turnout models being utilized.

“I think, ironically, polling in the off-year is even harder than polling in the on-year. While we have corrected a lot of things, some of the problems may be greater this cycle because one of the key problems is estimating who is going to turn out. I don’t think anyone knows who is turning out. The reports are so uneven by state and so different by state,” she explained.

This cycle, there’s been a significant divide between some of the independent polling and polls released by firms affiliated with one party or another. Lake warns against giving these kinds of polls too much consideration.

“I think it’s really wrong to use polls to influence the outcome of the races. You’re seeing more and more of that and campaigns releasing cheap public polls to influence outcomes.”

CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER

Chuck Coughlin, the president and CEO of Highground, a political consulting firm in Phoenix, said not all polls are created equal. The gold standard is still considered live-caller telephone polls with transparent methodologies, which is the way his company is conducting its polling. He said, unfortunately, he’s seen fewer “gold standard” polls and more sloppy, quick-and-dirty ones that tend to show large, double-digit spreads between candidates.

“We are calling asking for specific voters because we want to fill a quota in our poll. The quota is based on where you live, how old you are, what sex you are, what party you are. We are filling those quotas, and I think we are pretty accurate,” Coughlin explained. “We do not do any push button polls either. I don’t trust those, you don’t know who is doing them. Ours are all live-caller responses, which are more expensive to do and it takes longer, but it’s the right thing to do.”

The largest challenge for pollsters like Coughlin is actually getting the electorate to participate in their polls and answer their phones.

“These most recent ones are quick, it’s a 12-minute poll. So, it’s not long. Earlier in the cycle, you are doing 25-minute polls where people are like, ‘Heck, I’m not going to do this.’ Now, it’s a shorter poll, so it’s just about getting people to do it. That’s the largest challenge,” he said.

© 2022 Washington Examiner

Related Content