What ‘blue wave’? Why pre-election polls faltered again.

A man leaves the polling place at Reed High School in Sparks, Nevada, on Nov. 3, 2020. In many battleground states, Democrat Joe Biden won by a narrower margin than polls had predicted. High turnout may be one factor that pollsters had trouble grappling with.

Scott Sonner/AP

November 10, 2020

A majority of opinion polls showed Democrat Joe Biden leading President Donald Trump by a wide margin in the run-up to the Nov. 3 election. But President-elect Biden’s winning margins in battleground states like Wisconsin and Michigan proved to be much narrower than the polls predicted.

Similarly, many Democrats in congressional races underperformed their polls. No “blue wave” materialized for Democrats to retake a majority in the U.S. Senate.   

Critics say the polling industry has failed to learn after its misses in the 2016 presidential election. 

Why We Wrote This

A conundrum of the recent U.S. election is how pollsters failed again, despite their soul-searching after the 2016 election. Here are some theories on this important industry that reflects – and helps to shape – public thought.

How wrong were the polls in predicting the 2020 election results? 

In terms of the popular vote, not as much as you might think. The final RealClearPolitics average of polls had former Vice President Biden ahead by 7.2 percentage points. Similarly, FiveThirtyEight projected a margin of victory for Mr. Biden of 8.4 points. 

As ballots are still being counted, including provisional and mail-in ballots in non-battleground states, the final tally is likely to change. Mr. Biden is currently ahead by 3 percentage points, but analysts say that could climb toward 5 points, given where the outstanding ballots are. The average polling error for presidential elections since 1968 was 3 points, according to FiveThirtyEight. So pollsters could end up within – or not far from – normal margins of error on the overall popular vote.

They took up arms to fight Russia. They’ve taken up pens to express themselves.

Where the polls erred more was in battleground states that both candidates needed to win. The final RealClearPolitics average for Wisconsin predicted a 7-point win for Mr. Biden, with smaller margins in Michigan and Pennsylvania. Most polls also had shown little change during the campaign, suggesting that Mr. Biden’s advantage was stable. In the end, his victories in these three crucial states were thin; Wisconsin was won by 20,540 votes. 

Some pollsters did better in predicting state-level votes. Suffolk University Political Research Center was within 2 points of final results in Florida, Arizona, New Hampshire, and Minnesota, although it also overestimated Mr. Biden’s margin in Pennsylvania. 

Even more erratic was the polling for closely watched congressional races like the Senate seat defended by Republican Susan Collins in Maine. Pre-election polls put her opponent, Sara Gideon, in a strong lead; Senator Collins won by 9 points. In South Carolina, incumbent Sen. Lindsey Graham beat his Democratic challenger by 10 points, defying polls showing a dead heat. 

Analysts say ticket-splitting may have been an under-appreciated factor in states like Maine, where Mr. Biden ran far ahead of Hillary Clinton’s margin of victory in 2016 but failed to lift Ms. Gideon. Intensifying partisanship tends to boost straight-party tickets. 

What all these polling inaccuracies have in common is a direction of travel: support for Democratic candidates was often wildly overestimated. 

Ukraine’s Pokrovsk was about to fall to Russia 2 months ago. It’s hanging on.

What is behind these misses in election polling? 

Accurate polling rests on two critical calculations: The makeup of the electorate and which eligible voters are most likely to cast ballots. These calculations allow polling agencies to weigh the responses to surveys and project the outcome of an actual election. 

In 2016, most state polls failed to predict Mr. Trump’s Electoral College victory, in part because their surveys didn’t include enough non-college-educated voters and underestimated the turnout in rural areas. In the aftermath, surveys were adjusted to account for these demographics. 

Experts caution that it’s too early to pinpoint what went wrong in 2020 since not all ballots have been counted. But it appears that pollsters may have been thrown off by high turnout – the highest in at least 50 years – and the popularity of mail-in and early voting during a pandemic. In Texas, which expanded early in-person voting, turnout by eligible voters rose 9 points. 

Early voting meant that pre-election surveys could identify more actual voters as opposed to likely voters. This may have led to a pro-Biden bias in their sample, since fewer Republicans voted in advance amid Mr. Trump’s baseless claims about fraud in mailed ballots. 

Michael Traugott, a research professor emeritus at the University of Michigan, compares it to a cake recipe in which the ingredients are listed correctly but their proportion is unknown. “The portion of the recipe that was early voting was too large,” he says. 

Unlike in 2016, surveys found few undecided or third-party voters. That year, a significant number of late-breaking voters in battleground states went to Mr. Trump. But that doesn’t appear to be a factor in 2020 that could explain the underestimation of his support.  

Conservatives argue that pollsters miss Mr. Trump’s support because respondents are reluctant to state their preference, knowing that it may be socially unacceptable, particularly in professional circles. Studies have failed to replicate the “shy Trump voter” hypothesis. A bigger factor may be that Trump supporters are less likely to participate in surveys because they don’t trust pollsters. 

Has it become harder to survey public opinion on voting intentions? 

Caller ID and call blocking has made it harder to conduct live surveys. Some polling agencies rely more on robocalls; others have turned to online surveys that may not be as reliable. This drives up the cost of polling and may have contributed to polling errors in 2020, though it was already a factor in 2018 when more of the midterm polling was accurate. 

However, this year the pandemic led to higher response rates since more voters were at home, says David Paleologos, the director of the Suffolk University Political Research Center. “We were finishing projects a day earlier than scheduled,” he says. Voters seemed happy to talk to a pollster, perhaps because they were tired of talking politics with others in their household. 

But as noted, the propensity of voters to respond to polls isn’t equally distributed. The bias in surveys may reflect Democrats being overrepresented, as Trump voters with lower levels of social trust are harder to poll.