Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Salon
Salon
Politics
W. Joseph Campbell

What pollsters learned from 2020 failure

Sen. Susan Collins (R-ME) talks with reporters as she walks through the Senate subway following the weekly Senate Republican policy luncheon at the U.S. Capitol on June 14, 2022 in Washington, DC. (Chip Somodevilla/Getty Images)

When it became clear his poll had erred in the 2021 New Jersey governor's race, Patrick Murray, director of the Monmouth University Polling Institute, acknowledged:

"I blew it."

The campaign's final Monmouth poll estimated Gov. Phil Murphy's lead over Republican foe Jack Ciattarelli at 11 percentage points – a margin that "did not provide an accurate picture of the state of the governor's race," Murray later said in a newspaper commentary. Murphy won by 3.2 points.

It was a refreshingly candid acknowledgment by an election pollster.

More broadly, the error was one of several in the recent past and looms among the disquieting omens confronting pollsters in the 2022 midterm elections. Will they be embarrassed again? Will their polls in high-profile U.S. Senate and gubernatorial races produce misleading indications of election outcomes?

Such questions are hardly far-fetched or irrelevant, given election polling's tattered recent record. A few prominent survey organizations in recent years have given up on election polling, with no signs of returning.

Treat polls warily

It is important to keep in mind that polls are not always in error, a point noted in my 2020 book, "Lost in a Gallup: Polling Failure in U.S. Presidential Elections." But polls have been wrong often enough over the years that they deserve to be treated warily and with skepticism.

For a reminder, one need look no further than New Jersey in 2021 or, more expansively, to the 2020 presidential election. The polls pointed to Democrat Joe Biden's winning the presidency but underestimated popular support for President Donald Trump by nearly 4 percentage points overall.

That made for polling's worst collective performance in a presidential campaign in 40 years, and post-election analyses were at a loss to explain the misfire. One theory was that Trump's hostility to election surveys dissuaded supporters from answering pollsters' questions.

In any case, polling troubles in 2020 were not confined to the presidential race: In several Senate and gubernatorial campaigns, polls also overstated support for Democratic candidates. Among the notable flubs was the U.S. Senate race in Maine, where polls signaled defeat for the Republican incumbent, Susan Collins. Not one survey in the weeks before the election placed Collins in the lead.

She won reelection by nearly 9 points.

Recalling the shock of 2016

The embarrassing outcomes of 2020 followed a stunning failure in 2016, when off-target polls in key Great Lakes states confounded expectations of Hillary Clinton's election to the presidency. They largely failed to detect late-campaign shifts in support to Trump, who won a clear Electoral College victory despite losing the national popular vote.

Past performance is not always prologue in election surveys; polling failures are seldom alike. Even so, qualms about a misfire akin to those of the recent past have emerged during this campaign.

In September 2022, Nate Cohn, chief political analyst for The New York Times, cited the possibility of misleading polls in key races, writing that "the warning sign is flashing again: Democratic Senate candidates are outrunning expectations in the same places where the polls overestimated Mr. Biden in 2020 and Mrs. Clinton in 2016."

There has been some shifting in Senate polls since then, and surely there will be more before Nov. 8. In Wisconsin, for example, recent surveys suggest Republican incumbent Ron Johnson has opened a lead over Democratic challenger Mandela Barnes. Johnson's advantage was estimated at 6 percentage points not long ago in a Marquette Law School Poll.

The spotlight on polling this election season is unsurprising, given that key Senate races – including those featuring flawed candidates in Pennsylvania and Georgia – will determine partisan control of the upper house of Congress.

Worth doing?

Polling is neither easy nor cheap if done well, and the field's persistent troubles have even prompted the question whether election surveys are worth the bother.

Monmouth's Murray spoke to that sentiment, stating: "If we cannot be certain that these polling misses are anomalies then we have a responsibility to consider whether releasing horse race numbers in close proximity to an election is making a positive or negative contribution to the political discourse."

He noted that prominent survey organizations such as Pew Research and Gallup quit election polls several years ago to focus on issue-oriented survey research. "Perhaps," Murray wrote, "that is a wise move."

Questions about the value of election polling run through the history of survey research and never have been fully settled. Early pollsters such as George Gallup and Elmo Roper were at odds about such matters.

Gallup used to argue that election polls were acid tests, proxies for measuring the effectiveness of surveys of all types. Roper equated election polling to stunts like "tearing a telephone book in two" – impressive, but not all that consequential.

Screenshot, New York Times (Pollsters in 2016 predicted Democratic presidential candidate Hillary Clinton would win some states that she actually lost.)

Who is and isn't responding

Experimentation, meanwhile, has swept the field, as contemporary pollsters seek new ways of reaching participants and gathering data.

Placing calls to landlines and cellphones – once polling's gold standard methodology – is expensive and not always effective, as completion rates in such polls tend to hover in the low single digits. Many people ignore calls from numbers they do not recognize, or decline to participate when they do answer.

Some polling organizations have adopted a blend of survey techniques, an approach known as "methodological diversity." CNN announced in 2021, for example, that it would include online interviews with phone-based samples in polls that it commissions. A blended approach, the cable network said, should allow "the researchers behind the CNN poll to have a better understanding of who is and who is not responding."

During an online discussion last year, Scott Keeter of Pew Research said "methodological diversity is absolutely critical" for pollsters at a time when "cooperation is going down [and] distrust of institutions is going up. We need to figure out lots of ways to get at our subjects and to gather information from them."

So what lies immediately ahead for election polling and the 2022 midterms?

Some polls of prominent races may well misfire. Such errors could even be eye-catching.

But will the news media continue to report frequently on polls in election cycles ahead?

Undoubtedly.

After all, leading media outlets, both national and regional, have been survey contributors for years, conducting or commissioning – and publicizing – election polls of their own.

W. Joseph Campbell, Professor of Communication Studies, American University School of Communication

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.