How you can tell which polls to trust — and why we need to fix our broken system


The 2022 elections have seen us buried under hundreds of polls at the same time more and more of us are skeptical of them. Why is it that many polls seem to be getting things wrong more than before and how can we sort out the wheat from the chaff?

Always take into account the all-important “margin of error,” which can sometimes range above 4%. Don’t put too much trust in any poll that surveys only a few hundred voters.

And a poll of likely voters is better than one of “registered voters.” Even sampling registered voters who say they will vote in this midterm election but didn’t vote in the high-turnout 2020 presidential race is talking to people who are unlikely to actually vote this time.

Recognize that polls are now harder to take. Nate Cohn, the chief political analyst at The New York Times, admitted last month that only 0.4% of attempted phone calls for the latest Times/Siena College poll yielded a completed interview.

Cohn worries that many Republicans are less likely to respond to surveys than Democrats, even among people who had the same demographic characteristics. In the Times’ final wave of Senate polling this year, white registered Democrats were 28% likelier to respond than Republicans — a disparity bigger than what the Times found in 2020.

Hochul speaks during a New York Women “Get Out The Vote” rally in New York City on November 3.
Hochul speaks during a New York Women “Get Out The Vote” rally in NYC on Nov. 3.
Michael M. Santiago/Getty Images

Many people are now suspicious of pollsters and won’t tell strangers what they really think. Some even think they will go on some list identifying people by their political views. Thank political correctness.

In general, pay more attention to surveys from private pollsters working for candidate clients who demand better accuracy. Such polls are usually not released, but when they are they outperform the polls by mainstream media outlets. Media polls where the release date was inexplicably delayed or the questions worded in a certain way can be agenda-driven rather than a genuine snapshot of where the electorate is.

Generic ballot polls are more popular than ever but probably also less informative. Polls that lean too heavily on the self-reported party identification of voters or don’t push “independents” to admit they are really regular voters in one party or another are often meaningless.

Pay attention to pollsters that consistently have a low error rate even if they break from the pack on polling techniques. Dan McLaughlin of National Review notes that “The Trafalgar Group has racked up a string of polling successes in recent years by talking to the very sorts of voters that other pollsters have missed,” namely voters sympathetic to Trump-like themes. In the last six years, Trafalgar’s average error rate has been 2.4%, and it has predicted the winner in races 92% of the time.

Campaign signs
How do pollsters get accurate information leading up to elections?
Jonathan Mattise/AP

I spoke with Trafalgar CEO Robert Cahaly this week, just after he released a stunning poll that showed Gov. Kathy Hochul and Lee Zeldin tied in the New York race for governor. He credits his overall accuracy rate to the following elements:

  • Using shorter questionnaires. “Anyone willing to answer a survey of 30 items is either obsessed with politics or so lonely they want to spend time with a stranger on the phone. I ask six or fewer questions and find I get more normal voters.”
  • He uses different ways to locate voters — a mix of six different methods: live callers, integrated voice response, text messages, emails and two others he calls his “secret sauce.”
  • He never conducts a statewide poll with less than 1,000 respondents, which brings his margin of error down.

Let’s hope the polls this year are more accurate than in the past. If they’re not, says Patrick Murray, director of the Monmouth University Polling Institute, “we have a responsibility to consider whether releasing horse-race numbers in close proximity to an election is making a positive or negative contribution to the political discourse.”

Lee Zeldin
Lee Zeldin speaks to the media following the arrest of a person involved in the shooting near his home.
Lev Radin/Pacific Press/LightRocket via Getty Images

Groups like Pew Research and Gallup stopped taking election polls in recent years and now focus on issue-oriented surveys.

The problem is that if pollsters worried about the accuracy of their surveys withdraw from the field, those who remain will include more hucksters, corner-cutters and propagandists. That’s why all of us should hope that pollsters get it more right than wrong this election.

John Fund is a columnist for National Review and co-author of “Our Broken Elections: How The Left Changed The Way You Vote.”


Source link

Comments are closed.