Explore Harvard's Nieman network Nieman Fellowships Nieman Lab Nieman Reports Nieman Storyboard

Opinion polls show wildly conflicting horserace figures. Why?

ASK THIS | September 19, 2004

Sometimes too many Republicans (or Democrats, perhaps) are included in the sample. The explanation can be that simple.


By Barry Sussman
Editor@niemanwatchdog.org

Q. A blogger named Ruy Teixeira says that some polls showing Bush with a big lead have more self-described Republican respondents than seems correct. How many Republicans, Democrats, and independents were polled in the conflicting surveys?

Q. In each of the surveys, what are the breaks by region, age, gender?

Q. In telephone surveys, what time of day, and what days of the week, were the calls made?

Some recent polls show George Bush substantially ahead of John Kerry; some show the two about even. Some years ago, when I did the polling for the Washington Post, I used to get called on regularly to explain such apparently irreconcilable results.

The first thing I did was to lay out, side by side, the demographics in the conflicting polls, looking for differences of the kind Teixeira says he found. There was almost always something in the sampling or in the procedures that stuck out.

I'd say that is no doubt true today, and that is what I'd look for. Except now there is one added problem: Response rates today are much lower than they used to be. What with cell phones and answering machines, it's much harder to interview people by phone. Response rates today are a lot worse than in my day, and they were no bargain then.

The question about response rates boils down to one point: In terms of voting intention, are the people who take part in the survey different from those who don't? Pollsters like to think there is no difference, or an infinitesimal one. Traditionally, they have been correct. But that was when a typical response rate was 50 percent or higher; now it can be less than 30 percent.

Sometimes, in comparing poll techniques and procedures, I found out more than I wanted to know.

The pollster Louis Harris, for example, used to ask the presidential preference question twice in a survey — once at the outset, once later on after questions that may have made respondents change their mind about their planned vote. Harris reported only one of the responses; I guess it was the one he personally liked better.

Sometimes what appear to be small variations can explain big differences.

Once I did a poll for the Post shortly before a Virginia governor's election in which Charles Robb was running against a Republican named Marshall Coleman. At the same time Larry Sabato, then as now a well known political analyst, did a survey for another newspaper. With the election just a few days off, I had Robb ahead by 7 percentage points, as I remember it, and Sabato had them dead even.

That weekend he and I compared our samples over the phone. Almost everything seemed the same. There was one difference, however, that I believe was crucial. We both had about the same number of women in our surveys, but mine went strongly for Robb — a major gender gap — and Sabato's broke evenly.

I asked Sabato what hours his phone calls were made. Many had been made in the daytime, so that the women in his sample were mostly retired or homemakers. The Post's survey, done in the early evenings, had a more representative sample, including a much higher proportion of working women. The difference in their views alone explained the entire disparity in our two polls, as I recall it.

Robb won by about the margin shown in the Post's poll. Asked to explain his victory, he told reporters to just read the story Post reporters wrote the Sunday before the election, based on our poll. That was a nice compliment, of course. Sabato, for his part, maintained at the time that his poll was correct but that opinion changed the weekend before the election, after polling stopped. We can all argue it round or argue it flat, can't we?



The Washington Post Poll
Frequently asked questions about Post polls; see also nearby items on polling methodology

The Left Coaster blog
A highly partisan blog that raises questions about recent Gallup and NY Times/CBS horserace polls.

What If the polls are wrong?
Wall Street Journal column by Albert R. Hunt.

The NiemanWatchdog.org website is no longer being updated. Watchdog stories have a new home in Nieman Reports.