Reporting on political public opinion polls, and should your newsroom undertake polls of its own?
ASK THIS | March 24, 2004
By Barry Sussman
Q. The smaller the sample, the less useful the poll. Is the sample large enough?
Q. Should your news organization do its own political polls? Why or why not, and how difficult would that be?
Q. Where do focus groups fit in? (Answer: Nowhere.)
Done properly, opinion polls are marvelous contributions to understanding politics. But too often they're not done properly.
Chances are there will be more news media opinion polls and focus groups in 2004 than in any previous election season. That would be fine if these tools were used intelligently. But for the most part, they won't be. So here's a primer.
Keep in mind the first law of public opinion, which is that it changes and sometimes moves in powerful, unexpected waves. Forget summertime polls that tell us who'll win in November; the horse race, for all intents and purposes hasn't even begun. And as much as you can, go beyond the horse-race element in conducting and interpreting polls.
For a democracy, opinion polls are of enormous value. George Gallup explained why in 1935, just as he was starting his syndicated newspaper column: "Not only must the citizen have the information brought to him, so that he may know what is happening and what the issues before him are, but he must have a way to make his wishes known to government. It is here that students of our government have found the weakest rib of its framework."
From time to time public pollsters do a good job at just what Gallup wanted---through simple cross-tabs they underscore similarities and differences in groups by region, gender, race, ideology and socio-economic characteristics. They describe a nation's hopes and fears better than the best reporters can do on their own.
In sum, when it comes to politics, good polls not only tell us which candidates are ahead; more importantly, they tell us why. They are dynamic, not static.
But if some public polls are done well, too many aren't. Because of the expense, only the largest news organizations do much polling, and in the last three or four election campaigns most of them have cut back by reducing sample size. Lost are richness and nuance.
With only 500 or 600 interviews instead of 1,200 or 1,500, opinion polls aren't useful for much more than guessing at the horse race. Obviously, horse-race results are important; they're what people care about the most. But news organizations are doing a disservice when they give head-to-head figures with little or no explanation of what moves citizens.
Some organizations cut corners on sampling methods and interview procedures, resulting in what could be called the blind-squirrel system of polling.
One result of low sample size in public polls is that reporters go to candidates' polls for getting at what's behind voters' thinking, and that's not much help at all. For while some of the political pollsters would never lie to reporters, they wouldn't tell the truth either if it reflects badly on their clients.
Here are some recommendations on how to do a better job with opinion polls in the 2004 campaigns.
1. If your newsroom doesn't do any opinion polling because of the cost or technical skills needed, you may want to reconsider. You can probably do your own polls, at quite reasonable cost, if you team up with a nearby college, as survey research is taught almost everywhere.
2. If you are a reporter or editor working on polls or reporting on other people's polls, keep in mind that polling is one part science, two parts common sense. That goes for creating questions and every thing else. If something doesn't ring true don't just accept it; question it.
3. Have a substantial number of demographic questions, so that you'll be able to do cross tabulations by important groupings, such as by educational attainment, levels of income, locality (e.g., ward number), political affiliation, and others. Odd as it sounds, you need about as many interviews in a good state or local poll as you do in a national one.
4. If your organization does polls, take a role in the questions to be asked. You don't have to provide the exact language, the pollster will do that. But be sure to ask about issues you feel are important ones. Consult beat reporters---even call a staff meeting---to make sure you are hitting on worthwhile areas.
5. There are some simple interpretative rules. Don't make too much of small differences, as there's not much difference in 60 percent favoring something and 65 percent favoring it. Do make a big deal of big differences. Keep in mind that one poll is just a snapshot in time; to find patterns or trends you'll have to go back and poll again.
Follow these simple steps and you'll learn a lot more than just who's ahead for president or U.S. Senate or city council. Do it more than once, and you'll find the numbers starting to talk to you. Do it well, and you'll develop a better understanding of the public's agenda. You won't have to rely on what politicians are telling you about what people think.
If, in the end, your organization decides not to do opinion polling, then be cautious when dealing with the politicians' surveys. AAPOR (the American Association for Public Opinion Research) and NCPP (National Council on Public Polls) have checklists of items the candidates or their pollsters must make available to you if you are to have any faith at all in the numbers they are promoting.
I mentioned focus groups as something the news media may be expected to do more of in 2004. That's sad but true. About the only worthwhile use for focus groups is to provide slogans or lines of inquiry for quantitative surveys. Other than that, what people find in focus groups, whether they know it or not, is too often just what they were looking for in the first place. Don't generalize from focus groups; don't pay attention to anyone who does.