A Note on Talking About Polls
by fbihop, Sun Jun 15, 2008 at 07:36:24 PM EDT
We've all seen horrible reporting on the facts on polling. We've all seen people misinterpret polls, exaggerate, even blatantly lie. I give many people the benefit of the doubt; they probably just don't know better.
A great resource which was forwarded to me from an AP writer is this link from the National Council on Public Polling. It is "20 Questions a Journalist Should Ask About Poll Results." It is an invaluable resource any journalist or blogger who mentions or writes about polling.
Most are obvious to any of us political junkies who know a little about polling. Like, "Who did the poll?" Obviously some pollsters are more accurate and reputable than others. If you haven't heard of the polling firm, you might want to dig a little deeper to see if the firm is reputable.
I wrote a story last month in the Democratic primary about an internal poll by one of the campaigns in the 2nd Congressional District. The poll was done by Winning Connections, a group who wasn't known for polling. I e-mailed Mark Blumenthal of Pollster.com to ask him about it.
Blumenthal wondered at the lack of inclusion of methodology along with the polling outfit's reputation. "While Winning Connections has a lot of experience doing political telemarketing, this is the first time I've heard of their doing an opinion poll," he said.This also leads to another question: Who paid for the pol and why was it done?
In this case, it was done by the Harry Teague campaign as an internal poll. I believe internal polls can be remarked upon as long as there is a prominent caveat saying it is an internal poll. Why? Because even if the poll is done perfectly by a reputable polling company, you must ask why the poll was released.
At FiveThirtyEight.com, Nate Silver says in the site's FAQ:
All scientifically-conducted polls are included provided that they meet our reporting requirements (see below). The lone exception is "leaked" internal polls, as campaigns may be selective about which polls they leak, biasing the results.So I would amend the question to include "Why was the poll released?"
Other questions include the sample size, how those polled were chosen, when the poll was done, etc. All common sense questions that less than careful bloggers or journalists might forget to ask or report.
Here is a section that I will post in its entirety, because I see this mistake done all the time by all kinds of bloggers and journalists. Most recently by Jeralyn at Talk Left.
First what the NCPP says about the issue of reporting on who is ahead:
Sampling error raises one of the thorniest problems in the presentation of poll results: For a horse-race poll, when is one candidate really ahead of the other?Jeralyn at Talk Left said this:
Certainly, if the gap between the two candidates is less than the sampling error margin, you should not say that one candidate is ahead of the other. You can say the race is "close," the race is "roughly even," or there is "little difference between the candidates." But it should not be called a "dead heat" unless the candidates are tied with the same percentages. And it certainly is not a "statistical tie" unless both candidates have the same exact percentages.
And just as certainly, when the gap between the two candidates is equal to or more than twice the error margin - 6 percentage points in our example - and if there are only two candidates and no undecided voters, you can say with confidence that the poll says Candidate A is clearly leading Candidate B.
Barack Obama and John McCain are in a statistical dead heat according to the Daily Gallup tracking poll.If you go to the daily tracking poll, it shows Obama polls two points ahead of McCain. According Gallup, "the maximum margin of sampling error is ±2 percentage points" on the poll. So to say it is "roughly even" would be accurate. To say it is a dead heat is not correct.
This also leads to #17 on NCPP's list: "What other polls have been done on this topic? Do they say the same thing? If they are different, why are they different?"
The Gallup poll shows an extremely close contest nationwide; there is "little difference between the candidates" one might say. But what about other similar polls?
Rasmussen's national tracking poll has it 49 percent to 43 percent for Obama after a poll of 3,000 likely voters. Like the Gallup poll, the margin of error is "+/- 2 percentage points." It was also on the same day.
In reporting such a poll, in my opinion, one should say Gallup has a very close race while Rasmussen has Obama with a lead well outside the margin of error.
The difference between the two is Rasmussen uses a three-day rolling average while Gallup uses a five-day rolling average.
I really recommend going through all 20 questions on the NCPP article. It will help you even if you only comment or lurk; you will be able to see who is reporting the polls correctly and who is not. You yourself will be able to analyze the polls without the Washington Post, CNN or even your favorite blog telling you about the results.
I have to admit, I sometimes forget all the suggestions, but I try to check back once every couple of weeks just to keep my knowledge fresh.