A Note on Talking About Polls

We've all seen horrible reporting on the facts on polling.  We've all seen people misinterpret polls, exaggerate, even blatantly lie.  I give many people the benefit of the doubt; they probably just don't know better.

A great resource which was forwarded to me from an AP writer is this link from the National Council on Public Polling.  It is "20 Questions a Journalist Should Ask About Poll Results." It is an invaluable resource any journalist or blogger who mentions or writes about polling.

Most are obvious to any of us political junkies who know a little about polling.  Like, "Who did the poll?" Obviously some pollsters are more accurate and reputable than others. If you haven't heard of the polling firm, you might want to dig a little deeper to see if the firm is reputable.

FiveThirtyEight.com has pollster ratings which are a good resource.  Another is the SurveyUSA Report Card, which shows who did the best over an election season.

I wrote a story last month in the Democratic primary about an internal poll by one of the campaigns in the 2nd Congressional District.  The poll was done by Winning Connections, a group who wasn't known for polling.  I e-mailed Mark Blumenthal of Pollster.com to ask him about it.

Blumenthal wondered at the lack of inclusion of methodology along with the polling outfit's reputation. "While Winning Connections has a lot of experience doing political telemarketing, this is the first time I've heard of their doing an opinion poll," he said.
This also leads to another question: Who paid for the pol and why was it done?

In this case, it was done by the Harry Teague campaign as an internal poll.  I believe internal polls can be remarked upon as long as there is a prominent caveat saying it is an internal poll.  Why?  Because even if the poll is done perfectly by a reputable polling company, you must ask why the poll was released.

At FiveThirtyEight.com, Nate Silver says in the site's FAQ:

All scientifically-conducted polls are included provided that they meet our reporting requirements (see below).  The lone exception is "leaked" internal polls, as campaigns may be selective about which polls they leak, biasing the results.
So I would amend the question to include "Why was the poll released?"

Other questions include the sample size, how those polled were chosen, when the poll was done, etc.  All common sense questions that less than careful bloggers or journalists might forget to ask or report.  

Here is a section that I will post in its entirety, because I see this mistake done all the time by all kinds of bloggers and journalists. Most recently by Jeralyn at Talk Left.

First what the NCPP says about the issue of reporting on who is ahead:

Sampling error raises one of the thorniest problems in the presentation of poll results: For a horse-race poll, when is one candidate really ahead of the other?

Certainly, if the gap between the two candidates is less than the sampling error margin, you should not say that one candidate is ahead of the other. You can say the race is "close," the race is "roughly even," or there is "little difference between the candidates." But it should not be called a "dead heat" unless the candidates are tied with the same percentages.   And it certainly is not a "statistical tie" unless both candidates have the same exact percentages.

And just as certainly, when the gap between the two candidates is equal to or more than twice the error margin - 6 percentage points in our example - and if there are only two candidates and no undecided voters, you can say with confidence that the poll says Candidate A is clearly leading Candidate B.

Jeralyn at Talk Left said this:
Barack Obama and John McCain are in a statistical dead heat according to the Daily Gallup tracking poll.
If you go to the daily tracking poll, it shows Obama polls two points ahead of McCain.  According Gallup, "the maximum margin of sampling error is ±2 percentage points" on the poll.  So to say it is "roughly even" would be accurate. To say it is a dead heat is not correct.

This also leads to #17 on NCPP's list: "What other polls have been done on this topic? Do they say the same thing? If they are different, why are they different?"

The Gallup poll shows an extremely close contest nationwide; there is "little difference between the candidates" one might say.  But what about other similar polls?

Rasmussen's national tracking poll has it 49 percent to 43 percent for Obama after a poll of 3,000 likely voters.  Like the Gallup poll, the margin of error is "+/- 2 percentage points." It was also on the same day.

In reporting such a poll, in my opinion, one should say Gallup has a very close race while Rasmussen has Obama with a lead well outside the margin of error.

The difference between the two is Rasmussen uses a three-day rolling average while Gallup uses a five-day rolling average.

I really recommend going through all 20 questions on the NCPP article.  It will help you even if you only comment or lurk; you will be able to see who is reporting the polls correctly and who is not.  You yourself will be able to analyze the polls without the Washington Post, CNN or even your favorite blog telling you about the results.

I have to admit, I sometimes forget all the suggestions, but I try to check back once every couple of weeks just to keep my knowledge fresh.

Tags: meta, polls (all tags)

Comments

5 Comments

Gallup

Gallup uses a three-day rolling average now. They used a five-day result in the primaries.

Frankly, I have no idea why Gallup has more undecideds than Rasmussen. I can only guess it has to do with question order and framing. It isn't that McCain scores better in Gallup than Rasmussen.

by elrod 2008-06-15 08:06PM | 0 recs
Re: Gallup

Maybe Gallup uses "Barack Hussein Obama" (just kidding).

by animated 2008-06-15 08:37PM | 0 recs
Re: Gallup

You are right they have switched back to 3 day averages.

If gallup continues to show a tight race , I suspect Rasmussen would become tighter .

I think Gallup is preferable.

by lori 2008-06-15 09:30PM | 0 recs
This was a really good read.

Thanks for posting this.

by CAchemist 2008-06-15 08:13PM | 0 recs
Re: A Note on Talking About Polls

[i never trust much what people report as polls here, just because I dont expect them to be able to weed out the shit, I cant even weed out the shit]

good link; great diary;

by alyssa chaos 2008-06-15 08:16PM | 0 recs

Diaries

Advertise Blogads