If you've noticed, I do not blog about polls. The other bloggers here can, I have not made it an editorial policy not to, and have no intention of so doing; I just don't do it myself.
The reason is simple. I do not trust polls. Or, rather, I do not trust pollsters.
And now comes some more justification for my distrust:
When a New York Times poll found that the number of Americans who think it was right for the United States to go to war in Iraq rose from 35 percent in May to percent 42 percent in mid-July, rather than promptly report the new poll findings, the paper conducted another poll. As the Times' Janet Elder wrote Sunday, the increased support for the decision to go to war was "counterintuitive" and because it "could not be easily explained, the paper went back and did another poll on the very same subject."
Round Two found that 42 percent of voters think America was right to go into Iraq, while the percentage of those polled who said that it was wrong to go to war had fallen from 61 percent to 51 percent. The headline for Elder's piece read, "Same Question, Different Answer. Hmmm." But it should have read: "America's Paper of Record Out of Touch With American Public."
So why should I trust any poll? How can I tell if the pollster just ran one poll or kept trying until they got the answer they liked?
|