Straw Poll Fallacy

Good think tanks do research, and they also do advocacy, but think tanks that fail to make any distinction between the two squander valuable reputational capital.

Last Friday, my former MI colleague, Josh Barro, scolded the Florida-based James Madison Institute for conducting a “push poll” about the state’s federally-subsidized Medicaid expansion plans.  “This isn’t a poll designed to figure out how Floridians feel about the Medicaid expansion,” Barro complained, “it’s one designed to get them to say they oppose it, so the organization commissioning the poll can say it’s unpopular.”

Cato Institute health policy guru Michael Cannon, also a former colleague of mine, had apparently reviewed the poll questions for the James Madison Institute before the poll hit the field.  Cannon fired back:

Medicaid expansion is not a benefits-only proposition. When a poll only asks voters about benefits, the results are meaningless. Yet to my knowledge, JMI’s poll is so far the only poll that has asked voters about both costs and benefits. All other polls—for example, the hospital-industry poll Barro cites—ask only about benefits, as if the costs don’t exist or shouldn’t influence voters’ evaluation of the expansion. Those polls are “push” polls, while JMI’s poll is the only honest poll in the field.

I consulted an experienced GOP-leaning political pollster in the Washington, DC area to get the skinny.  The pollster, responding on condition of anonymity, expressed “serious concerns about the poll.” To wit:

First, it’s not a true survey of registered voters, because they focus mostly on pulling from registration lists those who voted in at least two of the last four elections. You can’t say that’s representative of Florida registered voters, though you could say its representative of likely voters. That’s a distinction that should be made clear, as it will bake in a slight right-leaning skew compared to straight-up registered voters.

I stopped reading and started writing this email when I hit that first debt question. A good poll would have asked a more “clean read” without loading up a big message before the ask about how important the debt is. The interviewer says “well everyone else cares about the debt, so, how concerned are you?” Really not good. This is the kind of question you push further down in the questionnaire as a message test, not as a legitimate gauge of concern about debt.

Then I got to the question [posed as] “some say we need reform” vs. “some say we need to preserve a government program.” How [often do] Democrats actually say, “we must preserve a government program!” Never. They say, “we must preserve needed health services for our poorest citizens,” etc. A good poll puts our best message against their best message. Already, the poll is putting up a weak version of the opposition’s position.

The point thus goes to Barro, though I’m sympathetic to Cannon, who is not a pollster and was only asked to review these questions for the accuracy of their substantive claims about Medicaid.

Such bad methods reflect poorly on the James Madison Institute, which holds itself out to be a research and educational organization, complete with a Research Advisory Council primarily composed of university-based social scientists. Think tank research isn’t expected to be peer-reviewed academic journal fodder, but it usually aspires to inform the public policy debate by telling us something new about the world we live in.  Think tank findings are often presented in light of researchers’ prior ideological commitments, but they should not merely be talking points in support of predetermined conclusions.

Not surprisingly, a James Madison Institute press release reveals that a division of the Florida-based public relations* firm Cherry Communications conducted the Medicaid expansion poll under contract.  “[While] a polling firm’s first goal is to create situational awareness,” the DC-area pollster explained, “a PR firm’s first goal is to create good headlines. These are each valuable but are not the same thing.”  Nor are they necessarily mutually exclusive:

There are really two different ways to approach designing a poll. One is if you want an accurate read on public opinion to guide strategic decision making. The other is to “message test” and to figure out how best to move opinion and build a communications plan. You can do both in one survey as long the “clean read” part comes first.

The fundamental problem here is that this poll was conducted with public release in mind and to show right off the bat that conservative messages on the issue work. This is a PR firm’s goal clearly. There’s no time taken to get the clean read.

The James Madison Institute hasn’t yet responded to my request for comment, but it isn’t hard to surmise what happened here: the communications department probably commissioned a poll as a way to get airtime for the Institute’s message on Medicaid expansion.  But a poll isn’t just a message.  A poll is a social scientific method, which why a lousy poll from a think tank casts doubt on the quality of its other research.

Journalists and policymakers afford more weight to think tank research than they do to press releases from PR firms because think tanks aren’t supposed to just spin.  The James Madison Institute may have rationalized this survey as the digital equivalent of liquid courage for skittish pols, but it should worry instead about what techniques like these suggest about its institutional values.  Reputation matters, because media and government consumers often don’t have the time or expertise to independently assess the quality of every report.  I am less likely now than I would have been last week to take anything in the James Madison Institute’s new policy brief on Medicaid expansion at face value, because I have reason to question the organization’s commitment to good research methods.

UPDATE:  I have just been informed that the Cherry Communications website I linked above belongs to a different firm with the same name as the “Cherry Communications” referenced in the James Madison Institute press release, whose division, “Public Insight,” conducted the Medicaid expansion poll.  I have eliminated the incorrect link, and I apologize to both firms for the error.

The James Madison Institute has offered some comments regarding the poll, and Jim Cherry of the Cherry Communications whose division conducted the Medicaid expansion poll has offered to comment also, so stay tuned for a follow-up post.

*Cherry Communications is really better characterized as a Republican political consulting firm specializing in phone-based services such as voter identification calls, persuasion/advocacy calls, get-out-the-vote calls, surveys, and polls.  It’s website is here.

One Response to Straw Poll Fallacy

  1. Ray Lopez February 27, 2013 at 12:37 pm #

    Nice article. So the issue then is: does a think tank squander its “reputational capital” in a transparent way, or stealthy? That is, if the poll is clearly a push poll, it can be laughed away as was done here, and nobody save the sub-100 IQ crowd in TV land will fall for it. But what about a more cleverly constructed push poll? Is that even more dangerous? We all know how opinions ‘follow the money’, and the most credible academics are nothing but hired mouthpieces at times. In litigation, they even have a phrase for it: ‘The Third Law of Litigation’ (after Newton’s laws): “For every expert, there is an equal and opposite expert”.