Vanderbilt University Must Absolutely Stop these Polling Practices
Let's start with an easy one first. Vanderbilt University conducted a poll of Tennessee adults between May 2-9th. It then teamed up with the state's Tennessean newspaper to release the poll on May 20th. Are you kidding me? They released a poll almost three weeks after the first interviews were conducted. As quickly as politics changes, they may as well have released poll results from last year. Its interesting to get a sense of what voters felt three weeks ago, but it isn't exactly news today.
Keep in mind, the poll was not some baseline survey covering voters' general attitudes towards government and politics -- the kinds of items that trend very slowly over time. Rather, it was about issues very much in the news now: gay marriage, ObamaCare, taxes, and, of course, the presidential campaign. This isn't a partisan issue; its a common sense one. Releasing a three-week old survey as news is ridiculous.
Vanderbilt then withheld the full survey results for three days, while the Tennessean ran multiple stories "reporting" on the poll findings. Polling organizations are increasingly providing this information as soon as the top-line results are released. It is sometimes a bit harder to find than I would like, but it is usually out there. For Vanderbilt to deliberately withhold the full results while its "findings" are being reported on for several days is so ludicrous it borders on malpractice.
The University's third troubling practice gets a bit more complicated, so let me boil it down: quit playing around with the samples. Its last political poll was conducted in February of registered voters only. This latest poll was of Adults. Why the switch?
Worse, the poll tries to compare results of the current poll of adults with the February survey of registered voters. Seriously, they put the numbers side-by-side trying to intimate trends. Given that surveys of adults have an inherent 7-point bias for Democrats, this misleading comparison would generally either suggest a movement to the Democrat position or obscure a movement to the GOP position. Putting aside any deliberate attempt to massage the numbers for partisan advantage, what intellectually would argue for comparing numbers drawn from completely different universes? The answer, clearly, is some higher-level polling theory which one can only understand within the confines of Academia.
This is especially perplexing, because the data on registered voters seems to exist. The Tennessean led their poll coverage with the breathless headline that Obama had "closed the gap" with Romney among adults. It reported much further into the story that among registered voters in the poll, Romney had a more comfortable 7-point lead. So, if you want to compare to an earlier poll of registered voters, why not just report the results of registered voters from this poll?
Even when Vanderbilt released its "full results" late yesterday, it didn't include any data on registered voters. I can't even verify whether the statement that Romney led Obama by 7 among this group is even accurate.
Finally, it also doesn't appear that Vanderbilt did any screening for party affiliation or ideology. Really. For a political poll. The only questions related to party affiliation were asking respondents their state Legislator's political party. Even assuming that respondents knew this little factoid with 100% certainty, it doesn't really tell us much. I guess it could be used to weight the sample so it accurately matches the distribution of seats in the state legislature, but I'm not sure that's meaningful. Lots of individual Democrats live in GOP districts and vice-versa. Without knowing the partisan screen of the individual respondents, its hard to glean anything definitive from this poll. Remember, the one that's three weeks old.