You'd be hard-pressed to talk me into a mail or telephone survey these days. From the Chronicle of Higher Education:
... a host of researchers, spurred by the rising cost of telephone polls and plummeting participation rates, are pushing to use a new generation of online-only surveys for their research. This work, which relies on subjects volunteering to be polled, carries great promise of allowing researchers to expand experiments beyond the usual suspects. But it also carries perils. ...
"Right now we don't know if the methods they employ are, or are not, going to have a catastrophic failure," says Robert Santos, chief methodologist at the Urban Institute, a nonpartisan think tank.
"The vast majority of social scientists would say these volunteer polls have no reason to be right," says Robert M. Groves, provost of Georgetown University and a former director of the U.S. Census Bureau. "They have no theory behind them at all."
Such arguments nag at Andrew Gelman, a professor of statistics and political science at Columbia University. "There's a reason why people aren't sticking with the old stuff," he says. Every survey now requires massaging data to account for low response rates; the ideal poll no longer exists. Researchers need new methods, he says. "Traditional polls are not so wonderful."
This simmering debate popped into public view this month, when the American Association for Public Opinion Research, the discipline's professional body, sent a letter to The New York Times warning against its recent use of online surveys. The letter was broadly written and seemed to indict the whole discipline of online polls.
Members of the association's email list were soon in a furious internal debate, and Mr. Gelman prominently criticized the statement. The association's president, Michael W. Link, regrets some of the language chosen for the letter. It was meant as a caution to the public—especially news outlets—and not as a condemnation of the research, he says. "Maybe the statement could have been a little clearer."
However it was meant, the letter highlighted the curious state in which survey research finds itself. As Mr. Groves has written, if there were a war between our guts and our statistics, the quants have won. Data are the currency of business, government, science, even higher education. There has never been more interest in polling; following Nate Silver, the media have rowed toward data analysis. The truth, if it can be found, simply must be hidden among those numbers.
Yet at this moment of demand, polling is in crisis. The costs have spiraled out of control. The public is harder than ever to reach. Landlines are dwindling, and rare is the person who takes an unknown call on her cellphone. Robocalls and junk polls clog the air. We all want to know what the public thinks—but who has the time to talk?
There is lots of good stuff in the rest of the piece.
I've used data collected by, in order of how much the data costs, Knowledge Networks, Online Survey Solution and SurveyMonkey (who bought Zoomerang). I endorse the first two but won't use SurveyMonkey's panel again (at least for surveys with hypothetical questions, I've presented a telephone vs. mail vs. online panel paper at a couple of conferences that explains why). I'm involved in data collection using the ridiculously cheap Mechanical Turk this summer, but I haven't seen the results yet. I've also heard good things about Survey Sampling's panel.
And today, I'll be in the field supervising an in-person survey of people lined up waiting to get in to the High Country Beer Fest. We have a 5 question postcard survey to pass out where the main question is something like, if the ticket cost $X instead of $40 would you still have shown up today? And in about a week, we hope to survey those who bought their tickets online. I've done a bunch of these tourism event surveys and have always gotten a good response rate.