Can we trust the polls? Under the best of circumstances, the answer is “Not necessarily without a fair amount of detailed information about how they were conducted.”
Polls have saturated the media narrative – and with methodological concerns increasing within the polling industry, much more caution is needed.
Pollsters inaccuracy has gone hand in hand with the rise of mobile phones and the internet. Many people only have a mobile phone – they don’t turn up in the polls as mobiles are excluded from samples that most public polling firms can buy.
One further problem linked to new technology is the incoming call screening for landlines, which alerts households to who is calling, makes it possible to avoid calls from “out of their area” or from unfamiliar numbers. Most polling now takes place on a landline phone, because it is cheap and easy for someone who wants to get into the polling business to buy a sample, write a short questionnaire, buy interviewing services from a telesales team, and receive a report based on the information gleaned.
As a consequence, the opportunity to see the results of a poorly conducted poll has become more frequent. This problem is made worse by hack journalists and newsorgans who wish to “report on public opinion”, but in fact wish to guide it like a tethered cow.
“THE REPORTING OF ‘POLL DATA’ AS IF IT WAS NEWS IS A PROBLEM. NEWSPAPERS, WHEN THEY CAN’T BE BOTHERED TO RESEARCH ANYTHING AND DO ANY WORK, COMMISSION AN OPINION POLL AND REPORT ITS FINDINGS AS IF IT WAS FRONT PAGE NEWS, ALREADY, BY ITSELF?
THIS IS STUPID, AND THE POLL DATA LANGUAGE OF ‘CONSENSUS’ THAT GOES WITH IT, I THINK, IS VERY STUPID TOO.” – C. HITCHENS
Such people are not generally well trained in assessing poll results and thus cannot always weed out “bad” poll results before they enter the news stream and become “fact.” So the risk is growing that bad polling data is infecting our Body Politik.
Neither poll consumers nor journalists who write about polls have access to quality-control criteria or certification processes by which to assess specific firms or individuals. As a result, all must rely on news organisations to evaluate polls on the basis of the standards of disclosure of poll results adopted by that news-organ. When you get a poll from them, you are essentially getting some of that news-organs propaganda. Why?
Falling response rates. The fact that no one is even answering these spam artists is a concern for the entire survey research industry, whether academic researchers, political consultants who work for candidates, or news organisations. Response rates in telephone surveys are now averaging about 10 percent, although most media polls claim to have response rates in the 30-45 percent range. Although analysts have identified many factors behind this long-term trend—such as the negative impacts of telemarketers posing as pollsters and the increased use of various call-screening devices—we don’t yet understand well how much each contributes to the overall decline.
Researchers are also beginning to understand that declining participation rates probably affect different kinds of political polls in different ways. The reliability of polls depends on the collection of high-quality data, well analysed and appropriately interpreted. It also depends on knowing the bias of the company doing the poll, the client asking for the poll and the people choosing to do the poll.
Poll consumers, as ever, have no recourse but to pay as much attention as they can to where the data came from and how they were analysed. This point is doubly true today, given the failure to predict the 2015 General election, the Brexit vote & the Trump win, when so many others could clearly see what was coming from analysis and research, clearly the methodologies being used to glean these results don’t bear the scrutiny of the ballot box.