The evidence that I am going to give you now will provide ammunition for those who don’t like opinion polls, think the numbers mean little and reckon our political culture would be healthier without them.
Of course, that’s not how I view this evidence. I believe it shows the need to read poll results more carefully. We need to distinguish between top-of-the-head answers to questions asked out of the blue, and how people react once they have thought about the matter in hand.
Here’s what we did. To inaugurate a new partnership between YouGov and the department of politics at Cambridge University, we tested how the wording and context of a question can shape responses. We all know how results can be biased by loaded questions, such as “Do you think Britain should allow the tinpot dictators in Brussels to tell us what to do?” So we asked alternative, unbiased questions. We chose the BBC licence fee: do people regard it as good or bad value? We posed the question in nine different ways, with nine different samples of more than 2,000 people, in late August and September.
The graphs overleaf show what we found. When we asked the simplest question: “Overall, do you think the BBC licence fee is good or bad value for money?” we found that equal numbers of people replied “good” and “bad.” This yields a net score (good minus bad) of zero.
But when we started adding information, the figures changed. Reminded that the licence fee costs £145.50 a year, people told us by two-to-one that it was bad value—a net score of minus 27. When we divided the annual sum into smaller timescales, responses became steadily more positive. If people are told the licence fee works out at 40p per day, the net score is plus 8.
Next, we added in the cost of a Sky package, and quoted figures (as Sky does) in terms of monthly subscriptions. When told a basic Sky package costs £19.50 and the licence fee £12.13, people give Sky a positive net score, and the BBC a negative score. But when we mentioned that a premium Sky package (including sports and movies) costs £55.75 a month, the BBC’s net score turned positive—and Sky’s turned negative.
Finally, we asked a warm-up question: how people feel about the service provided by the BBC. Three quarters of the public say they are satisfied. Having expressed satisfaction, people are then more likely to say the licence fee is good value. When the satisfaction question is linked to the 40p-a-day cost, the net score soared to plus 25.
In short, responses ranged from two-to-one saying good value, to two-to-one saying bad value. Critics will say this renders such polls meaningless. I disagree. The figures help to explain, for example, the government’s success in freezing the licence fee last year. One reason that ministers seemed to have popular support was because public debate centred on the high-sounding annual fee rather than its daily equivalent.
The deeper point is this. A typical poll question gives people between two and five options of answers. Except for respondents who answer “don’t know,” their views are reduced to a single, simple word or phrase—“doing well,” “oppose,” or whatever. The results frequently arouse media interest. Indeed, YouGov is often commissioned to ask stark questions to generate bold headlines and clear findings that this politician is hated or that cause is doomed.
It’s not that these headlines or allegations are wrong, but they are often too crude. A single question, or even a short sequence of questions, will seldom tell us all we need to know. One respondent may be knowledgeable, passionate and certain in his or her opinions: another, misinformed, indifferent and uncertain. Further thought, or the injection of particular facts, will cause one person to change his or her mind, and leave another unmoved. Unless we explore such matters, we are missing half the story.
Recently YouGov asked one of its routine questions about Nick Clegg and found him less unpopular than usual. How come? This time, we preceded this question with others about him and his party. I believe this affected his rating. We should not label one result as “right” and the other “wrong.” Rather we should regard the two sets of figures as telling us something important: that when the public spends a little time thinking about the Lib Dem leader, some of them start to warm to him. This is one reason why the trajectory of British politics over the years ahead could surprise us all.
A more sustained example of polling variation took place last winter ahead of May’s referendum on the alternative vote (AV). Until the final few weeks, YouGov’s standard question included a brief description of AV and asked people whether they wanted to switch to this system or stick with first-past-the-post. We found the “no” camp moving into the lead last September, and never losing it after that.
Other pollsters asked a simpler question, with no explanation. They showed the “yes” camp ahead until March. I lost count of the times that “yes” supporters complained to me about YouGov’s approach. In terms of technical purity, they had a case. By adding information, we altered the way some people responded. But that is precisely what made our polls last winter a more useful guide to the eventual outcome.
Nobody should have been surprised. Other research showed widespread ignorance of how AV worked. This meant that simpler questions, asked months ahead of the referendum, were lousy predictors of what would happen. Think about the process: people were called out of the blue and asked to give a view on a subject they probably neither understood nor cared about. As knowledge of AV spread, support evaporated. By adding a brief explanation, our question effectively anticipated this trend.
None of this will surprise YouGov’s corporate clients. They pay for insight, not headlines. They want to understand what makes their customers tick. Some people dislike applying commercial notions to debates about values and ideology. But shouldn’t we explore how voters make their choices at least as thoroughly as the way consumers choose their brand of dog food?