Why the Polls Got It Wrong Over the UK General Election and Why It Matters

Polling before UK Election Day proved surprisingly inaccurate at predicting the final result; the British Polling Council is reportedly setting up an independent enquiry into what went wrong.

Polling before UK Election Day proved surprisingly inaccurate at predicting the final result; the British Polling Council is reportedly setting up an independent enquiry into what went wrong.

The BBC Poll of Polls up until May 6th put the Conservatives on 34%, Labour 33%, UKIP 13% and Liberal Democrats 8% of the share of the vote.

At the Election, Conservatives got 36.9%, Labour 30.4%, UKIP 12.6% and Liberal Democrats 7.9%, so pre-election polls were accurate about the smaller parties.

A recent scientific study concludes that British polling has, at times, over-estimated Labour support (and underestimated the Conservative vote). The study, published in the academic journal 'Electoral Studies', points out that this tendency was exhibited most dramatically in the 1992 General Election, when polling predicted a Labour victory.

Stephen Fisher, Robert Ford, Will Jennings, Mark Pickup and Christopher Wlezien conducted the study of forecasting election-day vote share from polls, using polling data covering 16 UK elections from 1950 to 2005.

The results indicate that polls before elections have been fairly good predictors of the vote for the Conservatives and Liberal Democrats, but somewhat less so for Labour.

Entitled, 'From polls to votes to seats: Forecasting the 2010 British general election', the study points out that polling tells us how people would vote if the election were held 'tomorrow', not how they will vote when the election actually occurs.

Between the day of the poll and the election, events could change voters' minds. Also pre-election poll leads tend to fade, an effect regularly observed in US presidential elections.

Another possibility is smaller parties are starved of media coverage during normal politics, but benefit from increased exposure during election campaigns, when media legally must give equal coverage to all the main parties.

The study which was published by authors from the Universities of Oxford, Manchester and Southampton in the UK, Temple University and University of Texas at Austin in the USA and Simon Fraser University, Canada, argues that governing parties may be subject to particular campaign effects - voters might use polls before the election to protest against the government, but come to a more 'considered' judgement on election day.

Another possible explanation for polling inaccuracy has been uncovered by new research revealing that individuals under high stress tend not to vote. The study entitled 'Cortisol and politics: Variance in voting behaviour is predicted by baseline cortisol levels', finds that lower baseline salivary cortisol (a stress hormone) in the late afternoon was significantly associated with increased actual voting frequency in six national US elections.

From the University of Nebraska at Omaha, University of Nebraska-Lincoln and Rice University, Houston, the study found that although more stress meant less voting, elevated stress hormone levels didn't affect whether the participant had attended a political meeting or rally, or any other political activity.

The authors, Jeffrey French, Kevin Smith, John Alford, Adam Guck, Andrew Birnie and John Hibbing found variation specifically in voting, but not other political behaviour, is significantly predicted by this stress hormone.

The study, published in the academic journal, 'Physiology & Behaviour', argues that voting typically involves decision-making and emotional conflict, which may put off those with elevated afternoon stress hormones.

So it's possible what people say to pollsters may be very different to what they do, particularly if they are stressed.

But even exit polls - the supposed gold standard - have also got it badly wrong in the past.

Jose Pavia from the Department of Applied Economics, University of Valencia, Spain, in an investigation entitled 'Improving predictive accuracy of exit polls', points out that striking errors in exit polls have been found all over the world.

For instance, in the 2004 Indian Legislative elections, the 2007 Croatian Parliamentary elections and the 2008 Paraguayan Presidential elections.

In the USA in 2000, exit polls famously failed to predict the election results in Florida and some other states, and in 2004 Kerry was wrongly projected to win the electoral college and the popular vote. In fact, the Bush vote was underestimated in 41 of the 50 exit polls, and the winner was incorrectly identified in five states.

Published in 'International Journal of Forecasting', the study points out that 'nonresponse bias' is possibly a key reason for the inaccurate exit polls, and emerges when a difference exists in the propensities of voters of various parties to respond to the exit poll, furthermore, nonresponse is usually very high in exit polls.

Jose Pavia also points out that genetic bases to political preferences are suggested by recent research, so cooperating with pollsters might be influenced by the same personality features. If true, polls could be systematically biased against measuring accurately certain political viewpoints.

But does any of this arcane statistical debate actually matter?

A study entitled 'Exit Polls, Turnout, and Bandwagon Voting: Evidence from a Natural Experiment', exploited a voting reform in France, to estimate the effect of exit poll information on voting.

Before the change in legislation, those in some French overseas territories voted after the election result had already been made public, via exit poll information from mainland France.

The authors of the study, Rebecca Morton, Daniel Muller, Lionel Page and Benno Torgler, estimate that knowing the exit poll decreases voter turnout by about 11 percentage points.

The study, published in the academic journal, 'European Economic Review', is the first outside of the laboratory to demonstrate the effect of polls on voter turnout.

The authors, from New York University, University of Mannheim and Queensland University of Technology, find exit poll information significantly increases 'bandwagon' voting; voters who choose to turn out are more likely to vote for the expected winner.

The authors point out that, in August 2009, exit poll results for key regional elections in Germany were leaked on Twitter before voting ended. Such reporting is against the German law with a fine of up to 50,000 euros.

A survey of 66 countries worldwide finds that of the 59 that permit exit polls during an election, 41 prohibit publication of results until after all voting has ended. Yet in recent 21st century elections, incidents similar to the 2009 Twitter controversy in Germany abound. One candidate is reported to be being investigated by police after being alleged to have tweeted exit poll information before voting closed on the most recent UK Election Day.

In 2007, the websites of several Swiss and Belgian newspapers crashed when French citizens attempted to access exit poll results during an election, and in the 2012 French presidential election, results were also available online while voting was still in progress.

If polls and their inaccuracies disrupt voting, and if this effect gets amplified through social media and technological advances, genuine democracy will crash, and may never re-boot.

Close

What's Hot