As many of you may be aware, the UK general election took place yesterday, resulting in a surprising victory for the Conservative Party. The pre-election opinion polls predicted that the Conservatives and Labour would be roughly equal in terms of votes cast, with perhaps a small Conservative advantage leading to a hung parliament; instead the Conservatives got 36.9% of the vote to Labour's 30.4%, and won the election outright.

There has already been a lot of discussion about why the polls were wrong, from methodological problems to incorrect adjustments. But perhaps more interesting is the possibility that the polls were right! For example, Survation did a poll on the evening before the election, which predicted the correct result (Conservatives 37%, Labour 31%). However, that poll was never published because the results seemed "out of line." Survation didn't want to look silly by breaking with the herd, so they just kept quiet about their results. Naturally this makes me wonder about the existence of other unpublished polls with similar readings.

This seems to be a case of two well know problems colliding with devastating effect. Conformity bias caused Survation to ignore the data and go with what they "knew" to be the case (for which they have now paid dearly). And then the file drawer effect meant that the generally available data was skewed, misleading third parties. The scientific thing to do is to publish all data, including "outliers," both so that information can change over time rather than be anchored, and to avoid artificially compressing the variance. Interestingly, the exit poll, which had a methodology agreed beforehand and was previously committed to be published, was basically right.

This is now the third time in living memory that opinion polls have been embarrassingly wrong about the UK general election. Each time this has lead to big changes in the polling industry. I would suggest that one important scientific improvement is for polling companies to announce the methodology of a poll and any adjustments to be made before the poll takes place, and commit to publishing all polls they carry out. Once this became the norm, data from any polling company that didn't follow this practice would be rightly seen as unreliable by comparison.

New Comment
25 comments, sorted by Click to highlight new comments since: Today at 9:33 AM

The, ahem, money quote:

Interestingly, those who invested their own money in forecasting the outcome performed a lot better in predicting what would happen than did the pollsters. The betting markets had the Conservatives well ahead in the number of seats they would win right through the campaign and were unmoved in this belief throughout. Polls went up, polls went down, but the betting markets had made their mind up. The Tories, they were convinced, were going to win significantly more seats than Labour.

A bet is a tax on bullshit :-)

FWIW betting odds were almost exactly 50:50 over who would be prime minister after the election right up until the BBC exit poll was announced.

I just read the Wikipedia article on Prediction Markets, and thought others might find it interesting.

Basically, people can be betting monopoly money or only pennies, and they will make significantly better predictions than if you just ask them to guess. Some will use tools like Applies Information Economics to make surprisingly accurate predictions, at least on average. Others will still just guess, but they'll have a calibrated probability assessment, or perhaps will just be playing the game for the sake of better calibrating their probability assessments.

At some point, I need to calibrate my probability assessments, and put all this LW theory into practice.

I was looking at paddypower (they take bets on the election) and while they were more accurate than the polls they were still underestimating the number of seats the tories were going to get by ~50 seats. I think the safest bet the day before was the tories in a coalition.

Yes, the bookies performed better, but still not particularly well.

It has been been pointed out that it's hard to predict, especially the future :-)

Discussed on More or Less, the BBCs programne for statistics nerds.

If you want to improve the work the polling industry does, it would make sense to ask, we they do election polls in the first place.

Are they payed by someone to carry out the poll? If so, who pays? Do they do it to get their name out there and get payed for other polls?

Each time this has lead to big changes in the polling industry. I would suggest that one important scientific improvement is for polling companies to announce the methodology of a poll and any adjustments to be made before the poll takes place, and commit to publishing all polls they carry out

This would be a great change, but if they are worried about looking silly for having an incorrect outlier it may be very hard to create an incentive structure to get them to actually do this.

What do you mean by an outlier in this context? That they screwed up in their methodology, or that by random chance they sampled all the old curmudgeons in some district, but none of the bright young things?

Essentially the second, although the first could be a problem also. The point is that the incentive structure isn't there for them to precommit.

[-][anonymous]9y00

A third effect can be that Conservative supporters are less likely to admit it openly as it is considered less "cool". If you would poll people asking what their favorite food is, Big Macs would be underreported as there is a bit of a status stigma against admitting that.

A potential similar effect would be replying with the name of a "cool" party, then not voting at all because they have something else to do, such as work (the election was conducted on a workday, which is a bit weird to me, why not Sunday).

I wonder if there's some element of cause and effect at work here.

Let's say that I'm a British citizen who supported the Labour party before the election. As I watch BBC, I see that all the polls show that the Labour party will do well.

Does this effect my choice in whether or not to vote?

Personally, I live in a (very) democratic state in the US, to the point where I don't even bother voting for state officials. The "one person can make a difference" argument doesn't seem to hold up for me in the voting booth.

In short: how much do what the polls say effect the actual voting? Is there some way to measure this?

[This comment is no longer endorsed by its author]Reply
[-][anonymous]9y00

Very good and important points. Not much to add, except a brag that the bayes points I made from betting on a tory majority should keep me on top of the leaderboard of the prediction market at the Citadel rationalist house for another month. In honesty, this probably had less to do with the superiority of my Bayes-fu and more to do with blind luck.

[This comment is no longer endorsed by its author]Reply

I think there used to be a name for a particular polling effect in the US where black candidates did better in polling than in actual votes.

Here it is: https://en.wikipedia.org/wiki/Bradley_effect

Which some theorized was based on the Social desirability bias https://en.wikipedia.org/wiki/Social_desirability_bias

In the US, the media tends to be biased against conservatives, and the little I've seen of the BBC, it's even more biased.

With such a media bias, I'd expect conservatives to tend to poll lower than the election results, particularly from polls from the media orgs, who tend to predict according to their own biases.

It might be that a conservatives is simply less likely to give a stranger who calls them on the phone 10 minutes of his time for free.

the US, the media tends to be biased against conservatives,

Is that a fact?

So people have opinions. Thats not proof of biased reporting. These people have bosses. Including Rupert Murdoch.

Yes, in the UK this is called the "Shy Tory" effect. I don't think all the media here is biased against the Conservative Party (some are pro, some are con) but you are quite right that the BBC (which is easily the largest and most powerful media organisation) is hostile. It's possible this is (part of) the cause of this effect, and it's notable that the three general elections where the polls messed up all did so by underestimating the Conservatives and overestimating Labour.

But I don't know that this is a case of the polls getting it wrong. If this was a case of the "Shy Tory" effect, why did the exit poll (and the unpublished Survation poll) get it right? By comparison, in 1992 the exit poll was wrong.

Whilst the BBC may lean towards the left and be a powerful organisation, I doubt that the effect it had was anywhere near as great as the combined forces of the Daily Mail, The Sun, the Daily Telegraph, and The Times. These newspapers, which account for a huge percentage of the UK's circulation, conducted what I can justifiably describe as a co-ordinated campaign of vitriolic rhetoric against the Labour party and Ed Miliband's character. It's telling that 2 of the 4 are owned by Rupert Murdoch, who has a lot to gain from a continued Conservative government.

There is simply no way the BBC would be able to pull the same kind of trick, as their reputation rests on them at least appearing politically neutral. It can't have been that hard to come up with some derisive slogan regarding the welfare cuts Cameron is inevitably going to try and make, but the country (both left and right) would be up in arms if it was seen to be the opinion of the BBC. The only left-leaning newspaper with a comparable circulation is the Daily Mirror, which ran the headline "Keep Cameron Out". I haven't seen this article myself, but I doubt it has the same attacks on character as the right-wing papers.

I'd be grateful if someone could give/suggest a reason for the downvotes.

I'd guess it's a combination of several things:

First, you're not really adding anything to the overall debate (why the polls were wrong). Second, your post is mostly about politics, and how those of the party you disagree with is evil (or at least engages in ethically questionable behavior). Third, your bias is evident (Why do you doubt the left-leaning newspaper wouldn't engage in character attacks - as written, it comes off as little more than "My side is better than their side, and wouldn't resort to such tactics").

Don't stress the voting too much, though. It doesn't mean "You've done something wrong" (in general), it means "I don't want to see this kind of thing". Look at what you've posted - if you care what other people want to see - and don't write that kind of thing anymore. That, of course, depends on being able to tell what it is you've actually done.

Yes, in the UK this is called the "Shy Tory" effect.

Anyone got the same effect in other countries?

There may be another side of it: the tories do worse in cities better in the country.

BBC offices tend to be in cities so their staff are going to be surrounded by more lab/lib people meanwhile apart from debates where the audience is selected to match national statistics BBC studio audiences tend to be people who live in cities and are willing to stand in line for a long time for free tickets so are more likely to be young and poorer which makes them more likely to be lab/lib.