The certainty of chance

September 6th, 2008 by Ben Goldacre in bbc, statistics, times | 18 Comments »

Ben Goldacre
The Guardian,
Saturday September 6 2008

Britain’s happiest places have been mapped by scientists, according to the BBC: Edinburgh is the most miserable place in the country, and they were overbrimming with technical details on exactly how miserable we are in each area of Britain. The story struck a chord, and was lifted by journalists throughout the nation, as we cheerfully castigated ourselves. “Misera-Poole?” asked the Dorset Echo. “No smiles in Donny,” said Doncaster Today.

From the Bromley Times, through Bexley, Dartford and Gravesham, to the Hampshire Chronicle, everyone was keen to analyse and explain their ranking. “Basingstoke lacks any sense of community or heart,” said Reverend Dr Derek Overfield, industrial chaplain for the area. And so on.

Exactly what kind of data is the good reverend explaining there? The Times had some methodological information. “Researchers at Sheffield and Manchester universities based their findings on more than 5,000 responses from the annual British Household Panel Survey.” According to the BBC it was presented in a lecture at some geographical society. “However,” they said quietly, “the researchers stress that the variations between different places in Britain are not statistically significant.”

Here, nestled away, halfway through their gushing barrage of data and facts, was an unmarked confession: this entire news story was based on nothing more than random variation.

There are many reasons why you might see differences between different areas in your survey data on how miserable people are, and people being differently miserable is only one explanation. There might also be, of course, the play of chance: 5,000 people in 274 areas doesn’t give you many in each town – fewer than 20, in fact – so you might just happen to have picked out more miserable people in Edinburgh, and miss the fact that misery is, in fact, uniformly distributed throughout the country.

This is called sampling error, and it quietly undermines almost every piece of survey data ever covered in any newspaper. Although the phenomenon has spawned a fiendish area of applied maths called “statistics”, the basic principles are best understood with a simple game.

Dr Deming was a charismatic management guru who railed against performance-related pay on the grounds that it arbitrarily rewarded luck.

Working in a theatrical field, he demonstrated his ideas with a simple piece of stagecraft he called The Red Bead Experiment.

Deming would appear at management conferences with a big trough containing thousands of beads which were mostly white, but 20% were red.

Eight volunteers were then invited up on stage from the audience of management drones: three to be managers, and five to be workers.

“Your job,” Deming explained solemnly, “is to make white beads.”

(Fast forward to 4m30s in the video below…)



He then produced a paddle with 50 holes cut into it, which was passed to the each worker in turn. They dipped the paddle into the trough, wiggled it around, and tried to produce as many white beads as they could manage, through this entirely random process.

“Go and show the inspectors,” he would say, sternly. “Only five red beads, well done!”

“14 red beads? I think we need to re-evaluate your skill set.”

Workers were sacked, promoted, retrained and redeployed, to great amusement.

We ignore basic principles like sampling error at our peril, because the illusion of control, which we all carry around for the sake of sanity, is more powerful than we think, and countless workers have had their lives turned to misery for the simple crime of pulling out 15 red beads.

Back in the world of misery, were the journalists blameless, and guilty only of ignorance? For any individual, nobody can tell.

But Dr Dimitris Ballas, the academic who did the research, had this to say: “I tried to explain issues of significance to the journalists who interviewed me. Most did not want to know.”

· Please send your bad science to


Gimpy bothered to email the BBC, and they emailed him back. Bad BBC.

Meanwhile here is the fun you could be having in the Bad Science Forum with fellow nerds, pedants, cranks and geniuses:

If you like what I do, and you want me to do more, you can: buy my books Bad Science and Bad Pharma, give them to your friends, put them on your reading list, employ me to do a talk, or tweet this article to your friends. Thanks! ++++++++++++++++++++++++++++++++++++++++++

18 Responses

  1. gimpyblog said,

    September 6, 2008 at 7:00 am

    Ben, apgaylard and Dougal Stanton have also covered this.

  2. Weirdbeard said,

    September 6, 2008 at 9:41 am

    The Beeb had a ‘Have your say’ attached to this article. I don’t normally read these as they’re nearly all from the “It’s common sense innit?” brigade. Sure enough it was full of people either agreeing that the article was just reporting the obvious or people disagreeing because, well, in their opinion their town was the happiest. I added my bit, pointing out that the the article had said that the result was not significant. The moderators thought it not worth publishing. When I looked back a couple of days later ther was precisely one response (not mine) pointing out that it was a non-story. Mustn’t get in the way of a good story must we?

  3. SubMoron said,

    September 6, 2008 at 11:38 am

    Didn’t want to know because it would have spoiled the story?

  4. David Mingay said,

    September 6, 2008 at 2:03 pm

    Which over-enthusiastic press office put this out in the first place, I wondered. I’ve had a look at the output from Manchester and Sheffield universities. Not guilty. But there’s this from the Royal Geographical Society:

    It’s not till note to editors number 7 that the truth comes out.

  5. oneoffmanmental said,

    September 6, 2008 at 2:33 pm

    I don’t think the BBC is entirely to blame. The Royal Geographical Society spun the story, with only tiny footnotes saying it’s all bollocks. No surprise that Ballas’ work is influential in that Cities Unlimited report either?

    Also, the authors research was published their stats in Arxiv, not a journal.

  6. BenN said,

    September 6, 2008 at 5:28 pm

    Nice Divine Comedy reference (oh, and interesting article, as always, thanks).

  7. mrmuz said,

    September 7, 2008 at 7:35 am

    That Deming guy is a menace. Getting rid of evaluations and promoting co-operation! Why, there’d be nothing left for most managers to do. Think of the unemployment!

  8. isitmedicine said,

    September 7, 2008 at 5:32 pm

    You might be interested to know that last week the BBC News website had in its ‘most read’ box a series of articles by a Michael Blastland explaining the myths behind statistics, how to read them, etc – similar in a sense to this column – part 5 with links to the other parts here:

    Maybe the BBC journos should be reading the BBC’s own popular stories.

  9. Dudley said,

    September 7, 2008 at 6:07 pm

    I also bought “How to Lie wih Statistics” – very good, but a slightly misleading title. It hasn’t taught me how to lie with statistics at all.

  10. Pro-reason said,

    September 8, 2008 at 7:42 am

    I think I agree with those businessmen over Deming.

    I think that Deming is against penalising people for red beads simply because it is unfair. However, from a business point of view, it doesn’t matter that your rewards and punishments are sometimes arbitrary. All that matters is that the workers understand that they need to whatever they can do to achieve results.

    Oderint dum metuant, as some shite once rightly said.

    Business isn’t about fairness and co-operation. It’s about “do this because I say so, and I can fire you”. It’s unfair and illogical to begin with, so making a fair and logical version of it won’t succeed.

    That’s not to say he doesn’t have some good practical ideas, though.

  11. JMS said,

    September 8, 2008 at 11:21 am

    @ 12 Pro-reason

    I don’t think Deming is against performance related pay simply because it is unfair, but because it is unproductive. In the red bead experiment suppose that there was some subtle way to influence the outcome, say the red beads where slightly magnetic for example. Now what would be the best strategy for a smart worker who discovered this? They would not tell their colleagues because then they would loose their advantage. They would not tell the company because then there is no game/experiment and no extra pay. Worse (from the company’s point of view) the smart worker would have to be careful not to exploit his advantage too much for fear of being found out. If he/she won best employee every time people would begin to suspect and try to discover the secret.

    The “treat ’em mean to keep ’em keen” school of management is really more about enforcing and maintaining power relationships, not about maximizing profits. This may be relevant to military situations but if you are trying to improve production or quality, cooperation is by far the best way. The people who know how best to do the job are usually the ones doing it, not those trying to manage the process. In my experience organizations work despite their management rather than because of it. I have often thought that a good way to destroy a company or business would be if the workers did exactly what the managers told them to do however my union has advised me that the courts would consider this as industrial action!

  12. kooshster said,

    September 10, 2008 at 6:17 pm

    While the media have been wrong to run with the “happiness rankings” the way they have done, with no evidence in the study to suggest that these rankings weren’t the result of random variation, this ranking table didn’t come from nowhere. It was irresponsible for such a ranking table to have been constructed in the first place.

    It’s not a new phenomenon that the media try to sensationalise academic studies into scare stories, but you can hardly fault them in this case when such a juicy piece of gossip such as a ranking of British towns and cities by how happy they are is waved right under their noses.

    If the findings were not “statistically significant” then the researcher should never have allowed the ranking table to be written.

  13. Ben Goldacre said,

    September 10, 2008 at 7:21 pm

    well, we can see from the press release that this was irresponsibly presented to the media in any case, but i am quite concerned by the suggestion from people here and in my emails that academics should not dare to release some forms of information for fear that it will be misunderstood or overstated by jourbnalists. that seems like a pretty ridiculous gagging order to me, the media arent the only consideration when academics decide what utterances to make to each other, or “in public”, whatever the difference might be bnetween those two these days.

  14. gimpyblog said,

    September 11, 2008 at 7:23 am

    The press release might have been badly presented but the ‘notes for editors’ were clear enough in explaining the implications of no statistical significance. I find it even more damning of the BBC that the editors ignored the ‘notes for editors’.

  15. frontierpsychiatrist said,

    September 13, 2008 at 4:00 pm

    What I don’t understand is that
    this study appears to have planned to only ever recruit a small number of people from each of the places it wished to study. Does this not mean that from the outset that it would have struggled to achieve statistical significance. And if you could work this out before even starting, then why do it in the first place?

  16. RossAberdeenUK said,

    October 1, 2008 at 2:23 pm

    Watching Deming speak at the end of the video there – his words sound like an indictment of the situation the west is now in commercially, due (I have read) to competition and achievement targets within the stockbroking community.

    In psychological terms, beliecing with certainty something which is patently untrue is termed a psychosis. Does this mean that all of our commercial institutions are in fact mad? Perhaps some Fish Oil supplements will help.

  17. mobiledisco said,

    December 17, 2008 at 12:31 pm

    As sound as your point is, this story was not ‘discovered’ by enterprising journalists, it was handed, indeed promoted to them by people representing the scientists you are so keen to let off the hook. Scientists who knew what was being done in their name (either that or they are so unaware of what goes on around them that they shouldn’t be let out of the house unaccompanied) and who, I bet, did not start and end their conversations with journalists with ‘by the way, this data means nothing of great significance’. The scientist you spoke to said he ‘tried to explain’. As if he was going to fess up to doing anything else! You accept this as gospel truth without question. Did he really do that? Why not apply your usual rigour to this claim too? There will be notebooks, tapes, etc which could prove this, and I bet they would not show him demonstrating nearly as much caution as he claims he did, and which you unquestioningly accept. If the stats are really as weak as they seem to be he could simply have refused to be interviewed, or repeated the bit about lacking statistical significance over and over until the journalists went away, or not actively participated in the promotion of his research in the first place.

    I am a huge admirer of your stuff Ben, but you of all people realise that the heart of all ratonal consideration, scientific or otherwise, is impartiality. Question everything or your conclusions will be lopsided and at least appear like you’re just handing a pass to your fellow scientists while blaming absolutely everything on the journalists who write it up. Both sides bear responsibility for nonsense like this. It would not harm your case a jot to acknowledge it.

  18. jiang said,

    December 22, 2009 at 5:22 am

    ed hardy ed hardy
    ed hardy clothing ed hardy clothing
    ed hardy shop ed hardy shop
    christian audigier christian audigier
    ed hardy cheap ed hardy cheap
    ed hardy outlet ed hardy outlet
    ed hardy sale ed hardy sale
    ed hardy store ed hardy store
    ed hardy mens ed hardy mens
    ed hardy womens ed hardy womens
    ed hardy kids ed hardy kids ed hardy kids