Funnel vision

July 7th, 2005 by Ben Goldacre in bad science, herbal remedies, penises, references, statistics | 5 Comments »

Ben Goldacre
Thursday July 7, 2005
The Guardian

· There I was two weeks ago, making sarcastic jokes about how Bad Science was just a cover for the [coughs] popular statistics lecture series I secretly yearned to give, and now I’m about to try to explain funnel plots to you, in a national newspaper, and without any diagrams. Here we go. Publication bias makes the published scientific evidence on something look skewed. Let’s say lots of people do studies on something. Let’s say … we have a theory that rubbing semen into your face prevents skin cancer. Just go with it.

· So lots of groups run studies where people cheerfully rub semen into their faces, and then they measure the rates of skin cancer years later. The studies where they find that semen protects you get written up and published. Plenty more studies find that semen has no effect: but the researchers can’t be bothered to write up these unintriguing, negative findings, or journals can’t be bothered to publish them, so nobody hears about them. If all the studies were published, the evidence would show that semen has no effect, but because of publication bias, it looks as if it works. Everyone starts rubbing semen on their face, and for none of the usual reasons.

· You can get a hint of whether there is a publication bias in the evidence on something by doing a funnel plot. I’m not really going to explain funnel plots to you here, although they are jolly interesting and you can read about them at The basic principle is this: all other things being equal, trials that show a smaller error in their result, perhaps the larger ones, will tend to cluster more around the true answer of how useful something is. The less accurate studies, perhaps the smaller ones, should scatter about this point randomly, some overestimating efficacy, some underestimating it. On a funnel plot, you can see whether there seem to be more positive trials than you might expect: if there are, then you have a non-random distribution, and publication bias might be the explanation.

· There are circumstances in which you don’t need to do a funnel plot. A study in 1998 found that there had never been a single trial published in China that found a treatment to be ineffective. Perhaps that’s another “tradition” in traditional Chinese medicine. And the BMJ this month published a paper called A Systematic Review of Publication Bias in Studies on Publication Bias, with a right lovely funnel plot. And that’s what I call a punchline.

If you like what I do, and you want me to do more, you can: buy my books Bad Science and Bad Pharma, give them to your friends, put them on your reading list, employ me to do a talk, or tweet this article to your friends. Thanks! ++++++++++++++++++++++++++++++++++++++++++

5 Responses

  1. Mike Canty said,

    January 31, 2006 at 7:18 pm

    Eagerly awaiting your article ripping the evidence of the professionals in Prof Kathy Sykes programme on acupuncture apart.

  2. John Shimmin said,

    March 9, 2006 at 4:07 pm

    References please? I’d like to take a look at those articles.

  3. Blog Atopowy » Blog Archive » Koniec homeopatii? said,

    November 22, 2007 at 10:25 pm

    […] możemy mieć masę pozytywnych, choć nieprawdziwych wyników? Ponieważ jest coś takiego jak „skrzywienie publikacyjne”. We wszystkich dziedzinach nauki, rezultaty pozytywne mają o wiele większą […]

  4. wayscj said,

    November 21, 2009 at 7:30 am

    ed hardy ed hardy
    ed hardy clothing ed hardy clothing
    ed hardy shop ed hardy shop
    christian audigier christian audigier
    ed hardy cheap ed hardy cheap
    ed hardy outlet ed hardy outlet
    ed hardy sale ed hardy sale
    ed hardy store ed hardy store
    ed hardy mens ed hardy mens
    ed hardy womens ed hardy womens
    ed hardy kids ed hardy kids ed hardy kids

  5. iphone revolution said,

    December 30, 2009 at 8:36 am

    iphone wireless


    Apple iphone