Comment: Research is all about error. Either learn how to interpret data yourself, or trust those who can do it for you
Wednesday November 2, 2005
Whatever you have been told, science is not about certainty. And this creates problems for those health professionals who are charged with interpreting and relating data to the general public. We are expected to refute wholesale misunderstandings, in a popular forum, to people who may well be intelligent but who know nothing of evidence-based medicine, in soundbite format.
Health scares are like toothpaste: they’re easy to squeeze out, but very difficult to get back in the tube. On Monday, for example, Melanie Phillips of the Daily Mail wrote yet another attack on the MMR vaccine. She suggested that the journalists who trusted the new Cochrane review, which shows that MMR is probably safe and not linked to autism, were lazy stooges who took the press release at face value.
The problem is that Phillips seems to misunderstand basic epidemiology. She cites “research data” of highly dubious status and misrepresents what data there is. Her response is a microcosm of the problems that can arise when journalists engage with science.
The Cochrane Collaboration is an independent, international non-profit organisation that produces systematic reviews of the literature, written by and for scientists who understand critical appraisal. It would be a happier world if journalists who write about health issues were also au fait with the intricacies of evidence-based medicine, and were trained to read academic papers. But because the majority of them can’t, the report is converted into a more easily digested press release for journalists.
This creates its own problems. Science is all about the error bar, a graphic representation of the uncertainties in the data. I look forward to the day when every politician’s speech has an error bar next to it, fluctuating in response to the margin of certainty around their claims.
This is how science works: you do a piece of research, you get some results and you write them up, in full, so that everyone can see what you’ve done, replicate it and critique it. You say: here are our results, they are this statistically accurate (error bar), and on the basis of this finding, and many other findings from other labs (which are clearly referenced) we come to the following verdict on a particular hypothesis.
The process is transparent, but relies on people being able to critically appraise a paper, pull apart its methodology, understand the statistics, in order to come to their own verdict. It follows that all scientific papers are flawed, to a greater or lesser extent, and each in its own unique way, because of the different ways of working around the practical problems of doing research: imperfect follow-up, imperfect measurements and so on. Scientists all know that.
A Cochrane review is a systematic review of the literature, which critiques each paper it encounters in exactly this manner. So the press release for this particular Cochrane review said, overall, that this overview of the evidence suggests that MMR is pretty safe.
But Phillips has read the full report, and is outraged by what she thinks she’s found: “It said that no fewer than nine of the most celebrated studies that have been used against [Andrew Wakefield, who first claimed in 1998 that there was a link between the vaccine and autism] were unreliable in the way they were constructed”. Of course it did. Cochrane reviews are intended to criticise papers.
She seems surprised that the report doesn’t walk the reader through the reasons why Wakefield’s speculative 12-subject case-series report is less worrying when considered alongside the many large epidemiological surveys of up to half a million people that the Cochrane review focuses on. But the reasons are clear to anyone who understands biomedical research.
Phillips presents material as experimental scientific research when it has never been published in a peer-reviewed academic journal and is not indexed in PubMed, the standard search tool for finding medical academic papers.
One piece of scientific research she refers to has been published only in the in-house magazine of a rightwing US pressure group well known for polemics on homosexuality, abortion and vaccines. She says “autistic enterocolitis” is a “disease” and a “new syndrome” that has been “replicated in studies around the world”. As a rough guide, the phrase “autistic enterocolitis” appears in only five PubMed-indexed academic papers, one by Wakefield and four others mostly doubting its status.
What’s more, she seems to misunderstand basic epidemiology. Large population surveys have greater power to detect a small increase in a given condition. Apparently not for Phillips: “Only a very small proportion are said to have been badly affected … ” she says.”But population-wide studies are considered too large and insensitive to pick up small numbers such as this.”
It is self-evident to anybody who understands biomedical research, and if you don’t get it then you have only two choices: you can either learn to interpret data yourself and come to your own informed conclusions; or you decide who to trust. Choose wisely.
Â· Ben Goldacre is a doctor and writes the Bad Science column in the Guardian firstname.lastname@example.org
Comments below, personal anecdotes will be deleted as people get easily upset by the subsequent responses, childish personal attacks on anyone will be deleted unless they are funny. If your post is more than one thousand words long you are officially a loser.