Ben Goldacre
Saturday September 16, 2006
The Guardian
Regular readers will have established by now that most journalists are so scientifically inept, and so eager to run with “pill solves complex social problem†stories, that companies like Equazen selling their Eye-Q fish oil tablets for children with blanket media coverage can come out very nicely indeed.
So here’s the background you might have missed. Firstly, it costs 80p a day for you to feed your child these Eye-Q omega-3 fish oil tablets that Equazen have provided for Durham Council, to give to their GCSE students, in last week’s fish oil “trial†which received such phenomenal widespread media adulation. Meanwhile, Durham Council spend 65p a day on the ingredients for the same children’s school meals. If I was going to get to work on improving childrens’ diets, Durham, let me tell you, I would not start with pills, nor would I start with the long promotional press releases that you have been sending out with Equazen’s press office contact details on the bottom.
Dave Ford from Durham Council, the man behind these fish oil “trialsâ€, thinks that the evidence on omega-3 for intelligence and behaviour is in, and that this evidence shows it is of benefit. Well now, let’s go through the five published papers shall we: skip to the end for a punchline if you find evidence boring.
Not a single one of these trials is in “normal†mainstream children, 3 were positive (to a greater or lesser extent, as you will see), two were negative. Voigt et al in 2001 did a double-blind, randomised, placebo-controlled trial on omega-3 fish oil in 63 kids with ADHD: they found no significant differences between the fish oil and control groups (full citations and precis for all studies on badscience.net). Richardson et al in 2002 did a trial on 41 children with learning difficulties, and found improvements in 3 out of the 14 things measured (Conners ADHD score, inattention, and psychosomatic symptoms). Stevens et al in 2003 did a pilot study on 50 children with inattention, hyperactivity, and other disruptive behaviors (one third dropped out during the trial) and found improvements in the fish oil group for 2 out of the 16 things measured (parent-rated conduct problems and teacher-rated attention symptoms).
We’re nearly there. This is important. Hirayama et al had a trial with 40 subjects with ADHD, and in fact, not only was there no improvement for the fish oil group, the placebo group showed a significant improvement in visual short term memory and continuous performance. And lastly Richardson et al, looking at 117 subjects with developmental coordination disorder, found no significant differences between placebo and fish oil groups for motor skills, but improvements in reading and spelling. This last one, incidentally, was the “Oxford-Durham†trial, performed by Oxford academics, published in a peer reviewed academic journal, and it has nothing whatsoever to do with these unpublished methodologically inept “Durham trials†now being performed by Equazen’s friends in Durham County Council.
Why does Dave Ford of Durham Council think the case is proven? Why does Equazen think the case is proven? Why does the entirety of the media think the case is proven?
Because Equazen keep going on about all these trials they’ve done, and they keep getting reported all over the newspaper and telly. “All of our research, both published and unpublished, shows that the Eye Q formula can really help enhance achievement in the classroom†says Adam Kelliher, director of Equazen. All of it? I tried to get all of these studies (there are 20 now, apparently) so I could read them. “Nullius in verba†is the Royal Society’s motto: “on the word of no-oneâ€. This isn’t childishness. I’m not accusing people of lying. I simply want to read the research, in full, see their methodology, and results, and statistics, in full, critically appraise it, like you do, when you read a piece of academic research. That’s what science is all about. It’s about not taking things on faith or authority.
But I couldn’t read them. These studies are not published, and Equazen told me I would have to sign a confidentiality agreement to see them: a confidentiality agreement, to review the research evidence for these incredibly widely reported claims, in the media and by Durham Council employees, about a very interesting and controversial area of nutrition and behaviour, and about experiments conducted – forgive me if I’m getting sentimental here – on our schoolchildren. Well I suppose I could have signed it just to find out for my own curiosity. You wouldn’t even know if I had.
Academic References
These are the five published trials looking at what happens in children (with various diagnoses) when you give them fish oil supplements. I do not wish to undermine these studies in any sense, but it is worth noting, along with your other readings around them, that in most only a small number of the many variables measured were changed by fish oil, and that the p-values in the variables that were found to be changed were only just below 0.05, that is, they did just reach statistical significance. If you disagree with any of these brief summaries or have anything to add to them then do please let me know. In general, you will see if you get the original papers that they were methodologically meticulous and reported to a high standard. Top Jadad scores all round.
Voigt, R.G. et al., A randomized, double-blind, placebo-controlled trial of docosahexaenoic acid supplementation in children with attention- deficit/hyperactivity disorder. Journal of Pediatrics, 2001. 139(2): p. 189-96.
Kids with ADHD, found no significant differences in objective or subjective ADHD measures between treatment and control group. 63 subjects, 14.3% dropped out.
Richardson, A.J. et al., A randomized double-blind, placebo-controlled study of the effects of supplementation with highly unsaturated fatty acids on ADHD- related symptoms in children with specific learning disabilities. Progress in Neuro-Psychopharmacology & Biological Psychiatry, 2002. 26(2): p. 233-239.
Kids with LD, improvements in Conners ADHD score, inattention, and psychosomatic symptoms (P = 0.05, 0.03, 0.05 respectively) (3 out of the 14 things measured). 41 subjects, 22% dropped out.
Stevens, L.Z. et al. EFA supplementation in children with inattention, hyperactivity, and other disruptive behaviors. Lipids, 2003. 38(10): p. 1007-21.
Kids with ADHD or other disruptive behaviours, only a pilot study, improvements in fish oil group for parent rated conduct problems (p=0.05) & attention teacher rated symptoms (P=0.03) (2 out of the 16 things measured). 50 subjects, 34% dropped out.
Hirayama, S. et al., Effect of docosahexaenoic acid-containing food administration on symptoms of attention-deficit/hyperactivity disorder – a placebo-controlled double-blind study. European Journal of Clinical Nutrition, 2004. 58(3): p. 467-73.
Kids with ADHD, no difference between placebo and fish oil group (oh, except the placebo group, rather than the fish oil group, showed a significant improvement in visual short term memory and continuous performance). 40 subjects.
Richardson, A.J. and Montgomery, P., The Oxford-Durham study: a randomized, controlled trial of dietary supplementation with fatty acids in children with developmental coordination disorder. Pediatrics, 2005. 115(5): p. 1360-6.
Kids with Developmental Coordination Disorder, no significant differences between placebo and fish oil groups for motor skills, but improvements for the fish oil group in reading and spelling (P= 0.04 and <0.01) and CTRS-L global scale (P<0.05) and some subscale improvements (P<0.05) for the fish oil group. 117 subjects, 6% dropped out.
Incidentally, for those of you haven’t been back to the blog in the past few days, if you have the stomach for very long posts, you might enjoy this slightly odd episode from the previous comments:
followed by amusing banter, and then stirring rebuttal here:
Dr Aust said,
September 18, 2006 at 4:31 pm
“philosophy of statistics is a mindfuck up there with anything quantum physics can offer”
..but of course many professions require you to have some kind of basic knowledge of stats.
..not just scientists.
Sample dialogue from 1st yr medical school tutorial group:
Tutor: “What is a normal pulse rate?
Students: “About 70 per minute”
Sadistic tutor: “And how many people in your medical school yr gp would have a resting pulse rate of 70?… or between 65 and 75? ..how would you find out? Could you make an educated guess without measuring them all?”
etc. etc.
Ben Goldacre said,
September 18, 2006 at 4:47 pm
that is pretty sadistic. i usually start with “this olympic swimming pool is full of red marbles, but a few green ones here and there. how are you going to..”
RS said,
September 18, 2006 at 4:54 pm
Andrew, just the example of known prior probability, I can’t see many real life circumstances where it is at all relevant. The whole point is we don’t know what the priori probability is (in fact, I’m not even sure it is meaningful to talk about prior probabilities like that). So I think he’s using a bit of Bayesian sleight of hand to reinforce his argument (which is otherwise cogent)
Dr Aust said,
September 18, 2006 at 5:06 pm
It is a bit hard-core, I guess. Other ways to get them to start addressing the idea of distributions in the context of a “standard value” are to get them to all measure their own pulse rate, or to ask them whether “70 is normal” that means the bloke sitting next to them with a pulse of 50 is a freak, or is ill.
BTW, if you think that’s sadistic… remember once being sat in an Hons viva (for end of 1st yr MBChB) with a very eminent medical social scientist. Said luminary fixed the student with a piercing gaze and then asked “Do I have the average number of arms?”
RS said,
September 18, 2006 at 5:13 pm
Question is what kind of average is he using.
KJ said,
September 18, 2006 at 5:16 pm
Having dropped a pebble, I’ll (gently!) chide Ben for his implication that statistics is beyond him.
You don’t need much (any?) mathematics for the essential ideas, but IMHO those ideas are essential for anyone seriously wanting to understand data. They’re simple but deep(!) and well worth the effort. I view lay peoples refusal to try to grasp ideas important to the domain of discourse as almost as reprehensible as such an attitude in scientists. (Did that sound pompous enough?).
The Cohen article mentioned earlier in the thread is good, I hadn’t seen it before. Personally I echo Jerry Dallal’s Tufts Home Page where (here) he says
Perhaps the finest series of short articles on the use of statistics is the occasional series of Statistics Notes started in 1994 by the British Medical Journal. It should be required reading in any introductory statistics course. The full text of all but the first ten articles is available is available on the World Wide Web. The articles are listed here chronologically.
Absence of evidence is not evidence of absence is something every investigator should know, but too few do. Along with Interaction 2: compare effect sizes not P values, these articles describe two of the most common fatal mistakes in manuscripts submitted to research journals. The faulty reasoning leading to these errors is so seductive that papers containing these errors sometimes slip through the reviewing process and misinterpretations of data are published as fact.
Those two articles should be understandable by any scientist and IMHO should be required reading!
RS said,
September 18, 2006 at 5:19 pm
KJ, I don’t think I agree that “The presence or absence of significant p-values tells more about sample size than effect size” because in very many areas of biomedicine the samples sizes are restricted to a fairly narrow range, which brings the p-value back to effect size and variance.
Dr Aust said,
September 18, 2006 at 5:27 pm
Exactly right, RS. Really a Q to see if the victim knew any basic stats / could think on his/her feet. But of a mind-bender with no advance warning, though. I think the “subject” said something sensible after some prompting.
Ben Goldacre said,
September 18, 2006 at 5:31 pm
kj: straight stats is not that much of a problem, its when people start getting into “what does it all mean though, really…” that things get tricky, especially once you get away from the easy stuff like a students t-test.
Filias Cupio said,
September 19, 2006 at 6:01 am
Rosemary Kesick, quoted in post 46:
“I don’t have time for statistics when I see a sick child in front of me.â€
My response:
You should be thankful that somebody has had time for statistics, or else you’d be treating that sick child by bleeding her.
Dr Aust said,
September 19, 2006 at 11:00 am
RS wrote:
“…in very many areas of biomedicine the samples sizes are restricted to a fairly narrow range, which brings the p-value back to effect size and variance.”
Agreed – in most laboratory studies n will be between 4 and 10 – so one rarely uses anything other than simple parametric testing w student’s t – or analysis of variance for comparing same parameter between multiple groups. And in some branches of mol biol no stats at all – if you are trying to clone “the gene for x” it either works or it doesn’t.
The complexity of the statistics in proper trials comes as a definite surprise to people encountering them for the first time…. as usually do the very small effects.
RS said,
September 19, 2006 at 11:21 am
4 and 10! I’ll have you know I can reach as high as fifteen!
KJ said,
September 19, 2006 at 1:02 pm
Hi RS. You said
—————
‘I don’t think I agree that “The presence or absence of significant p-values tells more about sample size than effect size†because in very many areas of biomedicine the samples sizes are restricted to a fairly narrow range, which brings the p-value back to effect size and variance. ‘
————
I agree that IFF I know sample size then the p-value tells something about the relative magnitudes of effect size and variance, but surely that doesn’t gainsay the phrase you quoted?
In the context of this thread I see “in most laboratory studies n will be between 4 and 10”, “63 kids”, “41 children”, “50 children”, “40 subjects”, “117 subjects”. Then we have ‘the trial that ate itself’ with a planned 5000! Admittedly that last trial seems to have more fundamental failings, but aotbe would have much more power than the others. Assuming that the others were appropriately sized to detect effects of clinical significance (I can’t judge that) a (reasonably designed!) 5000 size trial would detect clinically insignificant effects with very high statistical significance (low p-values). This is what I meant when I said
“They will show a (statistically) significant effect if they try hard enough, that doesn’t mean it’s big enough to matter! ”
(By the way, albeit not in a biomedical area, I often work with sample sized of thousands).
stan sharred said,
September 19, 2006 at 1:03 pm
I work as a teacher in a hospital setting.
Recently received free samples of snake oil- sorry fish oil.
Do these people really think that we (teachers) are going to start dispensing these thinks?
They were despatched to pharmacy for distruction.
stan sharred said,
September 19, 2006 at 1:03 pm
sorry, destruction
RS said,
September 19, 2006 at 1:22 pm
Sample sizes of thousands, bah, that’s not science, that’s physics, psychology, or (eugh) sociology.
I agree that when you have these sorts of massive disparities in sample sizes, and in medical studies in general, concentration on p-values is misleading. I was just defending the use of p-values in science in general, and I was thinking particularly of less applied fields, because the argument that with a big enough sample sizes eventually a significant difference will be found is purely hypothetical and doesn’t necessarily apply in practice, and in fact, with the generally small sample sizes found in (non-applied) biomedical research, significant p-values are fairly meaningful.
RS said,
September 19, 2006 at 1:26 pm
KJ, so what I am disagreeing with is “A significant p-value doesn’t say the effect is “real†in any meaningful sense, it just means the sample is big enough to demonstrate that the effect is non-zero, which we knew anyway!” But I think we all know what everyone else is talking about here, it’s just what way we choose to spin it to make a particular point.
RS said,
September 19, 2006 at 3:18 pm
Wow, that’s another post I’ve seen appear higher up the list after a big delay (e.g. #64, #56) – do they get lost in the system or is there a set of keywords that initiate manual moderation by Ben?
Ben Goldacre said,
September 19, 2006 at 3:30 pm
“I work as a teacher in a hospital setting. Recently received free samples of snake oil- sorry fish oil.”
dude – what brand were they, and what promotional material came with them?
Ben Goldacre said,
September 19, 2006 at 3:32 pm
“Wow, that’s another post I’ve seen appear higher up the list after a big delay (e.g. #64, #56) – do they get lost in the system or is there a set of keywords that initiate manual moderation by Ben?” some do yeah, from the days before registration, i should get round to switching it off but there are a few spammers around and they can do a lot of damage very quickly. god knows what the triggers youre hitting are.
pv said,
September 19, 2006 at 6:18 pm
Evidence-biased said, amongst other things in the other thread (www.badscience.net/?p=297 )
“September 17, 2006 at 3:04 pm
As leading author on the Oxford-Durham study (Pediatrics 115 (5) 1360-1366), I’ve tried to stand back from this debate – but for the benefit of #98 and #105, I must make clear that:
1) Durham LEA have all the data collected for our trial, so if they wanted to publish any further papers, they’re very free to do so. (By contrast, some questionnaire data was never provided to the Oxford researchers; and the final data on motor skills – a primary outcome measure – only came across in summer 2004). To suggest that my well-known dislike of Equazen’s promotional activities makes it ‘unlikely that other papers will be released’ is thus misleading in the extreme. I do wonder why Durham LEA haven’t got on with this themselves – but presumably they’re too busy. Or maybe it’s because they know we’d insist on checking their sums.
2) Unfortunately, media attention dogged our study from early 2002 when data collection first began. The Oxford researchers repeatedly asked Durham LEA to desist from such activities until this trial was completed, analysed and published – but to no apparent avail. This constant premature leakage of information to the media, and its apparent exploitation for commercial purposes, was the reason why our previously good relations with Durham LEA broke down….”
The last couple of sentences speak volumes as to the (lack of) credibility and integrity of the Durham LEA. If I were a parent with a child at a Durham LEA school I would be demanding an explanation at the very least.
cath having fun said,
September 19, 2006 at 8:58 pm
Durham seem to be obsessed by spin at the expense of science – never one to let facts get in the way of a hyped story – but why this approach, when a more impartial, objective and scrutinised trial would generate more kudos (iespecially f fish oil were proved useful)?
Hang on – isn’t Sedgefield part of County Durham?
…and isn’t the Rt Hon Tony Blair PM, MP for Sedgefield?
I make a ridiculous hypothesis for the Daily Mail Health Section that it must be something in the drinking water. that prevents a rationale and objective response to questions asked …..
stever said,
September 19, 2006 at 11:49 pm
fuck. ive battled through all that, and Im now too shattered to contribute.
*bed*
Coobeastie said,
September 20, 2006 at 8:37 am
Ben – you have another fan 🙂
www.fabresearch.org/view_item.aspx?item_id=1019
RS said,
September 20, 2006 at 1:13 pm
Isn’t that the poster Evidence-biased?
stan sharred said,
September 21, 2006 at 3:26 pm
Ben,
Sorry, can’t remember which company it was, since I received them quite a few months ago.
The accompanying literature was of the “Evidence/research shows” and how beneficial they would be.
If I receive anymore, I’ll let you know…
jdc325 said,
September 28, 2006 at 9:18 am
Come on now, you can’t compare fish oil to snake oil – everyone knows that snake oil (aka Echinacea) is a placebo, whereas fish oil has significant evidence behind it… for certain conditions NOT including “may help to maintain concentration” or any similar schoolchildren-related claims. Omega 3 fatty acids are almost certainly beneficial in depression (references available if you wish to review) but the only study I have seen into omega 3 fatty acids and concentration / school performance that I felt I could trust was the Oxford-Durham trial. Don’t worry – as they say “it will all come out in the wash”. The Common Position (EC) No. 3/2006 will lead to an EU regulation (instantly enforced, unlike directives that must be enshrined in member state law first) that will require nutrition and health claims made on foods (including ALL supplements) to be backed up by generally accepted scientific evidence – or the claims to be removed. Let’s see how many claims are removed – I would ‘guesstimate’ that the majority will have to go, as the standards are quite stringent and claims such as “may help to maintain concentration” etc… etc… will be completely barred. I don’t believe that anyone should dismiss use of fish oil / a diet rich in fish as being of no benefit due to their own prejudices as that science is just as bad as the ‘science’ promulgated by certain unethical souls (or businesspeople as they prefer to be known).
Regards.
wayscj said,
November 23, 2009 at 12:37 am
Laptop Battery Laptop Battery Laptop Batteries
Laptop Batteries discount laptop battery
discount laptop battery
notebook battery notebook battery
computer battery computer battery
replacement laptop battery replacement laptop battery
notebook batteries notebook batteries
jiangjiang said,
December 8, 2009 at 2:06 am
ed hardy ed hardy
ed hardy clothing ed hardy clothing
ed hardy shop ed hardy shop
christian audigier christian audigier
ed hardy cheap ed hardy cheap
ed hardy outlet ed hardy outlet
ed hardy sale ed hardy sale
ed hardy store ed hardy store
ed hardy mens ed hardy mens
ed hardy womens ed hardy womens
ed hardy kids ed hardy kids ed hardy kids