Ben Goldacre, The Guardian, Saturday 5 June 2010
“Fish oil helps schoolchildren to concentrate” was the headline in the Observer. Regular readers will remember the omega-3 fish oil pill issue, as the entire British news media has been claiming for several years now that there are trials showing it improves school performance and behaviour in mainstream children, despite the fact that no such trial has ever been published. There is something very attractive about the idea that solutions to complex problems in education can be found in a pill.
So have things changed? The Observer’s health correspondent, Denis Campbell, is on the case, and it certainly sounds like they have. “Boys aged eight to 11 who were given doses once or twice a day of docosahexaenoic acid, an essential fatty acid known as DHA, showed big improvements in their performance during tasks involving attention.” Great. “The researchers gave 33 US schoolboys 400mg or 1,200mg doses of DHA or a placebo every day for eight weeks. Those who had received the high doses did much better in mental tasks involving mathematical challenges.” Brilliant news.
Is it true? After some effort, I have tracked down the academic paper. The first thing to note is that this study was not a trial of whether fish oil pills improve childrens’ performance, it was a brain imaging study. They took 33 kids, divided them into 3 groups (of 10, 10 and 13 children) and then gave them either: no omega-3, a small dose, or a big dose. Then the children performed some attention tasks in a brain scanner, to see if bits of their brains lit up differently.
Why am I saying “omega-3”? Because it wasn’t a study of fish oil, as the Observer says, but of omega-3 fatty acids derived from algae. Small print.
If this had been a trial to detect whether omega-3 improves performance, it would be laughably small: a dozen children in each group. While small studies aren’t entirely useless, as amateurs often claim, you do have a very small number of observations to work from, so your study is much more prone to error from the simple play of chance. A study with 11 children in each arm could conceivably detect an effect, but only if the fish oil caused a gigantic and unambiguous improvement in all the children who got it, and none on placebo improved.
This paper showed no difference in performance at all. Since it was a brain imaging study, not a trial, they only report the results of children’s actual performance on the attention task in passing, in a single paragraph, but they are clear: “there were no significant group differences in percentage correct, commission errors, discriminability, or reaction time”.
So this is all looking pretty wrong. Are we even talking about the same academic paper? I’ve a long-standing campaign to get mainstream media to link to original academic papers when they write about them, at least online, with some limited success on the BBC website. I asked Denis Campbell which academic paper he was referring to, but he declined to answer, and passed me on the Stephen Pritchard, the Readers Editor for the Observer, who answered a couple of days later to say he didn’t understand why he was being involved. Eventually Denis confirmed, but through Stephen Pritchard, that it was indeed this paper(qurl.com/denis) from the April edition of the American Journal of Clinical Nutrition.
If we are very generous, is it informative, in any sense, that a brain area lights up differently in a scanner after some pills? Intellectually, it may be. But doctors get very accustomed to drug company sales reps and enthusiastic researchers who approach them with an excitingtheoretical reason why one treatment should be better than another (or better than life as usual without the miracle treatment): maybe their intervention works selectively on only one kind of receptor molecule, for example, so it should therefore have fewer side effects. Similarly, drug reps and researchers will often announce that their intervention has some kind of effect on some kind of elaborate measure of some kind of surrogate outcome: maybe a molecule in the blood goes up in concentration, or down, in a way that suggests the intervention might be effective.
This is all very well. But it’s not the same as showing that something really does actually work, back here in the real world, and medicine is overflowing with unfulfilled promises from this kind of early theoretical research. It’s not even in the same ballpark as showing that something works.
And oddly enough, someone has finally now conducted a proper trial of fish oils pills in mainstream children, to see if they work: a well-conducted, randomised, double-blind, placebo-controlled trial, in 450 children aged 8–10 years old from a mainstream school population. It was published in full this year (qurl.com/fish), and they found no improvement. Show me the news headlines about that paper.
Meanwhile Euromonitor estimate global sales for fish oil pills at $2bn, having doubled in 5 years, with sales projected to reach $2.5bn by 2012, and they are now the single best selling product in the UK food supplement market. This has only been possible with the kind assistance of the British media, and their eagerness for stories about the magic intelligence pill.
Stuff:
You might also enjoy this takedown of the Observer piece by Dorothy Bishop, professor of neuropsychology in Oxford, and this at HolfordWatch. I should say, it makes me admire the Guardian even more that they publish a column like this about one of our own news articles. Although I do think the headline they used (“Omega-3 lesson, not so much brain boost as fishy research“) is wrong, as it wasn’t the research that was problematic, it was the reporting of it.
pv said,
June 5, 2010 at 9:54 am
It’s Denis Campbell again!
What’s his connection to the fish oil industry. Are they paying him?
Just asking.
SimonW said,
June 5, 2010 at 9:59 am
“This has only been possible with the kind assistance of the British media, and their eagerness for stories about the magic intelligence pill.”
False dichotomy, that fish oil pills don’t make children clever doesn’t mean that taking fish oil isn’t good for people (or at least those taking the supplements). You’d need different evidence to make that case.
I take a fish oil supplement, it was the cheapest form of vitamin D supplementation available (excluding sunshine) at a penny a day for 125% RDA vitamin D3.
thomaswp said,
June 5, 2010 at 10:20 am
I guess I am naive in thinking “Fish oil pills proved bogus in intelligence increasing study” would actually be a good headline?
colinlawson said,
June 5, 2010 at 11:01 am
Having read the Observer article I see that it includes quotes from Dr McNamara (one of the paper’s authors) which encourage the impression given by the article. Surely Dr McNamara is more at fault in this instance (assuming he was quoted correctly (no source given)??
olivertownshend said,
June 5, 2010 at 12:01 pm
So this article in the Economist won’t impress you? It attributes Fish Oil to the evolutionary growth in Man’s brain. www.economist.com/sciencetechnology/displayStory.cfm?story_id=16214142
ewan said,
June 5, 2010 at 12:01 pm
It’s all very well publishing the Bad Science column in the Guardian, but can we expect a proper retraction and correction on the website and in the next print edition of the Observer? And if not, why not?
HungryHobo said,
June 5, 2010 at 12:01 pm
The obsession with pills as easy answers seems a bit odd when people are so resistent to non-pill easy improvements.
An interesting evidence based trial in the US:
easy to read news article:
www.time.com/time/nation/article/0,8599,1978589-1,00.html
paper:
www.edlabs.harvard.edu/pdf/studentincentives.pdf
people were sending death threats to the guy running that one.
fontwell said,
June 5, 2010 at 1:49 pm
@thomaswp said,
June 5, 2010 at 10:20 am
“I guess I am naive in thinking “Fish oil pills proved bogus in intelligence increasing study” would actually be a good headline?”
Amazingly, it seems you are. You, like me and other Bad Science readers, are probably interested gaining knowledge, information and understanding about the world. However, this is a minority activity.
Most people read newspapers to have someone tell them that things they already believe in are actually true. Hence why our papers differentiate themselves based on socio-political viewpoint rather than quality of research and writing.
As Ben has pointed out many times, when it comes to medicine and social problems what people want to hear is how a simple pill can fix everything. They don’t want to hear that a study was flawed or even non existent because that’s dull and possibly complicated. The only type of medical mistake people want to know about is if a patient dies from some terrible mix up, ideally due to a foreign doctor who the NHS is paying huge sums of money.
JonDurham said,
June 5, 2010 at 1:57 pm
Thanks for this. the Observer article drove me nuts [insert joke here]. I’m surprised we don’t have an education policy based entirely on food ‘supplements’ … maybe I shouldn’t have said that …
pv said,
June 5, 2010 at 4:04 pm
@SimonW said,
Where does it say that fish oil isn’t good for anyone? The specific claim in the Observer article is that fish oil pills improve concentration in school kids. Even more specifically it is Denis Campbell’s claim.
1. The evidence doesn’t support the claim.
2. The paper to which Campbell refers isn’t about fish oil.
3. Denis Campbell is scientifically illiterate – which I suppose qualifies him greatly to pontificate in the press about medical matters.
4. Your comments are irrelevant. Perhaps a fish oil deficiency there?
5. Do you know the meaning of “dichotomy“?
pv said,
June 5, 2010 at 4:06 pm
I think Denis Campbell should come clean and explain his fascination with fish oil.
osipacmeist said,
June 5, 2010 at 5:13 pm
In 2007 Dennis Campbell was on the front page of the Observer about MMR and I wrote this to them:
It’s been a long time and started as a stimulating weekly intellectual treat when I was in my teens. But the front cover story on MMR is so low, you may as well paint your top red. The information is partial, and where it is not, it is simply wrong; the opinion disingenuous (at best), the research shoddy. I cannot believe the Observer has come to this. You should be ashamed. I will not be reading you again for a while. I can only hope you get better.
They obviously have not got better, and no I’m not impressed that they let BG correct an obviously silly article, if they do not then take a more intelligent view of the subject in the future. Repeatedly publishing such low grade rubbish is the issue….
They seem to have removed the article from the website, but there is a weak “readers editor” spiel at:
www.guardian.co.uk/commentisfree/2007/jul/15/comment.society
emmer said,
June 5, 2010 at 7:21 pm
I’m not very impressed with the title that The Guardian have given to Ben’s article today. Surely ‘fishy journalism’ rather than ‘fishy research’?
mary.atherton said,
June 5, 2010 at 9:08 pm
Brilliant as ever. Thank you. When are you going to do a ted talk?
reprehensible said,
June 6, 2010 at 1:44 am
As you might have guessed by the user name and previous posts I was in fact a drugs rep at one point. I can’t deny it, marketing used to commission loads of pointless papers to try and get one up on the competition, especially regarding what molecule binded to what and how strongly.
I’ve linked the funniest paper i ever had to discuss. I promise it will make you laugh when you see a diagram of the machine custom built to get a reliable control (and maybe cringe).
www.scribd.com/doc/32587138/stensballe
I also asked the companys clinical nurse specialits in a big meeting what the different outcomes would really mean to a patient, since a nurse might ask.When she finally got the point that I didn’t care about the numbers themself, just the pain they would translate to my idiot national sales manager started glaring at me to shut up and stop being clever, muttering something about nurses never asking stuff like that. Anyway I though he was being a prick so I bet I’d personally do some comparative analysis if he would. He wouldn’t take me up on it though, wimp.
(seriously click the link it will make you laugh, and yeah, didn’t last at that company long)
Unfortunately I might have to return to sales for a bit till I can get on a decent PhD. Ben, I promise I wont even bother with the papers if I do, to be hosest it would probably mean I’d get better results!
helenphilsci said,
June 6, 2010 at 12:21 pm
I’m with your argument until the final para that seems to me of the ‘correlation implies causation’ type and so illegitimate. The increase in fish oil capsule consumption may correlate with the increase in publicity about supposed links to intelligence, but it also correlates with increased publicity about fish oil being beneficial for other reasons, in particular for some rhumatoid arthritis sufferers. (Irrelevant to the argument, but it’s got me moving again!) One or two of your contributors need to understand that dietary supplements in pill/capsule form are very necessary for those of us with compromised digestive systems.
Dudeistan said,
June 6, 2010 at 7:42 pm
Oh dear! It seems that I am one of those amateurs that overplays the weakness of small trials that Ben alludes to in his article above.
I recently criticised a paper (see reference below) on acupuncture for depression in pregnancy. There were only 50 in each group. It seemed to me that the sample sizes were too small given that there was not a huge difference between intervention and controls.
Here are the stats breakdown:
Women who received acupuncture specific for depression experienced a greater rate of decrease in symptom severity (P<.05) compared with the combined controls (Cohen's d=0.39, 95% confidence interval [CI] 0.01–0.77) or control acupuncture alone (P<.05; Cohen's d=0.46, 95% CI 0.01–0.92). They also had significantly greater response rate (63.0%) than the combined controls (44.3%; P<.05; number needed to treat, 5.3; 95% CI 2.8–75.0) and control acupuncture alone (37.5%; P<.05: number needed to treat, 3.9; 95% CI 2.2–19.8).
I wonder whether any statisticians would argue that these results suggest that the sample sizes were adequate?
I know the point of Ben's article is that the Observer quoted a basic science study for sensationalised new that was not directly addressing efficacy of fish oil/omega 3 for underachieving kids. But that aside, looking at the study itself the sample sizes seem very small regardless of what the study was hoping to explore.
journals.lww.com/greenjournal/Abstract/2010/03000/Acupuncture_for_Depression_During_Pregnancy__A.7.aspx
thewinelake said,
June 6, 2010 at 10:31 pm
Can anyone tell me what Robert Winston’s current take on fish oils for kids is?
Dr DLD said,
June 7, 2010 at 12:16 pm
Hi Dudeistan
I’m a reasonable statistician, and I would argue that if a significant result was obtained, then by nature the study was adequately powered. Small sample sizes only really matter if no sig result was found (as by definition, power is just the probability that you will find a sig effect if one exists). In behavioural research, for some factors, you can obtain very large effect sizes on very small (n <10) sample sizes.
Hope that helps. Andy Fields (2009) book 'Discovering Statistics' is a very good start to the area.
tomrees said,
June 7, 2010 at 12:43 pm
@Dr DLD: 1 in 20 studies will give you a statistically significant difference between the samples, even if there is no effect – no matter what the sample size.
Dudeistan said,
June 7, 2010 at 1:52 pm
@Dr DLD
Thanks for that explanation. Very helpful. I shall look into the book.
@tomrees
Is this referenced anywhere?
Guy said,
June 7, 2010 at 3:06 pm
Dudeistan, it can’t possibly be referenced because it is simply the meaning of a p=0.05 (ie 1 in 20) probability. That’s why trials to be useful need to be replicated by different workers to be really secure in their findings.
Dr DLD said,
June 7, 2010 at 3:27 pm
Hi
Of course, if you set your p value at .05 or .01 you have a 1 in 20 and 1 in 100 (respectively) chance of obtaining a significant result by chance, that is what the alpa is – aprobability statement. However, that is unrelated to ‘power’. This is one of the reasons why replication is very important – rather than sample size per se. Sorry to have moved away from the original article (which was great). On a seperate note – also great to see Ben likes Momus (I noticed from his tweet yesterday) – one of Britains most interesting and unsung lyrical geniuses.
Dudeistan said,
June 7, 2010 at 4:51 pm
@Dr DLD
It was me that moved the topic away a little, not you.
@ Guy
Well your explanation pretty much seals my fate as an amateur! Thanks 🙂
Michael Grayer said,
June 7, 2010 at 4:53 pm
@tomrees and @Guy:
Further to what Dr DLD said (and I hope he’ll correct me if I’m wrong):
The 1-in-20 figure would be true if and only if all trials that yielded a “statistically significant” result yielded a p-value exactly equal to 0.05. However, 0.05 is the conventional cutoff point, not the result of the trial itself.
In other words, rather than “1-in-20 studies will give you a statistically significant difference between the samples, even if there is no effect”, it’s “1-in-at-least-20-and-probably-more-depending-on-the-p-value-you-actually-got studies will give you a statistically significant difference between the samples, even if there is no effect”.
(of course, this assumes your statistical techniques were valid in the first place, and you then have to look at bias and confounding, but those are separate issues)
Ian Preston said,
June 7, 2010 at 6:43 pm
“Show me the news headlines about that paper.”
Aren’t these some? Perhaps things aren’t totally black.
Telegraph: Fish oil capsules ‘don’t boost brain power in children’
Daily Mail: Fallacy of fish oil revealed as study finds supplements DON’T boost children’s brain power
BBC News: ‘More omega-3 research’ needed, says Prof Amanda Kirby
Jon d said,
June 8, 2010 at 9:43 am
… as the entire British news media has been claiming for several years now that there are trials showing it improves school performance and behaviour in mainstream children
Really? All of it? Or is a bit of less than totally verifiable journalistic hyperbole sneaking into this piece which is errm criticising journalists for being inaccurate?
MedsVsTherapy said,
June 10, 2010 at 4:46 pm
prefrontal.org/files/posters/Bennett-Salmon-2009.pdf
Ben I believe you have already covered the study showing the lit-up brain parts of the dead salmon run through te MRI just for grins. Here is direct link to the pdf.
The lead author Bennett has more sober MRI/false positive stuff, but still awesome stuff, at his website…
prefrontal.org/blog/
Guy said,
June 11, 2010 at 9:31 am
Meds, thanks for linking that. It is both brilliant and extremely funny. I will never eat salmon again without thinking of this paper!
thom said,
June 11, 2010 at 9:38 pm
@Dr DLD “I’m a reasonable statistician, and I would argue that if a significant result was obtained, then by nature the study was adequately powered. In behavioural research, for some factors, you can obtain very large effect sizes on very small (n <10) sample sizes."
Yes and no. A small study will usually only a significant effect if the observed effect size is large. So you are right about the statistical power to _detect_ an effect. But, for the same reason, small studies that reach statistical significance tend to overestimate the true effect size (even if you discount important factors such as publication bias). An alternative conception of power is to think of the confidence interval around the observed effect. Typically it takes large studies (or meta-analyses of small studies) to get a narrow, precise confidence interval. So even having detected an effect you can't be too sure that it has any practical importance if the study is small.
This is neatly illustrated by the study that Dudeistan cites where the 95% CI for Cohen's d runs from 0.01–0.92. Cohen's d of .01 is probably equivalent to zero for practical purposes (being 1/100th of a standard deviation difference between groups). The study does not have sufficient power to estimate the size of effect precisely – so yes it is underpowered if the goal is to determine whether the treatment has any practical value.
csrster said,
June 13, 2010 at 3:20 pm
And from today’s Observer (www.guardian.co.uk/theobserver/2010/jun/13/for-the-record)
“Fish oil helps schoolchildren concentrate” (News), claimed that research published in the American Journal of Clinical Nutrition showed omega3 fish oil could combat hyperactivity and attention deficit disorder.
We wish to make clear that the study did not include ADHD. Additionally, algal DHA, not fish oil, was used in the study and the simple attention task employed during MRI scans did not specifically identify improvements in attention. We have removed the story from our website and apologise for these errors.
harryr3 said,
June 15, 2010 at 10:58 pm
BBC Radio 4’s “You and Yours” on this morning ( Mon 15th June) had a section about Genetically Modified foods. Someone called in and taked about some crop being GM’d to make fish oil ( which actually come from algae). Can’t recall the exact details as was working at the time.
If the fish oil issue is not true, despite the general media claiming it to be so, why has it not been challenged – until now – by the ‘responsible press’?
Dan Kimberg said,
June 17, 2010 at 2:41 pm
It’s worth noting that the authors of this study did nothing to rule out the possibility of a treatment effect on cortical blood flow. This is a common issue with drug fMRI studies, the most trivial explanation for the observed effect is just that the drug caused an increase in cortical blood flow (which would quite naturally tend to turn up significant in one region and not others).
Thomas said,
January 23, 2015 at 1:33 pm
Sometimes your desire to be negative far outweighs your desire reduce confusion or misconceptions. Why the bee in your bonnet? There is a lot of good science regarding the benefits of fish oil supplementation, but one wouldn’t know it by reading this snipey article.
Thomas said,
January 23, 2015 at 1:35 pm
… In denying the specific claim, you leave a negative impression of fish oil supplementation, making its supporters seem more like snake oil salesmen, which we both know they generally are not.
DHA and EPA are marvellous tools we can use to positively influence many pathological conditions, and even to change outcomes.