Ben Goldacre, 19 September 2009, The Guardian
This week at a debate in the Royal Institute I was told off by the science minister for not praising good science reporting, because journalists – famously kind to their targets – are sensitive to criticism. So before we dismantle this Home Office report on drugs policy, can I just say I’m sure they’ve probably produced some other perfectly acceptable reports, and I shouldn’t like any brittle souls in government to be dispirited by the criticisms that will now follow.
The Blueprint programme is an intensive schools intervention to reduce problematic drug use, and a lengthy research project to see if it works – costing at least £6m – finished some years ago. We have been waiting for the results ever since, and this quote from Vernon Coaker, then Minister for Drugs & Crime Reduction, explains what we have been waiting for: “The Blueprint drugs education programme is designed to evaluate the effectiveness of a multi-component approach to school-based drug education… The programme is currently being evaluated to determine its impact on all drug use.”
This is odd. Because the minister said that in October 2006, but as early as 2002, before the study even began, the government had been told that their research was incapable – by design – of telling us anything useful about the effectiveness of the Blueprint intervention. The report is now out, and it even admits, in its own executive summary, that they always knew it was incapable of giving any such information.
They explain that after starting off with the idea of doing a big randomised trial, they were told they’d need 50 schools to get a sufficiently large sample, when they could only do 23, but they decided to go ahead with 23 schools anyway as some kind of gigantic pilot study, which could gather information about whether it was possible to do a proper trial of Blueprint. This, already, is a bizarre explanation, since a pilot study would not need £6m, or 23 schools, and in the end, to do the job properly, they would then wind up paying for 73 schools to be studied in total, instead of 50. There were also offers of advice from experts in trial design, such as Professor Sheila Bird of Cambridge University, who offered to help them do a meaningful trial on the available budget. This did not happen.
Then it gets even stranger. They have data from 6 normal “comparison” schools which aren’t receiving the Blueprint intervention, but they’ve not been randomised or matched properly, so you can’t use them to make any comparisons. So you bin the data. In fact, you don’t even need to collect it in the first place, because you know, before you start, that it’s no use. You’ve been told that. You understood it.
But no. Instead, the report goes for a strange cop-out: “While it was still planned that the local school data would be presented alongside the Blueprint school data to enable some comparisons to be drawn between the two samples,” they say: “recent academic and statistical reviews concluded that to present the data in this way would be misleading, given that the sample sizes are not sufficient to detect real differences between the two groups.” So you binned it? “Instead, findings from the local school data are presented separately in the report to provide some context to this work but do not act as a comparison group.” This is absurd.
And it’s not as if this was an impossible project: randomised trials of educational interventions are done, and sometimes very well. The SHARE trial was designed to discover whether a specific new sex education programme could help prevent unwanted teenage pregnancies. It was conducted in 25 schools and cost about £1m (although remember the intervention itself may have been cheaper than Blueprint). Schools were randomly assigned to give either the SHARE programme, or the normal sex education that everyone gets. Then, at age 20, young women were followed up to see if they had got pregnant, had a termination, and so on. There were no differences between the two groups, and this was a useful finding, because it stopped us wasting money on something that doesn’t work.
Did anything that good come out of the Blueprint study? Well, it was a very gigantic pilot study, so they now know a lot about things like “can you practically give the Blueprint programme in schools” (yes, you can) and “do parents like their children being taught about the risks of drugs” (yes, they do). And the Blueprint report also celebrates the fact that knowledge about drugs was good in the children they taught (although of course, they had nothing to formally compare them with). This sounds great, but in reality, improvements on this kind of “surrogate outcome” are often unrelated to real world benefits: the SHARE trial, for example, found that knowledge about pregnancy was improved, but rates of teenage pregnancy still remained unchanged.
And we should also notice, finally, that in the Blueprint trial, rates of drug use were often just a little higher among those children who did receive the special drugs education programme than among those in the non-comparable comparison group. These results are meaningless, of course, because as they’ve admitted, from the very outset this £6m trial was not designed in such a way that we can make such a comparison. But we can only speculate whether the Home Office would have been so abstemious about rigour if the flawed results from this inadequate trial had suggested their expensive new idea actually worked.
All references as links in the text above as ever. If you’re interested in reading some early feedback in which they were told about the problems then I recommend this 10mb pdf: