Saturday February 23 2008
People have done some terrible things, over the years, with science, and with their science skills. I’m talking about Zyklon B, electrocuting gay people straight, torturing people in concentration camps, leaving syphillis untreated in large numbers of black men for an experiment (without telling them, in the US, until the 1970s), and more. Stuff where it’s hard to find any humour.
This is why we have research ethics committees, codes of practise, professional bodies, and regulators like the The US Office for Human Research Protections. Sometimes these organisations can cock up quite badly. Let me tell you about two stories which have been unfolding over the past few months.
In New York, a fiendishly clever trial in ITU departments has looked at one of the simplest interventions imaginable: a ticklist for giving IV lines, a helpful little reminder to wash your hands, wear gloves, and so on. Can something as simple as “using a ticklist”, to check if people are doing the right thing, reduce infections and save lives?
This is the bread and butter of medical academic research, which is usually not about pills, or placeboes, or molecules, but about looking pragmatically at whether one thing works better than another. You will remember that homeopaths and various other quacks are philosophically opposed to this process.
The results were spectacular: in 3 months, the incidence of blood infections from these IV lines fell by two-thirds, and over 18 months, the program saved 1,500 lives and an estimated $200 million. Then someone complained to the OHRP, because this was a research study, and they did not have ethics committee clearance. The project was shut down. This week, the OHRP grandly lifted their ban, explaining that now – since it turns out the research bit is over, and the hospitals are just putting the ticklist into practise – they may tick away unhindered.
This is what we might call the “ethical paradox”. You can do something as part of a treatment program, entirely on a whim, and nobody will interfere, as long as it’s not potty (and even then you’ll probably be alright). But the moment you do the exact same thing as part of a research program, trying to see if it actually works or not, adding to the sum total of human knowledge, and helping to save the lives of people you’ll never meet, suddenly a whole bunch of people want to stuck their beaks in.
Hilary Hearnshaw did an elegant study where she pretended to apply to do a medical research project in the Israel, the UK, and 11 other countries in Europe. She said she wanted to do a trial on a leaflet – contain your excitement – which was designed to help older patients get more engaged with their GP.
Only three countries required the project to go through a process of ethical approval, and in the UK, this was more arduous than in any other country. Getting ethical clearance took ten weeks, required two submissions (because they demanded changes), and five full days of administration, during which the proposal had to be reviewed by full committees, some of which required multiple copies of the application paperwork.
This is just the tip of the iceberg (and I would always welcome more examples by email). For one multicentre clinical trial, each of 125 local research ethics committees required between 1 and 21 copies of a protocol. Ethics approval for another trial, involving 51 centres, required over 25 000 pieces of paper, 62 hours of photocopying, and an average of 3.3 hours of investigator time for each centre. You feel like you’re dying when administrators drag their heels. In the case of medical research, when you delay research findings, and deter researchers from even bothering, people really are dying. This wider harm seems to be a blindspot for the ethics committees, captivated by their own mission creep.
But it’s not the only ethical blindspot. These regulations have their roots in the Nuremburg Code. But while the world of clinicians and academics splits ethical hairs, with our eye off the ball, an elephant has walked into the room. February has seen another string of prominent psychologists resigning from their membership of the American Psychological Association, in disgust at its failure to take a stand on “abusive interrogation techniques”, cruel, inhuman and degrading treatment, and other activities which you might consider to be torture. Psychologists are key to these interrogations and other activities, both in designing and enacting what I would rather not call “protocols”, out of compassion for the people on whom they are grimly enacted, in places cameras do not go.
APA members, trained, clinical professionals on their register, who have signed up to their codes of practise, now participate in these activities. The APA’s response has been to specifically bend the codes of conduct to permit their actions, and to obfuscate. Where’s your ethics committee now, science boy?
Ken Pope, prominent member of the American Psychological Association (and a former chair of its Ethics Committee), resigned his membership on February 6. He’s the latest of a growing number of professional psychologists who have quit APA in protest of its position on the use of psychologists in government interrogations in the “War on Terror.”
Lots more on the APA and torture at Mindhacks.
Here’s a nice letter from Hilary Hearnshaw, the researcher who did the study looking at different countries’ ethics committee palaver.
Dear Ben Goldacre,
I was very pleased to see that in your article on February 23rd, in The Guardian, you had mentioned my research study comparing the ethics approval protocols in several European countries for an actual study (not a pretend one as you described it). I followed this up with my colleagues, especially students who attempted to conduct research studies for their masters degrees, to gather information on the effect and delays that acquiring approval from research ethics committees (RECs) incurred.
We found that the effects were so great for students that we decided to change the requirements for them to do a research study for their courses, since the approval process usually took longer than the whole course. It would have required students to start their applications before they had done any studying and learning on the course, which was clearly inappropriate (nay stupid).
Thus, as your article was making clear, obtaining research ethics approval led to a reduction in the number of research studies conducted by our students (mostly current health care professionals) and hence to a reduction in the research skills and experience of these health professionals.
I am pleased to note that the REC requirements are under constant review by the national COREC organisation, which might make them usable eventually. The whole REC process seems to me to be an excellent example of using a sledgehammer to crack a nut (preventing bad, unethical science) and losing any useful part of the nut in so doing.
I wish you well in future attempts to prevent bad science.