Yeah well you can prove anything with science

July 2nd, 2010 by Ben Goldacre in bad science, irrationality research | 41 Comments »

Ben Goldacre, The Guardian, Saturday 3 July 2010

What do people do when confronted with scientific evidence that challenges their pre-existing view? Often they will try to ignore it, intimidate it, buy it off, sue it for libel, or reason it away.

The classic paper on the last of those strategies is from Lord in 1979: they took two groups of people, one in favour of the death penalty, the other against it, and then presented each with a piece of scientific evidence that supported their pre-existing view, and a piece that challenged it. Murder rates went up, or down, for example, after the abolition of capital punishment in a state, or comparing neighbouring states, and the results were as you might imagine. Each group found extensive methodological holes in the evidence they disagreed with, but ignored the very same holes in the evidence that reinforced their views.

Some people go even further than this, when presented with unwelcome data, and decide that science itself is broken. Politicians will cheerfully explain that the scientific method simply cannot be used to determine the outcomes of a drugs policy. Alternative therapists will explain that their pill is special, among all pills, and you simply cannot find out if it works by using a trial.

How deep do these views go, and how far do they generalise? Professor Geoffrey Munro took around a hundred students and told them they were participating in a study on “judging the quality of scientific information”, now published in the Journal of Applied Social Psychology. First, their views on whether homosexuality might be associated with mental illness were assessed, and then they were divided into two groups.

The first group were given five research studies that confirmed their pre-existing view. Students who thought homosexuality was associated with mental illness, for example, were given papers explaining that there were more gay people in psychological treatment centres than the general population. The second group were given research that contradicted their pre-existing view. (After the study was finished, we should be clear, they were told that all these research papers were actually fake, and given the opportunity to read real research on the topic if they wanted to).

Then they were asked about the research they had read, and were asked to rate their agreement with the following statement: “The question addressed in the studies summarized… is one that cannot be answered using scientific methods.”

As you would expect, the people whose pre-existing views had been challenged were more likely to say that science simply cannot be used to measure whether homosexuality is associated with mental illness.

But then, moving on, the researchers asked a further set of questions, about whether science could be usefully deployed to understand all kinds of stuff, all entirely unrelated to stereotypes about homosexuality: “the existence of clairvoyance”, “the effectiveness of spanking as a disciplinary technique for children”, “the effect of viewing television violence on violent behavior”, “the accuracy of astrology in predicting personality traits”, and “the mental and physical health effects of herbal medications”.

Their views on each issue were added together to produce one bumper score on the extent to which they thought science could be informative on all of these questions, and the results were truly frightening. People whose pre-existing stereotypes about homosexuality had been challenged by the scientific evidence presented to them were more inclined to believe that science had nothing to offer, on any question, not just on homosexuality, when compared with people whose views on homosexuality had been reinforced.

When presented with unwelcome scientific evidence, it seems, in a desperate bid to retain some consistency in their world view, people would rather conclude that science in general is broken. This is an interesting finding. But I’m not sure it makes me very happy.


++++++++++++++++++++++++++++++++++++++++++
If you like what I do, and you want me to do more, you can: buy my books Bad Science and Bad Pharma, give them to your friends, put them on your reading list, employ me to do a talk, or tweet this article to your friends. Thanks! ++++++++++++++++++++++++++++++++++++++++++

41 Responses



  1. Dean Morrison said,

    July 3, 2010 at 1:21 am

    When presented with unwelcome scientific evidence, it seems, in a desperate bid to retain some consistency in their world view, people would rather conclude that science in general is broken. This is an interesting finding. But I’m not sure it makes me very happy.

    Not entirely a surprise to those of us who have been defending science against the assaults from creationists, global warming denialists, and Post-Modernists (okay, Steve Fuller).

    Trust is hard to cultivate: distrust thrives in an environment of ignorance and neglect.

  2. iliff said,

    July 3, 2010 at 1:34 am

    “…it seems, in a desperate bid to retain some consistency in their world view, people would rather conclude that science in general is broken.”

    Is it possible that people would rather *express the view* that science in general is broken, since they are bright enough to realise that they look foolish if they act inconsistently (and possibly macho if they don’t)?

    I don’t think yours is the only possible conclusion from this data.

    µ

  3. paddyfool said,

    July 3, 2010 at 1:56 am

    I wish I hadn’t seen so many examples of confirmation bias along these lines… and of the general rejection of science by many whose views are challenged by the evidence found.

  4. 300baud said,

    July 3, 2010 at 2:23 am

    I’d be interested to see how much we could change that with better presentation. Even setting aside the skeptics who are outright dicks, a lot of science-related dialog can be pretty brusque.

    Personally, I tend to prefer that kind of bracing, just-the-facts presentation. But as studies like this suggest, that’s not what a lot of people want. It’d be nice to have clear studies that show us how to make the intellectual medicine go down smoothly.

  5. Timmy said,

    July 3, 2010 at 3:42 am

    Consider this……how about a similar analysis of this subject.

    (loose description)

    Determine the positions of individuals on several topics and later poll them on the media sources that they use to question or re-enforce their positions. I’ll guess that they will select media data that re-enforces and “on the face” shun that which questions.

  6. Zeriador said,

    July 3, 2010 at 3:47 am

    Makes me wonder if this article is one of these fake articles aimed at making people question their faith in science.

    Madness! I don’t know what to believe anymore.

  7. Travis said,

    July 3, 2010 at 4:49 am

    This post and the abstract seem to imply they went the other way too. That they showed contrary ‘research’ to people who thought there was no link between homosexuality and insanity and those people ALSO came to the conclusion that science is unreliable.

    Can you confirm this?

  8. bagpuss7 said,

    July 3, 2010 at 9:35 am

    I recommend Robert Cialdini’s “Influence” as a good explanation on the desire for consistency and quite how far people will go to be ‘consistent’.

  9. Big M said,

    July 3, 2010 at 11:29 am

    [quote]When presented with unwelcome scientific evidence, it seems, in a desperate bid to retain some consistency in their world view, people would rather conclude that science in general is broken. This is an interesting finding. But I’m not sure it makes me very happy.[/quote]

    To be fair, I don’t think this is quite as sinister, or as surprising, as you seem to be making out. It looks like just another side of confirmation bias, especially given that we’re dealing with laypeople, who are far less interested in dry statistical evidence than they are in emotionally powerful personal experience, and who have had years of selective reinforcement of their opinions.

    It would have been astounding if they had accepted the single piece of evidence presented to them that changed their views. And once they’ve been shown something that they think must be flawed, of course they’re going to be put in a frame of mind in which they think that other evidence must also be flawed.

    The results of this study are in no way surprising, and I doubt they’re indicative of anything other than an unsurprising lack of emphasis on the scientific method, compared to emotional jumping to conclusions, among the general population.

  10. Synchronium said,

    July 3, 2010 at 1:00 pm

    All I can think of is how this hilariously applies to the religious.

  11. Kapitano said,

    July 3, 2010 at 1:35 pm

    There is a flipside to the mindset that dismisses science itself when it challenges beliefs.

    The very same people who assert science can’t evaluate astrology when scientific evidence counts against it, loudly proclaim “science proves astrology” when a single researcher finds a weak correllation between zodiac predictions and actual events.

    Lunatics like Jenny McCarthy use the ‘science’ of Andrew Wakefield when it’s convenient. Frauds like Deepak Chopra use terms misappropriated from quantum physics. Self-appointed economists from all political fields willfully misunderstand chaos theory to support their ideas.

    For these people, science has an authority – one which most of the time works against them, but occasionally they can use.

  12. TwentyMuleTeam said,

    July 3, 2010 at 1:39 pm

    I suspect we feel humiliated when provided evidence contrary to our conclusions; the primary affect of shame stops dead our rational processes. This is a remarkably emotional response. Story: I’ve sent adults for independent psychiatric assessment for insurance purposes, and received reports back stating the adult is unable to work at all, in any occupation. Then, after finding pay stubs for the adult’s work at 30-hours weekly, and submitting them to the examiner to include in an addendum opinion, invariably the hackles are raised and the examiner sticks to the same opinion.

  13. elluskott said,

    July 3, 2010 at 1:43 pm

    What is wrong with the Guardian editors? Even as media staff themselves become increasingly fact-illiterate and unable to distinguish between fact and opinion, many readers are actually able to.

  14. mch said,

    July 3, 2010 at 2:41 pm

    heh, ‘research studies’ are ‘scientific evidence’? A good, healthy distrust of scientists is not the same as ignoring scientific evidence. Conflating the two is not useful or scientifically healthy.

    It’s hardly surprising that people require solid evidence to change viewpoints. So what?

    Scientists too are heavily prejudiced in all sorts of ways; your emotive choice of *those* stoopid homophobes (ie, them not us) to make your point is similar prejudice to avoid a bit of introspection.

  15. jdc said,

    July 3, 2010 at 3:48 pm

    I second bagpuss7’s recommendation of Robert Cialdini’s “Influence” – Cialdini explains judgemental heuristics and highlights the ways that they can be used in order to persuade people in a way that even I could understand (he also points out that automatic, stereotyped behaviour is in many cases “the most efficient form of behaving”). Influence: Science and Practice sits on my bookshelf between Irrationality and Gut Feelings.

    Cialdini uses various examples of salespeople using these shortcuts as ‘weapons of influence’ and I think that they sometimes crop up in the promotion of CAM. Here’s a badly-written blogpost on Alternative Medicine and Weapons of Influence, and here’s a rather better one on Cialdini, Consistency, and trust in CAM.

  16. kbscout said,

    July 3, 2010 at 10:54 pm

    Ben,

    Not sure if you read Farnam Street, but I think you’d enjoy it.

  17. H. E. Pennypacker said,

    July 4, 2010 at 10:52 am

    Travis:

    Yes you’re right. They had people who believe in an association between mental illness and homosexuality (believers) and people who do not believe in such a link (non-believers). The exposure to belief-confirming or disconfirming evidence had the same effect on both groups.

    Big M:

    I agree that it would have been surprising if a single piece of evidence changed their views. But an attempt was made not to present the information as a single piece of evidence, the experimenter claimed that this evidence was derived from a review of a large body of research. It’s of course possible that laypeople woild still treat that as a single piece of evidence. But the most significant point of this study and Ben’s article was that being exposed to scientific data that disconfirms your previous belief leads you to discount scientific evidence regarding a whole host of other, completely unrelated topics. In this regards I think these data ARE surprising, and rather depressing.

  18. Honesty in Science said,

    July 4, 2010 at 10:55 am

    Ben Goldacre is not used to proving things with Science as he is a psychiatrist which is very very bad science.

    Psychiatry: Zero Science, Zero testing and Zero cures.
    One finds the irony amusing that we get a bad science web site from from the practitioners of Psychiatry or lets csll it its real nane quackery as its 100% subjective opinion bunkum and more akin to homeopathy than real science.

  19. Mark P said,

    July 4, 2010 at 11:47 am

    All it proves is that few people change their mind quickly. So?

    Actually that is a good thing, or we would have people wildly fluctuating between strong opinions based on the last piece of information.

    People do change their minds on issues like the death penalty, homosexual law reform, evolution, climate change or health care reform, but they like to do so gradually.

    Ben – I bet if I showed you scientific evidence that shed doubt on global warming, that you would pick holes in it. Hang on! That experiment has been done! And your response was? You behaved exactly like the people in the experiment. (Hardly surprising, as I assume you are human.)

    If I was to scientifically be able to show the CO2 theory wrong (somehow!) beyond doubt (it’s a thought experiment, OK?) I don’t think for a second you would believe me. It would take years before you came round.

    So why do you expect other people to suddenly change their minds because confronted with something they don’t agree with?

  20. RossAnderson said,

    July 4, 2010 at 8:33 pm

    It’s not just science that people will reject when it suits them. Read Bill von Hippel and Bob Trivers on “The evolution and psychology of self-deception

  21. Sqk said,

    July 4, 2010 at 10:12 pm

    ‘Honesty in Science’, please read your posts through, out-loud, before you post them.

    I’d love to read your views, and I’d hate to misunderstand or misquote you, but you’re going to have to meet me halfway here and post something coherent.

  22. SteveGJ said,

    July 5, 2010 at 7:32 am

    I believe there are good reasons to question the interpretation and methods of scientific studies once they start dealing with the incredibly complex area of the interplay of human psychology and difficult matters in society. Quite apart from confirmation biuas from those studying the results, there is plenty of scope for those defining the boundaries of the study to affect the results. Trying to boil complex issues down to a few simple propositions that can be questioned is, itelf, rather troublesome.

    Too often we are left with correlations and any causative mechanism is so necesarily abstracted from fundamental science that it is closer to the language of politicians and novelists.

    Now this is not to say that useful contributions can’t be made using scientifically controlled studies in more complex areas. In medicine double blind controlled trials have proved their worth, as have statistical studies on the outcome of different medical procedures. However, once it gets into areas of social policies we are dealing with the difficult area of the psychology of human beings as individuals and in mass. I think we are right to be wary of the results often reported. There are genuinely areas where science, if not broken, certainly has very large areas of uncertainty.

    So is science, in general, broken? Well that’s another of those over-simplified question. However, there are certainly areas where science is more firmly grounded than in others, and it should be recognised. It’s down to the nature of the problems being studied.

  23. Healthwatch said,

    July 5, 2010 at 11:48 am

    There has been a considerable cry of ‘nothing proved’ in the ‘Climategate’ debate from those who seek to defend science. Partly this has been due to the fact that the emails do not provide a definitive example of what happens when scientists play ‘tricks’ with data to justify an argument. However, a recent paper in the Indian Journal of Emergency Pediatrics has shown how scientists manipulate data in the field of snakebite. The paper includes statements from scientists in this field which directly link misrepresenting the numbers in an attmpt to justify funding.

    It is a useful guide to how widespead the issue of ‘data’ is when produced by scientists to drive policy.

    The paper “The Absence of Progress for both Children and Adults in Global Snakebite Management; Scrabbling for Funding and Business as Usual Ignores Available Solutions” can be accessed from healthpublishing2010@gmail.com and makes very interetsing reading.

  24. Brady said,

    July 5, 2010 at 2:46 pm

    A classic book that shows how groups of people behave when their catastropic predictions fail is Leon Festinger’s – When Prophecy Fails: A Social and Psychological Study of A Modern Group that Predicted the Destruction of the World (1956)
    web.mac.com/sinfonia1/Global_Warming_Politics/A_Hot_Topic_Blog/Entries/2008/8/20_More_On_Cognitive_Dissonance.html

    “What do people do when confronted with scientific evidence that challenges their pre-existing view?…”

    Well … it is amazing how much the theory of Cognitive Dissonance is supported by the events of the current collapse of the Catastropic Manmade Global Warming (CAGW) movement 🙂

    Just look at George Monbiots recent Guardian article which is full of Cognitive Dissonance, shown obviously in this comment:
    www.guardian.co.uk/environment/georgemonbiot/2010/jul/02/ipcc-amazongate-george-monbiot?showallcomments=true#CommentKey:7e05c6c2-0fd1-479f-ad99-0d59945c949f

    However, the CAGW movement now seems to be moving into it’s final stages of cognitive dissonance where it is now doing a virtually 180 degree turn and is being reunited with the real world again.

    Have a look at the wording that is now allowed to be printed in this Guardian (a former bastion for CAGW believers) article:
    www.guardian.co.uk/environment/2010/jul/04/climatechange-hacked-emails-muir-russell

    ” … the idea of IPCC scientists as “self-appointed oracles, enhanced by the Nobel Prize, is now in tatters…”

    “… its most visible and activist wing, appeared to want to go back to waging an all-out war on its perceived political opponents”. He added: “Such a strategy will simply exacerbate the pathological politicisation of the climate science community.”

    Amazing!

    We live in hitorical “Berlin Wall crumbling” times 🙂

  25. AdamJacobs said,

    July 5, 2010 at 3:55 pm

    Yes, confirmation bias is very powerful.

    A classic study from 1977 showed that it applies to scientific peer reviewers, not just ordinary folk. Pretty scary really when you think that peer review is supposedly the cornerstone of the scientific literature.

    I’ve blogged about that study at dianthus.co.uk/papers-from-the-past (follow the links to the full citation).

  26. BikerMondo said,

    July 5, 2010 at 5:20 pm

    @Healthwatch – now why would I want to give you my email address?

  27. pv said,

    July 5, 2010 at 11:29 pm

    300baud said,
    July 3, 2010 at 2:23 am

    Even setting aside the skeptics who are outright dicks, a lot of science-related dialog can be pretty brusque.

    Some people are dicks. And the reality of nature isn’t personal, so of course it and much of the dialogue might seem brusque.

    To be sceptical is simply to be curious and not take things as read. To prefer evidence to authority. To prefer the option of being wrong or unknown, and the provisional nature of evidence, as opposed to the absolutism of certainty and authority.

    So there are the competing interests abiding curiosity and the quest for knowledge, on the one hand, and Authority, certainty and fear of the unknown on the the other.

    People would rather fear the unknown and trust to Authority rather than embrace the unknown and try to find the truth of what it is.

  28. psybertron said,

    July 6, 2010 at 9:15 am

    That Munro paper title “whether homosexuality might be associated with mental illness” and aim “judging the quality of scientific evidence” begs its own conclusion:

    “The question addressed in the studies summarized… is one that cannot be answered (entirely) using scientific methods.”

    I appreciate that latter statement was used to differentiate the biases in the competing stories (in the Munro research), but in fact that statement is often substantially true.

    Science is often “bad” precisely because it is applied to a situation that cannot be adequately formulated for scientific evidence to provide the answers …. or only limited answers.

    “Judging Quality” is not a scientific exercise.

  29. irishaxeman said,

    July 8, 2010 at 1:16 pm

    Being a sceptic means that you take ‘science’ on its merits too, not that you blindly accept conclusions. Ben’s position has always seemed to me to be sceptical, not a blind defender of science.

    I must have missed the (no doubt decisive and overwhelming)evidence about the collapse of the CMCW:)- Not that it bothers me as I am one with Lovelock that humans need culling about 90% along with their stupid capitalism.imperialism, egoism, narcissism and most other blindfolds.

  30. philosopher-animal said,

    July 11, 2010 at 1:47 pm

    jdc calls attention to what I think is the most important thing to do with results like this. That is, take the next step and figure out what does change people’s minds and avoids confirmation bias, etc. I am reminded of my conversations with a friend of mine about nonclassical logic: I am skeptical of (say) paraconsistent logics, for any number of reasons, but they raise an interesting question: how did the late 19th-early 20th century revolution in logic happen?

    Something similar applies here: this is also the correct part of what the inane stuff about “framing” is about. But I would want people to believe X if X is the case, and not if not, etc. so it cannot just be a matter of “rhetoric” or the like. Psychologists talk about the degree to openness to new evidence; it would be interesting to see if (rigorous, these would have to be largely the natural sciences only) scientific training improves that or whether there’s a self-selection effect, for example. I have a sinking feeling there’s a lot of that.

  31. suggsygirl said,

    July 16, 2010 at 3:28 pm

    I think it just comes down to the fact that people will not admit when they’re wrong about everything, not just science. I remember my sister claiming that I had stolen her doll despite overwhelming evidence to the contrary (I was with my mum at the time) and would not admit that she was wrong even though lying about it got her a smack. This instinct is very strong and deeply-rooted and maybe there’s a child in every scientist, kicking and screaming that life is not fair when they’re proved wrong.

    Admitting to being wrong and adapting ones views is one of the hardest things you can do.

  32. T.J. Crowder said,

    July 19, 2010 at 1:39 pm

    <>

    At this point it’s useful to recall research related to the psychiatry of persuasion. I expect someone here has the relevant citations to hand (I don’t), but basically you can lead someone down the garden path very effectively indeed by starting at one point and then successively going further afield. People have a strong bias toward consistency in their answers. So if you start with “Can science answer question X?” and the person responds “no”, they will have a much stronger tendancy to answer “no” to the question “Can science answer question Y?”, especially if Y is a small increment away from X. Successive questions can continue to get further afield, into the realm where, had you asked the 10th question first, their answer would have been “yes”, but having asked it 10th, the answer is “no”. I’m not a scientist at all, much less in the area of persuasion, but I’d think this effect would be powerful enough to bias the results of the successive questions.

    @”Honesty in Science”: <>

    Ben’s a GP, not a psychiatrist.

  33. T.J. Crowder said,

    July 19, 2010 at 1:40 pm

    Trying again with different quoting characters. 🙂

    “But then, moving on, the researchers asked a further set of questions, about whether science could be usefully deployed to understand all kinds of stuff…”

    At this point it’s useful to recall research related to the psychiatry of persuasion. I expect someone here has the relevant citations to hand (I don’t), but basically you can lead someone down the garden path very effectively indeed by starting at one point and then successively going further afield. People have a strong bias toward consistency in their answers. So if you start with “Can science answer question X?” and the person responds “no”, they will have a much stronger tendancy to answer “no” to the question “Can science answer question Y?”, especially if Y is a small increment away from X. Successive questions can continue to get further afield, into the realm where, had you asked the 10th question first, their answer would have been “yes”, but having asked it 10th, the answer is “no”. I’m not a scientist at all, much less in the area of persuasion, but I’d think this effect would be powerful enough to bias the results of the successive questions.

    @”Honesty in Science”: “Ben Goldacre is not used to proving things with Science as he is a psychiatrist…”

    Ben’s a GP, not a psychiatrist.

  34. T.J. Crowder said,

    July 19, 2010 at 1:42 pm

    I should have said “…to bias the results of the successive questions *in this study*.” Otherwise the argument is really quite circular. 🙂

  35. Tinopener said,

    July 20, 2010 at 7:30 am

    Anybody here familiar with John Ioannidis proactively titled paper: “Why Most Published Research Findings Are False” www.plosmedicine.org/article/info:doi/10.1371/journal.pmed.0020124 ?

    This paper, or at least its title, seem to give some comfort to those who would generally pooh-pooh science. I am aware it applies only to the field of medical research, and that replication of results increases their probability of being correct, but only when the replication is exact: see also www.plosmedicine.org/article/info:doi/10.1371/journal.pmed.0040028 and this commentary that Google threw up from the WSJ: online.wsj.com/article/SB118972683557627104.html?mod=djemasialinks.

    I have googled this website to see what Ben’s comments on the Ionnides paper are, but have not found any. I am sorry if I did not search well enough.

    It clearly cannot be used to rubbish all results: otherwise our computers would not work, planes would fall out of the sky etc etc.

    However it raises an issue, I think, about when the press can place over importance on scientific results and the conditions that might have to exist for more to be claimed about the significance of results than they can truly bear.

    Does anyone know if Ionnidis’ work has been replicated, and if anyone has tried similar analysis outside of the field of medical research?

  36. Rididill said,

    July 27, 2010 at 3:02 pm

    @philosopher-animal – lose the jargon. What is paraconsistant logic? I’d love to know if you would but give me the chance instead of speaking only to your exclusive community of philsophy-speakers.

    @T.J. Crowder – I think you’re probably right. It would be interesting to know what kinds of things the ‘challenged’ and ‘non-challenged’ thought science was good for.

    It’s a shame in these studies that they never ask people why, often either because of cost or because of the ‘subjective’ nature (honestly, just because it isn’t easy to represent in numbers – when the opinion scale is just as subjective). That might shed further light on this, instead of leaving it down entirely to the researcher’s speculations, and dare I add, potential personal bias (I’m looking at you BG – you often say the public is smarter than the media give them credit for, but you also seem very ready to believe they all hate/distrust/are ignorant of science. If you think they’re that smart then they aren’t us gullibly lapping up the bullshit as you might think).

    You will find out far more from how people justify these choices than just the numbers.

  37. ChrisBJ said,

    July 30, 2010 at 4:49 pm

    This reminds me of recent research on “confirmation bias”, in people with strongly held political views. There must be some sort of link. Perhaps it is the same mechanism.

    Brendan Nyhan and Jason Reifler described the “backfire effect”, when people’s erroneous beliefs about issues such as the Iraq War were actually enhanced and strengthened when they were challenged with the true facts. See: When Corrections Fail: The Persistence of Political Misperceptions. 30 March 2010
    www.springerlink.com/content/064786861r21m257/fulltext.html
    A possible neurological mechanism was described in an editorial in the BMJ (How cognitive biases affect our interpretation of political messages 27 April 2010) www.bmj.com/cgi/content/full/340/apr27_1/c2276

    This article describes magnetic resonance imaging of subjects’ brains while they were being challenged with facts which contradicted their strongly held views.

    “Rejection of obviously contradictory evidence arose from a combination of switching off neurones associated with distress and switching on those associated with positive emotions. Perversely, the latter provided a “positive reinforcement” for making biased decisions” …”Crucially, this processing of information and updating of preferences occurred extremely rapidly, bypassing circuits normally associated with reasoning, and it was thought to be outside the realm of conscious control.”

    Not happy news – Hence arguing with a Climate Change denialist will only make their denial stronger, no matter how much scientific evidence you present. Similarly for believers in homeopathic medicine….

  38. mch said,

    August 1, 2010 at 10:00 am

    I’m highly skeptical of Munro & Ben’s extrapolation from study abstracts to science in general. I’ve read Munro’s study, and it’s not at all clear that the students actually did any such thing. As Crowder says above, some of it appears rather ‘leading’

    Does anyone know where to find the abstracts of the false studies or the actual questions asked? I’ve asked Munro but had no response. The only false study given in Munro’s paper isn’t even really relevant; it’s a survey of homosexuals in mental institutions, which wouldn’t really tell us whether it could be classed as a mental illness.

    FWIW, there was some (somewhat ‘robust’) discussion about this here: www.badscience.net/forum/viewtopic.php?f=3&t=17423

  39. mch said,

    August 1, 2010 at 10:07 am

    @Chris: “Not happy news – Hence arguing with a Climate Change denialist will only make their denial stronger, no matter how much scientific evidence you present. Similarly for believers in homeopathic medicine…”
    Similarly for believers in Climate Change alarm. It’s not a one way bias even though Ben rather skips over that: the study showed that people who did *not* believe that homosexuals were mentally ill *also* rejected contradictory evidence.

  40. Alan Kellogg said,

    August 4, 2010 at 12:41 pm

    Folks, you are not that much different from the study group, depending on the subject. Time and time again I have heard people expressing the opinion that “The universe doesn’t work that way” from people one would think would apply the scientific method to the presented problem. We all have our bete noirs, and it reflects badly upon us.

  41. incitatus said,

    November 5, 2010 at 3:58 pm

    turns on pedant mode……

    I have a slight issue with the title…..most of us poor working whitecoats are Popperian scientists. Which means that you can’t prove ANYTHING with science. Only disprove it.

    Turns off pedant mode and slinks back to lurking corner…