• Post hoc rationalisation – reasoning our intuition and changing our minds

    Post hoc rationalisation is what most of us end up doing when we reason. We have a gut instinct, a potentially irrational or a-rational decision based on the underlying cognitive faculties connected to our whole person: physical reactions and gut instincts.

    For those of you who have not watched this utterly brilliant TED talk from David Pisarro, do so now. It shows how we morally judge based on intuition, the idea of which being that we rationalise this ‘decision’ after the fact. Gut reaction first, think up the reasons later.

    The excellent Jonathan Haidt, a philosophical psychologist has done a lot of research in this area. As the New York Times says in a review about his book The Righteous Mind:

    To the question many people ask about politics — Why doesn’t the other side listen to reason? — Haidt replies: We were never designed to listen to reason. When you ask people moral questions, time their responses and scan their brains, their answers and brain activation patterns indicate that they reach conclusions quickly and produce reasons later only to justify what they’ve decided. The funniest and most painful illustrations are Haidt’s transcripts of interviews about bizarre scenarios. Is it wrong to have sex with a dead chicken? How about with your sister? Is it O.K. to defecate in a urinal? If your dog dies, why not eat it? Under interrogation, most subjects in psychology experiments agree these things are wrong. But none can explain why.

    The problem isn’t that people don’t reason. They do reason. But their arguments aim to support their conclusions, not yours. Reason doesn’t work like a judge or teacher, impartially weighing evidence or guiding us to wisdom. It works more like a lawyer or press secretary, justifying our acts and judgments to others. Haidt shows, for example, how subjects relentlessly marshal arguments for the incest taboo, no matter how thoroughly an interrogator demolishes these arguments.

    To explain this persistence, Haidt invokes an evolutionary hypothesis: We compete for social status, and the key advantage in this struggle is the ability to influence others. Reason, in this view, evolved to help us spin, not to help us learn. So if you want to change people’s minds, Haidt concludes, don’t appeal to their reason. Appeal to reason’s boss: the underlying moral intuitions whose conclusions reason defends.

    Of course, this is disheartening for people like me who set out to change people’s mind through rational discourse, presenting empirical evidence in the process. What this actually appears to do is entrench people in their original beliefs, rather than change their minds. As Joe Keohane states:

    In the end, truth will out. Won’t it?

    Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

    Keohane continues in explaining how this misinformation can be dangerously entrenched:

    On its own, this might not be a problem: People ignorant of the facts could simply choose not to vote. But instead, it appears that misinformed people often have some of the strongest political opinions. A striking recent example was a study done in the year 2000, led by James Kuklinski of the University of Illinois at Urbana-Champaign. He led an influential experiment in which more than 1,000 Illinois residents were asked questions about welfare — the percentage of the federal budget spent on welfare, the number of people enrolled in the program, the percentage of enrollees who are black, and the average payout. More than half indicated that they were confident that their answers were correct — but in fact only 3 percent of the people got more than half of the questions right. Perhaps more disturbingly, the ones who were the most confident they were right were by and large the ones who knew the least about the topic. (Most of these participants expressed views that suggested a strong antiwelfare bias.)

    Studies by other researchers have observed similar phenomena when addressing education, health care reform, immigration, affirmative action, gun control, and other issues that tend to attract strong partisan opinion. Kuklinski calls this sort of response the “I know I’m right” syndrome, and considers it a “potentially formidable problem” in a democratic system. “It implies not only that most people will resist correcting their factual beliefs,” he wrote, “but also that the very people who most need to correct them will be least likely to do so.”

    What’s going on? How can we have things so wrong, and be so sure that we’re right? Part of the answer lies in the way our brains are wired. Generally, people tend to seek consistency. There is a substantial body of psychological research showing that people tend to interpret information with an eye toward reinforcing their preexisting views. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesn’t. This is known as “motivated reasoning.” Whether or not the consistent information is accurate, we might accept it as fact, as confirmation of our beliefs. This makes us more confident in said beliefs, and even less likely to entertain facts that contradict them.

    New research, published in the journal Political Behavior last month, suggests that once those facts — or “facts” — are internalized, they are very difficult to budge. In 2005, amid the strident calls for better media fact-checking in the wake of the Iraq war, Michigan’s Nyhan and a colleague devised an experiment in which participants were given mock news stories, each of which contained a provably false, though nonetheless widespread, claim made by a political figure: that there were WMDs found in Iraq (there weren’t), that the Bush tax cuts increased government revenues (revenues actually fell), and that the Bush administration imposed a total ban on stem cell research (only certain federal funding was restricted). Nyhan inserted a clear, direct correction after each piece of misinformation, and then measured the study participants to see if the correction took.

    For the most part, it didn’t. The participants who self-identified as conservative believed the misinformation on WMD and taxes even more strongly after being given the correction. With those two issues, the more strongly the participant cared about the topic — a factor known as salience — the stronger the backfire. The effect was slightly different on self-identified liberals: When they read corrected stories about stem cells, the corrections didn’t backfire, but the readers did still ignore the inconvenient fact that the Bush administration’s restrictions weren’t total.

    I was talking to a theologian friend of mine who has become progressively more liberal the more he researches theology, but primarily psychology, on his own terms. His ability to change his mind is not about listening to me, disagreeing with him, it’s more about researching ideas on his own terms, having ownership over that research so that it is him who has responsibility for changing his own mind. This is perhaps the strength behind the recent move for organisations to take on coaching as a method – using Socratic dialectic methods to release new thoughts and ideas from the coachee themselves.

    I remember, with aforementioned Christian friend, that he used to deny evolution. I used to hit him with facts and reason, which was met with strong cognitive dissonance and an entrenchment of view. I did not convince him otherwise. However, he struck up a conversation with theistic evolutionary biologist Simon Conway-Morris who, over several emails of presenting the same evidence as me, managed to convince him otherwise.

    The power of a ‘friend’ – ie someone who broadly shares the same worldview – to change one’s view cannot be overestimated. Belief is deeply psychological. If we want to convince people to change their minds, we have to pick the cognitive and psychological locks which permit this. This includes choosing the right person to do the unpicking.

    Belief in God is one of the strongest, most pervasive beliefs there can be, with ramifications across the believer’s life. Rational evidence and argument, however, won’t cut the mind-changing mustard. More’s the pity.

    Category: EpistemologyFeaturedPsychology

    Tags:

    Article by: Jonathan MS Pearce

    11 Pingbacks/Trackbacks

    • Jess Rice

      I have been looking for this article without even knowing it. Thank you for articulating something that has been driving me nuts. Ya nailed it.

    • Andy_Schueler

      I´d love to see whether these results can be reproduced in cultures other than the USA.
      This stuff: “But instead, it appears that misinformed people often have some of the strongest political opinions.” – is simply the Dunning-Kruger effect and seems to be reproducable for different cultures and also for non-political opinions.
      This however: “In a series of studies in 2005 and 2006, researchers at the University
      of Michigan found that when misinformed people, particularly political
      partisans, were exposed to corrected facts in news stories, they rarely
      changed their minds. ” – I´m not so sure about. The political discourse in the USA is very… well, lets call it “unusual”. I have seen plenty of political partisans here in germany, but insisting that “the other side” is not merely wrong, but evil as well, is not something I see very frequently here. But it is completely routine in the USA. And if you don´t merely think that Obama is wrong, but that he is an evil satan-worshipping-Communist-Muslim-Maoist-Socialist-Fascist-Stalinist-Hitler-Nazi, it´s not surprising that facts don´t change your views. I think that the two party system (which is more prone to tribalism than a system that has more diversity of political views) and professional hate mongers like Rush Limbaugh and Glenn Beck have created a system that is rather unique and I don´t think that these results can be extrapolated to other cultures.

      • Some interesting points. I wonder what the search times I would use to find something on google scholar…

        • Andy_Schueler

          [People] who are otherwise very good at math may totally flunk a problem that they would otherwise probably be able to solve, simply because
          giving the right answer goes against their political beliefs.”

          That is so depressing….
          I also remember some research that showed how making a comprehensive case for a proposition – summarizing a representative sample of all the available evidence – can actually be counter-productive. If the person you try to convince finds a minor (perceived or real) flaw in your case, they tend to assume that everything else must be just as flawed and are not swayed at all. So presenting very few arguments (only the very best ones) can be much more effective than making a comprehensive case.

          • Yes, I know that. I can’t remember where I read it. Must look that up.

            • Andy_Schueler

              The only problem with that is, that what the “best” argument is, is not necessarily the same for everyone. Many atheists I know (particularly those that used to be Christians) were ultimately persuaded by the problem of evil, but I know others (myself included) who think that this is one of the weakest arguments against christianity.

            • Too true. I think the POE is very powerful, though I think free will is the powerfulestest.

            • Andy_Schueler

              Yup, free will would rank very high for me as well – but I think the two best ones (only for me personally of course) would be divine (mis)communication and the “idle God” argument that Hitchens loved to use “God sits idly by while humanity engages in 120,000+ years of tribal warfare, and then finally selects an obscure bronze age tribe in a completely random part of the world and ignores everyone else except for this tribe and their neighbors”.

            • Definitely strong contenders.

              I also like the creation of non god objects and photosynthesis (why create corporeal bodies at all?).

              http://www.skepticink.com/tippling/2013/08/01/the-case-for-god-on-trial/

            • Andy_Schueler

              Oh, and I also like the “my mom” argument, which I just made up :-D

              My mom is an inexhaustible source of practical wisdom and when I read the gospels for the first time, I was completely underwhelmed because my mom seemed to be a much better teacher than Jesus. It´s of course not an argument that I would use to try to persuade anyone else, but for me it was very powerful ;-)

            • just got its own post…

            • labreuer

              This is fascinating, when you consider Paul’s description of nonchristians as slaves and Jesus as freeing us from slavery. Paul’s use of ‘flesh’ gives me the distinct impression of someone ruled by their genes and circumstances. Paul seemed to think that one could be pulled toward God by ‘spirit’. I don’t know if you’ll do anything with this, but I am fascinated that you thing FW is the strongest argument against Christianity. Paul might agree, except for agreeing that it obtains. :-p

          • Think I’ve found it – Jason Long in The Christian Delusion referencing Petty and Cacioppo in Attitudes and Persuasion, p.72.

    • Pingback: The ‘my mum’ argument against God | A Tippling Philosopher()

    • Pingback: The ‘my mum’ argument against God | Skeptic Ink()

    • Aquinas1

      I’m the ‘theologian friend’ mentioned in the post. Ironically, it was my belief in Johno’s cognitive dissonance that made me wary of his views since -as a devotee of naturalism- he needed evolution to be true. Conway-Morris is an expert witness in the field who did not need to believe in evolution on the basis of his worldview and so he was an important stepping stone for me. I think, however, the best teachers know that true learning must come from the student at the end of the day.

      • Thanks for the comment, and I hope I was reasonable enough in my representation of you. Let me know if not. That is how I certainly remember the state of play! I understand that he is an authority as opposed to me, and that this does not qualify as an appeal to authority in that sense.

        But you get the point about changing minds. It was you who were telling me the other day of the excellent analogy in Jonathan Haidt’s book about riding elephants. Can you remind me of that again?

    • labreuer

      I remember, with aforementioned Christian friend, that he used to deny evolution. I used to hit him with facts and reason, which was met with strong cognitive dissonance and an entrenchment of view. I did not convince him otherwise. However, he struck up a conversation with theistic evolutionary biologist Simon Conway-Morris who, over several emails of presenting the same evidence as me, managed to convince him otherwise.

      One doesn’t just present evidence, one presents interpretation. I would hazard to guess that the presuppositions behind the interpretation you offered, and the presuppositions behind the interpretation Conway-Morris offered, were different. Alternatively, it could be that the philosophical barnacles which people attach to the scientific theory of evolution had made it into the mind of your theologian friend, and you didn’t do enough to combat them.

      Too many people think that a given set of evidence just screams out the ‘right’ interpretation, if only you can drop the scales from your eyes. This is wrong on so many levels. I’m not really accusing you of doing this Jonathan, although your post does smell of it a bit.

      P.S. The thing that finally convinced me away from creationism, as far as I can tell, is that the theory of evolution helps us discover new, neat things, and understand how organisms work, at a deeper level than creationism could ever do. It helped to know that evolution actually said nothing about the existence of God, except for those who buy into god-of-the-gaps.

    • Pingback: “There is no association between evolution, science, academics, or the intellect. It does not exist.” | A Tippling Philosopher()

    • Pingback: How To Tell If A Conspiracy Theory Is Daft | The Website of Jim Snowden, Author()

    • Pingback: Dear Christians With Guns: Anastasia Basil tells it straight | A Tippling Philosopher()

    • Pingback: Scottish Independence – my thoughts | A Tippling Philosopher()

    • Pingback: Charlie Hebdo – Blasphemy in France and the US | Asterism()

    • Pingback: Lies Your Optimization Guru Told You()

    • Pingback: #58: Split Paranoia – haciafalta()

    • Pingback: Magia Tu i Teraz: dlaczego nie wiesz, czego chcesz (i jak to zmienić)?()

    • MonkeyG

      In “Trump times” with fake news this article resonates well

    • Pingback: Why Is Harry Potter Popular — Meta Level – Bartleby’s Backpack()