The Backfire Effect

Attention! Brace yourself. A major announcement from the world of science came down yesterday morning.
Are you ready?
Apparently, people are not always, or even often persuaded by the facts when they are involved in an argument over an issue they feel strongly about.
Stunning, I know.
There is even a name for this scientific, ahem, discovery: “The Backfire Effect.” A study was done where people were given two copies of an article about George Bush and his confidence in the presence of weapons of mass destruction in Iraq. One article contained a discrediting quote casting doubt on the existence of these weapons, while the other did not. After reading one version or the other of this article, the participants were asked whether they agreed or disagreed with a statement claiming that Iraq had weapons of mass destruction prior to the US invasion. The conclusion?
The people who rated themselves as liberal, left of center, or centrist, did not agree—and whether they read the correction had little effect on their views. The people who rated themselves as conservative did agree. And they agreed even more, when they read the article with the correction than when they read the article without the correction.
Further tests were done to probe the extent of these astonishing results.
Participants in the experiments were more likely to experience the Backfire Effect when they sensed that the contradictory information had come from a source that was hostile to their political views. But under a lot of conditions, the mere existence of contradictory facts made people more sure of themselves — or made them claim to be more sure.
Wait a minute. Do these scientists really expect us to believe that human beings do not always behave rationally? Are we seriously expected to believe that we tend to adopt positions that reinforce what we already think even when this seems to fly in the face of logic and reason?
I’m still trying to get over this revelatory breakthrough in human understanding.
Perhaps the only real discovery here is that there is no ordinary feature of human behaviour and discourse that can’t be dressed up and made new by re-describing it in the language of science (or pseudoscience).
Of course, anyone who has, I don’t know, had a conversation with another human being (or, worse, observed/participated in an online discussion) already knows about the “Backfire Effect.” Just the other day I watched as one poor intrepid soul ventured into an online gathering of the (proudly liberal) theological intelligentsia who were mostly patting themselves on the back for how unlike the unwashed conservative masses they were. This brave individual tried to introduce a fact or two, even tried to demonstrate how their discourse was a pretty good example of precisely the practices that they were gleefully critiquing in others. And for this, this person was admonished for their tone, condescendingly urged to made sure he understood things, mocked, etc. Everything except address the concerns he actually raised. Why let facts or logic get in the way of all that delicious posturing and preening, after all?
The exact same thing happens at the other end of the spectrum. In the earlier years of this blog, I would somewhat regularly encounter two groups of fairly hostile people, either strident atheists or very conservative Christians. I remember patiently cutting and pasting sentences and paragraphs of their angry comments, systematically (and with devastating theological clarity and humility) addressing each point, and then having all this effort completely disregarded in favour of a fresh stream of vitriol. I remember how frustrated this would make me. They’re not even addressing what I’m saying! I would whine to anyone who would listen. Poor me.
And, of course if I’m honest, I know that I’ve done this myself. It’s far easier—and quite a bit more fun!—to ignore inconvenient facts or logical inconsistencies than it is to just amp up the rhetoric. Especially online. Looking back, I’m embarrassed by some of the things I have said in heated conversations. Evidently I, too, am a victim of the “Backfire Effect.”
Or not.
Perhaps the real lesson to be learned in all this isn’t so much a lesson as a reminder. Human beings are irrational, selfish, and proud. We are personally invested in our most deeply held views and we cling to them like a dog on a bone when we think they are under threat. These are old, old realities that we have actually understood quite well with old, old words and categories for a long time, regardless of how powerless we seem to be to correct them. This isn’t a condition or an “effect” that we are victims of, even if it sounds more respectable to say, “I’m experiencing what scientists call ‘the backfire effect’” than “I was totally irrational and I behaved like a jerk.” Or “I need to do a better job of listening.” Or “I should just remained silent.”
Anyway, that’s what I think about the “Backfire Effect.” And if you disagree with me, well I don’t know quite what to do about your incorrigible stupidity and stubborn recalcitrance in the face of the obvious truth.
(Sorry about that. I think I’m currently experiencing what’s known as the “I sometimes behave like an idiot” effect.)
Guilty as charged.
The thing that is missing here is a discussion about our identity and how that shapes what we think and who we take seriously when they speak. I took a great university class about identity conflict. Essentially when someone is attacking your identity (theology is part of that) rationale is left at the door. I think the big thing here is that we must dig deeper than the ‘facts’ on an issue. Why do people think gun control is a bad idea even though the stats are shocking? An identity or ideology, whatever you call it that tells them to be afraid of the government – a founding principle/identity/ideology of being an American (thankfully only for some). Our main fault is to think the issue is the issue (that’s not a typo).
Yeah, great point, Jon. We have such a difficult time disentangling our identities from the issue under discussion at any given point, or even acknowledging that our identity is really the thing that is under threat in our reactions.
http://en.wikipedia.org/wiki/Confirmation_bias
Persistence of discredited beliefs
“[B]eliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs. They can even survive the total destruction of their original evidential bases.”
—Lee Ross and Craig Anderson[38]
Confirmation biases can be used to explain why some beliefs remain when the initial evidence for them is removed.[39] This belief perseverance effect has been shown by a series of experiments using what is called the “debriefing paradigm”: subjects read fake evidence for a hypothesis, their attitude change is measured, then the fakery is exposed in detail. Their attitudes are then measured once more to see if their belief returns to its previous level.[38]
A TYPICAL FINDING IS THAT AT LEAST SOME OF THE INITIAL BELIEF REMAINS EVEN AFTER A FULL DEBRIEF.[40] In one experiment, subjects had to distinguish between real and fake suicide notes. The feedback was random: some were told they had done well while others were told they had performed badly. EVEN AFTERE A FULL DEBRIEF,SUBJECTS WERE STILL INFLUENCED BY THE FEEDBACK. THEY STILL THOUGHT THEY WERE BETTER OR WORSE THAN AVERAGE AT THAT KIND OF TASK, DEPENDING ON WHAT THEY HAD INITIALLY BEEN TOLD.[41]
Wikipedia
So, maybe among the lessons to be learned here is that human beings are much, much more than evidence-crunching machines… That there are important things at stake for us in the convictions that we reject or embrace.
Not rocket science, I suppose, but I guess we could always use the reminder :).
So in short most of us are stubborn. On a different note CBC radio has a new show called the 180, which sometimes discusses when people do a ‘180 degree’ turn in their beliefs. Perhaps a better study is to look at those people that do change direction and beliefs and try to discover what is at play there? I remember hearing about a cattle rancher that went vegetarian based on a book he read. What breaks down the barriers in belief? Obviously over the past hundreds of years our society has been changing beliefs slowly about many social issues. Maybe we need to ask the opposite question?
It’s a fascinating subject,the variables involved in situational change of mind/beliefs, especially as it relates to a specific religious experience(s) that creates a psychic change in an individual.
“Obviously over the past hundreds of years our society has been changing beliefs slowly about many social issues.” I believe this slow motion change is actually Evolutionary in mankind by God’s design,as it relates to the Quantum Field of Consciousness/Universal Mind. ~~then there’s that. 🙂
@ Jon… I like the shift in emphasis. Rather than focusing exclusively on what makes us cling to beliefs, sometimes even in spite of overwhelming evidence, we should also pay attention to how and why people change. This, too, is an important part of the story that needs to be told.
@ Mike… Quantum theory, eh? Well, that can be (and is) used to explain almost as many things as God these days :).
Yeah…you know, the butterfly effect and all that….. 🙂
Hi Ryan, missed your mind and company in Edmonton this week; I had fun visiting with Gil and our friends. I am wondering how the Backfire Effect is working itself out right now in the hearts and minds of those who were leaning hard in either direction on Christian responses to marginalized sexualities given what was presented at the SC. Food for thought.
kmp
I missed being there, Ken. Truly, I did. It would have been great to reconnect with you and so many other good friends.
And yes, the conversations around sexuality going on in Edmonton this week were certainly in the back of my mind as I wrote this post. Plenty of backfiring, I imagine, whatever point of the spectrum one inhabits :).