The Backfire Effect
Attention! Brace yourself. A major announcement from the world of science came down yesterday morning.
Are you ready?
Apparently, people are not always, or even often persuaded by the facts when they are involved in an argument over an issue they feel strongly about.
Stunning, I know.
There is even a name for this scientific, ahem, discovery: “The Backfire Effect.” A study was done where people were given two copies of an article about George Bush and his confidence in the presence of weapons of mass destruction in Iraq. One article contained a discrediting quote casting doubt on the existence of these weapons, while the other did not. After reading one version or the other of this article, the participants were asked whether they agreed or disagreed with a statement claiming that Iraq had weapons of mass destruction prior to the US invasion. The conclusion?
The people who rated themselves as liberal, left of center, or centrist, did not agree—and whether they read the correction had little effect on their views. The people who rated themselves as conservative did agree. And they agreed even more, when they read the article with the correction than when they read the article without the correction.
Further tests were done to probe the extent of these astonishing results.
Participants in the experiments were more likely to experience the Backfire Effect when they sensed that the contradictory information had come from a source that was hostile to their political views. But under a lot of conditions, the mere existence of contradictory facts made people more sure of themselves — or made them claim to be more sure.
Wait a minute. Do these scientists really expect us to believe that human beings do not always behave rationally? Are we seriously expected to believe that we tend to adopt positions that reinforce what we already think even when this seems to fly in the face of logic and reason?
I’m still trying to get over this revelatory breakthrough in human understanding.
Perhaps the only real discovery here is that there is no ordinary feature of human behaviour and discourse that can’t be dressed up and made new by re-describing it in the language of science (or pseudoscience).
Of course, anyone who has, I don’t know, had a conversation with another human being (or, worse, observed/participated in an online discussion) already knows about the “Backfire Effect.” Just the other day I watched as one poor intrepid soul ventured into an online gathering of the (proudly liberal) theological intelligentsia who were mostly patting themselves on the back for how unlike the unwashed conservative masses they were. This brave individual tried to introduce a fact or two, even tried to demonstrate how their discourse was a pretty good example of precisely the practices that they were gleefully critiquing in others. And for this, this person was admonished for their tone, condescendingly urged to made sure he understood things, mocked, etc. Everything except address the concerns he actually raised. Why let facts or logic get in the way of all that delicious posturing and preening, after all?
The exact same thing happens at the other end of the spectrum. In the earlier years of this blog, I would somewhat regularly encounter two groups of fairly hostile people, either strident atheists or very conservative Christians. I remember patiently cutting and pasting sentences and paragraphs of their angry comments, systematically (and with devastating theological clarity and humility) addressing each point, and then having all this effort completely disregarded in favour of a fresh stream of vitriol. I remember how frustrated this would make me. They’re not even addressing what I’m saying! I would whine to anyone who would listen. Poor me.
And, of course if I’m honest, I know that I’ve done this myself. It’s far easier—and quite a bit more fun!—to ignore inconvenient facts or logical inconsistencies than it is to just amp up the rhetoric. Especially online. Looking back, I’m embarrassed by some of the things I have said in heated conversations. Evidently I, too, am a victim of the “Backfire Effect.”
Perhaps the real lesson to be learned in all this isn’t so much a lesson as a reminder. Human beings are irrational, selfish, and proud. We are personally invested in our most deeply held views and we cling to them like a dog on a bone when we think they are under threat. These are old, old realities that we have actually understood quite well with old, old words and categories for a long time, regardless of how powerless we seem to be to correct them. This isn’t a condition or an “effect” that we are victims of, even if it sounds more respectable to say, “I’m experiencing what scientists call ‘the backfire effect’” than “I was totally irrational and I behaved like a jerk.” Or “I need to do a better job of listening.” Or “I should just remained silent.”
Anyway, that’s what I think about the “Backfire Effect.” And if you disagree with me, well I don’t know quite what to do about your incorrigible stupidity and stubborn recalcitrance in the face of the obvious truth.
(Sorry about that. I think I’m currently experiencing what’s known as the “I sometimes behave like an idiot” effect.)