A new study published in a journal called Sociological Inquiry points out that people sometimes have a vested interest in believing things that aren't true, and that such beliefs can be very persistent.
As an example, the authors bring up "the strength and resilience of the belief among many Americans that Saddam Hussein was linked to the terrorist attacks of 9/11."
So far, so painfully obvious. But suddenly, things take a bizarre turn:
Although this belief influenced the 2004 election, they claim it did not result from pro-Bush propaganda, but from an urgent need by many Americans to seek justification for a war already in progress.A war launched by whom? With what official justification? In order to protect what Great Nation from which Doomsday Weapons hoarded by what Brutal Dictator, who was clearly in league with which Cave-Dwelling Evildoers?
Who knows? Who cares? The important thing is, it's time to dissolve the people, and elect another.
The study, "There Must Be a Reason: Osama, Saddam and Inferred Justification" calls such unsubstantiated beliefs "a serious challenge to democratic theory and practice" and considers how and why it was maintained by so many voters for so long in the absence of supporting evidence.The absence of supporting evidence, that is, other than the personal guarantee of leaders who presented themselves as super-patriots and imitators of Christ, and an army of politicians and media figures who treated skepticism of those leaders as frivolity at best and treason at worst.
There's no denying that people tend to believe whatever makes them happy. But it's a bit high-handed to treat the Bush Administration's propaganda campaign as largely irrelevant to the formation of precisely those opinions that it sought to form at all costs. The phrase "blaming the victims" springs to mind, just for starters.
While numerous scholars have blamed a campaign of false information and innuendo from the Bush administration, this study argues that the primary cause of misperception in the 9/11-Saddam Hussein case was not the presence or absence of accurate data but a respondent's desire to believe in particular kinds of information.Granting the pitfalls of causality -- and the fact that Nature her Selfe is out of Tune; and Sicke of Tumult and Disorder -- I think that when you're searching for primary causes, it makes sense to start from as early a point as possible. For instance, if you're trying to understand why someone believes that Saddam Hussein was involved in 9/11, it makes sense to take note of when, where, why, and by whom that theory was first advanced, instead of treating it as some spontaneous phantasmatic response of the collective unconscious to the cold hard fact of war. Especially since the connection was suggested before the war.
"People were basically making up justifications for the fact that we were at war," he says.The war, you see, was simply "a current reality":
"They wanted to believe in the link," he says, "because it helped them make sense of a current reality. So voters' ability to develop elaborate rationalizations based on faulty information, whether we think that is good [!!!] or bad for democratic practice, does at least demonstrate an impressive form of creativity."At the risk of revealing my unfitness for civilized debate, and tarnishing the heretofore stainless reputation of the radical-left blogosphere, I have to say this is absolute fucking nonsense.
While it may be foolish or pre-critical to trust that an American president wouldn't lie to you about a matter of life and death, it's not necessarily an "elaborate rationalization." Nor is it "an impressive form of creativity" to accept their lies as facts. And while the need to persist in believing these things despite years of evidence to the contrary may indeed require elaborate rationalizations, I'd balk at calling them creative, as opposed to reflexive.
The extent to which these beliefs were accepted had a great deal to do with the authority of their sources, and the government's ability to set the terms of debate (while shrugging off substantive criticism as the effete ressentiment of a terrorist-coddling elite). The Bush Administration gave the public factually incorrect information, obviously. But what's more important is that it gave them a philosophically satisfying vantage point from which to assess and reject opposing arguments, while insisting that the failure to reject them would be literally fatal.
As far as I can tell, Hoffman et al ignore the incredible effort that was made to delegitimize critics on the basis of identity politics, schoolyard taunts, and guilt by fever-dream association, and to represent basic standards for evidence and logic as some sort of Franco-Islamic imposition on the common man. Thanks to this effort, rejecting skepticism was not just psychologically comforting, but also became an act of national defense on a par with patrolling the mean streets of Fallujah, or torturing a detainee with a nail gun.
To treat this campaign as somehow less important than its results is an act of ideological willfulness that rivals any of the behavior this paper describes.
"We refer to this as 'inferred justification,'" says Hoffman "because for these voters, the sheer fact that we were engaged in war led to a post-hoc search for a justification for that war.And for some reason, they settled on the very same justification that the Bush Administration advanced before invading Iraq, by means of "false information and innuendo." What are the odds?
Since I'm nothing if not fair-minded, I'll let the author have the last word:
"The argument here is that people get deeply attached to their beliefs," Hoffman says.
3 comments:
I'm going to quibble a little bit and say that it's only relative fucking nonsense.
I'm with them on "elaborate rationalization," but with you on "an impressive form of creativity." Reflexive, fer sure.
I think what's at issue here (without having read the damned link) isn't how people came to these erroneous beliefs in the first place, but why they dopersist in believing these things despite years of evidence to the contrary.
Having arrived at a belief, irrespective of how much sense or nonsense it contains, one becomes emotionally committed to that belief, and deeply loathe to change it. Most especially in the face of a vocal opposition. Nobody ever wants to be wrong. Convince someone that they are wrong, and you have very likely made a life-long enemy. that is the power of emotional commitment to belief.
Am I swaying you at all?
Jzb the unpersuasive trombonist
I'm with them on "elaborate rationalization," but with you on "an impressive form of creativity." Reflexive, fer sure.
I don't think we really disagree that much. I think everything you're saying about rationalization and emotional commitment is exactly right. But how people come to believe things in the first place is an issue, because it affects the extent to which they feel driven to persist in those beliefs. Granted, no one wants to be wrong, but far fewer people want to be wrong about a life-or-death issue like war.
So my point, as far as the initial formation of opinion goes, is basically that the Administration took the issues you're raising into account, and exploited them. And because this was such a monumental betrayal of trust, and implicated the believer in such a disastrous outcome, the need for rationalization was that much more intense.
Which is why I feel like Hoffman's blaming the victim (or seems to be, in this article): He's harping on people's response after the fact, while downplaying the extent to which they'd been targeted at this elemental level by people they either believed they could trust, or believed that had no choice but to trust, and put in a position where admitting to being wrong would open an unusually large abyss at their feet.
And I think this was part of the rationale for demonizing critics, too...not only do you lose, if you admit to being wrong, but those people win, which is intolerable. The propaganda was designed to put people in this position, I believe, so treating it as a secondary issue is incoherent, from my standpoint.
Excellent analysis. I am always amazed at how once something is fixed in the popular imagination, it can't be uprooted, no matter what evidence is brought to bear.
Post a Comment