The mind plays strange tricks. For instance, consider this article about the threat of al-Qaeda attacking us with nuclear weapons. The headline says that the risk of such an attack is comfortingly remote, but the first two paragraphs tell a very different story:
Of all the clues that Osama bin Laden is after a nuclear weapon, perhaps the most significant came in intelligence reports indicating that he received fresh approval last year from a Saudi cleric for the use of a doomsday bomb against the United States.Consider for just a moment this deployment of the colorful, inaccurate, and utterly gratuitous phrase "doomsday bomb." Now, let's continue:
For bin Laden, the religious ruling was a milestone in a long quest for an atomic weapon. For U.S. officials and others, it was a frightening reminder of what many consider the ultimate mass-casualty threat posed by modern terrorists. Even a small nuclear weapon detonated in a major American population center would be among history's most lethal acts of war, potentially rivaling the atomic destruction of Hiroshima and Nagasaki.God save us! Who can forget the horror of the "doomsday bombs" that struck Japan on those fateful, nonconsecutive Doomsdays in 1945? Just imagine what it would be like if bin Laden hit us with nukes! We'd all be killed! Or most of us, anyway.
Here's what's going on with this article. Human brains are greedy for information, but they're inefficient at recognizing irrelevant information, particularly if it has some sort of emotional content. In this article, the details about what might happen if bin Laden got nuclear weapons are vivid and emotionally charged, but they're also completely irrelevant to what the story is actually about.
Nonetheless, the vividness of the opening paragraphs psychologically undercuts the message of the article, which is that a nuclear attack is unlikely. As a result, many readers will come away from this article feeling more disturbed about the possibility of such an attack.
The psychologists Amos Tversky and Daniel Kahneman described how different heuristics - which are cognitive rules-of-thumb that guide decision-making - can affect a subject's assessment of risk; the heuristic that relates to the article above is known as the availability heuristic and the specific fallacy involved is known as "misleading vividness." It's a tactic beloved by the Bush administration and the media, and this article is an exceptionally obvious instance of it. If you've never noticed this sleight-of-hand scaremongering before, I predict you'll be amazed at how often you'll notice it in the future.