Preventing the Formation of False Beliefs: Four Rules for Journalists

By Sam Wang

In this year's mud-filled presidential campaign, journalists have a responsibility to help the public distinguish fact from fiction. Unfortunately, current tenets such as equally reporting both sides of a story undermine the public's ability to identify the truth, thanks to the quirky way in which our brains process contradictory information - and mislead us along the way. Understanding those quirks suggests four techniques to help journalists dispel false beliefs.

A recent survey from the Pew Research Center shows that, increasingly, Americans get their news from multiple sources. More than one-third use internet-based sources such as websites, blogs, and even social networking sites to get at least some of their news. Those who still rely entirely on traditional sources are now a minority. Not mentioned in the survey is chain e-mail, an unvetted format that has fed rumors that devout Christian and presidential candidate Senator Barack Obama is actually a Muslim. This proliferation of media and messages creates competitive pressure on journalists to bend their standards in order to get a story quickly. The negative effects of cutting corners - and even some standard practices - can be understood in terms of how the brain works.

We tend to remember news that accords with our worldview, and discount statements that contradict it. In one Stanford study, 48 students, half of whom said they favored capital punishment and half of whom said they opposed it, were presented with equal but opposing evidence, both supporting and contradicting the claim that capital punishment deters crime. Both groups were more convinced by the evidence that supported their initial position, a phenomenon known as biased assimilation. In a diverse media environment, biased assimilation can easily help false ideas to stick.

The human brain also does not save information permanently, as do computer drives and printed pages. Recent research suggests that as a piece of information is recalled, it may be "written" down again as part of the process of strengthening it. Along the way, the fact is gradually separated from its original context. For example, you know that the capital of Massachusetts is Boston, but you probably don’t remember how you learned it.

This phenomenon, known as source amnesia, can cause people to forget where they first heard a statement - and even whether the statement is true. In the months it takes to reprocess memories from short-term storage in temporal area brain structures to longer-term storage in the cerebral cortex, a false statement from a noncredible source that is at first not believed can gain credibility. Source amnesia could explain why, during the 2004 presidential campaign, it took some weeks for the Swift Boat Veterans for Truth campaign against Senator John Kerry to have an effect on his standing in the polls.

The opening line "I think I read somewhere" or a reference to a specific source is a classic way of backing up a false belief. In another Stanford study, a group of students was exposed repeatedly to an unsubstantiated claim taken from a Web site that Coca-Cola is an effective paint thinner. Students who read the statement five times were nearly one-third more likely than those who read it only twice to attribute it to Consumer Reports (rather than the National Enquirer, their other choice), giving it a gloss of credibility.

Finally, memory formation is aided by by emotions such as fear and disgust, which activate evolutionarily ancient brain regions such as the amygdala and insula. Psychologists have suggested that legends propagate by striking an emotional chord, so that ideas can spread by emotional selection rather than by their factual merits. Moral disgust played a role in 2000, when Bush campaign operatives spread false rumors that Senator John McCain had fathered a mixed-race child. This loaded statement had a devastating effect on McCain’s support among southern Republican primary voters.

When journalists cover controversial statements, the following four rules suggested by neuroscience and psychology can guide their efforts.

1. State the facts without repeating the falsehood. Untrue content should not be mentioned directly. For example, in covering the controversy over a New Yorker cover caricaturing Barack and Michelle Obama, virtually every major TV journalist repeated the stereotyped charges against the candidate before noting that the beliefs were false. A related mistake is saying that something is newsworthy because "the story is out there." Reporting on coverage by a less credible source such as the Drudge Report, even with disclaimers, is a reinforcement of the original story. False statements should not be presented neutrally since they are prone to be remembered later as being true.

2. Tell the truth with images that match the story. Psychologists have shown that if people aren't given enough time to think, they tend to automatically accept a statement as being true. Images are particularly powerful since nearly half of the brain is dedicated to processing visual information. When images do not match words, viewers tend to remember what they see, not what they hear. Karl Rove has said that campaigns should be run as if the television volume is turned down. Images should be selected with care to convey an accurate impression.

Television journalists breach this standard frequently. On CNN, one recent story on autism was accompanied by images of concerned mothers, vaccines, doctor’s offices, and autistic children - despite the fact that the text of the story concerned a scientific finding debunking any link between vaccines and autism.

3. Provide a compelling storyline or mental framework. An effective debunking tells its own story, replacing the falsehood with positive content. For instance, the rumor about McCain is easily displaced by the story of his adopted Bangladeshi daughter Bridget, thereby accounting for the existence of photographs of him with a dark-skinned child.

4. Discredit the source. Debunking has special staying power if it appeals to the gut. The motives of the purveyors of falsehoods can provide one powerful story hook. A recent example is the press coverage pointing out Obama Nation author Jerome Corsi's motivations and past of racist Web commentary and allegations of Bush Administration complicity in the 9/11 attacks.

In 1919, Oliver Wendell Holmes wrote "the best test of truth is the power of the thought to get itself accepted in the competition of the market." Holmes erroneously assumed that ideas are more likely to spread if they are honest. Our brains do not naturally obey this admirable dictum. But by better understanding the mechanisms of memory, perhaps journalists can help move their modern audience closer to Holmes's ideal.

Sam Wang, an associate professor of neuroscience and molecular biology at Princeton University, is a co-author of “Welcome to Your Brain: Why You Lose Your Car Keys but Never Forget How to Drive and Other Puzzles of Everyday Life.”