Opinion writer
August 4, 2016
How did Donald Trump win the Republican nomination, despite clear evidence that he had misrepresented or falsified key issues throughout the campaign? Social scientists have some intriguing explanations for why people persist in misjudgments despite strong contrary evidence.
Trump is a vivid and, to his critics, a frightening present-day illustration of this perception problem. But it has been studied carefully by researchers for more than 30 years. Basically, the studies show that attempts to refute false information often backfire and lead people to hold on to their misperceptions even more strongly.
This literature about misperception was lucidly summarized by Christopher Graves, the global chairman of Ogilvy Public Relations, in a February 2015 article in the Harvard Business Review, months before Trump surfaced as a candidate. Graves is now writing a book about his research at the Rockefeller Foundation’s Bellagio Center in Italy.
Graves’s article examined the puzzle of why nearly one-third of U.S. parents believe that childhood vaccines cause autism, despite overwhelming medical evidence that there’s no such link. In such cases, he noted, “arguing the facts doesn’t help — in fact, it makes the situation worse.” The reason is that people tend to accept arguments that confirm their views and discount facts that challenge what they believe.
This “confirmation bias” was outlined in a 1979 article by psychologist Charles Lord, cited by Graves. Lord found that his test subjects, when asked questions about capital punishment, responded with answers shaped by their prior beliefs. “Instead of changing their minds, most will dig in their heels and cling even more firmly to their originally held views,” Graves explained in summarizing the study.
Trying to correct misperceptions can actually reinforce them, according to a 2006 paper by Brendan Nyhan and Jason Reifler, also cited by Graves. They documented what they called a “backfire effect” by showing the persistence of the belief that Iraq had weapons of mass destruction in 2005 and 2006, after the United States had publicly admitted that they didn’t exist. “The results show that direct factual contradictions can actually strengthen ideologically grounded factual belief,” they wrote.
Next Graves examined how attempts to debunk myths can reinforce them, simply by repeating the untruth. He cited a 2005 study in the Journal of Consumer Research on “How Warnings about False Claims Become Recommendations.” It seems that people remember the assertion and forget whether it’s a lie. The authors wrote: “The more often older adults were told that a given claim was false, the more likely they were to accept it as true after several days have passed.”
When critics challenge false assertions — say, Trump’s claim that thousands of Muslims cheered in New Jersey when the twin towers fell on Sept. 11, 2001 — their refutations can threaten people, rather than convince them. Graves noted that if people feel attacked, they resist the facts all the more. He cited a study by Nyhan and Reifler that examined why people misperceived three demonstrable facts: that violence in Iraq declined after President George W. Bush’s troop surge; that jobs have increased during President Obama’s tenure; and that global temperatures are rising.
The study showed two interesting things: People are more likely to accept information if it’s presented unemotionally, in graphs; and they’re even more accepting if the factual presentation is accompanied by “affirmation” that asks respondents to recall an experience that made them feel good about themselves.
Bottom line: Vilifying Trump voters — or, alternatively, parents who don’t want to have their children vaccinated — won’t convince them they’re wrong. Probably it will have the opposite effect.
The final point that emerged from Graves’s survey is that people will resist abandoning a false belief unless they have a compelling alternative explanation. That point was made in an article called “The Debunking Handbook,” by Australian researchers John Cook and Stephan Lewandowsky. They wrote: “Unless great care is taken, any effort to debunk misinformation can inadvertently reinforce the very myths one seeks to correct.”
Trump’s campaign pushes buttons that social scientists understand. When the GOP nominee paints a dark picture of a violent, frightening America, he triggers the “fight or flight” response that’s hardwired in our brains. For the body politic, it can produce a kind of panic attack.
Screaming back at Trump for these past 12 months may have been satisfying for his critics, but it hasn’t dented his support much. What seems to be hurting Trump in the polls now are self-destructive comments that trouble even his most passionate supporters. Attempts to aggressively “correct” his remaining fans may only deepen their attachment.
David Ignatius writes a twice-a-week foreign affairs column and contributes to the PostPartisan blog.
How did Donald Trump win the Republican nomination, despite clear evidence that he had misrepresented or falsified key issues throughout the campaign? Social scientists have some intriguing explanations for why people persist in misjudgments despite strong contrary evidence.
Trump is a vivid and, to his critics, a frightening present-day illustration of this perception problem. But it has been studied carefully by researchers for more than 30 years. Basically, the studies show that attempts to refute false information often backfire and lead people to hold on to their misperceptions even more strongly.
This literature about misperception was lucidly summarized by Christopher Graves, the global chairman of Ogilvy Public Relations, in a February 2015 article in the Harvard Business Review, months before Trump surfaced as a candidate. Graves is now writing a book about his research at the Rockefeller Foundation’s Bellagio Center in Italy.
Graves’s article examined the puzzle of why nearly one-third of U.S. parents believe that childhood vaccines cause autism, despite overwhelming medical evidence that there’s no such link. In such cases, he noted, “arguing the facts doesn’t help — in fact, it makes the situation worse.” The reason is that people tend to accept arguments that confirm their views and discount facts that challenge what they believe.
This “confirmation bias” was outlined in a 1979 article by psychologist Charles Lord, cited by Graves. Lord found that his test subjects, when asked questions about capital punishment, responded with answers shaped by their prior beliefs. “Instead of changing their minds, most will dig in their heels and cling even more firmly to their originally held views,” Graves explained in summarizing the study.
Trying to correct misperceptions can actually reinforce them, according to a 2006 paper by Brendan Nyhan and Jason Reifler, also cited by Graves. They documented what they called a “backfire effect” by showing the persistence of the belief that Iraq had weapons of mass destruction in 2005 and 2006, after the United States had publicly admitted that they didn’t exist. “The results show that direct factual contradictions can actually strengthen ideologically grounded factual belief,” they wrote.
Next Graves examined how attempts to debunk myths can reinforce them, simply by repeating the untruth. He cited a 2005 study in the Journal of Consumer Research on “How Warnings about False Claims Become Recommendations.” It seems that people remember the assertion and forget whether it’s a lie. The authors wrote: “The more often older adults were told that a given claim was false, the more likely they were to accept it as true after several days have passed.”
When critics challenge false assertions — say, Trump’s claim that thousands of Muslims cheered in New Jersey when the twin towers fell on Sept. 11, 2001 — their refutations can threaten people, rather than convince them. Graves noted that if people feel attacked, they resist the facts all the more. He cited a study by Nyhan and Reifler that examined why people misperceived three demonstrable facts: that violence in Iraq declined after President George W. Bush’s troop surge; that jobs have increased during President Obama’s tenure; and that global temperatures are rising.
The study showed two interesting things: People are more likely to accept information if it’s presented unemotionally, in graphs; and they’re even more accepting if the factual presentation is accompanied by “affirmation” that asks respondents to recall an experience that made them feel good about themselves.
Bottom line: Vilifying Trump voters — or, alternatively, parents who don’t want to have their children vaccinated — won’t convince them they’re wrong. Probably it will have the opposite effect.
The final point that emerged from Graves’s survey is that people will resist abandoning a false belief unless they have a compelling alternative explanation. That point was made in an article called “The Debunking Handbook,” by Australian researchers John Cook and Stephan Lewandowsky. They wrote: “Unless great care is taken, any effort to debunk misinformation can inadvertently reinforce the very myths one seeks to correct.”
Trump’s campaign pushes buttons that social scientists understand. When the GOP nominee paints a dark picture of a violent, frightening America, he triggers the “fight or flight” response that’s hardwired in our brains. For the body politic, it can produce a kind of panic attack.
Screaming back at Trump for these past 12 months may have been satisfying for his critics, but it hasn’t dented his support much. What seems to be hurting Trump in the polls now are self-destructive comments that trouble even his most passionate supporters. Attempts to aggressively “correct” his remaining fans may only deepen their attachment.
David Ignatius writes a twice-a-week foreign affairs column and contributes to the PostPartisan blog.
No comments:
Post a Comment