Monday, July 12, 2010

Evidence And Beliefs. Which comes first?



Digby discusses a series of articles highlighted in the Boston Globe. They are all about our puny brainz malfunctioning in ways which might threaten democracy, if the latter is defined as improving with better informed voters.

Because the studies the Globe discusses appear to show something quite depressing:

New research, published in the journal Political Behavior last month, suggests that once those facts — or "facts" — are internalized, they are very difficult to budge. In 2005, amid the strident calls for better media fact-checking in the wake of the Iraq war, Michigan's Nyhan and a colleague devised an experiment in which participants were given mock news stories, each of which contained a provably false, though nonetheless widespread, claim made by a political figure: that there were WMDs found in Iraq (there weren't), that the Bush tax cuts increased government revenues (revenues actually fell), and that the Bush administration imposed a total ban on stem cell research (only certain federal funding was restricted). Nyhan inserted a clear, direct correction after each piece of misinformation, and then measured the study participants to see if the correction took.

For the most part, it didn't. The participants who self-identified as conservative believed the misinformation on WMD and taxes even more strongly after being given the correction. With those two issues, the more strongly the participant cared about the topic — a factor known as salience — the stronger the backfire. The effect was slightly different on self-identified liberals: When they read corrected stories about stem cells, the corrections didn't backfire, but the readers did still ignore the inconvenient fact that the Bush administration's restrictions weren't total.

Is it time to scatter ashes all over ourselves and to rend our hair, those of us who are into politics as something that might in fact matter a great deal? I wouldn't go that far, and the reason is that by 2005 we already had the Fox News as a major propaganda outlet, one which told outright lies and then called them facts. We also had the blogosphere, both the right and the left one, and several sources used exaggeration in a way which distorted the truth. And we had Rush and Ann and the whole right-wing talk-show system as well as the odd left-wing talk-show of similar type.

My point is that IF we were offered carefully checked information in the first place, our worldviews would not build faulty information into their walls and windows and doors and then being corrected on that information wouldn't collapse the house on our heads. Now it does, for those individuals who get their factoids flavored with truthiness without any real correction in the initial building stages.

But given the current trend to several subjective truths in general, yes, there is some cause for worry. The recommendations in the Globe article focus on the supply side: demanding better fact-checking by the factoid producers. I don't see that happening in the near future. Still, we should keep the pressure on the media to do just that.

Reading those articles made me feel funny (as in funny=peculiar). If I'm suffering from the same "backfire" and "motivated reasoning" problems (as defined in the Globe piece), then how can I write about the piece as if I was outside the problem? Indeed, how can the researchers carry out their studies as if they are outside the problem?

The way to reconcile all this is by assuming that the problem is not universal or at least that there is some way of escaping it. The Globe points out a few solutions:

It would be reassuring to think that political scientists and psychologists have come up with a way to counter this problem, but that would be getting ahead of ourselves. The persistence of political misperceptions remains a young field of inquiry. "It's very much up in the air," says Nyhan.

But researchers are working on it. One avenue may involve self-esteem. Nyhan worked on one study in which he showed that people who were given a self-affirmation exercise were more likely to consider new information than people who had not. In other words, if you feel good about yourself, you'll listen — and if you feel insecure or threatened, you won't. This would also explain why demagogues benefit from keeping people agitated. The more threatened people feel, the less likely they are to listen to dissenting opinions, and the more easily controlled they are.

There are also some cases where directness works. Kuklinski's welfare study suggested that people will actually update their beliefs if you hit them "between the eyes" with bluntly presented, objective facts that contradict their preconceived ideas. He asked one group of participants what percentage of its budget they believed the federal government spent on welfare, and what percentage they believed the government should spend. Another group was given the same questions, but the second group was immediately told the correct percentage the government spends on welfare (1 percent). They were then asked, with that in mind, what the government should spend. Regardless of how wrong they had been before receiving the information, the second group indeed adjusted their answer to reflect the correct fact.

I also suspect that if the corrections were included in a more emotional package they might work better. For instance, if the topic has to do with how much the government should spend on welfare, showing a suffering family getting help or showing that 1% as a penny coin (as in one cent out of each dollar the government spends) rolling off a table and spinning for a while before landing on the floor, the message might squeak through those erected ramparts. After all, Fox News does this in reverse all the time. Just read those creeping lines at the bottom of the screen! They are full of emotional hints which may go straight past one's intellectual sensors.
---
A postscript. I should add here that I have had my worldview collapse. It's an extremely painful experience and how one rebuilds that necessary framework must vary from person to person. Thus, I have some sympathy with those who cling on to false information as if it was a life-raft, because that ocean of not-knowing is a frightening place. But if the raft has holes you are going to have to learn to swim anyway.