- Total 248
Here are some troubling examples summarized in a paper by Edward L. Glaeser and Cass Sunstein:*
- Presented with evidence that Iraq did not have weapons of mass destruction before 2002, liberals become more likely to agree with that thesis, and conservatives become less so. In other words, for conservatives, the information provoked a backlash (Nyhan and Reifler 2010).
- Presented with the fact that President G.W. Bush did not ban stem-cell research, liberals did not shift their opinion but continued to believe that he had banned it (ibid).
- Given the same information about nanotechnology, economic conservatives became more enthusiastic about the new technology and less favorable to regulating it, while economic liberals became more concerned about the technology and more supportive of regulating it (Kahan, 2007).
Glaesser and Sunstein propose two mechanisms to explain these phenomena. I will focus on one, which they call (in a mouthful of a phrase) “asymmetric Bayesianism.”
You reason in a Bayesian way when you use what actually happens to estimate the general probability of its happening. For instance, you reach blindly into an urn and pull out a black marble. You form the tentative hypothesis that the urn is full of black marbles: the probability of another black one is 100%. But after you have pulled out 10 black marbles and 10 white marbles, you adjust the probability to 50%. This is a reasonable and common way of thinking and is particularly fashionable in the early 2000s because computers are very good at it.
Humans, not so much. In the political domain, we generally want the probability to be a certain way. For example, as a liberal Democrat, I want Democratic presidents to do a good job, regulatory policies to work, and neoliberalism to fail. As we pull marbles out of the metaphorical urn, we use the ones that show the expected color to confirm our prior beliefs and strengthen our convictions. When some come out the wrong color, we forget them or dismiss them on various grounds. Ignoring an actual shiny white marble would be idiotic, but rejecting a third-party account of a subtle issue like Iraq’s WMD or Bush’s stem cell policy is easy. You just tell yourself that the messenger is biased or the case is exceptional and irrelevant. Thus, the more marbles we pull out of the urn–no matter their color–the more we shift toward our prior convictions.
As a social phenomenon, this is problematic. CIRCLE has been rigorously evaluating some specific innovations that were attempted during the 2012 election, and I will discuss those results later this spring. None was a slam-dunk success. But clearly, it is possible for public opinion to shift, because history is rife with change. Democrats may generally share certain biases today, but they all believed very different things 50 years ago. We need to understand more about what makes large groups learn.
On an individual level, the message is clear. Unless you want your brain to ossify and your vision to narrow, you must pay special attention to the marbles that come out the wrong color. People like me, who are generally sympathetic to the impulses behind European and South Asian social democracy, need to focus on this kind of awkward fact: South Korea’s mean income rose 10 times faster than India’s between 1950 and 2000, giving South Korea the 12th highest human development index in the world today, and India the 136th highest (out of 187), despite the fact that their baselines were about the same in 1950. From 1950-1980, India was a diverse and pluralist social democracy; South Korea was a corporatist dictatorship. Of course, that’s not the end of the story, and the moral is not to drop our moral objections to dictatorship. But it’s an example of an awkward-colored marble. We mustn’t reject that kind of data as exceptional or irrelevant but must actually use it to adjust our views.
*”Why Does Balanced News Produce Unbalanced Views?” (see its bibliography for other studies cited above).