- Facebook248
- Threads
- Bluesky
- Total 248
Here are some troubling examples summarized in a paper by Edward L. Glaeser and Cass Sunstein:*
- Presented with evidence that Iraq did not have weapons of mass destruction before 2002, liberals become more likely to agree with that thesis, and conservatives become less so. In other words, for conservatives, the information provoked a backlash (Nyhan and Reifler 2010).
- Presented with the fact that President G.W. Bush did not ban stem-cell research, liberals did not shift their opinion but continued to believe that he had banned it (ibid).
- Given the same information about nanotechnology, economic conservatives became more enthusiastic about the new technology and less favorable to regulating it, while economic liberals became more concerned about the technology and more supportive of regulating it (Kahan, 2007).
Glaesser and Sunstein propose two mechanisms to explain these phenomena. I will focus on one, which they call (in a mouthful of a phrase) “asymmetric Bayesianism.”
You reason in a Bayesian way when you use what actually happens to estimate the general probability of its happening. For instance, you reach blindly into an urn and pull out a black marble. You form the tentative hypothesis that the urn is full of black marbles: the probability of another black one is 100%. But after you have pulled out 10 black marbles and 10 white marbles, you adjust the probability to 50%. This is a reasonable and common way of thinking and is particularly fashionable in the early 2000s because computers are very good at it.
Humans, not so much. In the political domain, we generally want the probability to be a certain way. For example, as a liberal Democrat, I want Democratic presidents to do a good job, regulatory policies to work, and neoliberalism to fail. As we pull marbles out of the metaphorical urn, we use the ones that show the expected color to confirm our prior beliefs and strengthen our convictions. When some come out the wrong color, we forget them or dismiss them on various grounds. Ignoring an actual shiny white marble would be idiotic, but rejecting a third-party account of a subtle issue like Iraq’s WMD or Bush’s stem cell policy is easy. You just tell yourself that the messenger is biased or the case is exceptional and irrelevant. Thus, the more marbles we pull out of the urn–no matter their color–the more we shift toward our prior convictions.
As a social phenomenon, this is problematic. CIRCLE has been rigorously evaluating some specific innovations that were attempted during the 2012 election, and I will discuss those results later this spring. None was a slam-dunk success. But clearly, it is possible for public opinion to shift, because history is rife with change. Democrats may generally share certain biases today, but they all believed very different things 50 years ago. We need to understand more about what makes large groups learn.
On an individual level, the message is clear. Unless you want your brain to ossify and your vision to narrow, you must pay special attention to the marbles that come out the wrong color. People like me, who are generally sympathetic to the impulses behind European and South Asian social democracy, need to focus on this kind of awkward fact: South Korea’s mean income rose 10 times faster than India’s between 1950 and 2000, giving South Korea the 12th highest human development index in the world today, and India the 136th highest (out of 187), despite the fact that their baselines were about the same in 1950. From 1950-1980, India was a diverse and pluralist social democracy; South Korea was a corporatist dictatorship. Of course, that’s not the end of the story, and the moral is not to drop our moral objections to dictatorship. But it’s an example of an awkward-colored marble. We mustn’t reject that kind of data as exceptional or irrelevant but must actually use it to adjust our views.
*”Why Does Balanced News Produce Unbalanced Views?” (see its bibliography for other studies cited above).
How about situations when people want regression to the mean? For example, in the case of having two baby boys in a row, people assume that the chance of the next baby being a girl is much higher when it is still 50%. In this case, is it because the person’s mental model clusters around a middle rather than extreme? Does the research you cite study the “undecided” and their response to evidence?
Coincidentally, with Rob Portman coming out in support of gay rights today as a result of his own family experience, you have an example of something intense enough to burst the cognitive dissonance bubble. It’s interesting that some of the criticism he’s gotten from the left has been that his rationale for changing his mind is “selfish” rather than “reasoned.” Grassroots organizers would tell you that rarely does logic rather than personal experience move you to change your mind on an issue… maybe it’s “soft” rather than “hard” evidence that gets people to change their beliefs.
I like Kevin Drum on Rob Portman today: http://www.motherjones.com/kevin-drum/2013/03/gay-marriage-wins-another-convert (Consistent with your point.)
I’m not sure about reversion to the mean, and the Sunstein piece doesn’t mention independents, but in the experiments we did this fall, the independents did tend to shift the views more. Of course, that doesn’t mean that they’re generally more rational than the partisans: it can be quite rational to affiliate with one of the parties.