For a great proportion of our scientific beliefs, we have to rely on a long-established consensus. … For views on evolution, the Holocaust, whether transfats cause cancer, or carbon dioxide causes global warming, no single person can themselves compile enough evidence. You need to rely on scientists who themselves rely on more scientists.
However, people often form opinions, or choose which ‘consensus’ to trust, on the basis of feelings. This particularly works in a negative way; if you really hate X and his/her group, then X believing something, and in particular X thinking it is important, then thinking and proclaiming it as untrue gives enormous pleasure. This happens whether X is some braying redfaced foxhunter or sanctimonious good for nothing leftie student. Most political extremes have convoluted theories for why their opponents think the rubbish that they do.
A very interesting paper in Nature last week supports this, explaining finding on the ways that “ordinary citizens react to scientific evidence”:
People endorse whichever position reinforces their connection to others with whom they share important commitments. As a result, public debate about science is strikingly polarized. The same groups who disagree on 'cultural issues' — abortion, same-sex marriage and school prayer — also disagree on whether climate change is real and on whether underground disposal of nuclear waste is safe.
A process that does account for this distinctive form of polarization is 'cultural cognition'. Cultural cognition refers to the influence of group values — ones relating to equality and authority, individualism and community — on risk perceptions and related beliefs. …
For example, people find it disconcerting to believe that behaviour that they find noble is nevertheless detrimental to society, and behaviour that they find base is beneficial to it. Because accepting such a claim could drive a wedge between them and their peers, they have a strong emotional predisposition to reject it.
Our research suggests that this form of 'protective cognition' is a major cause of political conflict over the credibility of scientific data on climate change and other environmental risks. People with individualistic values, who prize personal initiative, and those with hierarchical values, who respect authority, tend to dismiss evidence of environmental risks, because the widespread acceptance of such evidence would lead to restrictions on commerce and industry, activities they admire. By contrast, people who subscribe to more egalitarian and communitarian values are suspicious of commerce and industry, which they see as sources of unjust disparity. They are thus more inclined to believe that such activities pose unacceptable risks and should be restricted. Such differences, we have found, explain disagreements in environmental-risk perceptions more completely than differences in gender, race, income, education level, political ideology, personality type or any other individual characteristic.
Cultural cognition also causes people to interpret new evidence in a biased way that reinforces their predispositions. As a result, groups with opposing values often become more polarized, not less, when exposed to scientifically sound information.
It’s a fascinating piece.
(Update: Just remembered I came across this paper via Chris (I think), so consider my hat tipped.)