Skip to content

Why Do People Believe Scientifically Untrue Things?

|
Ronald Bailey, Reason Online

Because to do otherwise would be immoral

You hear a lot about the politicization of science, but the real problem is the moralization of science. The New York University psychologist Jonathan Haidt has made a compelling casethat moral differences drive partisan debates over scientific issues. Dan Kahan and others at the Yale Cultural Cognition Project have identified cultural differences that bias how people assimilate information. Together, Haidt and Kahan’s research suggests that what you believe about a scientific debate signals to like-minded people that you are on their side and are therefore a good and trustworthy person. Unfortunately, this means that the factual accuracy of beliefs is somewhat incidental to the process of moral signaling.

For an illustration, consider a recent skirmish between Skepticeditor Michael Shermer and Mother Jones writer Chris Mooney. Shermer, whose political views lean toward libertarianism, wrote a column for Scientific American titled “The Liberal War on Science,” noting the left’s tendency to deny human cognitive evolution and the safety of biotech crops and nuclear power. Mooney, author of a book called The Republican War on Science, retorted with a story headlined “There is No Such Thing as a Liberal War on Science.” The right’s denial of evolutionary biology and man-made global warming, Mooney argued, are much more consequential for public policy. While acknowledging that a substantial percentage of Democrats don’t believe in human evolution or man-made global warming either, Mooney took comfort in the fact that “considerably fewer Democrats than Republicans get the science wrong on these issues.”

Kahan identifies the ideological left as people who tend to have egalitarian or communitarian views. Egalitarians want to reduce disparities between people, and communitarians believe that society is obliged to take care of everyone. People holding these cultural values are naturally biased toward collective action to address inequality and the lack of solidarity. When the results of scientific research are perceived to perturb those values, it should be no surprise that left-leaners have a greater tendency to moralize them, to favor government intervention to control them, and to disdain conservatives who resist liberal moralizing.

Haidt’s moral survey data suggests that ideological conservatives have a greater tendency to moralize about purity and sanctity than do liberals. This may be so, but it’s pretty clear that liberals are not immune from concerns about purity and sanctity. While conservatives moralize about the purity and sanctity of sex and reproduction, liberals fret about the moral purity of foods and the sanctity of the natural world.

One particularly powerful moralizing tool that is chiefly deployed by progressives is the precautionary principle. Mooney blandly writes that this “is not an anti-science view, it is a policy view about how to minimize risk.” Beliefs about how much risk people should allowed to take or to be exposed to are moral views. In fact, as Kahan and his colleagues have shown, the strong urge to avoid scientific and technological risk is far more characteristic of people who have egalitarian and communitarian values. The precautionary principle is not a neutral risk analysis tool; it is an embodiment of left-leaning moral values.

Let’s look at what scientific research says—and does not say—about the moralized issues of climate change, biological evolution, nuclear power, genetically modified crops, exposure to synthetic chemicals, concealed carry of guns, vaccines, video games, fracking, organic foods, and sex education.

Full story