Rejection of science is a huge problem, with many people refusing to get vaccines and denying the existence of climate change. Researchers Aviva Philipp-Muller of Simon Fraser University, Richard Petty of Ohio State University, and Spike W.S. Lee of the University of Toronto have come up with four key reasons people reject scientific information.
Those factors, laid out in a post last month for The Conversation, are:
• The information comes from a source they perceive as non-credible;
• They identify with groups that are anti-science;
• The information contradicts what they believe is true, good, or valuable;
• The information is delivered in a way that conflicts with how they think about things.
Understanding these psychological reasons for being anti-science is critical, the three authors say, because it helps unpack the rejection of science across many domains and points to potential solutions for increasing scientific acceptance.
No Trust in Scientists
The first key reason people are anti-science is that they don’t see scientists as credible. This happens when scientists’ expertise is questioned, when they are deemed untrustworthy, and when they appear biased. Although debate among scientists is a healthy part of the scientific process, many lay people interpret legitimate scientific debate as a sign that those on either or both sides of the issue are not truly experts on the topic.
Scientists are often distrusted because they are seen as cold and unfeeling. Scientists’ objectivity has also been questioned, as they are seen as being biased against Christian and conservative values.
How can scientists address these credibility gaps? They can communicate to the public that debate is a natural part of the scientific process. To increase trustworthiness, they can convey that their work is motivated by selfless goals.
People also tend to reject scientific information when it conflicts with their social identities. For example, there has been continuing debate over scientific studies on the harms of playing video games.
People may also identify with social groups that reject scientific evidence, and hate scientists or those who agree with scientists. For example, those who identify with groups that are skeptical about climate change tend to be quite hostile toward climate change believers.
To tackle this, the authors say, science communicators should find a shared identity with their audience. Research has shown, for example, that when scientists offered their recycled water suggestions to a hostile audience, the audience was more receptive once they found a shared identity. [It also helps to start the conversation with the things that are already most important to that audience—Ed.]
People often reject science because it contradicts their beliefs, attitudes, and values. When scientific information contradicts what people believe is true or good, they feel uncomfortable. They resolve this discomfort by simply rejecting the science. For people who have smoked their entire lives, the evidence that smoking kills is uncomfortable because it contradicts their behaviours. It is far easier to trivialize the science regarding smoking than to change a deeply ingrained habit.
Often, scientific information contradicts existing beliefs due to widespread misinformation. Once misinformation has been spread, it is hard to correct, especially when it provides a causal explanation for the issue at hand.
One effective strategy to combat this is prebunking—warning people that they are about to receive a dose of misinformation—and then refuting it so that people will be better at resisting misinformation when they encounter it.
Scientific evidence can also be rejected for reasons beyond the content of the message. When science is delivered in ways that are at odds with how people think about things, they might reject the message. For example, some people find uncertainty hard to tolerate. For those people, when science is communicated in uncertain terms (as it often is), they tend to reject it.
Science communicators should therefore try to figure out how their audiences approach information and then match their style. They can use the logic of targeted advertising to try and frame scientific messages in different ways to be persuasive for different audiences.
Political forces are powerful contributors to anti-science attitudes. This is because politics can trigger or amplify all four of the key reasons for being anti-science. Politics can determine which sources seem credible, exposing people with different political ideologies to different scientific information and misinformation.
Politics is also an identity, and so when scientific ideas come from one’s own group, people are more amenable to them.
For example, when a carbon tax is described as being proposed by Republicans, Democrats are more likely to oppose it. When scientific information contradicts people’s politically informed moral values, both conservatives and liberals vehemently oppose it.
Finally, conservatives and liberals differ in their thinking styles and how they generally approach information. For example, conservatives tend to be less tolerant of uncertainty than liberals. These different thinking styles are linked to different degrees of being anti-science.
All in all, the three authors write, these core determinants of anti-science attitudes help us understand what is driving rejection of diverse scientific theories and innovations, ranging from new vaccines to the evidence for climate change.
Fortunately, by understanding these bases for being anti-science, we can also better understand how to target such sentiments and increase scientific acceptance.