Science denial and misinformation are having a bit of a moment. In recent years, the terms post-truth and fake news have been named word of the year by various dictionaries. Flat-earthers achieved newfound exposure—though not necessarily support—with a 2018 Netflix documentary, and the current U.S. president is a climate change denier and anti-vaxxer.
These developments have real world consequences. Preventable diseases are making a comeback. Policies to mitigate climate change have been delayed decades. Segments of the population are failing to practice social distancing due to COVID-19 misinformation.
How can science denial be flourishing two decades into the twenty-first century? Ironically, our modern lifestyle is a contributor. Social media have exacerbated the problem, making it much easier to quickly and widely disseminate misinformation. By making irrelevant the traditional gatekeepers of conventional media, an individual can potentially reach as many people as mainstream media outlets. Further, social media platforms have a financial interest in delivering content that maximizes clicks and interactions. So they facilitate the development of echo chambers by delivering content that conforms to peoples’ preexisting beliefs.
However, we can’t blame it all on social media. Underneath the facade of modern technology lies psychological challenges that make it more difficult for us to understand and accept scientific reality—and even harder to fix the problem.
Take, for example, an issue such as climate change, which is uniquely difficult for our minds to grasp. Climate change is planetary in scale, spans timeframes of decades to centuries, and is caused collectively by the global human population. In contrast, the human brain has evolved to deal with immediate, short-term dangers such as predators jumping out of bushes.
Ultimately, the solution to science denial lies in psychology and building public resilience against misinformation. But again, we face steep psychological challenges. People think in two different ways: instinctive, fast thinking and more reasoned, slower thinking. Summoning the critical thinking to assess arguments and discern reasoning fallacies requires slow, methodical thinking. This takes great cognitive effort, because most of our thinking is fast and automatic. Our brains are hardwired to escape those predators jumping out of bushes.
So how do we tackle the psychological, technological, and social challenges that make misinformation and science denial such a daunting problem? Science does hold the answers to science denial. But I’ve come to the conclusion that science, while necessary, is insufficient in itself to meet the challenge of science denial. I have taken a long, meandering journey to arrive at that conclusion.
From Science to Art and Back to Science
I began my career studying physics at the University of Queensland in Australia. While completing my honours degree, I found myself drawing cartoons in my spare time in the margins of my lecture notes. Upon graduation, I made the dramatic career change from physics to cartooning. But while my days were spent drawing, I found myself doing science in my spare time. You can take the boy out of the science, but you can’t take the science out of the boy.
I then launched my website SkepticalScience.com. The approach of the website was simple: debunk misinformation about climate change with peer-reviewed science. Coming from a background in physical science, my approach was based on the naive notion that all I had to do to successfully debunk misinformation was provide the scientific facts.
Then out of the blue, I received an email from a cognitive scientist who shared some psychological research into debunking, including a study finding that when you debunk myths in the wrong way, you run the risk of reinforcing the myth rather than refuting it. To my horror, I discovered that I had been debunking myths in the wrong way!
That email proved to be life changing, because it opened my eyes to the science of science communication. I immersed myself in the psychological research into misinformation, eventually embarking on a PhD in psychology researching how to counter misinformation.
Just a Bit of Science Denial
After five years of doctoral research and around twenty published papers, I had found the answer to science denial: inoculation (Cook et al. 2017). For half a century, psychology researchers showed in experiment after experiment that if you expose people to a weak form of misinformation, they built up immunity so that stronger misinformation has less influence on them (Compton 2013). This approach is known as inoculation, borrowing the metaphor from medical vaccination and applying it to knowledge. Misinformation is delivered in a weakened form by including two elements: a warning of the danger of being misled and counterarguments explaining the misleading techniques employed by the misinformation.
Having landed upon the solution of inoculation, the next question was how to practically implement this approach to counter climate misinformation. I turned to two critical thinking philosophers, Peter Ellerton and David Kinkead, developing a methodology for systematically deconstructing and analyzing misinformation (Cook et al. 2018). We applied this process to fifty of the most common myths about climate change. In each case, we showed that each myth contained reasoning errors and identified the logical fallacy in each.
Once you identify the logical fallacy in a piece of misinformation, what’s the best way to explain how misinformation misleads the public and then inoculate people against it? My philosopher colleagues introduced me to the concept of parallel argumentation: explaining the flaw in an argument by transplanting the same logic into a parallel situation, often an extreme or absurd one. A lightbulb moment for me was the realization that this technique was used every evening by late night comedians. The beauty of parallel argumentation is it debunks false arguments by exposing bad logic, sidestepping the need to provide long, complicated explanations. It can also be highly accessible to the general public—not to mention humorous.
The Double-Edged Sword of Humor
The exciting potential of parallel argumentation led me into a new line of research: using humor to counter science misinformation. Existing research into comedy as a science communication tool found that the humorous approach was often a double-edged sword; benefits seemed to come with a potential drawback.
For example, communicating about a sober, intimidating topic such as climate change with humor makes the issue less threatening and more accessible (Boykoff and Osnes 2018). Humorous messages are also more effective with disengaged audiences (Brewer and McKnight 2015). However, the downside of humor is that people can come away less concerned about climate change relative to a serious climate message (Bore and Reid 2014).
One negative trait of humorous messages is that they are perceived as less informative than serious messages, even when containing the exact same information (Skurka et al. 2018). But on the positive side, people show less counterarguing against humorous messages. There is some intriguing research finding a sleeper effect in response to a comedy routine, where the message became more persuasive after one week (Nabi et al. 2007).
In the face of these mixed results, I began a series of experiments with my George Mason University colleagues Emily Vraga and Sojung Kim, measuring the impact of using humor to debunk science misinformation. We tested debunking misinformation across different issues, such as climate change, gun control, and vaccination, and found that debunking was most effective on topics where views were softest: vaccination (Vraga et al. 2019).
Our ongoing research has employed eye tracking to determine that both serious and humorous corrections are effective but in different ways. Using mediation analysis, we found that the way that serious refutations reduced the credibility of misinformation was because they are seen as more credible. In contrast, by incorporating eye-tracking data into our mediation analysis, we found that humorous cartoons were successful in discrediting misinformation because people spent more time paying attention to the cartoons.
Outside of the lab, I’ve also put these research findings into practice in the form of a book, Cranky Uncle vs. Climate Change. The book uses cartoons in a variety of forms to make the science both clearer and more engaging. These techniques include anthropomorphism to make natural phenomena more relatable, personification of climate denial in the form of the ubiquitous cranky uncle, and parallel arguments to explain the fallacies in climate misinformation.
Seriously Funny Games
Cartoon humor is also suited to another platform: games. Applying gamification to education is known as “serious games,” taking advantage of a range of gaming tools and techniques that help enhance the education process. For example, achievement rewards offer learning incentives (Blair et al. 2016), while leaderboards and player-to-player features add social and community elements (Paavilainen et al. 2017).
Gamification offers a powerful solution to science denial. Developing public resilience to misinformation requires teaching critical thinking, which is challenging because of the human preference for fast, automatic thinking over effortful, slow thinking. However, gamification incentivizes players to practice critical thinking. Through repeated quizzes where players spot fallacies in bad arguments, a slow thinking process can be turned into a fast thinking heuristic. It may be that the best thing to stop bad heuristics is good heuristics.
I’ve adapted the research and art that went into the Cranky Uncle vs. Climate Change book into a smartphone game where Cranky Uncle mentors players into learning the techniques of science denial to become a cranky uncle themselves. One danger of serious games is players can lose motivation to play if they see the game as all education and no fun. By using a sassy cartoon mentor and humorous examples of logical fallacies, we avoid this pitfall by packaging the game as a seriously funny game.
The Cranky Uncle game has been tested in college classes across the country. Just a half hour of playing a prototype of the game has been found to improve students’ ability to identify a range of different logical fallacies. While the game focuses on climate misinformation, students showed improvement in spotting misleading techniques across a range of topics. Researchers call this the “umbrella of protection”: inoculating against misinformation in one topic conveys resistance across multiple topics (Parker et al. 2012).
Science + Art + Technology
While there has been a great deal of scientific research into understanding and countering misinformation, the scientific community has struggled to keep up with the explosion of misinformation campaigns. Science denial has taken advantage of the tools offered by technological platforms.
Science is necessary but insufficient to counter science denial—it also needs allies. Fortunately, there are two powerful allies in art and technology. Art enables communicators to package scientific information in entertaining formats to attract the attention of disengaged audiences. Technology allows us to disseminate interactive games at a scale commensurate with the problem.
References
Blair, L., C. Bowers, J. Cannon-Bowers, et al. 2016. Understanding the role of achievements in game-based learning. International Journal of Serious Games 3(4): 47–56.
Bore, I.L.K., and G. Reid. 2014. Laughing in the face of climate change? Satire as a device for engaging audiences in public debate. Science Communication 36(4): 454–478.
Boykoff, M., and B. Osnes. 2018. A laughing matter? Confronting climate change through humor. Political Geography.
Brewer, P.R., and J. McKnight. 2015. Climate as comedy: The effects of satirical television news on climate change perceptions. Science Communication 37(5): 635–657.
Compton, J. 2013. Inoculation theory. The SAGE Handbook of Persuasion: Developments in Theory and Practice, 220–236.
Cook, J., P. Ellerton, and D. Kinkead. 2018. Deconstructing climate misinformation to identify reasoning errors. Environmental Research Letters 11(2).
Cook, J., S. Lewandowsky, and U. Ecker. 2017. Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. PLOS ONE 12(5): e0175799.
Nabi, R.L., E. Moyer-Gusé, S. and Byrne. 2007. All joking aside: A serious investigation into the persuasive effect of funny social issue messages. Communication Monographs 74(1): 29–54.
Paavilainen, J., K. Alha, and H. Korhonen. 2017. A review of social features in social network games. Transactions of the Digital Games Research Association 3(2).
Parker, K.A., B. Ivanov, and J. Compton. 2012. Inoculation’s efficacy with young adults’ risky behaviors: Can inoculation confer cross-protection over related but untreated issues? Health Communication 27(3): 223–233.
Skurka, C., J. Niederdeppe, R. Romero-Canyas, et al. 2018. Pathways of influence in emotional appeals: Benefits and tradeoffs of using fear or humor to promote climate change-related intentions and risk perceptions. Journal of Communication 68(1): 169–193.
Vraga, E.K., S.C. Kim, and J. Cook. 2019. Testing logic-based and humor-based corrections for science, health, and political misinformation on social media. Journal of Broadcasting & Electronic Media 63(3): 393–414.