‘Don’t Trust That Scientist!’

Ralph Barnes and Samuel Draznin-Nagy

Several activities are part of the scientific endeavor. Scientific activities obviously include things such as generating theories, gathering empirical data, and testing hypotheses. However, scientists also communicate their research to scientists and the general public. Scientists try to convince other scientists and the general public of the truth of their claims, but non-scientists also take part in science communication. Journalists and members of the general public, for example, often engage in science communication. Much of science communication doesn’t push an agenda. For instance, a communicator might simply want to inform people of the evidence and various theories related to a certain topic. What we are interested in are persuasive messages in science—those designed to persuade people that a particular scientific perspective is the correct one.

Persuasive messages about science can be consistent or inconsistent with the scientific mainstream. A scientist promoting the idea that global climate change is real (and influenced by human activities) would be promoting a message consistent with the scientific mainstream. In contrast, a blogger who promotes the idea that vaccines for COVID-19 contain microchips that will allow governments to track the recipients’ movements would be promoting a message inconsistent with the mainstream. For several years, we have been looking at persuasive argumentation related to science issues and particularly at differences between how those promoting claims inconsistent with the scientific mainstream differ from those promoting claims that are consistent with the scientific mainstream.

We should note that claims that are inconsistent with the scientific mainstream are not necessarily false; humans are imperfect, and as long as science is a human endeavor, mistakes are going to be made. Nonetheless, differences between those who agree with and those who disagree with mainstream scientific ideas may be of interest in their own right. Ralph Barnes and Rebecca Church (2013) compared websites that argued for evolution with those that argued for creationist views. They found that those that argued for evolution tended to frame the issue in terms of the weight of evidence, while those arguing for creationism tended to frame the issue in terms of certainty and proof.

Creationist authors—but not those promoting evolution—tended to claim that they had indisputable “proof” for their claims. Ralph Barnes, Rebecca Church, and Samuel Draznin-Nagy (2017) also compared websites arguing for evolution with those arguing for the creationist perspective. They found that websites arguing for evolution relied primarily on arguments from empirical evidence. In contrast, creationist websites used a wide range of argumentation, including appeal to authority and appeal to reason.

 Furthermore, evolution websites differed from creationist ones in the topics they addressed. Most arguments on the websites arguing for evolution were narrowly focused on descent with modification. In contrast, arguments on the websites arguing for creationism tackled a larger range of topics, including whether or not God exists, the age of the universe, and whether or not the biblical creation story is true. In the conclusion of their 2017 article, , suggested that it would be worth examining other scientific issues (e.g., AIDS denialism, global warming, and vaccine safety) to see if the differences between those arguing for mainstream and non-mainstream science views were unique to the topic of origins or whether they extended to other science issues.

The argument of central concern for the current study is the ad hominem argument or ad hominem attack. An ad hominem attack may be defined as an argument directed not at the substance of an argument but at the individual promoting that argument. An example of an ad hominem attack would be attempting to undermine Andrew Wakefield’s claim that the MMR vaccine causes autism-like symptoms in children by noting that his medical license has been revoked. In contrast, an argument directly targeting Wakefield’s claims might focus on the small sample size, inaccurate data reporting, and lack of replication of Wakefield’s now-retracted Lancet article.

In the past, ad hominem attacks were often assumed to be fallacious and unreasonable. However, many scholars today feel that ad hominem attacks may be reasonable or not depending on how they are used (Aberdein 2014; Walton 1998). An unreasonable ad hominem argument might include this example: evolution is a lie because a scientist who claims that evolution is true is an atheist. An example of a reasonable ad hominem argument might be this: a scientist’s claim that smoking poses no health risk should be viewed with a healthy dose of skepticism because the scientist is an employee of the tobacco industry. Whether reasonable or unreasonable, there is empirical evidence that ad hominem attacks can be very influential (Barnes et al. 2018; Kaid and Boydston 2009; Yap 2013).

 

The Study

Along with our colleague Zoë Neumann, we looked at four science issues: AIDS denialism, climate change, GMO safety, and vaccine safety. An inconsistent website had to be arguing for one of the following four claims: 1) HIV does not cause AIDS; 2) global climate change is not occurring and/or is not due to human actions; 3) GMOs are not safe; or 4) the health risks of vaccines (e.g., autism) outweigh the benefits. A consistent website had to be arguing the converse of any of the aforementioned four claims. To select websites that contained either arguments for or against the mainstream scientific view on all four of these topics, we used a Google search strategy that was very similar to the one we had used in an earlier study (Barnes et al. 2017). All the websites we chose presented lists of arguments for a view that was either consistent with or inconsistent with the mainstream scientific view of each issue. We ultimately selected sixty-nine websites for our analysis (11, 28, 15, and 15 respectively for the topics of AIDS denialism, climate change, GMO safety, and vaccine safety; see Barnes et al. 2020).

Once we selected the websites to be analyzed, we then had to identify all examples of ad hominem attacks. Once those arguments were located, our next job was to develop a rubric for identifying the particular focus of each ad hominem attack. To this end, we developed two rubrics. We first categorized all ad hominem attacks (see Table 1). If an argument was initially categorized as “Bad action” in Table 1, we then more narrowly categorized the argument using the rubric presented in Table 2.

All arguments were independently coded by two individuals on the basis of the rubrics found in Tables 1 and 2. The results based on the initial rubric can be found in Table 3. We found that websites promoting views that were inconsistent with the scientific mainstream were more likely to use ad hominem arguments. This difference was driven primarily by the fact that the websites that promoted ideas inconsistent with the scientific mainstream were much more likely to accuse their opponents of engaging in bad actions. The results based on the action rubric (Table 2) can be found in Table 4. Our data reveal that the most popular type of negative action the opponents were accused of was information control. Readers who are interested in the raw data and a more detailed summary of the results of this study (e.g., results presented separately for each of the four topics) may consult the text and supplementary materials of the original article (Barnes et al. 2020).

 

Discussion

Our study found hundreds of examples of ad hominem attacks, both in documents arguing for and against the mainstream science perspective. The results of the present study are consistent with those of earlier studies conducted by Barnes and Church (2013) and Barnes, Church, and Draznin-Nagy (2017). Those earlier studies revealed that those arguing for the position that was inconsistent with the scientific mainstream (i.e., creationism) employed an argumentative strategy quite different from the argumentative strategy used by the websites promoting ideas consistent with the scientific mainstream (i.e., evolution). In this study, the differences in argumentation strategy between those arguing for or against the scientific mainstream are shown to extend beyond the issue of the origin of species. We now have evidence that a different pattern of argumentation based on perspective (i.e., consistent/inconsistent with the scientific mainstream) extends to four additional issues: AIDS denialism, climate change, GMO safety, and vaccine safety.

Many scholars in the field of argumentation science feel that abusive ad hominem attacks are often unreasonable and inappropriate (van Eemeren et al. 2009). If the authors of the websites in our sample agreed with this sentiment and/or felt that their audiences would not be persuaded by abusive ad hominem arguments, then we would predict that abusive ad hominem attacks would be relatively rare. The low frequency of the generic abusive attacks (see Table 3) bears this out.

In our Bad Action rubric (Table 2), one can see that the category of “information control” includes such things as lying and withholding information. We feel that pointing out an action such as dishonesty is a reasonable ad hominem attack. If an individual is known to have repeatedly lied about a science issue in the past, then it seems that it is quite sensible that we ought to be skeptical of science claims that individual may make in the future. For this reason, it may be no surprise that information control is the most common bad action that scientists are accused of.

When trying to understand what people choose to believe about science issues, we have to be aware that belief is about more than just facts. A wealth of research indicates that trust in sources is at least as important as the scientific facts when it comes to influencing the public (Slovic 1993; Slovic et al. 1991). This study reveals that those who get their science information from the internet may be exposed to a lot of negative information about the individuals making science claims. Why is trust in sources so influential when it comes to science claims? In light of the results presented here, the answer to that question might be that the public has been provided with many reasons to distrust certain sources of science information. Perhaps one reason that anti-vaxxers, climate change denialists, and creationists reject the mainstream science views on those issues is that they have been told that the mainstream scientists are not to be trusted.

References

Aberdein, Andrew. 2014. In defence of virtue: The legitimacy of agent-based argument appraisal. Informal Logic 34(1): 77–93.

Barnes, Ralph, and Rebecca Church. 2013. Proponents of creationism but not proponents of evolution frame the origins debate in terms of proof. Science & Education 22(3): 577–603.

Barnes, Ralph, Zoë Neumann, and Samuel Draznin-Nagy. 2020. Source related argumentation found in science websites. Informal Logic 40(3): 443–473.

Barnes, Ralph, Heather Johnson, Noah MacKenzie, et al. 2018. The effect of ad hominem attacks on the evaluation of claims promoted by scientists. PLOS ONE 13(1): 1–15.

Barnes, Ralph, Rebecca Church, and Samuel Draznin-Nagy. 2017. The nature of the arguments for creationism, intelligent design, and evolution. Science & Education 26(1–2): 27–47.

Kaid, Lynda, and John Boydston. 2009. An experimental study of the effectiveness of negative political advertisements. Communication Quarterly 35(2): 193–201.

Slovic, Paul. 1993. Perceived risk, trust, and democracy. Risk Analysis 16(6): 675–682.

Slovic, Paul, James Flynn, and Mark Layman. 1991. Perceived risk, trust, and the politics of nuclear waste. Science 254(5038): 1603–1607.

Van Eemeren, Frans, Bart Garsen, and Bert Meufells. 2009. Fallacies and Judgements of Reasonableness: Empirical Re-Search Concerning the Pragma-Dialectical Discussion of Rules. Dordrecht, Netherlands: Springer.

Walton, Douglas. 1998. Ad Hominem Arguments. Tuscaloosa, AL: University of Alabama Press.

Yap, Audrey. 2013. Ad hominem fallacies, bias, and testimony. Argumentation 27: 97–109.

Ralph Barnes and Samuel Draznin-Nagy

Ralph Barnes is an educator and researcher. His research interests include reasoning, judgment and decision making, and rhetoric. He earned his PhD from The Ohio State University and is currently a Teaching Professor at Montana State University.   Samuel Draznin-Nagy is a graduate of Montana State University. His research interests include reasoning, scientific thinking, and judgment and decision making.


This article is available to subscribers only.
Subscribe now or log in to read this article.