The Scientific Attitude: Defending Science from Denial, Fraud, and Pseudoscience. By Lee McIntyre. MIT Press, 2019. ISBN 9780266039833. 296 pp., $27.95.
Science is under attack. The evidence for global warming is overwhelming, but many reject the evidence in favor of ideology and just believe what they want to believe. Vaccine-preventable diseases are rebounding due to rejection of vaccine science and the circulation of egregious misinformation. There are many misunderstandings about science, even among scientists. In this “post-truth” world, we must be able not only to explain why claims based on scientific evidence have a superior claim to believability but also to persuade the public to accept them over competing claims that are not based on empirical evidence but instead on ideology or wishful thinking.
Lee McIntyre, a philosopher of science at Boston University, has spent his entire career grappling with this problem. In his new book, The Scientific Attitude: Defending Science from Denial, Fraud, and Pseudoscience, he tries to explain what science is by examining what it is not. He writes, “there is much to learn from those who have forsaken it.” He argues that “the scientific attitude” is the key, even though it is difficult to pin down.
There’s no such thing as “the scientific method”; the simplistic “observe, hypothesize, predict, test, analyze, and revise” model does not describe how most scientific discoveries are actually made. Penicillin was discovered not by systematically applying a “scientific method” but rather by serendipitously profiting from an accidental contamination. The demarcation problem (drawing a line between science and nonscience) has never been satisfactorily solved. Criteria such as falsifiability showed promise but were flawed. McIntyre’s goal is to provide a way to show what is distinctive about science without solving the demarcation problem.
No matter how good the evidence, science can never prove the truth of any empirical theory. Theories are always tentative and subject to revision based on better evidence. But this doesn’t mean we have no grounds for believing a scientific theory. Evidence counts. Predictability is compelling. Good theories don’t require knowing the mechanism; we knew penicillin worked long before we understood how it worked. He draws a distinction between truth and warrant. A theory has warrant if it has a credible claim to believability given the evidence, even though we can’t prove it’s true in any absolute sense. We can be comfortable accepting a warranted theory even though we realize it might eventually be proven wrong. We can celebrate uncertainty, and we can require denialists and conspiracy theorists to defend the warrant of their beliefs rather than simply attacking our science.
Einstein had the true scientific attitude. He said experimental confirmation would not establish the truth of his theory, but he would have to accept that his theory was untenable if it failed certain tests. Specific methodology is less important than honesty and openness. Individuals must be willing to critique their own ideas, but criticism is also a communal and institutional self-correcting enterprise. We try to find failure. Science is value-laden and can never be purely objective; McIntyre writes, “It is a myth that we choose our beliefs and theories based only on the evidence.”
Ignaz Semmelweis had the scientific attitude far ahead of his time; his ideas were met with unreasoned opposition, and McIntyre notes that “Until quite recently in the history of medicine, patients often had just as much to fear from their doctor as they did from whatever disease they were trying to cure.” Benjamin Rush was a strong advocate of bloodletting to balance the humors; perhaps it was fitting that he died of bloodletting when it was used to treat his typhus fever. In his day, patients treated by a doctor for compound fracture had only a 50 percent chance of survival. Gradually doctors accepted that they could learn from experiments. The scientific attitude prevailed, and modern medicine was born.
According to a popular myth, penicillin was discovered by accident—but it was no accident. Fleming responded to a chance contamination with a prepared mind, understood that it showed antibiotic properties, published his findings, and others tested and developed the drug. The individuals involved all had the scientific attitude.
Science can go wrong in many ways: p-hacking, cherry-picking, sloppiness, laziness, cognitive bias, unintentional errors, intentional falsification or fabrication of data (sometimes with “good intentions” because they are sure they are right and want to speed the process along), and deliberate fraud, as in the case of Andrew Wakefield, whose now-retracted study on autism led to vaccine refusals and measles resurgence.
The most valuable chapter of the book is “Science Gone Sideways: Denialists, Pseudoscientists, and Other Charlatans.” Those people either misunderstand or don’t care about the standards of scientific evidence—or if they do care, they don’t care enough to modify or abandon their ideological beliefs. He defines denialism as “the refusal to believe in well-warranted scientific theories even when the evidence is overwhelming” and pseudoscience as when someone seeks the mantle of science to promote a fringe theory but refuses to change their beliefs in the face of refuting evidence or methodological criticism. This is dangerous, because people who have an economic, religious, or political interest in contradicting certain scientific findings “have resorted to a public relations campaign that has made great strides in undermining the public’s understanding of and respect for science.”
Denialists and pseudoscientists may believe that they are living up to the highest standards of the scientific attitude, but they are not. When the evidence conflicts with a sacred belief, people who believe they already know the answer will reject science. Superstition and willful ignorance are not new, but what is new is “the extent to which people can find a ready supply of ‘evidence’ to support their conspiracy-based, pseudoscientific, denialist, or other outright irrational beliefs in a community of like-minded people on the Internet.” It is easy to avoid conflicting views and to live in a fact-free alternative reality. Group consensus works well in science as a check against error, but outside of science, charlatans use consensus to reinforce prejudice.
Denialists like to call themselves skeptics and claim to be open to new ideas, but true skeptics reject new ideas that are believed without sufficient evidence and change their minds to adopt new ideas that are based on good evidence. “For denialists, no evidence ever seems sufficient for them to change their minds.” This is not a scientific attitude; it is the opposite of having an open mind. They demand impossibly high standards for disconfirming evidence but demonstrate complete gullibility for any claim that supports their belief system; for instance, they don’t need evidence to believe the claim that the CDC paid the Institute of Medicine to suppress the data on thimerosal.
Denialists and pseudoscientists like to point out that cranks can sometimes get it right. They laughed at Galileo, for example, but Galileo was not a crank; he had evidence to support his assertions. The book tells the story of J. Harlen Bretz, a maverick geologist who found evidence that a massive flood had created the channeled scablands of Eastern Washington. He realized that what he saw could not be explained by uniformitarianism, the generally accepted dogma that geologic processes act gradually over time. He didn’t know what could have caused such a mega-flood, but others found the explanation: the failure of a giant ice dam at prehistoric Lake Missoula. He was long considered a heretic, but his evidence was overwhelming. Others could see the evidence for themselves. Bretz was vindicated in 1965 when geologists went on an official field trip and sent him a telegram saying, “We are all catastrophists now.” It took awhile, but the scientific attitude eventually triumphed. “The crucial question for pseudoscientists is this: if your theories are true, where is the evidence?” If you have good answers, you will not be persecuted or ignored; science will eventually beat a path to your door.
McIntyre says the core difference between what is science and what is not hinges on the scientific attitude. He hopes that his concept of the scientific attitude will put us in a better place to understand and defend science and to grow it in new fields such as the social sciences. Truth matters, and if we care about empirical evidence and use it to shape our theories, we can avoid remaining, as he puts it, “mired in the ditch of ideology, superstition, and confusion.” Our very survival as a species may depend on it.