Why We Need Science

Harriet Hall

Most patients—and even many medical doctors and scientists—have not grasped how important it is to use rigorous science to evaluate claims for medical treatments. All too often people decide to try a treatment that is irrational, hasn’t been tested, or has been tested and shown not to work. Why do they make those bad decisions?

How can you know whether a medical treatment works? If others say it worked for them, your Aunt Sally swears it cured her, there’s someone in a white lab coat lecturing about it on YouTube, and you try it and your symptoms go away, you can pretty much assume it really works. Right? No, wrong! That’s all the “evidence” most people need, but it’s not evidence at all.

No, you can’t make that assumption, because sometimes we get it wrong. For many centuries, doctors used leeches and lancets to remove blood from their patients. Everyone knew bloodletting worked. When you had a fever and the doctor bled you, you got better. Everyone knew of friends or relatives who had been at death’s door until bloodletting cured them. Doctors could recount thousands of successful cases. George Washington accepted the common wisdom that bloodletting was effective. When he got a bad throat infection, he let his doctors remove so much of his blood that his weakened body couldn’t recover, and he died. When scientists finally tested bloodletting, they found out it did much more harm than good. Patients who got well had been getting well in spite of bloodletting, not because of it. And some patients, including Washington, died unnecessarily.

People can be absolutely certain they are right when they are actually wrong. People can be fooled, including me and you. As Richard Feynman said: “The first principle is that you must not fool yourself—and you are the easiest person to fool.”

Personal experience can be very impressive, but it is unreliable. Testimonials are anecdotes that can suggest which treatments might be worth testing, but they are not reliable evidence of efficacy.

It Worked for Me’

People say, “It worked for me!” Well, maybe it didn’t. They have no basis for claiming that it “worked.” All they can really claim is that they observed an improvement following the treatment. That could indicate a real effect, an inaccurate observation, or a post hoc ergo propter hoc error—a false assumption that correlation in time meant causation. Such observations are only a starting point: we need to do science to find out what the observations mean. A commenter on the Science-Based Medicine blog wrote this testimonial: “I have witnessed first hand the life that begins to flow through the body upon the removal of a subluxation.”

What does this even mean? Does he expect anyone to believe this just because he says it? Would he believe me if I said I had witnessed firsthand the invisible dragon in Carl Sagan’s garage? Another commenter wrote that advocates of science-based medicine “seem to think that even if they see something with their own eyes that they can’t believe it if there are no double blinded officially published studies to prove that what they saw actually happened.”

Well, yes, that’s pretty much what we do think; and we are appalled that you don’t understand it yet, because it’s the whole reason we have to do science. I would phrase it a bit differently: “Seeing something with my own eyes doesn’t prove it’s true, and it doesn’t preclude the necessity for scientific testing.”

Perception and Interpretation

We can’t believe our own eyes. The process of vision itself is an interpretive construct by the brain. There are blind spots in our field of vision that we aren’t even aware of because our brain fills in the blanks. I saw a magician cut a woman in half on stage—that was an illusion, a false perception. I saw a patient get better after a treatment—but my interpretation that the treatment caused the improvement may have been a mistake, a false attribution.

Sometimes We Get It Wrong

Before we had science to test treatments, sometimes we got it wrong. We were probably more often wrong than right, and even modern doctors sometimes get it wrong. Not long ago, doctors used to do an operation for heart disease where they opened the chest and tied off chest wall arteries to divert more blood flow to the heart. They had an impressive 90 percent success rate. Dr. Leonard Cobb wanted to make sure, so he did a sham surgery experiment where he just made the incision in the chest and closed it back up without doing anything else. He discovered that just as many patients improved after the fake surgery! Doctors stopped doing that operation.

How could so many people be so wrong? How could they believe something had helped them when it actually had done them more harm than good? There are many reasons people can come to believe an ineffective treatment works.

The disease may have run its natural course. A lot of diseases are self-limiting; the body’s natural healing processes restore people to health after a time. A cold usually goes away in a week or so. To find out if a cold remedy works, you have to keep records of successes and failures for a large enough number of patients to find out if they really get well faster with the remedy than without it.

Many diseases are cyclical. The symptoms of any disease fluctuate over time. We all know people with arthritis have bad days and good days. The pain gets worse for a while, then it gets better for a while. If you use a remedy when the pain is bad, it was probably about to start getting better anyway, so the remedy gets credit it doesn’t deserve.

We are all suggestible. If we’re told something is going to hurt, it’s more likely to hurt. If we’re told something is going to make it better, it probably will. We all know this; that’s why we kiss our children’s scrapes and bruises. Anything that distracts us from thinking about our symptoms is likely to help. In scientific studies that compare a real treatment to a placebo sugar pill, an average of 35 percent of people say they feel better after taking the sugar pill. The real treatment has to do better than that if we’re going to believe it’s really effective (not to mention spend money on it).

There may have been two treatments and the wrong one got the credit. If your doctor gave you a pill and you also took a home remedy, you may give the credit to the home remedy. Or maybe something else changed in your life at the same time that helped treat the illness and was the real reason you got better.

The original diagnosis or prognosis may have been incorrect. A lot of people have supposedly been cured of cancer who never actually had cancer. Doctors who tell a patient he has only six months to live are only guessing and may be wrong. The best they can do is say the median survival for similar patients is six months—but that means half of the patients live longer than that.

Temporary mood improvement can be confused with cure. If a provider makes you feel optimistic and hopeful, you may think you feel better when the disease is really unchanged.

Psychological needs can affect our behavior and perceptions. When people want to believe badly enough, they can convince themselves they have been helped. People have been known to deny the facts—to refuse to see that the tumor is still getting bigger. If they have invested time and money, they don’t want to admit it was wasted. We see what we want to see; we remember things the way we wish they had happened. When a doctor is sincerely trying to help a patient, the patient may feel a sort of social obligation to please the doctor by reporting improvement.

We confuse correlation with causation. Just because an effect follows an action, that doesn’t necessarily mean the action caused the effect. When the rooster crows and then the sun comes up, we realize it’s not the crowing that made the sun come up. But when we take a pill and then we feel better, we assume it was the pill that made us feel better. We don’t stop to think that we might have felt better for some other reason. We jump to conclusions like in the story of a man who trained a flea to dance when it heard music, then he cut the flea’s legs off one by one until it could no longer dance and concluded that the flea’s organ of hearing must be in its legs!

So, there are a lot of ways we can get it wrong. Luckily, there’s a way we can eventually get it right: by scientific testing. There’s nothing mysterious or complicated about science: it’s just a toolkit of commonsense ways to test beliefs against reality. If you believe you’ve lost weight and you step on the scale to test your belief, that’s science. If you think you have a better way to grow carrots and you test your idea by planting two rows side by side, one with the old method and one with the new method, and see which row produces better carrots, that’s science. To test medicines, we can sort a large number of patients into two equal groups and give one group the treatment we’re testing and give the other group an inert placebo, such as a sugar pill. If the group that got the active treatment does significantly better, the treatment probably really works.

Jacqueline Jones was a fifty-year-old woman who had suffered from asthma since the age of two. She read about a miraculous herbal treatment that cured a host of ailments, including asthma. She assumed the information was true, because it included a lot of testimonials from people who had used it and were able to stop taking their asthma medications. They knew it worked; they had seen it work. Sick of the side-effects of conventional drugs, Jones stopped using her three inhalers, steroids, and nebulizer and took the herbal supplement instead. Within two days she had a massive asthma attack with complications that kept her hospitalized for six weeks.

All those people who said that herbal remedy had cured their asthma got it wrong. Asthma symptoms fluctuate. Maybe their symptoms would have improved anyway. Whatever the reason, the remedy had not been tested scientifically and was not effective for treating asthma. Believing those testimonials almost cost Jacqueline her life.

The next time a friend enthusiastically recommends a new treatment, stop and remember that they could be wrong. Remember Jacqueline Jones. Remember George Washington. Sometimes we get it wrong.

 


An earlier version of this article appeared on the Science-Based Medicine website.

Harriet Hall

Harriet Hall, MD, a retired Air Force physician and flight surgeon, writes and educates about pseudoscientific and so-called alternative medicine. She is a contributing editor and frequent contributor to the Skeptical Inquirer and contributes to the blog Science-Based Medicine. She is author of Women Aren’t Supposed to Fly: Memoirs of a Female Flight Surgeon and coauthor of the 2012 textbook Consumer Health: A Guide to Intelligent Decisions.


This article is available to subscribers only.
Subscribe now or log in to read this article.