Intellectual Humility: A Guiding Principle for the Skeptical Movement?

Scott O. Lilienfeld

I beseech you, in the bowels of Christ, think it possible you may be mistaken.”

Oliver Cromwell, to the General Assembly of the Church of Scotland, 1650

In November 2019, attendees of the Committee for Skeptical Inquiry’s CSICon conference received the following inquiry from an anonymous CSI member in their email inboxes:

I’ve been thinking a lot about what it means to be a skeptic and what skepticism means to such people. I had a thought that I’d like your feedback on … (1) What does being a skeptic mean to you? (2) What do you think are the most important elements of skepticism?

The answers to these questions are far from self-evident. Although skepticism has been the subject of numerous conceptual treatments—some from a primarily scientific perspective (Beyerstein 1996; Novella et al. 2018) and others from a primarily philosophical perspective (DeRose and Warfield 1999; Bupp and Kurtz 2012; Pritchard 2019)—the skeptical movement has long struggled to identify a guiding credo. For example, Skeptical Inquirer magazine observes that “Skepticism is … not a cynical rejection of new ideas, as the popular stereotype goes, but rather an attitude of both open mind and critical sense” (skepticalinquirer.org/what-is-skepticism/). From this vantage point, skepticism requires us to adopt an epistemic stance—an approach to knowledge—marked by two seemingly contradictory viewpoints: an open-mindedness to novel claims conjoined with an insistence on subjecting them to rigorous and impartial scrutiny (see also Sagan 1996). Still, this characterization raises an intriguing question echoed by the two queries posed at the outset of this article: Is there a deeper psychological “essence” or “attitude of mind” underlying skepticism? If so, what is it?

An Inherent Tension within the Skeptical Movement

Anyone who has followed the skeptical movement over the past few decades is keenly aware of deep-seated disputes within its ranks regarding how best to deal with proponents of unsubstantiated assertions, such as those regarding the paranormal or complementary and alternative medicine. Should skeptics adopt a “take-no-prisoners” approach, which ruthlessly takes such advocates to task, or a “kinder and gentler” approach, which challenges them firmly but tactfully? Or is a mix of both approaches needed, depending on the situation at hand? At skeptics’ conferences, in skeptical publications, and on skeptics’ blogs, one commonly sees both strategies, spanning the gamut from ridicule to respect, but with scant consensus on which is most useful for changing hearts and minds.

Prominent figures within the ranks of the skeptical movement have long raised alarms regarding its at-times arrogant or elitist tone (Plait 2010). One of skepticism’s leading voices, astronomer and science writer Carl Sagan, expressed these worries decades ago: “The chief deficiency I see in the skeptical movement is its polarization: Us vs. Them—the sense that we have a monopoly on the truth; that those other people who believe in all these stupid doctrines are morons; that if you’re sensible, you’ll listen to us; and if not, to hell with you. This is nonconstructive. It does not get our message across” (Sagan 1996).

We share Sagan’s concerns. At the same time, the question of how much openness or respect to display to proponents of scientifically unsupported claims is far from settled. Few would dispute Sagan’s contention that routinely maligning believers in the paranormal is unlikely to be productive. Yet there may be cases (especially when advocates of dubious claims brazenly exploit vulnerable individuals) in which a hardline approach to debunking claims is warranted or even necessary. We suspect that without a coherent unifying approach to addressing scientifically unjustified assertions and their advocates, the skeptical movement is likely to grapple with these debates for the foreseeable future.

In this article, we propose that intellectual humility is an overarching approach to evidence that the skeptical movement may wish to embrace as a guiding credo. As psychologist William O’Donohue and his colleagues have noted, intellectual humility is a key epistemic virtue: a character disposition that helps us to minimize error in our web of beliefs (Quine and Ullian 1978) and ideally arrive at a closer approximation to the truth (O’Donohue et al. 2017).

Intellectual Humility: Conceptual and Definitional Issues

For centuries, if not millennia, scholars have warned of the hazards of intellectual hubris (Cowan et al. 2019). The sixteenth-century French philosopher Michel de Montaigne (1595) wrote that “the plague of man is boasting of his knowledge.” Perhaps owing to its origins in philosophy (Lynch 2019) and the definitional fuzziness that sometimes accompanies philosophical discussions, the concept of intellectual humility has not readily lent itself to scientific investigation. Nevertheless, in the past few years, intellectual humility has increasingly become a focus of empirically oriented psychologists, owing largely to the construction of several promising measures of this construct (McElroy-Heltzel et al. 2019). Although psychologists have yet to converge on a consensual definition of intellectual humility, most concur that it encompasses an awareness of one’s intellectual limitations and biases (Church and Samuelson 2016; Leary et al. 2017; Lilienfeld and Bowes 2020); an appreciation of alternative perspectives; and an acknowledgement that the evidentiary bases of one’s beliefs are often incomplete (Alfano et al. 2017; Whitcomb et al. 2017).

Intellectual humility is a fundamentally “metacognitive” (thinking about thinking) construct (Zohar and Barzilai 2013), meaning that intellectually humble individuals habitually reflect on their thinking processes, applying the principles of skepticism to their own reasoning. To give readers a flavor of the concept, Table 1 presents representative items from several questionnaire measures of intellectual humility.

Intellectual humility is ostensibly tied to what psychologists term a small bias blind spot. Research suggests that when asked about various ubiquitous cognitive biases, such as confirmation bias (Nickerson 1998) and hindsight bias (Fischhoff 1975), most of us report that other people are prone to these biases while we are largely or entirely immune to them (Pronin et al. 2002); this discrepancy is termed bias blind spot. Hence, most of us are not merely psychologically blind to some degree but blind to our blindness. In principle, intellectually humble people are more cognizant of their biases than other people, although whether they are less bias-prone in general is unknown.

Some authors also regard intellectual humility as encompassing respect for others’ views (Hook et al. 2017; Krumrei-Mancuso and Rouse 2016), although other scholars see this capacity as a downstream effect of intellectual humility rather than a necessary feature. These definitional disagreements notwithstanding, data suggest that individuals with high levels of self-reported intellectual humility are more likely than those with low levels to regard sharp disagreements as reflecting respectful differences of opinion that are worthy of discussion (Porter and Schumann 2018).

Intellectual humility is moderately correlated with, but independent of, overall humility (Davis et al. 2016). Preliminary evidence also suggests that intellectual humility is modestly associated with general intelligence (Zmigrod et al. 2019). In addition, intellectual humility bears robust associations with certain personality traits, especially those tied to agreeableness, conscientiousness, and openness to new experiences and ideas (Davis et al. 2016; McElroy et al. 2014).

Contrary to what some authors (e.g., Pritchard 2019) have implied, intellectual humility does not imply low self-confidence. Data suggest that intellectual humility is unrelated to, and perhaps even slightly positively correlated with, self-esteem, as well as confidence in one’s beliefs (Meagher et al. 2015; Porter and Schumann 2018). Hence, one can simultaneously be intellectually humble and largely certain of one’s convictions, especially if one has subjected them to searching scrutiny. One likely exemplar of this principle was Carl Sagan. Having once met with Sagan for an hour and a half, the first author of this article found him intensely intellectually challenging, at times even ferocious; Sagan asked penetrating questions and displayed no signs of diffidence. At the same time, Sagan was open to positions that contradicted his own and was more than willing to change his mind when confronted with contrary evidence. Like Sagan, prototypically intellectually humble individuals are not ideological pushovers. Instead, they weigh their convictions proportionally to the strength of the evidence.

Intellectual Humility and Skepticism

Why should readers of Skeptical Inquirer and similar publications be aware of intellectual humility’s implications? There are at least three reasons.

First, at least in principle, intellectual humility may occupy the middle ground between excessive gullibility on the one hand and cynicism on the other (see also Zagzebski 1996). Many authors have identified the golden mean or “sweet spot” between these two extremes as synonymous with skepticism (Beyerstein 1996; Molé 2002; Pigliucci 2003). In the words of Michael Shermer (2002), skepticism reflects “the essential balance between orthodoxy and heresy, between a total commitment to the status quo and the blind pursuit of new ideas.”

Second, the key component of skepticism—extending doubt to claims—does not come naturally to the human mind. Intellectual humility is broadly consistent with this ethos but extends it to doubting one’s own beliefs, not merely those of others. In this regard, Paul Kurtz, one of the giants of skepticism, echoed the views of philosopher and mathematician Charles Sanders Peirce: “There is always the possibility that we may be in error. Our observations may need to be corrected. We may have misinterpreted the data. We may be mistaken” (Kurtz and Shook 2010, 26).

Third, data from our laboratory suggest that intellectual humility is linked to several psychological dispositions that are presumably relevant to skepticism, such as low levels of confirmation bias and dogmatism, as well as to “objectivism,” the tendency to base one’s beliefs on rigorous data and carefully considered reasons rather than on intuitions or subjective experiences (Bowes 2019). In addition, intellectual humility appears to be somewhat protective against susceptibility to an intriguing construct termed “pseudoprofound BS” (Pennycook et al. 2015), which reflects a tendency to perceive superficially deep but vacuous assertions, such as “We are in the midst of a high-frequency blossoming of interconnectedness that will give us access to the quantum soup itself,” as meaningful (Bowes 2019). Intellectual humility is also associated with lower levels of endorsement of conspiracy theories and lower levels of belief in the discredited causal link between vaccines and autism (Bowes, Costello, et al. 2020). Potentially consistent with these findings, acceptance of conspiracy theories is positively associated with narcissism (Cichocka et al. 2016), which is a marker of low intellectual humility (Alfano et al. 2017). Moreover, intellectual humility predicts higher levels of belief in global warming as well as greater openness to scientific arguments challenging one’s views regarding global warming (Bowes, Blanchard, et al. 2020).

Still, much of this evidence is provisional and arguably flawed. Most of the data we have reviewed derives from self-reported measures of intellectual humility. It may be paradoxical to ask individuals about their own levels of humility (Lilienfeld and Bowes 2020). To the extent that the essence of humility is an awareness of one’s weaknesses, many intellectually humble participants may substantially underestimate their levels of this trait. Conversely, we can all think of highly immodest individuals—perhaps a few prominent politicians come to mind—who greatly overestimate their levels of intellectual humility. This “paradox of humility” may pose a formidable challenge to measuring intellectual humility and allied traits (Christen et al. 2014). In future research, it will be essential to circumvent this limitation by supplementing self-reports of intellectual humility with informant reports and behavioral observations, including objective indicators of how often individuals have changed their minds when confronted with persuasive contrary evidence (Rodriguez et al. 2019).

Intellectual Humility and Scientific Thinking

In many respects, we can think of science as an organized prescription for intellectual humility (Lilienfeld et al. 2017; McFall 1996). In our chosen field of psychology, for example, the procedural controls demanded of researchers—such as control groups, blinding of observations, and elimination of potential confounding variables—constitute efforts to combat confirmation bias and other cognitive errors that can be fueled by intellectual hubris. They are methodological bulwarks against fooling ourselves and fooling others; they are also among the features that best distinguish science from pseudoscience (Bunge 1984). As Carl Sagan and Ann Druyan observed, “Science … is forever whispering in our ears, ‘Remember, you’re very new at this. You might be mistaken. You’ve been wrong before’” (Sagan 1996, 34–35).

Similarly, in the words of social psychologists Carol Tavris and Elliott Aronson (2007), science is a method of “arrogance control” (109), because it implicitly acknowledges that scientists frequently make mistakes, whether in their theorizing, methods, analyses, or interpretations. Many of the institutionalized safeguards of science, including formalized peer review and what Helen Longino (1990) termed transformative interrogation (ongoing scrutiny of fellow scientists’ assertions at conferences, during informal conversations, on blogs, and so on) are essential defenses against human error. Hence, even though individual scientists are not necessarily intellectually humble, the scientific community relentlessly pushes back on their overstatements, operating as a means of collective error detection and error correction (Oreskes 2019).

At the risk of overgeneralization, it seems safe to conclude that scientists are more often rewarded for trumpeting their positive findings than for admitting mistakes or acknowledging shortcomings in their results (Diamandis and Bouras 2018). Yet as neuroscientist Stuart Firestein (2015) reminded us in a trenchant analysis, failure is an essential driver of scientific progress, as disconfirmation of our cherished hypotheses tends to be the most efficient means of weeding out errors in our belief systems (O’Donohue 2013; Popper 1959). As scientists, science communicators, and science educators, we should enthusiastically seek out, welcome, and publicize our failed predictions. We should come to regard erroneous hypotheses as our friends.

Still, as scientists, we sorely need more role models of intellectual humility. One of our favorite examples derives from Charles Darwin’s autobiography. In reflecting on his approach to recording data, Darwin wrote:

I had, also, during many years followed a golden rule, namely, that whenever a published fact, a new observation or thought came across me, which was opposed to my general results, to make a memorandum of it without fail and at once; for I had found by experience that such facts and thoughts were far more apt to escape from the memory than favourable ones. (Darwin 1898, 123)

This lovely quotation captures much of the essence of intellectual humility, including its contemporary conceptualization as a small bias blind spot (Lilienfeld and Bowes 2020). Darwin conceded his propensity toward confirmation bias and explained how he undertook concerted efforts to compensate for it. He may have been just as prone to biases as were other life scientists, but he was better prepared to counteract them.

A more recent example comes from Princeton University psychologist Daniel Kahneman, recipient of the 2002 Nobel Prize in Economic Sciences. In his book, Thinking, Fast and Slow, Kahneman (2011) touted the robustness of several highly counterintuitive psychological findings based on social priming paradigms. In one famous experiment, researchers reported that students primed to think of old age by unscrambling words related to the elderly (such as Florida and wrinkle) were later more likely to walk down the hall slowly than were non-primed students (Bargh et al. 1996). When it comes to these and other surprising findings, Kahneman wrote, “Disbelief is not an option. The results are not made up, nor are they statistical flukes. You have no choice but to accept that the major conclusions of these studies are true” (Kahneman 2011, 56). Yet as unsuccessful efforts to replicate social priming findings mounted, so did Kahneman’s doubts and those of other psychologists. Even though Kahneman and his collaborator Amos Tversky had conducted classic work on the law of small numbers, which demonstrated that most people draw unjustifiably strong inferences from data originating from small samples (Tversky and Kahneman 1971), Kahneman himself had apparently disregarded this cardinal principle when appraising the merits of social priming studies, many of which were conducted on modest numbers of participants.

When a colleague pointed out this inconsistency to Kahneman in a blog posting, how did he respond? Kahneman wrote:

What the blog gets absolutely right is that I placed too much faith in underpowered studies. As pointed out in the blog, and earlier by Andrew Gelman, there is a special irony in my mistake because the first paper that Amos Tversky and I published was about the belief in the “law of small numbers,” which allows researchers to trust the results of underpowered studies with unreasonably small samples. … Our article was written in 1969 and published in 1971, but I failed to internalize its message. (Kahneman 2017)

If only all scientists and politicians were this willing to own up to their errors!

A final refreshing example of intellectual humility from our own field is the website The Loss-of-Confidence Project, founded in 2018, which encourages psychologists to share examples of their published results that they no longer believe (see lossofconfidence.com/). As of this writing, this project has attracted statements from about a dozen researchers who say that they have lost confidence in one or more of their findings. Several of them confessed that, in retrospect, they had “massaged” the data in several ways before obtaining evidence for their hypotheses (Rohrer et al. 2018). We hope that other psychological researchers, as well as those in other scientific disciplines, will soon follow suit in owning up to doubts regarding their own findings.

Concluding Thoughts

Research on intellectual humility is in its relative infancy, and many conceptual and empirical questions remain to be resolved. Still, we are reasonably confident that intellectual humility holds promise as a lodestar for the skeptical movement. In our view, all skeptics should seek to become more aware of their cognitive limitations, including their biases, and acknowledge that the evidentiary bases of their beliefs are often fallible. In addition, although skeptics should not shy away from adopting forceful positions regarding blatantly unsubstantiated clams, they should be reluctant to dismiss strong assertions prior to careful inquiry. As Carl Sagan (1996) warned us, skeptics should be wary of assuming that they possess a monopoly on the truth.

If our analysis has merit, all skeptics should strive to inculcate a thoroughgoing sense of intellectual humility in themselves and others and avoid the tempting allure of intellectual arrogance. But of course, we might be wrong.

 


References

  • Alfano, M., K. Iurino, P. Stey, et al. 2017. Development and validation of a multi-dimensional measure of intellectual humility. PloS one 12(8).
  • Bargh, J., M. Chen, and L. Burrows. 1996. Automaticity of social behavior: Direct effects of trait construct and stereotype activation on action. Journal of Personality and Social Psychology 71(2): 230–244.
  • Beyerstein, B. 1996. Distinguishing Science from Pseudoscience. The Centre for Curriculum and Professional Development. Victoria, Canada: Simon Fraser University.
  • Bowes S.M. 2019. Expanding the Nomological Network of Intellectual Humility: An Examination of Personality Traits, Cognitive Styles, Critical-Thinking, and Self-Perception. Master’s Thesis, Emory University, Atlanta, Georgia.
  • Bowes, S.M., M. Blanchard, T.H. Costello, et al. 2020. Stepping Outside the Echo Chamber: Is Intellectual Humility Associated with Reduced Political Bias? Poster presented at the Annual Meeting of the Society for Personality and Social Psychology, New Orleans, LA.
  • Bowes, S.M., T.H. Costello, W. Ma, et al. 2020. Conspiratorial Thinking and Personality Traits: Disconfirming the Null Hypothesis. Paper presentation at the University of Miami Conspiracy Theory Conference in Miami, FL.
  • Bunge, M. 1984. What is pseudoscience? Skeptical Inquirer 9(1): 36–47.
  • Bupp, N., and P. Kurtz. 2012. Meaning and Value in a Secular Age: Why Eupraxsophy Matters—The Writings of Paul Kurtz. Amherst, NY: Prometheus Books.
  • Christen, M., M. Alfano, and B. Robinson. 2014. The semantic neighborhood of intellectual humility. Proceedings of the European Conference on Social Intelligence 1283: 40–49.
  • Church, I., and P. Samuelson. 2016. Intellectual Humility: An Introduction to the Philosophy and Science. London: Bloomsbury Academic.
  • Cichocka, A., M. Marchlewska, and A. de Zavala. 2016. Does self-love or self-hate predict conspiracy beliefs? Narcissism, self-esteem, and the endorsement of conspiracy theories. Social Psychological and Personality Science 7(2): 157–166.
  • Cowan, N., E. Adams, S. Bhangal, et al. 2019. Foundations of arrogance: A broad survey and framework for research. Review of General Psychology 23(4): 425–443.
  • Cromwell, O. 1650. Letter 129 – 3rd August 1650 to the General Assembly of the Kirk of Scotland (letter). The Cromwell Association (August 3). Available online at http://www.olivercromwell.org/Letters_and_speeches/letters/Letter_129.pdf.
  • Darwin, C. 1898. The Life and Letters of Charles Darwin. Including an Autobiographical Chapter. Volumes I and II. New York: D. Appleton & Company.
  • Davis, D., K. Rice, S. McElroy, et al. 2016. Distinguishing intellectual humility and general humility. The Journal of Positive Psychology 11(3): 215–224.
  • de Montaigne, Michel. 1595. Michel de Montaigne—The Complete Essays. London: Penguin Classics.
  • DeRose, K., and T. Warfield. 1999. Skepticism: A Contemporary Reader. Oxford, U.K.: Oxford University Press.
  • Diamandis, E., and N. Bouras. 2018. Hubris and sciences. F1000Research 7(133).
  • Firestein, S. 2015. Failure: Why Science Is So Successful. Oxford, U.K.: Oxford University Press.
  • Fischhoff, B. 1975. Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty. Journal of Experimental Psychology: Human Perception and Performance 1(3): 288–299.
  • Haggard, M., W. Rowatt, J. Leman, et al. 2018. Finding middle ground between intellectual arrogance and intellectual servility: Development and assessment of the limitations-owning intellectual humility scale. Personality and Individual Differences 124: 184–193.
  • Hook, J., J. Farrell, K. Johnson, et al. 2017. Intellectual humility and religious tolerance. The Journal of Positive Psychology 12(1): 29–35.
  • Kahneman, D. 2011. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux.
  • ———. 2017. Reconstruction of a train wreck: How priming research went off the rails (blog entry response). Replicability-Index (February 14). Available online at https://replicationindex.com/2017/02/02/reconstruction-of-a-train-wreck-how-priming-research-went-of-the-rails/comment-page-1/.
  • Krumrei-Mancuso, E., and S. Rouse. 2016. The development and validation of the comprehensive intellectual humility scale. Journal of Personality Assessment 98(2): 209–221.
  • Kurtz, P., and J. Shook. 2010. Exuberant Skepticism. Amherst, NY: Prometheus Books.
  • Leary, M., K. Diebels, E. Davisson, et al. 2017. Cognitive and interpersonal features of intellectual humility. Personality and Social Psychology Bulletin 43(6): 793–813.
  • Lilienfeld, S.O., and S.M. Bowes. 2020. Intellectual humility: Ten unresolved questions. In P. Graf (ed.), The State of the Art in Applied Psychology. New York: Wiley.
  • Lilienfeld, S.O., S.J. Lynn, W. O’Donohue, et al. 2017. Epistemic humility: An overarching educational philosophy for clinical psychology programs. Clinical Psychologist 70(2): 6–14.
  • Longino, H.E. 1990. Science as Social Knowledge: Values and Objectivity in Scientific Inquiry. Princeton, N.J.: Princeton University Press.n
  • Lynch, M.P. 2019. Know-It-All Society: Truth and Arrogance in Political Culture. New York: Liveright Publishing.
  • McElroy, S., K. Rice, and D. Davis. 2014. Intellectual humility: Scale development and theoretical elaborations in the context of religious leadership. Journal of Psychology and Theology 42(1): 19–30.
  • McElroy-Heltzel, S., D. Davis, C. DeBlaere, et al. 2019. Embarrassment of riches in the measurement of humility: A critical review of 22 measures. The Journal of Positive Psychology 14(3): 393–404.
  • McFall, R.M. 1996. Making psychology incorruptible. Applied and Preventive Psychology 5(1): 9–15.
  • Meagher, B., J. Leman, J. Bias, et al. 2015. Contrasting self-report and consensus ratings of intellectual humility and arrogance. Journal of Research in Personality 58: 35–45.
  • Molé, P. 2002. Are skeptics cynical? Popular misunderstandings of skeptics. Skeptical Inquirer 26(6): 44–48.
  • Nickerson, R. 1998. Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology 2(2): 175–220.
  • Novella, S. et al. 2018. The Skeptics’ Guide to the Universe: How to Know What’s Really Real in a World Increasingly Full of Fake. New York: Grand Central Publishing.
  • O’Donohue, W. 2013. Clinical Psychology and the Philosophy of Science. Berlin/Heidelberg: Springer Science + Business Media.
  • O’Donohue, W., J. Casas, D. Szoke, et al. 2017. Scientific progress in clinical psychology and epistemically virtuous research. Behavior and Philosophy 45: 45–63.
  • Oreskes, N. 2019. Why Trust Science? Princeton, N.J.: Princeton University Press.
  • Pennycook, G., J. Cheyne, N. Barr, et al. 2015. On the reception and detection of pseudo-profound bullshit. Judgment and Decision Making 10(6): 549–563.
  • Pigliucci, M. 2003. The sin of scientism. Skeptical Inquirer 27(6): 21–22.
  • Plait, P. 2010. Don’t Be a Dick (speech). The James Randi Educational Foundation. Available online at https://vimeo.com/13704095.
  • Popper, K. 1959. The Logic of Scientific Discovery. Abingdon-on-Thames: Routledge.
  • Porter, T., and K. Schumann. 2018. Intellectual humility and openness to the opposing view. Self and Identity 17(2): 139–162.
  • Pritchard, D. 2019. Scepticism: A Very Short Introduction. Oxford, U.K.: Oxford University Press.
  • Pronin, E., D. Yin, and L. Ross. 2002. The bias blind spot: Perceptions of bias in self versus others. Personality and Social Psychology Bulletin 28(3): 369–381.
  • Quine, W., and J.S. Ullian. 1978. The Web of Belief. New York: McGraw-Hill Education.
  • Rodriguez, D., J.N. Hook, J.E. Farrell, et al. 2019. Religious intellectual humility, attitude change, and closeness following religious disagreement. The Journal of Positive Psychology 14(2): 133–140.
  • Rohrer, J., W. Tierney, and E. Uhlmann, et al. 2018. Putting the self in self-correction. Available online at https://psyarxiv.com/exmb2/.
  • Sagan, C. 1996. The Demon-Haunted World: Science as a Candle in the Dark. London, U.K.: Headline Book Publishing.
  • Shermer, M. 2002. Skepticism as a virtue. Scientific American 286(4) (April): 37.
  • Tavris, C., and E. Aronson. 2007. Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts. San Diego, CA.: Harcourt.
  • Tversky, A. and D. Kahneman. 1971. Belief in the law of small numbers. Psychological Bulletin 76(2): 105–110.
  • Whitcomb, D., H. Battaly, J. Baehr, et al. 2017. Intellectual humility: Owning our limitations. Philosophy and Phenomenological Research 94(3): 509–539.
  • Zagzebski, L. 1996. Virtues of the Mind: An Inquiry into the Nature of Virtue and the Ethical Foundations of Knowledge. Cambridge, U.K.: Cambridge University Press.
  • Zmigrod, L., S. Zmigrod, P.J., Rentfrow, et al. 2019. The psychological roots of intellectual humility: The role of intelligence and cognitive flexibility. Personality and Individual Differences 141: 200–208.
  • Zohar, A., and S. Barzilai. 2013. A review of research on metacognition in science education: Current and future directions. Studies in Science Education 49(2): 121–169.

 


Adele N. Strother, BS, is the lab manager for the Lilienfeld Laboratory at Emory University. Her research interests include intellectual humility, cognitive bias, unusual beliefs, pseudoscience, morality, interpersonal communication, and evidence-based practice.

Shauna M. Bowes, MS, is an advanced graduate student in the clinical psychology doctoral program at Emory University. Her areas of professional interest include abnormal and normal personality traits, cognitive styles, intellectual humility, rationality, and cognitive bias management strategies.

Thomas H. Costello, MS, is an advanced graduate student in psychology at Emory University. His interests include the implications of personality for political attitudes and behaviors, differences and similarities in authoritarianism across the political Right and Left, and psychopathic personality.

Scott O. Lilienfeld

Scott O. Lilienfeld, PhD, is a professor of psychology at Emory University. He is coeditor of the book Science and Pseudoscience in Clinical Psychology, Second Edition (2014) and author of several other books about science and pseudoscience in psychology.