Image

Late in life, atheist philosopher Bertrand Russell received this challenge: if, after death, he found himself face to face with God, what would he say? Russell replied, “I probably would ask, ‘Sir, why did you not give me better evidence?’”¹

Theists contend that though evidence for God is both present and sufficient, bias can fog even brilliant minds like Russell’s. It’s possible that bias could explain Russell’s atheism, but is the accusation of bias merely an ad hominem counter argument? We often assume that human beliefs arise from the application of reason to facts and experience; that we are, in effect, Homo rationalis (rational man). If Russell were objectively rational after considering all the evidence, then his defense is valid. His unbelief would signify failure on God’s part.

Homo rationalis is widely embraced and resonates with our self-perception. We always think our own beliefs are based on facts, reason, and experience.

Social scientists in the 1970s broadly accepted two ideas about human nature. First, people are generally rational, and their thinking is normally sound. Second, emotions such as fear, affection, and hatred explain most of the occasions on which people depart from rationality.²

However, the Christian Scriptures reject the doctrine of Homo rationalis, instead predicting that people would refuse to believe in the face of overwhelming evidence. In a parable recorded in Luke 16, Jesus says, “If they do not hear Moses and the prophets, neither will they be persuaded though one rise from the dead.” And in Romans 1:20, Paul wrote, “For since the creation of the world His invisible attributes are clearly seen, being understood by the things that are made, even His eternal power and Godhead, so that they are without excuse.”

In recent decades, researchers from a range of disciplines have investigated the nature of human belief. The results of this research enable us to test which is more correct, Homo rationalis or the biblical perspective.

Finding #1: Relying on Heuristics

Humans routinely sift through mountains of information to make even simple decisions. Ideally, a person one would take accurate, complete data and apply reason to reach a logical and correct conclusion. Reality is not so cooperative; we often lack both time and desire for exhaustive analysis, even if perfect information were available. Instead, we make the best possible decisions based on imperfect, incomplete data.

Heuristics are those mental shortcuts people use for deciding as efficiently as possible given the information on hand. We all use them, several times a day. Heuristics are quite helpful, actually. If you encounter a shadowy figure in a dark alley with something shaped like a gun in his hand, the “representativeness” heuristic would recommend avoidance. Logic would be useless until you determined beyond all doubt that (1) yes, it was a gun, and (2) the bearer had malicious intent—which could be too late.

Unfortunately, heuristics are often wrong and used as a substitute for thoughtful reflection. In his book Thinking Fast and Slow, renowned psychologist Daniel Kahneman offers a comprehensive portrayal of how our minds work and how an expanding catalog of cognitive biases and faulty heuristics routinely and predictably lead us astray. Heuristics are automatic, quick, and effortless. Kahneman labels this “System 1” thinking. Thoughtful reflection (“System 2” thinking) yields better decisions at the cost of time and effort. What Kahneman and his collaborators found was that our minds are naturally lazy so we rely on System 1 as much as possible: “System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy.”³

Cognitive biases are tendencies deeply embedded in our subconscious that lead us to err in predictable ways. Almost two hundred have been described in the literature. Many serve to enhance our own self-image or minimize emotional tension. For example, confirmation bias is the tendency to assign greater significance to evidence that supports our preexisting opinion. Heuristics and biases are closely intertwined. One way to understand the connection is that heuristics represent a shortcut to decision making, but are neutral regarding outcome. Biases push those decisions in certain (somewhat) predictable directions. Having invested a lifetime researching heuristics and biases, Kahneman concluded that “the human mind is not bound to reality.”4

Finding #2: Emotional Influences

It would be a sorry state of affairs if we regarded tragedy and suffering with cold indifference. But to what extent do emotions determine our beliefs? Is it merely an occasional exception or do emotions undermine the validity of Homo rationalis? In recent decades, a clear picture has emerged. It began with the observation that patients with specific brain injuries lost all capacity for emotion. The surprising consequence, though, was that such patients also lost the ability to make decisions. They could analyze a problem all day long without ever forming a conclusion. Dr. Antoine Bechara summarized the outcome of this research in 2004: 5

“The studies of decision-making in neurological patients who can no longer process emotional information normally suggest that people make judgments not only by evaluating the consequences and their probability of occurring, but also and even sometimes primarily at a gut or emotional level.” (emphasis added)

Now, this is far from saying that every decision is purely or primarily emotional nor that emotions inevitably lead to flawed conclusions. But when it comes to objective analysis or honest truth-seeking, emotions may not merely impede our progress; they can propel us right off the cliff. Consider the emotional fervor over certain political, social, religious, and even scientific issues. It is easy to believe the issues inflame our passion; more often it is our passions that inflame the issue. Despite the evidence, few will admit to thinking emotionally rather than logically. Most likely we don’t even know we’re doing it.

In 2015, Jennifer Lerner of Harvard University reviewed 35 years of research on the role of emotions in judgment and decision making:6

“The research reveals that emotions constitute potent, pervasive, predictable, sometimes harmful and sometimes beneficial drivers of decision making. Across different domains, important regularities appear in the mechanisms through which emotions influence judgments and choices.”

Finding #3: Social influences

If Homo rationalis existed, then we could completely trust expert opinions. But there are two obvious problems. First, experts often disagree. Second, recent history shows that experts sometimes fail spectacularly. The bandwagon effect inclines people to conform their opinions to the perceived majority position. This may occur either to enhance one’s own conformity and social acceptance, or because one sincerely (perhaps naively) trusts the wisdom of the majority.

When formulating an opinion on a complex subject, rarely do people rely on their own analysis. For example, on initial consideration, Professor B may consider Professor A’s opinion. The opinion of Professor A will be treated as additional data, sometimes prompting Professor B to reach the opposite conclusion from what he might have reached independently. Professor C then comes along and, rather than seeing disagreement between Professors A and B, she sees unanimity. If she trusts her colleagues, the inclination toward agreement becomes ever greater. This is the mechanism by which information cascades develop. In an information cascade, the early deciders have a disproportionate impact over equally qualified experts who arrive later. When a cascade has occurred, the majority viewpoint of 100 experts may be completely opposite to the opinion of the same 100 experts analyzing the data independently, blinded to the opinions of their colleagues.

Finding #4: Intelligence and Religiosity

There is no evidence that more intelligent or better educated individuals transcend their own emotions and biases or are less susceptible to peer pressure. In Kahneman’s collaborative research, it didn’t matter whether the subjects were average high school students or Ivy League undergrads. Highly intelligent and educated people are more confident,7 making them less likely to doubt their opinions or change their minds. Rather than pursuing truth wherever it may be found, smarter people channel their energy toward arguing and reinforcing their preexisting opinions.8

Belief Formation Research Supports Scripture

While Bertrand Russell, and many others, may attribute unbelief to lack of evidence, the Bible declares that belief is a choice. Research on human decision making has demonstrated that we are heavily influenced by nonrational factors that can lead to faulty decisions and incorrect belief (or unbelief). It seems the Bible’s view is well supported. To paraphrase Solzhenitsyn, the dividing line between fact and fancy cuts through the mind of every person, believer and skeptic alike.

(Jointly posted at Reasons to Believe on June 14, 2019

Endnotes
  1. Leo Rosten, “Bertrand Russell and God: A Memoir,” The Saturday Review, February 23, 1974, 26.
  2. Daniel Kahneman, Thinking Fast and Slow (New York: Farrar, Straus, and Giroux, 2011), 8.
  3. Kahneman, Thinking Fast and Slow, 81.
  4. Kahneman, Thinking Fast and Slow, 365.
  5. Antoine Bechara, “The Role of Emotion in Decision-Making: Evidence from Neurological Patients with Orbitofrontal Damage,” Brain and Cognition 55, no. 1 (June 2004): 30–40, doi:10.1016/j.bandc.2003.04.001.
  6. Jennifer Lerner et al., “Emotion and Decision Making,” Annual Review of Psychology 66 (January 2015): 799–823, doi:10.1146/annurev-psych-010213-115043.
  7. David G. Robson, The Intelligence Trap: Why Smart People Make Dumb Mistakes (London: Hodder and Stoughton, 2019).
  8. Dan M. Kahan, , “Ideology, Motivated Reasoning, and Cognitive Reflection: An Experimental Study,” Judgment and Decision Making 8, no. 4 (July 2013): 407–24.

About Author

about author

Steven Willing MD, MBA

Dr. Steven Willing received his medical degree from the Medical College of Georgia, completed an internship in pediatrics from the University of Virginia before undertaking a residency in diagnostic radiology at the Medical College of Georgia, and a fellowship in neuroradiology at the University of Alabama at Birmingham. Dr. Willing spent 20 years in academic medicine at the University of Louisville, the University of Alabama at Birmingham and Indiana University-Purdue University Indianapolis (IUPUI). He also earned an MBA from the University of Alabama at Birmingham in 1997.

During his academic career, Dr. Willing published over 50 papers in the areas of radiology, informatics, and management. He is the author of "Atlas of Neuroradiology", published by W. B. Saunders in 1995.

Now retired from clinical practice, Dr. Willing serves as a radiology consultant to Tenwek Hospital in Bomet, Kenya both remotely and on-site. He is presently the Alabama State Director for the American Academy for Medical Ethics, an adjunct Professor of Divinity at Regent University, and a Visiting Scholar for Reasons to Believe.

2 Comments
  1. RICHARD

    BRILLIANT!

  2. Jim Small

    I heard a professor give a talk recently where he said something like, “We all imagine we are scientists weighing evidence, but all of us spend more time like lawyers defending an emotional opinion.” Wish I could remember his name!

What do you think?

This site uses Akismet to reduce spam. Learn how your comment data is processed.