Tag Archive : apologetics

/ apologetics

Most of us think we’re smarter than most of us! In a recent large survey, 65% of Americans rated themselves more intelligent than average.[1] [Sounds of unrestrained laughter, barking, and howling – the Spaniel and pals]. Believing we’re very smart, we assume we’re usually right. But is that confidence warranted?

“Do you see a man wise in his own eyes?
There is more hope for a fool than for him.”

Proverbs 26:12

In the course of my medical career, I have known brilliant physicians of many different faiths. Among the most committed adherents, it is safe to say that all were quite sure regarding the truth of their particular faith. But each tradition contradicts all others in one or more matters. They could all be wrong in part or in whole; they cannot all be right. Logically, we must conclude that not only is it possible to be brilliant, certain, and wrong, but that it is common.

In the previous post, we looked at several nonrational factors that can lead to false beliefs: heuristics and biases, emotions, and social influences. We noted that education and intelligence are unreliable predictors of rational thinking.

Yet false beliefs comprise but one side of the coin. The other side, of equal or even greater importance, is the level of certainty attached to those beliefs. Confidence is our estimate of the probability that we are correct. It is a belief concerning our belief—metacognition, in psychological parlance.

The Illusion of Certainty

Ideally, our confidence should be roughly proportional to the mathematical probability that we are correct. In other words, if we are 90% certain, we should be right 90% of the time. But studies repeatedly show that our degree of certainty consistently exceeds our accuracy. For example, people who are “99% sure” are wrong 50% of the time. This disparity both defines and demonstrates the phenomenon of overconfidence. Our unwarranted certainty could be blamed on misplaced trust; that is, by placing too much credence in an unreliable source. However, since we tend to favor sources we already agree with (confirmation bias), excess certainty usually reflects an excessive faith in ourselves (pride).

In his 2009 tome On Being Certain, neuroscientist Robert Burton argued that certainty is not a state of reason but of feeling, influenced by unconscious physiologic processes.[2] Certainty is mostly illusion, Burton argues, and there is considerable evidence supporting this hypothesis.

Overconfidence has been demonstrated and measured in many domains besides intelligence: driving ability, economic forecasting, and medicine, for example. In almost every domain studied to date, significant majorities express a confidence in their abilities far beyond what is warranted, or even mathematically possible. [“Like my distant cousin who somehow still thinks he can catch a car” – the Spaniel].

Sometimes, the least competent people are the most confident, whereas the most skilled and knowledgeable people slightly underestimate their ability. This phenomenon has been dubbed the “Dunning-Kruger” effect, after the original researchers whose landmark paper, “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments,” not only opened a new avenue of research but has prompted many a smile from those who sensed its ring of truth.[3]

The Intelligence Trap

Highly intelligent people constitute another group with an elevated risk of overconfidence. Intelligent people know they are intelligent, making them less likely to doubt themselves, respect other opinions, or change their minds. They are also every bit as attuned, if not more so, to social influences that motivate belief.[4]

Highly intelligent people can and do believe crazy things. Sir Arthur Conan Doyle, creator of the ruthlessly logical Sherlock Holmes, was a devout believer in spiritualism and fairies. [“I once knew a Border Collie who claimed he’d been abducted by penguins” – the Spaniel]. Albert Einstein expressed a naïve and unshakeable optimism concerning Lenin, Stalin, and the Soviet Union:

I honor Lenin as a man who completely sacrificed himself and devoted all his energy to the realization of social justice. I do not consider his methods practical, but one thing is certain: men of his type are the guardians and restorers of humanity.[5]

In The Intelligence Trap, science writer David Robson informs us that:

  • College graduates are more likely than nongraduates to believe in ESP and psychic healing
  • People with IQ’s over 140 are more likely to max out on their credit
  • High IQ individuals consume more alcohol and are more likely to smoke or take illegal drugs[6]

While the popular perception is that intelligent people are naturally skeptical, in fact all humans are believing machines. We drift with the cultural tides, embracing popular ideas on the flimsiest of evidence, then clutch those beliefs tenaciously to protect our egos, strut our virtue, justify our actions, and advertise loyalty to our in-group. This view may seem cynical, but it is well-validated.

There are many strategies for overcoming the “intelligence trap.” They include cognitive reflection, actively open-minded thinking, curiosity, emotional awareness and regulation, having a growth mindset, distrusting the herd, and consistent skepticism. However one habit of mind undergirds all others: an attitude of intellectual humility.

Knowing Our Limits

Intellectual humility could be defined as merely having a realistic view of our mental processing; viz., that our knowledge is inevitably limited, our thinking is unavoidably biased, and that even the smartest among us are prone to error.[7]

In recent decades, psychology has embraced a model of personality based on the “big five”: openness, conscientiousness, extraversion, agreeableness, and neuroticism. The more recent version adds a sixth measure: HH, for honesty-humility. Researchers have demonstrated that HH shows a consistent negative correlation with all three elements of the “dark triad”: psychopathy, narcissism, and Machiavellianism.[8] [“We just call that 'being a cat'” – the Spaniel]. On the other hand, HH correlates positively with healthier traits such as cooperation and self-control.

In a 2018 paper from UC Davis, researchers showed that intellectual humility is associated with openness during disagreement, and that promoting a growth mindset served to enhance intellectual humility.[9] Intellectual humility also helps to reduce polarization and conflict.[10] In one study, it was even superior to general intelligence in predicting academic achievement.[11]

Research Affirms Scripture

According to most theologians in the Judeo-Christian tradition, pride is the deadliest sin. Humility is its opposite. It may be tempting to assume this peril concerns only the skeptic, but it’s not just about “them.” It’s about all of us. And the greater the visibility or the higher one’s position in Christian circles, the greater the problem is likely to be.

“Do not be wise in your own conceits.”

romans 12:16, KJV

Scripture repeatedly warns against unwarranted confidence in our own wisdom. Decades of research in cognitive science shows this to be a common human problem that only worsens with intelligence. The antidote begins with intellectual humility, an ancient virtue whose wisdom has been validated by the latest empirical data.

Article also posted (without canine commentary) at Reasons to Believe on August 9, 2018

Endnotes

[1]. Patrick R. Heck, Daniel J. Simons, and Christopher F. Chabris, “65% of Americans Believe They Are above Average in Intelligence: Results of Two Nationally Representative Surveys,” PLoS ONE 13, no. 7 (July 3, 2018): e0200103, doi:10.1371/journal.pone.0200103.

2. Robert Burton, On Being Certain: Believing You Are Right Even When You’re Not (New York: St. Martin’s Press, 2008).

3. Justin Kruger and David Dunning, “Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments,” Journal of Personality and Social Psychology 77, no. 6 (January 2000): 1121–34, doi:10.1037//0022-3514.77.6.1121.

4. Dan M. Kahan, “Ideology, Motivated Reasoning, and Cognitive Reflection,” Judgment and Decision Making 8, no. 4 (July 2013): 407–24.

5. Lewis Samuel Feuer, Einstein and the Generations of Science 2nd ed. (New Brunswick, NJ: Transaction Publishers, 1989), 25. [JA10] [SW11] 

6. David Robson, The Intelligence Trap: Why Smart People Make Dumb Mistakes (New York: W. W. Norton & Company, 2019).

7. Peter C. Hill et al., “A Few Good Measures: Colonel Jessup and Humility,” in Everett L. Worthington Jr., Don E. Davis, and Joshua N. Hook, eds., Handbook of Humility: Theory, Research, and Implications (New York: Routledge, 2017).

8. Joseph Leman et al., “Personality Predictors and Correlates of Humility,” in Worthington, Davis, and Hook, eds., Handbook of Humility.

9. Tenelle Porter and Karina Schumann, “Intellectual Humility and Openness to the Opposing View,” Self and Identity 17, issue 2 (August 9, 2017): 139–62, doi:10.1080/15298868.2017.1361861.

10. Porter and Schumann, “Intellectual Humility.”

11. Bradley P. Owens, Michael D. Johnson, and Terence R. Mitchell, “Expressed Humility in Organizations: Implications for Performance, Teams, and Leadership,” Organization Science 24, no. 5 (February 12, 2013): 1517–38, doi:10.1287/orsc.1120.0795.



In this podcast recorded at Reasons to Believe in May 2019, Philosopher-Theologian Ken Samples and I discuss the nature of belief, pride, humility, and the life of the mind.

Topics:
-My personal journey from early atheism to Christian faith
Are people rational?
The role of emotions in belief formation
Intellectual pride and humility
"The Intelligence Trap" by David Robson
"The God Delusion" by Richard Dawkins
Henry IV and Pope Gregory VII
Tenwek Hospital
"As an atheist, I truly believe Africa needs God." Matthew Paris, The Sunday Times, December 27, 2008
-Responding to skeptics

Late in life, atheist philosopher Bertrand Russell received this challenge: if, after death, he found himself face to face with God, what would he say? Russell replied, “I probably would ask, ‘Sir, why did you not give me better evidence?’”¹

Theists contend that though evidence for God is both present and sufficient, bias can fog even brilliant minds like Russell’s. It’s possible that bias could explain Russell’s atheism, but is the accusation of bias merely an ad hominem counter argument? We often assume that human beliefs arise from the application of reason to facts and experience; that we are, in effect, Homo rationalis (rational man). If Russell were objectively rational after considering all the evidence, then his defense is valid. His unbelief would signify failure on God’s part.

Homo rationalis is widely embraced and resonates with our self-perception. We always think our own beliefs are based on facts, reason, and experience.

Social scientists in the 1970s broadly accepted two ideas about human nature. First, people are generally rational, and their thinking is normally sound. Second, emotions such as fear, affection, and hatred explain most of the occasions on which people depart from rationality.²

However, the Christian Scriptures reject the doctrine of Homo rationalis, instead predicting that people would refuse to believe in the face of overwhelming evidence. In a parable recorded in Luke 16, Jesus says, “If they do not hear Moses and the prophets, neither will they be persuaded though one rise from the dead.” And in Romans 1:21, Paul writes, “Because, although they knew God, they did not glorify Him as God, nor were thankful, but became futile in their thoughts, and their foolish hearts were darkened.”

In recent decades, researchers from a range of disciplines have investigated the nature of human belief. The results of this research enable us to test which is more correct, Homo rationalis or the biblical perspective.

Finding #1: Relying on Heuristics

Humans routinely sift through mountains of information to make even simple decisions. Ideally, a person one would take accurate, complete data and apply reason to reach a logical and correct conclusion. Reality is not so cooperative; we often lack both time and desire for exhaustive analysis, even if perfect information were available. Instead, we make the best possible decisions based on imperfect, incomplete data.

Heuristics are those mental shortcuts people use for deciding as efficiently as possible given the information on hand. We all use them, several times a day. Heuristics are quite helpful, actually. If you encounter a shadowy figure in a dark alley with something shaped like a gun in his hand, the “representativeness” heuristic would recommend avoidance. Logic would be useless until you determined beyond all doubt that (1) yes, it was a gun, and (2) the bearer had malicious intent—which could be too late.

Unfortunately, heuristics are often wrong and used as a substitute for thoughtful reflection. In his book Thinking Fast and Slow, renowned psychologist Daniel Kahneman offers a comprehensive portrayal of how our minds work and how an expanding catalog of cognitive biases and faulty heuristics routinely and predictably lead us astray. Heuristics are automatic, quick, and effortless. Kahneman labels this “System 1” thinking. Thoughtful reflection (“System 2” thinking) yields better decisions at the cost of time and effort. What Kahneman and his collaborators found was that our minds are naturally lazy so we rely on System 1 as much as possible: “System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy.”³

Cognitive biases are tendencies deeply embedded in our subconscious that lead us to err in predictable ways. Almost two hundred have been described in the literature. Many serve to enhance our own self-image or minimize emotional tension. For example, confirmation bias is the tendency to assign greater significance to evidence that supports our preexisting opinion. Heuristics and biases are closely intertwined. One way to understand the connection is that heuristics represent a shortcut to decision making, but are neutral regarding outcome. Biases push those decisions in certain (somewhat) predictable directions. Having invested a lifetime researching heuristics and biases, Kahneman concluded that “the human mind is not bound to reality.”4

Finding #2: Emotional Influences

It would be a sorry state of affairs if we regarded tragedy and suffering with cold indifference. But to what extent do emotions determine our beliefs? Is it merely an occasional exception or do emotions undermine the validity of Homo rationalis? In recent decades, a clear picture has emerged. It began with the observation that patients with specific brain injuries lost all capacity for emotion. The surprising consequence, though, was that such patients also lost the ability to make decisions. They could analyze a problem all day long without ever forming a conclusion. Dr. Antoine Bechara summarized the outcome of this research in 2004: 5

"The studies of decision-making in neurological patients who can no longer process emotional information normally suggest that people make judgments not only by evaluating the consequences and their probability of occurring, but also and even sometimes primarily at a gut or emotional level." (emphasis added)

Now, this is far from saying that every decision is purely or primarily emotional nor that emotions inevitably lead to flawed conclusions. But when it comes to objective analysis or honest truth-seeking, emotions may not merely impede our progress; they can propel us right off the cliff. Consider the emotional fervor over certain political, social, religious, and even scientific issues. It is easy to believe the issues inflame our passion; more often it is our passions that inflame the issue. Despite the evidence, few will admit to thinking emotionally rather than logically. Most likely we don’t even know we’re doing it.

In 2015, Jennifer Lerner of Harvard University reviewed 35 years of research on the role of emotions in judgment and decision making:6

"The research reveals that emotions constitute potent, pervasive, predictable, sometimes harmful and sometimes beneficial drivers of decision making. Across different domains, important regularities appear in the mechanisms through which emotions influence judgments and choices."

Finding #3: Social influences

If Homo rationalis existed, then we could completely trust expert opinions. But there are two obvious problems. First, experts often disagree. Second, recent history shows that experts sometimes fail spectacularly. The bandwagon effect inclines people to conform their opinions to the perceived majority position. This may occur either to enhance one’s own conformity and social acceptance, or because one sincerely (perhaps naively) trusts the wisdom of the majority.

When formulating an opinion on a complex subject, rarely do people rely on their own analysis. For example, on initial consideration, Professor B may consider Professor A’s opinion. The opinion of Professor A will be treated as additional data, sometimes prompting Professor B to reach the opposite conclusion from what he might have reached independently. Professor C then comes along and, rather than seeing disagreement between Professors A and B, she sees unanimity. If she trusts her colleagues, the inclination toward agreement becomes ever greater. This is the mechanism by which information cascades develop. In an information cascade, the early deciders have a disproportionate impact over equally qualified experts who arrive later. When a cascade has occurred, the majority viewpoint of 100 experts may be completely opposite to the opinion of the same 100 experts analyzing the data independently, blinded to the opinions of their colleagues.

Finding #4: Intelligence and Religiosity

There is no evidence that more intelligent or better educated individuals transcend their own emotions and biases or are less susceptible to peer pressure. In Kahneman’s collaborative research, it didn’t matter whether the subjects were average high school students or Ivy League undergrads. Highly intelligent and educated people are more confident,7 making them less likely to doubt their opinions or change their minds. Rather than pursuing truth wherever it may be found, smarter people channel their energy toward arguing and reinforcing their preexisting opinions.8

Belief Formation Research Supports Scripture

While Bertrand Russell, and many others, may attribute unbelief to lack of evidence, the Bible declares that belief is a choice. Research on human decision making has demonstrated that we are heavily influenced by nonrational factors that can lead to faulty decisions and incorrect belief (or unbelief). It seems the Bible’s view is well supported. To paraphrase Solzhenitsyn, the dividing line between fact and fancy cuts through the mind of every person, believer and skeptic alike.

(Jointly posted at Reasons to Believe on June 14, 2019

Endnotes
  1. Leo Rosten, “Bertrand Russell and God: A Memoir,” The Saturday Review, February 23, 1974, 26.
  2. Daniel Kahneman, Thinking Fast and Slow (New York: Farrar, Straus, and Giroux, 2011), 8.
  3. Kahneman, Thinking Fast and Slow, 81.
  4. Kahneman, Thinking Fast and Slow, 365.
  5. Antoine Bechara, “The Role of Emotion in Decision-Making: Evidence from Neurological Patients with Orbitofrontal Damage,” Brain and Cognition 55, no. 1 (June 2004): 30–40, doi:10.1016/j.bandc.2003.04.001.
  6. Jennifer Lerner et al., “Emotion and Decision Making,” Annual Review of Psychology 66 (January 2015): 799–823, doi:10.1146/annurev-psych-010213-115043.
  7. David G. Robson, The Intelligence Trap: Why Smart People Make Dumb Mistakes (London: Hodder and Stoughton, 2019).
  8. Dan M. Kahan, , “Ideology, Motivated Reasoning, and Cognitive Reflection: An Experimental Study,” Judgment and Decision Making 8, no. 4 (July 2013): 407–24.

Most of us think we are better than most of us. As a consequence, we are sometimes tempted to advertise our moral superiority through public expression. This is known as “moral grandstanding”. There’s nothing wrong with issuing public moral pronouncements; it’s not merely a right, but sometimes a duty. The issue here is motive. It is grandstanding when the main motive is to flaunt our virtue. (With human behavior, usually more than one motive is in play).

One of the many drawbacks of moral grandstanding is that it can lead to “ramping up.” If a conversation partner makes a moral statement that is equally virtuous to our own position, pride may push us to a more extreme position in order to maintain our own moral superiority. Ramping up is one of many prideful behaviors that contribute to extreme polarization on issues.

Both moral grandstanding and ramping up are pervasive in social and political spheres. It is often referred to as "virtue signalling" but that expression is a misnomer with a life of its own. This is not an invitation to point fingers. When we accuse specific people of moral grandstanding, we cross a line. We can't know the sincerity of their motive. It's like accusing a person of lying, when for all we know they may actually believe what they're saying. In the broader social context though, we can identify it, criticize it, ignore it, but especially avoid doing it.

Naturally, being the spiritual people we are, none of us would stoop to moral grandstanding, would we? Christians wouldn’t try to outdo one another in their spirituality, would they?

The “Dilbert” comic strip by Scott Adams features an intermittent character named “Topper”. Topper has a single distinguishing trait: whenever any other character makes a statement in his presence, he must counter with a competing claim that is more impressive. It’s always something outrageous, or it wouldn’t be funny. But the point of satire is to poke fun at things real people do. Have you ever played Christian Topper? It goes like this:

Sam: How was your trip?

Loretta: What a blessing! We were in Haiti for two weeks.

Topper: I spent 37 years in Borneo living off grass and beetles!

Loretta: We did get a stomach bug. I lost 3 pounds.

Topper: I survived malaria, dengue fever, Zika, and Ebola!

Loretta: But we prayed for a quick recovery and were soon back on our feet.

Topper: I was eaten by cannibals and prayed my way back to life!

Or closer to home:

Bob: We watched “Harry Potter” last night and had a good conversation with our kids this morning about the themes of courage and sacrifice.

Leah: We won’t let our kids watch “Harry Potter”. Didn’t you know that because of those stories millions of Christian kids turned to Satanism?

Topper: I’ve never watched a movie my entire life! In fact, I walk everywhere with my eyes shut so I can’t even see an advertisement!

Now, most Christians who eschewed the Harry Potter franchise probably were quite sincere in their belief that it was harmful (even if a disturbing number fell for the hoax about witchcraft and Satanism, mistaking parody for fact). Making a public display over it is more problematic. Moral grandstanding comes so naturally to us, it can be unconscious and quite unintentional.

Christians play this game even with the Bible. At one extreme some compete to see how simplistically they can interpret it without hitting 10 on the nuttiness scale. (Snakes, anyone?) At the other extreme they compete to see how much of the Bible they can jettison and still pretend to be Christian. It’s hard to grandstand from inside the mainstream where no one notices or pays much attention to you. Unfortunately, if enough people jump in, the ramping up effect can either move the mainstream, split it, or (usually) both.

Jesus knew moral grandstanding when he saw it, and warned his disciples:

"Take heed that you do not do your charitable deeds before men, to be seen by them. Otherwise you have no reward from your Father in heaven. Therefore, when you do a charitable deed, do not sound a trumpet before you as the hypocrites do in the synagogues and in the streets, that they may have glory from men. Assuredly, I say to you, they have their reward. But when you do a charitable deed, do not let your left hand know what your right hand is doing, that your charitable deed may be in secret; and your Father who sees in secret will Himself reward you openly." (Matthew 6:1-4, NKJV)

We play Christian Topper with an endless list of theological and moral issues, but it’s a poor Christian witness. Better we follow the admonition of our Lord and let our character speak - softly:

"Let your light so shine before men, that they may see your good works and glorify your Father in heaven." (Matthew 5:16 NKJV).

Excerpted from the upcoming book "Superbia" by Steven Willing, MD

Lost in Math

October 30, 2018 | book reviews | No Comments

Lost in Math: How Beauty Leads Physics Astray 
by Sabine Hossenfelder. Basic Books, June 2018. Reviewed by Steven Willing, MD

“Physical laws should have mathematical beauty”. Paul Dirac, Nobel Laureate*
Way back in 1973, the world of theoretical physics reached a dead end. That year marked the last successful prediction of any elemental particles - the top and bottom quarks - which were experimentally verified in 1995 and 1977 respectively. (The Higgs boson, finally detected in 2012, had been predicted in the 1960’s). Since then, there has been no successful prediction that would supersede the standard model.

In the intervening decades, dozens of additional particles have been predicted; not one has been found. String theory evolved and gained wide acceptance, without a shred of experimental verification. Proton decay has been sought but never observed. Dark matter remains dark to our own investigations. The search for a grand unified theory, based on the the holy grail of supersymmetry, has gone nowhere. Eighty years of effort failed to combine general relativity with the standard model. More exotic concepts - the multiverse, wormholes, extra dimensions, mini black holes - have eluded observation and may never be testable. Some ideas are untestable, even in theory.  

Not that we are lacking in achievement. What physicists call the “standard model” - where all matter and forces except for gravity are accounted for by 25 elemental particles and forces - has been wildly successful both in experimental validation and its predictive power. The same is true of quantum theory. There’s only one problem. Physicists hate them both. Nature, it seems, is too unnatural for their tastes. 

The standard model has been denigrated as “ugly and contrived” (Michio Kaku), “ugly and ad hoc” (Stephen Hawking), “ugly and baroque” (Brian Greene), with “the air of unfinished business” (Paul Davies). What troubles them so? Fine-tuning. Too many improbable coincidences. Too hard to understand or explain. Quantum mechanics is “magic”.  Too many arbitrary constants. (In the standard model, there are at least 19 unique constants that cannot be predicted by the model. They can only be determined by scientific measurement).  

The mass of the Higgs boson serves as a case in point.   Its mass depends on the contribution from quantum fluctuations multiplied by the fermion/boson sum. Quantum fluctuations contribute an amount to the mass of the Higgs boson 1015 greater than what is measured. To achieve the measured mass, the quantum fluctuation effect must be perfectly offset by a factor of 10-15, with a precision extending to fourteen digits.   In the eyes of physicists, such fine-tuning is not “natural”. It is an improbable coincidence. Fine-tuning is “a badge of shame” (Lisa Randall), “a sickness” (Howard Baer). It seems to demand an explanation. It is “ugly”. There are other trouble spots of fine tuning: the cosmological constant, the “strong CP problem”, and the great disparity between gravity and other forces (the “hierarchy problem”).

Theoretical physicist Sabine Hossenfelder suggests one reason physicists have hit a wall: in their philosophical quest for “Beauty” the world of theoretical physics has gotten “Lost in Math”. The popular perception of a scientist is that of one driven by cold, hard, objective, unswerving logic. Despite the stereotype, the theoretical physicists interviewed and cited  by Hossenfelder - all leaders in their field - seek, hope for, even insist upon solutions that are aesthetically satisfying.  To them, the ultimate explanation for everything should reveal elegance,  naturalness,  symmetry - all shrouded in mathematical beauty. Yet, there is a danger in this approach. If our present laws of nature were not beautiful, would we ever have found them?  Surely an ugly explanation beats no explanation at all. If a more fundamental theory is not “beautiful”, will we fail to find it? Or even look for it? What if ultimate reality is “ugly"? 

There are other barriers to progress. As high energy experiments from the Large Hadron Collider eliminate from consideration various testable hypotheses, successive hypotheses must assume even higher energy levels which may not be testable, ever: 

If we wanted to directly reach Planckian  energies, we’d need a particle collider about the size of the Milky Way. Or if we wanted to measure a quantum of the gravitational field - a graviton - the detector would have to be the size of Jupiter….Clearly, these aren’t experiments we’ll get funded anytime soon. 

To escape the current predicament, there are calls to abandon the scientific method by eliminating the requirement of experimental verification.  Physicist, philosopher, and string theory proponent Richard Dawid is advocating “non-empirical theory assessment”. With declining prospects of empirical validation, Dawid concludes that “the scientific method must be amended so that hypotheses can be evaluated on purely theoretical grounds.” But “if we can’t test it, is it science?” asks Hossenfelder. 

Hossenfelder is at various times lively, comic, and probing. She quips “Theoretical physicists used to explain what was observed. Now they try to explain why they can’t explain what was not observed…There are many ways to not explain something”.  

In her journey through the rarified world of particle physicists and cosmologists, Hossenfelder voices concern for how hostility to the idea of a God on the part of some harms the public image of science. In the course of their conversation, cosmologist George Ellis recalls his review of a book by Victor Stenger claiming that science disproves the existence of God: 

"I opened this book with great anticipation, waiting to see what was the experimental apparatus that gave the result and what did the data points look like and was it a three-sigma or five-sigma result? Of course, there is no such experiment. These are scientists who haven’t understood basic philosophy.” (God, the Failed Hypothesis, by Victor Stenger, reviewed by George Ellis in Physics World

“Lost in Math” portrays a community of researchers in philosophical crisis. The esteemed physicists interviewed in this book and its impressive author are to be congratulated on their efforts and their honesty. The genuine achievements of science are acknowledged and celebrated, while the limitations of science and of scientists are admitted frankly. Scientists are human, after all. 

Naturalness, beauty, simplicity are aesthetic and philosophical concepts, not scientific ones. While aggressive proponents of secularism accuse believers of irrationality for believing in a God that - they claim - cannot be proven, their rear guard is crumbling. The field of theoretical physics faces a headwall where empirical validation of foundational theories may no longer be possible. More foundational theories may ultimately be embraced on faith alone - so long as the mathematics is beautiful! 

In the world of physics, we find fine-tuning and mystery from the subatomic to the cosmic scale with rapidly diminishing prospects of natural explanation. It is possible we may never see deeper than we are currently able, that we have reached our limit of comprehension regarding the essence of underlying reality. Meanwhile, what can be proven is distressingly improbable. God must be smiling.

*In her biography of Paul Dirac, historian Helge Kragh noted that in the last 49 years of his life Dirac “largely failed to produce physics of lasting value”.