“The fool doth think he is wise, but the wise man knows himself to be a fool.”
William Shakespeare, As You Like It, Act 5, Scene 1


When the Oracle of Delphi declared Socrates the wisest of men, he was skeptical enough to investigate. He interviewed politicians, poets, and craftsmen — everyone reputed to be wise — and found the same pattern everywhere: people who confidently claimed knowledge they did not possess. His conclusion was precise rather than modest: I know that I know nothing meant not that knowledge is impossible, but that wisdom begins with accurate awareness of where your knowledge actually ends.

Twenty-five centuries later, cognitive scientists have a technical term for what Socrates was describing: metacognitive sensitivity — the ability to calibrate your confidence to your actual accuracy, to know when you know and when you don’t. A recent study in Personality and Individual Differences has now established something the ancient world could only observe: that this capacity is not merely a philosophical virtue but a measurable cognitive one, and that it lies at the heart of intellectual humility itself.

Key Terms

The study distinguishes three things that are easy to conflate.

Cognition is what you believe about the world. Metacognition is what you believe about your own believing — your sense of how confident, accurate, or reliable your judgments are. In Superbia, I emphasized that both can be distorted by pride.

Within metacognition, the researchers draw a further distinction. Metacognitive sensitivity — which they call meta-d’ — is the degree to which your confidence tracks your actual accuracy on a trial-by-trial basis. High sensitivity means you feel confident when you’re right and uncertain when you’re wrong. Low sensitivity means your confidence floats free of reality; you feel equally sure whether you’re correct or not.

Metacognitive bias is your overall confidence level regardless of accuracy — whether you tend, as a person, to feel certain or to feel doubtful. This is what most people picture when they imagine intellectual humility: a chronic sense of uncertainty, a reflexive tentativeness. The study’s most counterintuitive finding is that this picture is wrong.

What the Study Did

Fischer, Kause, and Huff recruited 999 American adults, quota-matched to the national population by age, gender, and geography. Before anything else, participants completed a validated psychological questionnaire designed to measure intellectual humility as a stable personality trait — not a mood or a momentary posture, but a characteristic way of relating to one’s own beliefs. Questions probed things like willingness to revise opinions, openness to other viewpoints, and awareness of personal fallibility. This established each participant’s humility score independently, before any performance data existed, which is what allows the researchers to treat intellectual humility as a genuine predictor of what followed rather than merely a description of it.

Participants then read four short summaries of fictitious studies on renewable energy and answered questions designed to test whether they could distinguish accurate from inaccurate interpretations of the evidence. After each answer, they rated their confidence on a scale from 50% (pure guessing) to 100% (certain).

The topic was chosen deliberately. Climate change is one of the most politically polarized subjects in American public life — exactly the kind of domain where motivated reasoning runs hot and accurate thinking runs cold. If intellectual humility makes a cognitive difference, this is where you’d want to look for it.

What the Study Found

The study confirmed three essential findings.

First, more intellectually humble participants were simply better at distinguishing correct from incorrect interpretations of the contested evidence. Humility predicted accuracy. This is the study’s most straightforward result, and reflects the core theme in Chapter 3 of Superbia: “Unshakeable confidence in one’s own beliefs is an invincible barrier against correction. People who understand and admit their own fallibility have taken the first giant step away from pride and toward humility.”

Second — and here is where the findings become genuinely surprising — more intellectually humble people displayed higher metacognitive sensitivity. They reported higher confidence when their interpretations were correct and lower confidence when they were wrong. Their confidence was calibrated. It moved with their actual performance rather than floating at a fixed level regardless of what they actually got right or wrong.

This is a precise cognitive portrait of what classical tradition calls docility — the virtue of being easily taught. (This must be distinguished from a second meaning, that of being easily led or manipulated). Thomas Aquinas identified docility as the first act of prudence: the disposition to learn, which requires first knowing what you do not know. The Fischer study puts numbers to it.

Third — and most counterintuitively — intellectually humble people were not less confident overall. They were actually more confident on average. But that higher confidence was entirely explained by their superior performance. They were more confident because they were more often right. The study authors put it plainly: higher confidence among the intellectually humble “was, in fact, justified — rooted in objectively superior task performance.”

This dismantles a common misconception. Intellectual humility is not mental insecurity. It is not the habit of hedging every sentence or expressing uncertainty about things you genuinely know. It is the capacity to match your confidence to your competence — to hold strong views where the evidence warrants, and to hold them loosely where it doesn’t.

In Superbia, I cited research showing these principles in the political realm:

Subjects coming from both political extremes showed less insight into the accuracy of their decisions and were less likely to change their decision in response to contradictory evidence… what we could call a failure of metacognition.

The Fischer study is the positive side of that principle. Where the political study showed that low humility tracks with low metacognitive sensitivity, this one shows that high humility tracks with high metacognitive sensitivity — and the effect held even after controlling for political attitude, age, gender, and education.

1 Corinthians 8:2 captures this principle in a single sentence: “If anyone thinks he knows anything, he knows nothing yet as he ought to know.” Paul is not describing epistemological skepticism. He is describing miscalibration — the state of being confident without being accurate. That is metacognitive failure, and it is, in Paul’s account, the default condition of the unreflective mind.

The Fischer study also tested whether intellectual humility might simply be a proxy for being smarter. It is not. The researchers separated metacognitive sensitivity from task performance statistically, and the relationship held independently. Humble people weren’t just better performers whose better performance made them better calibrated. Their calibration was a feature of their humility in its own right.

The Theological Payoff

The ancient Greek diagnosis, as we have seen, was observational. Socrates found that people who claimed knowledge actually lacked it. He could describe the problem with surgical precision. What he could not supply was an account of why this is the human default — why we systematically overestimate our accuracy, why confidence so readily outstrips competence.

Scripture supplies the explanation. Pride is not merely a bad habit or a cognitive quirk. It is a moral and spiritual condition rooted in the Fall, a fundamental orientation away from God and toward self that distorts every faculty, including the intellect. As I wrote in Superbia: “Our problem is not merely one of wrong beliefs. The far greater problem is that we are so sure we’re right.”

The Fischer study delivers empirical confirmation of what Proverbs 26:12 said three thousand years earlier: “Do you see a man wise in his own eyes? There is more hope for a fool than for him.” The problem is not ignorance — it is the confidence that masquerades as knowledge and forecloses the correction that wisdom requires.

Intellectual humility, then, is not a personal disposition toward indecisiveness. It is a cognitive virtue with measurable consequences for how well we navigate reality. It is the foundational principle for sound Christian thinking. Increasingly, research is vindicating the ancient notion that the epistemically virtuous person — the one who knows what she knows and knows what she doesn’t — reasons more accurately. And the advantage is greatest in politically charged terrain where accurate reasoning is hardest.

“When pride comes, then comes shame; But with the humble is wisdom.” Proverbs 11:2