Search This Blog

Thursday, 1 September 2022

Why intellectual humility matters

We should all nurture the ability to recognise our own cognitive biases and to admit when we’re wrong writes JEMIMA KELLY in The FT


What makes some people believe in conspiracy theories and false news reports more than others? Is it their political or religious perspective? Is it a lack of formal education? Or is it more about their age, gender or socio-economic background? 

A recently published study suggests that more important than any of these factors is another characteristic: the extent to which someone has — or does not have — intellectual humility. 

Intellectual humility can be thought of as a willingness to recognise our own cognitive limitations and biases, to admit when we’re wrong, and to be more interested in understanding the truth of an issue than in being right. Its spirit is captured nicely by the quote often attributed (probably wrongly) to John Maynard Keynes: “When the facts change, I change my mind — what do you do, sir?” 

In their study, Marco Meyer and Mark Alfano — academics who specialise in social epistemology, a field at the intersection of philosophy and psychology — found those who possess this virtue are much better at differentiating between accurate news reports and false ones. They suggest that having intellectual humility was a better predictor of someone’s ability to resist fake news than any of the other factors they looked at. 

In another study published last year, Meyer and Alfano found a strong correlation between “epistemic vice” (the lack of intellectual humility) and belief in false information about Covid-19, with a coefficient of 0.76. The next strongest link was with religiosity, with a moderate coefficient of 0.46. And while they did find a weak correlation between intelligence — measured by exam results, education level, and performance on a cognitive reflection test — and belief in false information, they say there is no link between intelligence and intellectual humility. 

“When you’re intelligent, you can actually be more susceptible to certain kinds of disinformation, because you’re more likely to be able to rationalise your beliefs,” says Meyer, who is based at the University of Hamburg. Intellectual humility is, he suggests “super-important . . . as a counterweight, almost, against intelligence.” 

You might think such a virtue would be almost impossible to measure, but Meyer and Alfano’s work suggests that self-reported intellectual humility — based on asking respondents to rate the extent to which they agree with statements such as “I often have strong opinions about issues I don’t know much about” — is quite effective. And other studies have shown positive correlations between self-reported and peer-reported intellectual humility, with the former generally seen as a more accurate gauge. 

You might also worry that, given the liberal over-representation in academia, the examples used in these studies would skew towards rightwing falsehoods or conspiracies. But the researchers say they were careful to ensure balance. In the case of Covid misinformation, they asked participants about their beliefs in widely disputed areas, such as hand dryers being effective in killing the virus, rather than more contested ones such as the effectiveness of masks and lockdowns, or the origins of the virus. 

Intellectual humility is important not just in preventing the spread of misinformation. Other studies have found that it is associated with so-called “mastery behaviours” such as seeking out challenging work and persisting after failures, and it is also linked to less political “myside bias”. 

However, this quality is not easy to cultivate. A recent study suggests that repeatedly exposing students to their own errors, such as by getting them involved in forecasting tournaments, could be effective. I have argued before that social media platforms such as Twitter should institute a “challenger mode” that exposes us to beliefs we don’t normally come across; another trick might be to implement a practice of “steelmanning”, a term that appears to have been coined by the blogger Chana Messinger. She describes it as “the art of addressing the best form of the other person’s argument, even if it’s not the one they presented” — the opposite of a straw-man, in other words. 

Of course, there are limits to intellectual humility: beyond a certain point it becomes self-indulgent and can render us indecisive. Running a country — writing a column, even — requires a level of conviction, and sometimes that means faking it a bit and hoping for the best. So we should cultivate other virtues too, such as courage and the ability to take action. 

But fostering an environment in which we reward uncertainty and praise those who acknowledge their errors is vital. Saying “I was wrong”, and explaining why, is often much more valuable than insisting “I was right”.

No comments:

Post a Comment