“Blessed is the nation that doesn’t need heroes" Goethe. “Hero-worship is strongest where there is least regard for human freedom.” Herbert Spencer
Search This Blog
Wednesday 12 October 2022
Wednesday 29 June 2022
Being smart makes our bias worse: Life is Poker not Chess - 3
Abridged and adapted from Thinking in Bets by Annie Duke
We bet based on what we believe about the world.This is very good news: part of the skill in life comes from learning to be a better belief calibrator, using experience and information to more objectively update our beliefs to more accurately represent the world. The more accurate our beliefs, the better the foundation of the bets we make. However there is also some bad news: our beliefs can be way, way off.
Hearing is believing
We form beliefs in a haphazard way, believing all sorts of things based just on what we hear out in the world but haven’t researched for ourselves.
This is how we think we form abstract beliefs:
We hear something
We think about it and vet it, determining whether it is true or false; only after that
We form our belief
It turns out though, that we actually form abstract beliefs this way:
We hear something
We believe it to be true
Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.
These belief formation methods had evolved due to a need for efficiency not accuracy. In fact, questioning what you see or hear could get you eaten in the jungle. However, assuming that you are no longer living in a jungle, we have failed to develop a high degree of scepticism to deal with materials available in the modern social media age. This general belief-formation process may affect our decision making in areas that can have significant consequences.
If we were good at updating our beliefs based on new information, our haphazard belief formation process might cause relatively few problems. Sadly, we form beliefs without vetting most of them, and maintain them even after receiving clear, corrective information.
Truthseeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information. Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.
The stubbornness of beliefs
Once a belief is lodged, it becomes difficult to dislodge. It takes on a life of its own, leading us to notice and seek out evidence confirming our belief, rarely challenge the validity of confirming evidence, and ignore or work hard to actively discredit information contradicting the belief. This irrational, circular information processing pattern is called motivated reasoning.
Fake news works because people who already hold beliefs consistent with the story generally won’t question the evidence. Disinformation is even more powerful because the confirmable facts in the story make it feel like the information has been vetted, adding to the power of the narrative being pushed.
Fake news isn’t meant to change minds. The potency of fake news is that it entrenches beliefs its intended audience already has, and then amplifies them. The Internet is a playground for motivated reasoning. It provides the promise of access to a greater diversity of information sources and opinions than we’ve ever had available. Yet, we gravitate towards sources that confirm our beliefs. Every flavour is out there, but we tend to stick with our favourite.
Even when directly confronted with facts that disconfirm our beliefs, we don’t let facts get in the way.
Being smart makes it worse
Surprisingly, being smart can actually make bias worse. The smarter you are, the better you are at constructing a narrative that supports your beliefs, rationalising and framing the data to fit your argument or point of view.
Blind spot bias - is an irrationality where people are better at recognising biased reasoning in others but are blind to bias in themselves. It was found that blind spot bias is greater the smarter you are. Furthermore, people who were aware of their own biases were not better able to overcome them.
Dan Kahan discovered that the more numerate people made more mistakes interpreting data on emotionally charged topics than the less numerate subjects sharing the same beliefs. It turns out the better you are with numbers, the better you are at spinning those numbers to conform to and support your beliefs.
Wanna bet
Imagine taking part in a conversation with a friend about the movie Chintavishtayaya Shyamala. Best film of all time, introduces a bunch of new techniques by which directors could contribute to story-telling. ‘Obviously, it won the national award’ you gush, as part of a list of superlatives the film unquestionably deserves.
Then your friend says, ‘Wanna bet?’
Suddenly, you are not so sure. That challenge puts you on your heels, causing you to back off your declaration and question the belief that you just declared with such assurance.
Remember the order in which we form abstract beliefs:
We hear something
We believe it to be true
Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.
‘Wanna bet?’ triggers us to engage in that third step that we only sometimes get to. Being asked if we're willing to bet money on it makes it much more likely that we will examine our information in a less biased way, be more honest with ourselves about how sure we are of our beliefs, and be more open to updating and calibrating our beliefs. The more objective we are, the more accurate our beliefs become. And the person who wins bets over the long run is the one with the more accurate beliefs.
Of course, in most instances, the person offering to bet isn’t actually looking to put any money on it. They are just making a point - a valid point that perhaps we overstated our conclusion or made our statement without including relevant caveats.
I’d imagine that if you went around challenging everyone with ‘Wanna bet?’ it would be difficult to make friends and you’d lose the ones you have. But that doesn’t mean we can’t change the framework for ourselves in the way we think about our biases. decisions and opinions.
Saturday 18 June 2022
Monday 30 May 2022
On Fact, Opinion and Belief
Annie Duke in 'Thinking in Bets'
What exactly is the difference between fact, opinion and belief?” .
Fact: (noun): A piece of information that can be backed up by evidence.
Belief: (noun): A state or habit of mind, in which trust or confidence is placed, in some person or thing. Something accepted or considered to be true.
Opinion: (noun): a view, judgement or appraisal formed in the mind about a particular matter.
The main difference here is that we can verify facts, but opinions and beliefs are not verifiable. Until relatively recently, most people would call facts things like numbers, dates, photographic accounts that we can all agree upon.
More recently, it has become commonplace to question even the most mundane objective sources of fact, like eyewitness accounts, and credible peer-reviewed science, but that is a topic for another day.
How we think we form our beliefs:
We hear something;
We think about it and vet it, determining whether it is true or false; only after that
We form our belief
Actually, we form our beliefs:
We hear something;
We believe it to be true;
Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.
Psychology professor Daniel Gilbert, “People are credulous creatures who find it very easy to believe and very difficult to doubt”.
Our default is to believe that what we hear and read is true. Even when that information is clearly presented as being false, we are still likely to process it as true.
For example, some people believe that we use only 10% of our brains. If you hold that belief, did you ever research it for yourself?
People usually say it is something they heard but they have no idea where or from whom. Yet they are confident that this is true. That should be proof enough that the way we form beliefs is foolish. And, we actually use all parts of our brain.
Our beliefs drive the way we process information. We form beliefs without vetting/testing most of them and we even maintain them even after receiving clear, corrective information.
Once a belief is lodged, it becomes difficult to dislodge it from our thinking. It takes a life of its own, leading us to notice and seek out evidence confirming our belief. We rarely challenge the validity of confirming evidence and ignore or work hard to actively discredit information contradicting the belief. This irrational, circular information-processing pattern is called motivated reasoning.
Truth Seeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information.
Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.
Fake news works because people who already hold beliefs consistent with the story generally won’t question the evidence. The potency of fake news is that it entrenches beliefs its intended audience already has, and then amplifies them. Social media is a playground for motivated reasoning. It provides the promise of access to a greater diversity of information sources and opinions than we’ve ever had available. Yet, we gravitate towards sources that confirm our beliefs, that agree with us. Every flavour is out there, but we tend to stick with our favourite.
Even when directly confronted with facts that disconfirm our beliefs, we don’t let facts get in the way of our opinions.
Being Smart Makes It Worse
The popular wisdom is that the smarter you are, the less susceptible you are to fake news or disinformation. After all, smart people are more likely to analyze and effectively evaluate where the information is coming from, right? Part of being ‘smart’ is being good at processing information, parsing the quality of an argument and the credibility of the source.
Surprisingly, being smart can actually make bias worse. The smarter you are, the better you are at constructing a narrative that supports your beliefs, rationalising and framing the data to fit your argument or point of view.
Unfortunately, this is just the way evolution built us. We are wired to protect our beliefs even when our goal is to truthseek. This is one of the instances where being smart and aware of our capacity for irrationality alone doesn’t help us refrain from biased reasoning. As with visual illusions, we can’t make our minds work differently than they do no matter how smart we are. Just as we can’t unsee an illusion, intellect or will power alone can’t make us resist motivated reasoning.
Wanna Bet?
Imagine taking part in a conversation with a friend about the movie Citizen Kane. BEst film of all time, introduced a bunch of new techniques by which directors could contribute to storytelling. “Obviously, it won the best picture Oscar,” you gush, as part of a list of superlatives the film unquestionably deserves.
Then your friend says,”Wanna bet?”
Suddenly, you’re not so sure. That challenge puts you on your heels, causing you to back off your declaration and question the belief that you just declared with such assurance.
When someone challenges us to bet on a belief, signalling their confidence that our belief is inaccurate in some way, ideally it triggers us to vet the belief, taking an inventory of the evidence that informed us.
How do I know this?
Where did I get this information?
Who did I get it from?
What is the quality of my sources?
How much do I trust them?
How up to date is my information?
How much information do I have that is relevant to the belief?
What other things like this have I been confident about that turned out not to be true?
What are the other plausible alternatives?
What do I know about the person challenging my belief?
What is their view of how credible my opinion is?
What do they know that I don’t know?
What is their level of expertise?
What am I missing?
Remember the order in which we form our beliefs:
We hear something;
We believe it to be true;
Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.
“Wanna bet?” triggers us to engage in that third step that we only sometimes get to. Being asked if we are willing to bet money on it makes it much more likely that we will examine our information in a less biased way, be more honest with ourselves about how sure we are of our beliefs, and be more open to updating and calibrating our beliefs.
A lot of good can result from someone saying, “Wanna bet?” Offering a wager brings the risk out in the open, making explicit what is implicit (and frequently overlooked). The more we recognise that we are betting on our beliefs (with our happiness, attention, health, money time or some other limited resource), the more we are likely to temper our statements, getting closer to the truth as we acknowledge the risk inherent in what we believe.
Once we start doing this (at the risk of losing friends), we are more likely to recognise that there is always a degree of uncertainty, that we are generally less sure than we thought we were, that practically nothing is black and white 0% or 100%. And that's a pretty good philosophy for living.
Wednesday 28 July 2021
Friday 4 June 2021
Have you seen Groupthink in action?
Tim Harford in The FT
Thursday 6 May 2021
Sunday 20 May 2018
Why the 'Right to Believe' Is Not a Right to Believe Whatever You Want
Do we have the right to believe whatever we want to believe? This supposed right is often claimed as the last resort of the wilfully ignorant, the person who is cornered by evidence and mounting opinion: ‘I believe climate change is a hoax whatever anyone else says, and I have a right to believe it!’ But is there such a right?
We do recognise the right to know certain things. I have a right to know the conditions of my employment, the physician’s diagnosis of my ailments, the grades I achieved at school, the name of my accuser and the nature of the charges, and so on. But belief is not knowledge.
Beliefs are factive: to believe is to take to be true. It would be absurd, as the analytic philosopher G.E. Moore observed in the 1940s, to say: ‘It is raining, but I don’t believe that it is raining.’ Beliefs aspire to truth – but they do not entail it. Beliefs can be false, unwarranted by evidence or reasoned consideration. They can also be morally repugnant. Among likely candidates: beliefs that are sexist, racist or homophobic; the belief that proper upbringing of a child requires ‘breaking the will’ and severe corporal punishment; the belief that the elderly should routinely be euthanised; the belief that ‘ethnic cleansing’ is a political solution, and so on. If we find these morally wrong, we condemn not only the potential acts that spring from such beliefs, but the content of the belief itself, the act of believing it, and thus the believer.
Such judgments can imply that believing is a voluntary act. But beliefs are often more like states of mind or attitudes than decisive actions. Some beliefs, such as personal values, are not deliberately chosen; they are ‘inherited’ from parents and ‘acquired’ from peers, acquired inadvertently, inculcated by institutions and authorities, or assumed from hearsay. For this reason, I think, it is not always the coming-to-hold-this-belief that is problematic; it is rather the sustaining of such beliefs, the refusal to disbelieve or discard them that can be voluntary and ethically wrong.
If the content of a belief is judged morally wrong, it is also thought to be false. The belief that one race is less than fully human is not only a morally repugnant, racist tenet; it is also thought to be a false claim – though not by the believer. The falsity of a belief is a necessary but not sufficient condition for a belief to be morally wrong; neither is the ugliness of the content sufficient for a belief to be morally wrong. Alas, there are indeed morally repugnant truths, but it is not the believing that makes them so. Their moral ugliness is embedded in the world, not in one’s belief about the world.
‘Who are you to tell me what to believe?’ replies the zealot. It is a misguided challenge: it implies that certifying one’s beliefs is a matter of someone’s authority. It ignores the role of reality. Believing has what philosophers call a ‘mind-to-world direction of fit’. Our beliefs are intended to reflect the real world – and it is on this point that beliefs can go haywire. There are irresponsible beliefs; more precisely, there are beliefs that are acquired and retained in an irresponsible way. One might disregard evidence; accept gossip, rumour, or testimony from dubious sources; ignore incoherence with one’s other beliefs; embrace wishful thinking; or display a predilection for conspiracy theories.
I do not mean to revert to the stern evidentialism of the 19th-century mathematical philosopher William K Clifford, who claimed: ‘It is wrong, always, everywhere, and for anyone, to believe anything upon insufficient evidence.’ Clifford was trying to prevent irresponsible ‘overbelief’, in which wishful thinking, blind faith or sentiment (rather than evidence) stimulate or justify belief. This is too restrictive. In any complex society, one has to rely on the testimony of reliable sources, expert judgment and the best available evidence. Moreover, as the psychologist William James responded in 1896, some of our most important beliefs about the world and the human prospect must be formed without the possibility of sufficient evidence. In such circumstances (which are sometimes defined narrowly, sometimes more broadly in James’s writings), one’s ‘will to believe’ entitles us to choose to believe the alternative that projects a better life.
In exploring the varieties of religious experience, James would remind us that the ‘right to believe’ can establish a climate of religious tolerance. Those religions that define themselves by required beliefs (creeds) have engaged in repression, torture and countless wars against non-believers that can cease only with recognition of a mutual ‘right to believe’. Yet, even in this context, extremely intolerant beliefs cannot be tolerated. Rights have limits and carry responsibilities.
Unfortunately, many people today seem to take great licence with the right to believe, flouting their responsibility. The wilful ignorance and false knowledge that are commonly defended by the assertion ‘I have a right to my belief’ do not meet James’s requirements. Consider those who believe that the lunar landings or the Sandy Hook school shooting were unreal, government-created dramas; that Barack Obama is Muslim; that the Earth is flat; or that climate change is a hoax. In such cases, the right to believe is proclaimed as a negative right; that is, its intent is to foreclose dialogue, to deflect all challenges; to enjoin others from interfering with one’s belief-commitment. The mind is closed, not open for learning. They might be ‘true believers’, but they are not believers in the truth.
Believing, like willing, seems fundamental to autonomy, the ultimate ground of one’s freedom. But, as Clifford also remarked: ‘No one man’s belief is in any case a private matter which concerns himself alone.’ Beliefs shape attitudes and motives, guide choices and actions. Believing and knowing are formed within an epistemic community, which also bears their effects. There is an ethic of believing, of acquiring, sustaining, and relinquishing beliefs – and that ethic both generates and limits our right to believe. If some beliefs are false, or morally repugnant, or irresponsible, some beliefs are also dangerous. And to those, we have no right.
Thursday 28 December 2017
I used to think people made rational decisions. But now I know I was wrong
It’s been coming on for a while, so I can’t claim any eureka moment. But something did crystallise this year. What I changed my mind about was people. More specifically, I realised that people cannot be relied upon to make rational choices. We would have fixed global warming by now if we were rational. Instead, there’s a stubborn refusal to let go of the idea that environmental degradation is a “debate” in which there are “two sides”.
Debating is a great way of exploring an issue when there is real room for doubt and nuance. But when a conclusion can be reached simply by assembling a mountain of known facts, debating is just a means of pitting the rational against the irrational.
Humans like to think we are rational. Some of us are more rational than others. But, essentially, we are all slaves to our feelings and emotions. The trick is to realise this, and be sure to guard against it. It’s something that, in the modern world, we are not good at. Authentic emotions are valued over dry, dull authentic evidence at every turn.
I think that as individuality has become fetishised, our belief in our right to make half-formed snap judgments, based on little more than how we feel, has become problematically unchallengeable. When Uma Thurman declared that she would wait for her anger to abate before she spoke about Harvey Weinstein, it was, I believe, in recognition of this tendency to speak first and think later.
Good for her. The value of calm reasoning is not something that one sees acknowledged very often at the moment. Often, the feelings and emotions that form the basis of important views aren’t so very fine. Sometimes humans understand and control their emotions so little that they sooner or later coagulate into a roiling soup of anxiety, fear, sadness, self-loathing, resentment and anger which expresses itself however it can, finding objects to project its hurt and confusion on to. Like immigrants. Or transsexuals. Or liberals. Or Tories. Or women. Or men.
Even if the desire to find living, breathing scapegoats is resisted, untrammelled emotion can result in unwise and self-defeating decisions, devoid of any rationality. Rationality is a tool we have created to govern our emotions. That’s why education, knowledge, information is the cornerstone of democracy. And that’s why despots love ignorance.
Sometimes we can identify and harness the emotions we need to get us through the thing we know, rationally, that we have to do. It’s great when you’re in the zone. Even negative emotions can be used rationally. I, for example, use anger a lot in my work. I’m writing on it at this moment, just as much as I’m writing on a computer. I’ll stop in a moment. I’ll reach for facts to calm myself. I’ll reach for facts to make my emotions seem rational. Or maybe that’s just me. Whatever that means.
‘‘Consciousness’ involves no executive or causal relationship with any of the psychological processes attributed to it'David Oakley and Peter Halligan
It’s a fact that I can find some facts to back up my feelings about people. Just writing that down helps me to feel secure and in control. The irrationality of humans has been considered a fact since the 1970s, when two psychologists, Amos Tversky and Daniel Kahneman, showed that human decisions were often completely irrational, not at all in their own interests and based on “cognitive biases”. Their ideas were a big deal, and also formed the basis of Michael Lewis’s book, The Undoing Project.
More recent research – or more recent theory, to be precise – has rendered even Tversky and Kahneman’s ideas about the unreliability of the human mind overly rational.
Chasing the Rainbow: The Non-Conscious Nature of Being is a research paper from University College London and Cardiff University. Its authors, David Oakley and Peter Halligan, argue “that ‘consciousness’ contains no top-down control processes and that ‘consciousness’ involves no executive, causal, or controlling relationship with any of the familiar psychological processes conventionally attributed to it”.
Which can only mean that even when we think we’re being rational, we’re not even really thinking. That thing we call thinking – we don’t even know what it really is.
When I started out in journalism, opinion columns weren’t a big thing. Using the word “I’ in journalism was frowned upon. The dispassionate dissemination of facts was the goal to be reached for.
Now so much opinion is published, in print and online, and so many people offer their opinions about the opinions, that people in our government feel comfortable in declaring that experts are overrated, and the president of the United States regularly says that anything he doesn’t like is “fake news”.
So, people. They’re a problem. That’s what I’ve decided. I’m part of a big problem. All I can do now is get my message out there.
Sunday 15 May 2016
How Little do Experts Know- On Ranieri and Leicester, One Media Expert Apologises
Marcus Christenson in The Guardian
No one likes to be wrong. It is much nicer to be right. In life, however, it is not possible to be right all the time. We all try our best but there are times when things go horribly wrong.
I should know. In July last year I sat down to write an article about Claudio Ranieri. The 63-year-old had just been appointed the new manager of Leicester City and I decided, in the capacity of being the football editor at the Guardian, that I was the right person to write that piece.
Claudio Ranieri: the anti-Pearson … and the wrong man for Leicester City?
I made that decision based on the following: I have lived and worked as a journalist in Italy and have followed Ranieri’s career fairly closely since his early days in management. I also made sure that I spoke to several people in Greece, where Ranieri’s last job before replacing Nigel Pearson at Leicester, had ended in disaster with the team losing against the Faroe Islands and the manager getting sacked.
It was quite clear to me that this was a huge gamble by Leicester and that it was unlikely to end well. And I was hardly the only one to be sceptical. Gary Lineker, the former Leicester striker and now Match of the Day presenter, tweeted “Claudio Ranieri? Really?” and followed it up with by saying: “Claudio Ranieri is clearly experienced, but this is an uninspired choice by Leicester. It’s amazing how the same old names keep getting a go on the managerial merry-go-round.”
I started my article by explaining what had gone wrong in Greece (which was several things) before moving on to talk about the rest of his long managerial career, pointing out that he had never won a league title in any country and nor had he stayed at any club for more than two seasons since being charge at Chelsea at the beginning of the 2000s.
I threw in some light-hearted “lines”, such as the fact that he was the manager in charge of Juventus when they signed Christian Poulsen (not really a Juventus kind of player) and proclaimed that the appointment was “baffling”.
I added: “In some ways, it seems as if the Leicester owners went looking for the anti-Nigel Pearson. Ranieri is not going to call a journalist an ostrich. He is not going to throttle a player during a match. He is not going to tell a supporter to ‘fuck off and die’, no matter how bad the abuse gets.”
Claudio Ranieri instructs his players during Greece’s defeat by the Faroe Islands, the Italian’s last game in charge of the Euro 2004 winners. Photograph: Thanassis Stavrakis/AP
Rather pleased with myself – thinking that I was giving the readers a good insight to the man and the manager – I also put a headline on the piece, which read: “Claudio Ranieri: the anti-Pearson … and the wrong man for Leicester City?”
I did not think much more of the piece until a few months later when Leicester were top of the league and showing all the signs of being capable of staying there.
After a while, the tweets started to appear from people pointing out that I may not have called this one right. As the season wore on, these tweets became more and more frequent, and they have been sent to me after every Leicester win since the turn of the year.
At some point in February I decided to go back and look at the piece again. It made for uncomfortable reading. I had said that describing his spell in charge of Greece as “poor” would be an understatement. I wrote that 11 years after being given the nickname “Tinkerman” because he changed his starting XI so often when in charge of Chelsea, he was still an incorrigible “Tinkerman”.
It gets worse. “Few will back him to succeed but one thing is for sure: he will conduct himself in an honourable and humble way, as he always has done,” the articles said. “If Leicester wanted someone nice, they’ve got him. If they wanted someone to keep them in the Premier League, then they may have gone for the wrong guy.”
Ouch. Reading it back again I was faced with a couple of uncomfortable questions, the key one being “who do you think you are, writing such an snobbish piece about a dignified man and a good manager?”
The second question was a bit easier to answer. Was this as bad as the “In defence of Nicklas Bendtner” article I wrote a couple of years ago? (The answer is “no”, by the way, few things come close to an error of judgment of that scale).
I would like to point out a few things though. I did get – as a very kind colleague pointed out – 50% of that last paragraph right. He clearly is a wonderful human being and when Paolo Bandini spoke to several of his former players recently one thing stood out: the incredible affection they still feel for this gentle 64-year-old.
All in all, though, there is no point defending the indefensible: I could not have got it more wrong.
At the start of this piece I said that no one likes to be wrong. Well, I was wrong about that too. I’ve enjoyed every minute of being embarrassingly wrong this season. Leicester is the best story that could have happened to football in this country, their triumph giving hope to all of us who want to start a season dreaming that something unthinkable might happen.
So thank you Leicester and thank you Claudio, it’s been quite wonderful.
Thursday 14 May 2015
The troubling flaws in forensic science
What’s needed are additional safeguards to shield forensic examiners against irrelevant information that might skew their judgement. A first step is to ensure they aren’t given irrelevant information, such as knowing that witnesses have placed the suspect at the crime scene, or that he has previous convictions for similar crimes. Another safeguard is to reveal the relevant information sequentially – and only when it is needed. “We need to give them the information that they need to do their job when they need it, but not extra information that’s irrelevant to what they’re doing and which could influence their perception and judgement,” says Dror.