Abridged and adapted from Thinking in Bets by Annie Duke
We bet based on what we believe about the world.This is very good news: part of the skill in life comes from learning to be a better belief calibrator, using experience and information to more objectively update our beliefs to more accurately represent the world. The more accurate our beliefs, the better the foundation of the bets we make. However there is also some bad news: our beliefs can be way, way off.
Hearing is believing
We form beliefs in a haphazard way, believing all sorts of things based just on what we hear out in the world but haven’t researched for ourselves.
This is how we think we form abstract beliefs:
We hear something
We think about it and vet it, determining whether it is true or false; only after that
We form our belief
It turns out though, that we actually form abstract beliefs this way:
We hear something
We believe it to be true
Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.
These belief formation methods had evolved due to a need for efficiency not accuracy. In fact, questioning what you see or hear could get you eaten in the jungle. However, assuming that you are no longer living in a jungle, we have failed to develop a high degree of scepticism to deal with materials available in the modern social media age. This general belief-formation process may affect our decision making in areas that can have significant consequences.
If we were good at updating our beliefs based on new information, our haphazard belief formation process might cause relatively few problems. Sadly, we form beliefs without vetting most of them, and maintain them even after receiving clear, corrective information.
Truthseeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information. Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.
The stubbornness of beliefs
Once a belief is lodged, it becomes difficult to dislodge. It takes on a life of its own, leading us to notice and seek out evidence confirming our belief, rarely challenge the validity of confirming evidence, and ignore or work hard to actively discredit information contradicting the belief. This irrational, circular information processing pattern is called motivated reasoning.
Fake news works because people who already hold beliefs consistent with the story generally won’t question the evidence. Disinformation is even more powerful because the confirmable facts in the story make it feel like the information has been vetted, adding to the power of the narrative being pushed.
Fake news isn’t meant to change minds. The potency of fake news is that it entrenches beliefs its intended audience already has, and then amplifies them. The Internet is a playground for motivated reasoning. It provides the promise of access to a greater diversity of information sources and opinions than we’ve ever had available. Yet, we gravitate towards sources that confirm our beliefs. Every flavour is out there, but we tend to stick with our favourite.
Even when directly confronted with facts that disconfirm our beliefs, we don’t let facts get in the way.
Being smart makes it worse
Surprisingly, being smart can actually make bias worse. The smarter you are, the better you are at constructing a narrative that supports your beliefs, rationalising and framing the data to fit your argument or point of view.
Blind spot bias - is an irrationality where people are better at recognising biased reasoning in others but are blind to bias in themselves. It was found that blind spot bias is greater the smarter you are. Furthermore, people who were aware of their own biases were not better able to overcome them.
Dan Kahan discovered that the more numerate people made more mistakes interpreting data on emotionally charged topics than the less numerate subjects sharing the same beliefs. It turns out the better you are with numbers, the better you are at spinning those numbers to conform to and support your beliefs.
Wanna bet
Imagine taking part in a conversation with a friend about the movie Chintavishtayaya Shyamala. Best film of all time, introduces a bunch of new techniques by which directors could contribute to story-telling. ‘Obviously, it won the national award’ you gush, as part of a list of superlatives the film unquestionably deserves.
Then your friend says, ‘Wanna bet?’
Suddenly, you are not so sure. That challenge puts you on your heels, causing you to back off your declaration and question the belief that you just declared with such assurance.
Remember the order in which we form abstract beliefs:
We hear something
We believe it to be true
Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.
‘Wanna bet?’ triggers us to engage in that third step that we only sometimes get to. Being asked if we're willing to bet money on it makes it much more likely that we will examine our information in a less biased way, be more honest with ourselves about how sure we are of our beliefs, and be more open to updating and calibrating our beliefs. The more objective we are, the more accurate our beliefs become. And the person who wins bets over the long run is the one with the more accurate beliefs.
Of course, in most instances, the person offering to bet isn’t actually looking to put any money on it. They are just making a point - a valid point that perhaps we overstated our conclusion or made our statement without including relevant caveats.
I’d imagine that if you went around challenging everyone with ‘Wanna bet?’ it would be difficult to make friends and you’d lose the ones you have. But that doesn’t mean we can’t change the framework for ourselves in the way we think about our biases. decisions and opinions.