Search This Blog

Monday 30 May 2022

On Fact, Opinion and Belief

 Annie Duke in 'Thinking in Bets'


What exactly is the difference between fact, opinion and belief?” .

Fact: (noun): A piece of information that can be backed up by evidence.

Belief: (noun): A state or habit of mind, in which trust or confidence is placed, in some person or thing. Something accepted or considered to be true.

Opinion: (noun): a view, judgement or appraisal formed in the mind about a particular matter.

The main difference here is that we can verify facts, but opinions and beliefs are not verifiable. Until relatively recently, most people would call facts things like numbers, dates, photographic accounts that we can all agree upon.

More recently, it has become commonplace to question even the most mundane objective sources of fact, like eyewitness accounts, and credible peer-reviewed science, but that is a topic for another day.

 How we think we form our beliefs:

  1. We hear something;

  2. We think about it and vet it, determining whether it is true or false; only after that

  3. We form our belief

Actually, we form our beliefs:

  1. We hear something;

  2. We believe it to be true;

  3. Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.


Psychology professor Daniel Gilbert, “People are credulous creatures who find it very easy to believe and very difficult to doubt”. 


Our default is to believe that what we hear and read is true. Even when that information is clearly presented as being false, we are still likely to process it as true.

For example, some people believe that we use only 10% of our brains. If you hold that belief, did you ever research it for yourself?

People usually say it is something they heard but they have no idea where or from whom. Yet they are confident that this is true. That should be proof enough that the way we form beliefs is foolish. And, we actually use all parts of our brain.

Our beliefs drive the way we process information. We form beliefs without vetting/testing most of them and we even maintain them even after receiving clear, corrective information.

Once a belief is lodged, it becomes difficult to dislodge it from our thinking. It takes a life of its own, leading us to notice and seek out evidence confirming our belief. We rarely challenge the validity of confirming evidence and ignore or work hard to actively discredit information contradicting the belief.  This irrational, circular information-processing pattern is called motivated reasoning.

Truth Seeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information.

Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.

Fake news works because people who already hold beliefs consistent with the story generally won’t question the evidence. The potency of fake news is that it entrenches beliefs its intended audience already has, and then amplifies them. Social media is a playground for motivated reasoning. It provides the promise of access to a greater diversity of information sources and opinions than we’ve ever had available. Yet, we gravitate towards sources that confirm our beliefs, that agree with us. Every flavour is out there, but we tend to stick with our favourite. 

Even when directly confronted with facts that disconfirm our beliefs, we don’t let facts get in the way of our opinions.


Being Smart Makes It Worse


The popular wisdom is that the smarter you are, the less susceptible you are to fake news or disinformation. After all, smart people are more likely to analyze and effectively evaluate where the information is coming from, right? Part of being ‘smart’ is being good at processing information, parsing the quality of an argument and the credibility of the source.


Surprisingly, being smart can actually make bias worse. The smarter you are, the better you are at constructing a narrative that supports your beliefs, rationalising and framing the data to fit your argument or point of view.


Unfortunately, this is just the way evolution built us. We are wired to protect our beliefs even when our goal is to truthseek. This is one of the instances where being smart and aware of our capacity for irrationality alone doesn’t help us refrain from biased reasoning. As with visual illusions, we can’t make our minds work differently than they do no matter how smart we are. Just as we can’t unsee an illusion, intellect or will power alone can’t make us resist motivated reasoning.


Wanna Bet?


Imagine taking part in a conversation with a friend about the movie Citizen Kane. BEst film of all time, introduced a bunch of new techniques by which directors could contribute to storytelling. “Obviously, it won the best picture Oscar,” you gush, as part of a list of superlatives the film unquestionably deserves.


Then your friend says,”Wanna bet?”


Suddenly, you’re not so sure. That challenge puts you on your heels, causing you to back off your declaration and question the belief that you just declared with such assurance.


When someone challenges us to bet on a belief, signalling their confidence that our belief is inaccurate in some way, ideally it triggers us to vet the belief, taking an inventory of the evidence that informed us.


  • How do I know this?

  • Where did I get this information?

  • Who did I get it from?

  • What is the quality of my sources?

  • How much do I trust them?

  • How up to date is my information?

  • How much information do I have that is relevant to the belief?

  • What other things like this have I been confident about that turned out not to be true?

  • What are the other plausible alternatives?

  • What do I know about the person challenging my belief?

  • What is their view of how credible my opinion is?

  • What do they know that I don’t know?

  • What is their level of expertise?

  • What am I missing?


Remember the order in which we form our beliefs:


  1. We hear something;

  2. We believe it to be true;

  3. Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.


“Wanna bet?” triggers us to engage in that third step that we only sometimes get to. Being asked if we are willing to bet money on it makes it much more likely that we will examine our information in a less biased way, be more honest with ourselves about how sure we are of our beliefs, and be more open to updating and calibrating our beliefs.


A lot of good can result from someone saying, “Wanna bet?” Offering a wager brings the risk out in the open, making explicit what is implicit (and frequently overlooked). The more we recognise that we are betting on our beliefs (with our happiness, attention, health, money time or some other limited resource), the more we are likely to temper our statements, getting closer to the truth as we acknowledge the risk inherent in what we believe.


Once we start doing this (at the risk of losing friends), we are more likely to recognise that there is always a degree of uncertainty, that we are generally less sure than we thought we were, that practically nothing is black and white 0% or 100%. And that's a pretty good philosophy for living.

No comments:

Post a Comment