Search This Blog

Showing posts with label motivated. Show all posts
Showing posts with label motivated. Show all posts

Monday 30 May 2022

On Fact, Opinion and Belief

 Annie Duke in 'Thinking in Bets'


What exactly is the difference between fact, opinion and belief?” .

Fact: (noun): A piece of information that can be backed up by evidence.

Belief: (noun): A state or habit of mind, in which trust or confidence is placed, in some person or thing. Something accepted or considered to be true.

Opinion: (noun): a view, judgement or appraisal formed in the mind about a particular matter.

The main difference here is that we can verify facts, but opinions and beliefs are not verifiable. Until relatively recently, most people would call facts things like numbers, dates, photographic accounts that we can all agree upon.

More recently, it has become commonplace to question even the most mundane objective sources of fact, like eyewitness accounts, and credible peer-reviewed science, but that is a topic for another day.

 How we think we form our beliefs:

  1. We hear something;

  2. We think about it and vet it, determining whether it is true or false; only after that

  3. We form our belief

Actually, we form our beliefs:

  1. We hear something;

  2. We believe it to be true;

  3. Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.


Psychology professor Daniel Gilbert, “People are credulous creatures who find it very easy to believe and very difficult to doubt”. 


Our default is to believe that what we hear and read is true. Even when that information is clearly presented as being false, we are still likely to process it as true.

For example, some people believe that we use only 10% of our brains. If you hold that belief, did you ever research it for yourself?

People usually say it is something they heard but they have no idea where or from whom. Yet they are confident that this is true. That should be proof enough that the way we form beliefs is foolish. And, we actually use all parts of our brain.

Our beliefs drive the way we process information. We form beliefs without vetting/testing most of them and we even maintain them even after receiving clear, corrective information.

Once a belief is lodged, it becomes difficult to dislodge it from our thinking. It takes a life of its own, leading us to notice and seek out evidence confirming our belief. We rarely challenge the validity of confirming evidence and ignore or work hard to actively discredit information contradicting the belief.  This irrational, circular information-processing pattern is called motivated reasoning.

Truth Seeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information.

Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.

Fake news works because people who already hold beliefs consistent with the story generally won’t question the evidence. The potency of fake news is that it entrenches beliefs its intended audience already has, and then amplifies them. Social media is a playground for motivated reasoning. It provides the promise of access to a greater diversity of information sources and opinions than we’ve ever had available. Yet, we gravitate towards sources that confirm our beliefs, that agree with us. Every flavour is out there, but we tend to stick with our favourite. 

Even when directly confronted with facts that disconfirm our beliefs, we don’t let facts get in the way of our opinions.


Being Smart Makes It Worse


The popular wisdom is that the smarter you are, the less susceptible you are to fake news or disinformation. After all, smart people are more likely to analyze and effectively evaluate where the information is coming from, right? Part of being ‘smart’ is being good at processing information, parsing the quality of an argument and the credibility of the source.


Surprisingly, being smart can actually make bias worse. The smarter you are, the better you are at constructing a narrative that supports your beliefs, rationalising and framing the data to fit your argument or point of view.


Unfortunately, this is just the way evolution built us. We are wired to protect our beliefs even when our goal is to truthseek. This is one of the instances where being smart and aware of our capacity for irrationality alone doesn’t help us refrain from biased reasoning. As with visual illusions, we can’t make our minds work differently than they do no matter how smart we are. Just as we can’t unsee an illusion, intellect or will power alone can’t make us resist motivated reasoning.


Wanna Bet?


Imagine taking part in a conversation with a friend about the movie Citizen Kane. BEst film of all time, introduced a bunch of new techniques by which directors could contribute to storytelling. “Obviously, it won the best picture Oscar,” you gush, as part of a list of superlatives the film unquestionably deserves.


Then your friend says,”Wanna bet?”


Suddenly, you’re not so sure. That challenge puts you on your heels, causing you to back off your declaration and question the belief that you just declared with such assurance.


When someone challenges us to bet on a belief, signalling their confidence that our belief is inaccurate in some way, ideally it triggers us to vet the belief, taking an inventory of the evidence that informed us.


  • How do I know this?

  • Where did I get this information?

  • Who did I get it from?

  • What is the quality of my sources?

  • How much do I trust them?

  • How up to date is my information?

  • How much information do I have that is relevant to the belief?

  • What other things like this have I been confident about that turned out not to be true?

  • What are the other plausible alternatives?

  • What do I know about the person challenging my belief?

  • What is their view of how credible my opinion is?

  • What do they know that I don’t know?

  • What is their level of expertise?

  • What am I missing?


Remember the order in which we form our beliefs:


  1. We hear something;

  2. We believe it to be true;

  3. Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.


“Wanna bet?” triggers us to engage in that third step that we only sometimes get to. Being asked if we are willing to bet money on it makes it much more likely that we will examine our information in a less biased way, be more honest with ourselves about how sure we are of our beliefs, and be more open to updating and calibrating our beliefs.


A lot of good can result from someone saying, “Wanna bet?” Offering a wager brings the risk out in the open, making explicit what is implicit (and frequently overlooked). The more we recognise that we are betting on our beliefs (with our happiness, attention, health, money time or some other limited resource), the more we are likely to temper our statements, getting closer to the truth as we acknowledge the risk inherent in what we believe.


Once we start doing this (at the risk of losing friends), we are more likely to recognise that there is always a degree of uncertainty, that we are generally less sure than we thought we were, that practically nothing is black and white 0% or 100%. And that's a pretty good philosophy for living.

Sunday 30 June 2019

The science of influencing people: six ways to win an argument

Hidebound views on subjects such as the climate crisis and Brexit are the norm – but the appliance of science may sway stubborn opinions writes David Robson in The Guardian 

 
Illustration: Getty Images/Observer Design


“I am quite sure now that often, very often, in matters of religion and politics a man’s reasoning powers are not above the monkey’s,” wrote Mark Twain.

Having written a book about our most common reasoning errors, I would argue that Twain was being rather uncharitable – to monkeys. Whether we are discussing Trump, Brexit, or the Tory leadership, we have all come across people who appear to have next to no understanding of world events – but who talk with the utmost confidence and conviction. And the latest psychological research can now help us to understand why.

Consider the “illusion of explanatory depth”. When asked about government policies and their consequences, most people believe that they could explain their workings in great detail. If put to the test, however, their explanations are vague and incoherent. The problem is that we confuse a shallow familiarity with general concepts for real, in-depth knowledge.

Besides being less substantial than we think, our knowledge is also highly selective: we conveniently remember facts that support our beliefs and forget others. When it comes to understanding the EU, for instance, Brexiters will know the overall costs of membership, while remainers will cite its numerous advantages. Although the overall level of knowledge is equal on both sides, there is little overlap in the details.

Simply asking why people support or oppose a policy is pointless. You need to ask how something works to have an effect

Politics can also scramble our critical thinking skills. Psychological studies show that people fail to notice the logical fallacies in an argument if the conclusion supports their viewpoint; if they are shown contrary evidence, however, they will be far more critical of the tiniest hole in the argument. This phenomenon is known as “motivated reasoning”.

A high standard of education doesn’t necessarily protect us from these flaws. Graduates, for instance, often overestimate their understanding of their degree subject: although they remember the general content, they have forgotten the details. “People confuse their current level of understanding with their peak knowledge,” Prof Matthew Fisher of Southern Methodist University in Dallas, Texas, says. That false sense of expertise can, in turn, lead them to feel that they have the licence to be more closed-minded in their political views – an attitude known as “earned dogmatism”.

Little wonder that discussions about politics can leave us feeling that we are banging our heads against a brick wall – even when talking to people we might otherwise respect. Fortunately, recent psychological research also offers evidence-based ways towards achieving more fruitful discussions.

Ask ‘how’ rather than ‘why’
Thanks to the illusion of explanatory depth, many political arguments will be based on false premises, spoken with great confidence but with a minimal understanding of the issues at hand. For this reason, a simple but powerful way of deflating someone’s argument is to ask for more detail. “You need to get the ‘other side’ focusing on how something would play itself out, in a step by step fashion”, says Prof Dan Johnson at Washington and Lee University in Lexington, Virginia. By revealing the shallowness of their existing knowledge, this prompts a more moderate and humble attitude.


FacebookTwitterPinterest Anti-Brexit protester Steve Bray and a pro-Brexit protester face off outside parliament earlier this year. Photograph: Jack Taylor/Getty Images

In 2013, Prof Philip Fernbach at the University of Colorado, Boulder, and colleagues asked participants in cap-and-trade schemes – designed to limit companies’ carbon emissions – to describe in depth how they worked. Subjects initially took strongly polarised views but after the limits of their knowledge were exposed, their attitudes became more moderate and less biased.

It’s important to note that simply asking why people supported or opposed the policy – without requiring them to explain how it works – had no effect, since those reasons could be shallower (“It helps the environment”) with little detail. You need to ask how something works to get the effect.

If you are debating the merits of a no-deal Brexit, you might ask someone to describe exactly how the UK’s international trade would change under WTO terms. If you are challenging a climate emergency denier, you might ask them to describe exactly how their alternative theories can explain the recent rise in temperatures. It’s a strategy that the broadcaster James O’Brien employs on his LBC talk show – to powerful effect.

Fill their knowledge gap with a convincing story
If you are trying to debunk a particular falsehood – like a conspiracy theory or fake news – you should make sure that your explanation offers a convincing, coherent narrative that fills all the gaps left in the other person’s understanding.

Consider the following experiment by Prof Brendan Nyhan of the University of Michigan and Prof Jason Reifler of the University of Exeter. Subjects read stories about a fictional senator allegedly under investigation for bribery who had subsequently resigned from his post. Written evidence – a letter from prosecutors confirming his innocence – did little to change the participants’ suspicions of his guilt. But when offered an alternative explanation for his resignation – to take on another role – participants changed their minds. The same can be seen in murder trials: people are more likely to accept someone’s innocence if another suspect has also been accused, since that fills the biggest gap in the story: whodunnit.


FacebookTwitterPinterest Boris Johnson, Jeremy Hunt, Michael Gove, Sajid Javid and Rory Stewart taking part in a BBC TV debate earlier this month. Photograph: Jeff Overs/BBC/PA

The persuasive power of well-constructed narratives means that it’s often useful to discuss the sources of misinformation, so that the person can understand why they were being misled in the first place. Anti-vaxxers, for instance, may believe a medical conspiracy to cover up the supposed dangers of vaccines. You are more likely to change minds if you replace that narrative with an equally cohesive and convincing story – such as Andrew Wakefield’s scientific fraud, and the fact that he was set to profit from his paper linking autism to MMR vaccines. Just stating the scientific evidence will not be as persuasive.

Reframe the issue
Each of our beliefs is deeply rooted in a much broader and more complex political ideology. Climate crisis denial, for instance, is now inextricably linked to beliefs in free trade, capitalism and the dangers of environmental regulation.

Attacking one issue may therefore threaten to unravel someone’s whole worldview – a feeling that triggers emotionally charged motivated reasoning. It is for this reason that highly educated Republicans in the US deny the overwhelming evidence.

You are not going to alter someone’s whole political ideology in one discussion, so a better strategy is to disentangle the issue at hand from their broader beliefs, or to explain how the facts can still be accommodated into their worldview. A free-market capitalist who denies global warming might be far more receptive to the evidence if you explain that the development of renewable energies could lead to technological breakthroughs and generate economic growth.

Appeal to an alternative identity

If the attempt to reframe the issue fails, you might have more success by appealing to another part of the person’s identity entirely.

Someone’s political affiliation will never completely define them, after all. Besides being a conservative or a socialist, a Brexiter or a remainer, we associate ourselves with other traits and values – things like our profession, or our role as a parent. We might see ourselves as a particularly honest person, or someone who is especially creative. “All people have multiple identities,” says Prof Jay Van Bavel at New York University, who studies the neuroscience of the “partisan brain”. “These identities can become active at any given time, depending on the circumstances.”

You are more likely to achieve your aims by arguing gently and kindly. You will also come across better to onlookers

It’s natural that when talking about politics, the salient identity will be our support for a particular party or movement. But when people are asked to first reflect on their other, nonpolitical values, they tend to become more objective in discussion on highly partisan issues, as they stop viewing facts through their ideological lens.

You could try to use this to your advantage during a heated conversation, with subtle flattery that appeals to another identity and its set of values; if you are talking to a science teacher, you might try to emphasise their capacity to appraise evidence even-handedly. The aim is to help them recognise that they can change their mind on certain issues while staying true to other important elements of their personality.

Persuade them to take an outside perspective

Another simple strategy to encourage a more detached and rational mindset is to ask your conversation partner to imagine the argument from the viewpoint of someone from another country. How, for example, would someone in Australia or Iceland view Boris Johnson as our new prime minister?

Prof Ethan Kross at the University of Michigan, and Prof Igor Grossmann at the University of Waterloo in Ontario, Canada, have shown that this strategy increases “psychological distance” from the issue at hand and cools emotionally charged reasoning so that you can see things more objectively. During the US presidential elections, for instance, their participants were asked to consider how someone in Iceland would view the candidates. They were subsequently more willing to accept the limits of their knowledge and to listen to alternative viewpoints; after the experiment, they were even more likely to join a bipartisan discussion group.


FacebookTwitterPinterest The front pages of two New York newspapers on Friday 2 June 2017, as Donald Trump pledged to withdraw the US from the Paris climate agreement. Photograph: Richard B Levine/Alamy

This is only one way to increase someone’s psychological distance, and there are many others. If you are considering policies with potentially long-term consequences, you could ask them to imagine viewing the situation through the eyes of someone in the future. However you do it, encouraging this shift in perspective should make your friend or relative more receptive to the facts you are presenting, rather than simply reacting with knee-jerk dismissals.

Be kind
Here’s a lesson that certain polemicists in the media might do well to remember – people are generally much more rational in their arguments, and more willing to own up to the limits of their knowledge and understanding, if they are treated with respect and compassion. Aggression, by contrast, leads them to feel that their identity is threatened, which in turn can make them closed-minded.

Assuming that the purpose of your argument is to change minds, rather than to signal your own superiority, you are much more likely to achieve your aims by arguing gently and kindly rather than belligerently, and affirming your respect for the person, even if you are telling them some hard truths. As a bonus, you will also come across better to onlookers. “There’s a lot of work showing that third-party observers always attribute high levels of competence when the person is conducting themselves with more civility,” says Dr Joe Vitriol, a psychologist at Lehigh University in Bethlehem, Pennsylvania. As Lady Mary Wortley Montagu put it in the 18th century: “Civility costs nothing and buys everything.”

Thursday 26 July 2018

How do beliefs arise - Thinking in Bets – Making smarter decisions when you don’t have all the facts – by Annie Duke Part 2

Excerpts


How we think we form abstract beliefs:
1.       We hear something.
2.       We think about it and vet it, determining whether it is true or false; only after that.
3.       We form our belief.

We actually form abstract beliefs this way:
1.       We hear something.
2.       We believe it to be true.
3.       Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.

If we were good at updating our beliefs based on new information, our haphazard belief formation process might cause us relatively few problems. Sadly, this is not the way it works. We form beliefs without vetting most of them and maintain them even after receiving clear, corrective information.

Truthseeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information. We might think of ourselves as open-minded and capable of updating our beliefs based on new information, but the research conclusively shows otherwise. Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.

“We do not simply react to an event…. We behave according to what we bring to the occasion” (Hastorf and Cantrill). Our beliefs affect how we process all new things, ‘whether the thing is a football game, a presidential candidate, Communism or spinach.”

Once a belief is lodged, it becomes difficult to dislodge. It takes a life on its own, leading us to notice and seek out evidence conforming our belief, rarely challenge the validity of confirming evidence, and ignore or work hard to actively discredit information contradicting the belief. This irrational circular information processing pattern is called motivated reasoning.