Search This Blog

Showing posts with label opinion. Show all posts
Showing posts with label opinion. Show all posts

Wednesday 29 June 2022

Being smart makes our bias worse: Life is Poker not Chess - 3

 

Abridged and adapted from Thinking in Bets by Annie Duke


We bet based on what we believe about the world.This is very good news: part of the skill in life comes from learning to be a better belief calibrator, using experience and information to more objectively update our beliefs to more accurately represent the world. The more accurate our beliefs, the better the foundation of the bets we make. However there is also some bad news: our beliefs can be way, way off.


Hearing is believing


We form beliefs in a haphazard way, believing all sorts of things based just on what we hear out in the world but haven’t researched for ourselves.


This is how we think we form abstract beliefs:


  1. We hear something

  2. We think about it and vet it, determining whether it is true or false; only after that

  3. We form our belief


It turns out though, that we actually form abstract beliefs this way:


  1. We hear something

  2. We believe it to be true

  3. Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.


These belief formation methods had evolved due to a need for efficiency not accuracy. In fact, questioning what you see or hear could get you eaten in the jungle. However, assuming that you are no longer living in a jungle, we have failed to develop a high degree of scepticism to deal with materials available in the modern social media age. This general belief-formation process may affect our decision making in areas that can have significant consequences.


If we were good at updating our beliefs based on new information, our haphazard belief formation process might cause relatively few problems. Sadly, we form beliefs without vetting most of them, and maintain them even after receiving clear, corrective information.


Truthseeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information. Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.


The stubbornness of beliefs


Once a belief is lodged, it becomes difficult to dislodge. It takes on a life of its own, leading us to notice and seek out evidence confirming our belief, rarely challenge the validity of confirming evidence, and ignore or work hard to actively discredit information contradicting the belief. This irrational, circular information processing pattern is called motivated reasoning.


Fake news works because people who already hold beliefs consistent with the story generally won’t question the evidence. Disinformation is even more powerful because the confirmable facts in the story make it feel like the information has been vetted, adding to the power of the narrative being pushed.


Fake news isn’t meant to change minds. The potency of fake news is that it entrenches beliefs its intended audience already has, and then amplifies them. The Internet is a playground for motivated reasoning. It provides the promise of access to a greater diversity of information sources and opinions than we’ve ever had available. Yet, we gravitate towards sources that confirm our beliefs. Every flavour is out there, but we tend to stick with our favourite. 


Even when directly confronted with facts that disconfirm our beliefs, we don’t let facts get in the way.


Being smart makes it worse


Surprisingly, being smart can actually make bias worse. The smarter you are, the better you are at constructing a narrative that supports your beliefs, rationalising and framing the data to fit your argument or point of view.


Blind spot bias - is an irrationality where people are better at recognising biased reasoning in others but are blind to bias in themselves. It was found that blind spot bias is greater the smarter you are.  Furthermore, people who were aware of their own biases were not better able to overcome them. 


Dan Kahan discovered that the more numerate people made more mistakes interpreting data on emotionally charged topics than the less numerate subjects sharing the same beliefs. It turns out the better you are with numbers, the better you are at spinning those numbers to conform to and support your beliefs.


Wanna bet


Imagine taking part in a conversation with a friend about the movie Chintavishtayaya Shyamala. Best film of all time, introduces a bunch of new techniques by which directors could contribute to story-telling. ‘Obviously, it won the national award’ you gush, as part of a list of superlatives the film unquestionably deserves.


Then your friend says, ‘Wanna bet?’


Suddenly, you are not so sure. That challenge puts you on your heels, causing you to back off your declaration and question the belief that you just declared with such assurance.


Remember the order in which we form abstract beliefs:


  1. We hear something

  2. We believe it to be true

  3. Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.


‘Wanna bet?’ triggers us to engage in that third step that we only sometimes get to. Being asked if we're willing to bet money on it makes it much more likely that we will examine our information in a less biased way, be more honest with ourselves about how sure we are of our beliefs, and be more open to updating and calibrating our beliefs. The more objective we are, the more accurate our beliefs become. And the person who wins bets over the long run is the one with the more accurate beliefs.


Of course, in most instances, the person offering to bet isn’t actually looking to put any money on it. They are just making a point - a valid point that perhaps we overstated our conclusion or made our statement without including relevant caveats.


I’d imagine that if you went around challenging everyone with ‘Wanna bet?’ it would be difficult to make friends and you’d lose the ones you have. But that doesn’t mean we can’t change the framework for ourselves in the way we think about our biases. decisions and opinions.


Monday 30 May 2022

On Fact, Opinion and Belief

 Annie Duke in 'Thinking in Bets'


What exactly is the difference between fact, opinion and belief?” .

Fact: (noun): A piece of information that can be backed up by evidence.

Belief: (noun): A state or habit of mind, in which trust or confidence is placed, in some person or thing. Something accepted or considered to be true.

Opinion: (noun): a view, judgement or appraisal formed in the mind about a particular matter.

The main difference here is that we can verify facts, but opinions and beliefs are not verifiable. Until relatively recently, most people would call facts things like numbers, dates, photographic accounts that we can all agree upon.

More recently, it has become commonplace to question even the most mundane objective sources of fact, like eyewitness accounts, and credible peer-reviewed science, but that is a topic for another day.

 How we think we form our beliefs:

  1. We hear something;

  2. We think about it and vet it, determining whether it is true or false; only after that

  3. We form our belief

Actually, we form our beliefs:

  1. We hear something;

  2. We believe it to be true;

  3. Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.


Psychology professor Daniel Gilbert, “People are credulous creatures who find it very easy to believe and very difficult to doubt”. 


Our default is to believe that what we hear and read is true. Even when that information is clearly presented as being false, we are still likely to process it as true.

For example, some people believe that we use only 10% of our brains. If you hold that belief, did you ever research it for yourself?

People usually say it is something they heard but they have no idea where or from whom. Yet they are confident that this is true. That should be proof enough that the way we form beliefs is foolish. And, we actually use all parts of our brain.

Our beliefs drive the way we process information. We form beliefs without vetting/testing most of them and we even maintain them even after receiving clear, corrective information.

Once a belief is lodged, it becomes difficult to dislodge it from our thinking. It takes a life of its own, leading us to notice and seek out evidence confirming our belief. We rarely challenge the validity of confirming evidence and ignore or work hard to actively discredit information contradicting the belief.  This irrational, circular information-processing pattern is called motivated reasoning.

Truth Seeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information.

Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.

Fake news works because people who already hold beliefs consistent with the story generally won’t question the evidence. The potency of fake news is that it entrenches beliefs its intended audience already has, and then amplifies them. Social media is a playground for motivated reasoning. It provides the promise of access to a greater diversity of information sources and opinions than we’ve ever had available. Yet, we gravitate towards sources that confirm our beliefs, that agree with us. Every flavour is out there, but we tend to stick with our favourite. 

Even when directly confronted with facts that disconfirm our beliefs, we don’t let facts get in the way of our opinions.


Being Smart Makes It Worse


The popular wisdom is that the smarter you are, the less susceptible you are to fake news or disinformation. After all, smart people are more likely to analyze and effectively evaluate where the information is coming from, right? Part of being ‘smart’ is being good at processing information, parsing the quality of an argument and the credibility of the source.


Surprisingly, being smart can actually make bias worse. The smarter you are, the better you are at constructing a narrative that supports your beliefs, rationalising and framing the data to fit your argument or point of view.


Unfortunately, this is just the way evolution built us. We are wired to protect our beliefs even when our goal is to truthseek. This is one of the instances where being smart and aware of our capacity for irrationality alone doesn’t help us refrain from biased reasoning. As with visual illusions, we can’t make our minds work differently than they do no matter how smart we are. Just as we can’t unsee an illusion, intellect or will power alone can’t make us resist motivated reasoning.


Wanna Bet?


Imagine taking part in a conversation with a friend about the movie Citizen Kane. BEst film of all time, introduced a bunch of new techniques by which directors could contribute to storytelling. “Obviously, it won the best picture Oscar,” you gush, as part of a list of superlatives the film unquestionably deserves.


Then your friend says,”Wanna bet?”


Suddenly, you’re not so sure. That challenge puts you on your heels, causing you to back off your declaration and question the belief that you just declared with such assurance.


When someone challenges us to bet on a belief, signalling their confidence that our belief is inaccurate in some way, ideally it triggers us to vet the belief, taking an inventory of the evidence that informed us.


  • How do I know this?

  • Where did I get this information?

  • Who did I get it from?

  • What is the quality of my sources?

  • How much do I trust them?

  • How up to date is my information?

  • How much information do I have that is relevant to the belief?

  • What other things like this have I been confident about that turned out not to be true?

  • What are the other plausible alternatives?

  • What do I know about the person challenging my belief?

  • What is their view of how credible my opinion is?

  • What do they know that I don’t know?

  • What is their level of expertise?

  • What am I missing?


Remember the order in which we form our beliefs:


  1. We hear something;

  2. We believe it to be true;

  3. Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.


“Wanna bet?” triggers us to engage in that third step that we only sometimes get to. Being asked if we are willing to bet money on it makes it much more likely that we will examine our information in a less biased way, be more honest with ourselves about how sure we are of our beliefs, and be more open to updating and calibrating our beliefs.


A lot of good can result from someone saying, “Wanna bet?” Offering a wager brings the risk out in the open, making explicit what is implicit (and frequently overlooked). The more we recognise that we are betting on our beliefs (with our happiness, attention, health, money time or some other limited resource), the more we are likely to temper our statements, getting closer to the truth as we acknowledge the risk inherent in what we believe.


Once we start doing this (at the risk of losing friends), we are more likely to recognise that there is always a degree of uncertainty, that we are generally less sure than we thought we were, that practically nothing is black and white 0% or 100%. And that's a pretty good philosophy for living.

Friday 4 June 2021

Have you seen Groupthink in action?

Tim Harford in The FT 

In his acid parliamentary testimony last week, Dominic Cummings, the prime minister’s former chief adviser, blamed a lot of different people and things for the UK’s failure to fight Covid-19 — including “groupthink”. 

Groupthink is unlikely to fight back. It already has a terrible reputation, not helped by its Orwellian ring, and the term is used so often that I begin to fear that we have groupthink about groupthink. 

So let’s step back. Groupthink was made famous in a 1972 book by psychologist Irving Janis. He was fascinated by the Bay of Pigs fiasco in 1961, in which a group of perfectly intelligent people in John F Kennedy’s administration made a series of perfectly ridiculous decisions to support a botched coup in Cuba. How had that happened? How can groups of smart people do such stupid things? 

An illuminating metaphor from Scott Page, author of The Difference, a book about the power of diversity, is that of the cognitive toolbox. A good toolbox is not the same thing as a toolbox full of good tools: two dozen top-quality hammers will not do the job. Instead, what’s needed is variety: a hammer, pliers, a saw, a choice of screwdrivers and more. 

This is obvious enough and, in principle, it should be obvious for decision-making too: a group needs a range of ideas, skills, experience and perspectives. Yet when you put three hammers on a hiring committee, they are likely to hire another hammer. This “homophily” — hanging out with people like ourselves — is the original sin of group decision-making, and there is no mystery as to how it happens. 

But things get worse. One problem, investigated by Cass Sunstein and Reid Hastie in their book Wiser, is that groups intensify existing biases. One study looked at group discussions about then-controversial topics (climate change, same-sex marriage, affirmative action) by groups in left-leaning Boulder, Colorado, and in right-leaning Colorado Springs. 

Each group contained six individuals with a range of views, but after discussing those views with each other, the Boulder groups bunched sharply to the left and the Colorado Springs groups bunched similarly to the right, becoming both more extreme and more uniform within the group. In some cases, the emergent view of the group was more extreme than the prior view of any single member. 

One reason for this is that when surrounded with fellow travellers, people became more confident in their own views. They felt reassured by the support of others. 

Meanwhile, people with contrary views tended to stay silent. Few people enjoy being publicly outnumbered. As a result, a false consensus emerged, with potential dissenters censoring themselves and the rest of the group gaining a misplaced sense of unanimity. 

The Colorado experiments studied polarisation but this is not just a problem of polarisation. Groups tend to seek common ground on any subject from politics to the weather, a fact revealed by “hidden profile” psychology experiments. In such experiments, groups are given a task (for example, to choose the best candidate for a job) and each member of the group is given different pieces of information. 

One might hope that each individual would share everything they knew, but instead what tends to happen is that people focus, redundantly, on what everybody already knows, rather than unearthing facts known to only one individual. The result is a decision-making disaster. 

These “hidden profile” studies point to the heart of the problem: group discussions aren’t just about sharing information and making wise decisions. They are about cohesion — or, at least, finding common ground to chat about. 

Reading Charlan Nemeth’s No! The Power of Disagreement In A World That Wants To Get Along, one theme is that while dissent leads to better, more robust decisions, it also leads to discomfort and even distress. Disagreement is valuable but agreement feels so much more comfortable. 

There is no shortage of solutions to the problem of groupthink, but to list them is to understand why they are often overlooked. The first and simplest is to embrace decision-making processes that require disagreement: appoint a “devil’s advocate” whose job is to be a contrarian, or practise “red-teaming”, with an internal group whose task is to play the role of hostile actors (hackers, invaders or simply critics) and to find vulnerabilities. The evidence suggests that red-teaming works better than having a devil’s advocate, perhaps because dissent needs strength in numbers. 

A more fundamental reform is to ensure that there is a real diversity of skills, experience and perspectives in the room: the screwdrivers and the saws as well as the hammers. This seems to be murderously hard. 

When it comes to social interaction, the aphorism is wrong: opposites do not attract. We unconsciously surround ourselves with like-minded people. 

Indeed, the process is not always unconscious. Boris Johnson’s cabinet could have contained Greg Clark and Jeremy Hunt, the two senior Conservative backbenchers who chair the committees to which Dominic Cummings gave his evidence about groupthink. But it does not. Why? Because they disagree with him too often. 

The right groups, with the right processes, can make excellent decisions. But most of us don’t join groups to make better decisions. We join them because we want to belong. Groupthink persists because groupthink feels good.

Sunday 20 May 2018

Why the 'Right to Believe' Is Not a Right to Believe Whatever You Want

Daniel De Nicola in The Wire.In
Why the 'Right to Believe' Is Not a Right to Believe Whatever You Want


Do we have the right to believe whatever we want to believe? This supposed right is often claimed as the last resort of the wilfully ignorant, the person who is cornered by evidence and mounting opinion: ‘I believe climate change is a hoax whatever anyone else says, and I have a right to believe it!’ But is there such a right?

We do recognise the right to know certain things. I have a right to know the conditions of my employment, the physician’s diagnosis of my ailments, the grades I achieved at school, the name of my accuser and the nature of the charges, and so on. But belief is not knowledge.
Beliefs are factive: to believe is to take to be true. It would be absurd, as the analytic philosopher G.E. Moore observed in the 1940s, to say: ‘It is raining, but I don’t believe that it is raining.’ Beliefs aspire to truth – but they do not entail it. Beliefs can be false, unwarranted by evidence or reasoned consideration. They can also be morally repugnant. Among likely candidates: beliefs that are sexist, racist or homophobic; the belief that proper upbringing of a child requires ‘breaking the will’ and severe corporal punishment; the belief that the elderly should routinely be euthanised; the belief that ‘ethnic cleansing’ is a political solution, and so on. If we find these morally wrong, we condemn not only the potential acts that spring from such beliefs, but the content of the belief itself, the act of believing it, and thus the believer.

Such judgments can imply that believing is a voluntary act. But beliefs are often more like states of mind or attitudes than decisive actions. Some beliefs, such as personal values, are not deliberately chosen; they are ‘inherited’ from parents and ‘acquired’ from peers, acquired inadvertently, inculcated by institutions and authorities, or assumed from hearsay. For this reason, I think, it is not always the coming-to-hold-this-belief that is problematic; it is rather the sustaining of such beliefs, the refusal to disbelieve or discard them that can be voluntary and ethically wrong.

If the content of a belief is judged morally wrong, it is also thought to be false. The belief that one race is less than fully human is not only a morally repugnant, racist tenet; it is also thought to be a false claim – though not by the believer. The falsity of a belief is a necessary but not sufficient condition for a belief to be morally wrong; neither is the ugliness of the content sufficient for a belief to be morally wrong. Alas, there are indeed morally repugnant truths, but it is not the believing that makes them so. Their moral ugliness is embedded in the world, not in one’s belief about the world.

‘Who are you to tell me what to believe?’ replies the zealot. It is a misguided challenge: it implies that certifying one’s beliefs is a matter of someone’s authority. It ignores the role of reality. Believing has what philosophers call a ‘mind-to-world direction of fit’. Our beliefs are intended to reflect the real world – and it is on this point that beliefs can go haywire. There are irresponsible beliefs; more precisely, there are beliefs that are acquired and retained in an irresponsible way. One might disregard evidence; accept gossip, rumour, or testimony from dubious sources; ignore incoherence with one’s other beliefs; embrace wishful thinking; or display a predilection for conspiracy theories.

I do not mean to revert to the stern evidentialism of the 19th-century mathematical philosopher William K Clifford, who claimed: ‘It is wrong, always, everywhere, and for anyone, to believe anything upon insufficient evidence.’ Clifford was trying to prevent irresponsible ‘overbelief’, in which wishful thinking, blind faith or sentiment (rather than evidence) stimulate or justify belief. This is too restrictive. In any complex society, one has to rely on the testimony of reliable sources, expert judgment and the best available evidence. Moreover, as the psychologist William James responded in 1896, some of our most important beliefs about the world and the human prospect must be formed without the possibility of sufficient evidence. In such circumstances (which are sometimes defined narrowly, sometimes more broadly in James’s writings), one’s ‘will to believe’ entitles us to choose to believe the alternative that projects a better life.

In exploring the varieties of religious experience, James would remind us that the ‘right to believe’ can establish a climate of religious tolerance. Those religions that define themselves by required beliefs (creeds) have engaged in repression, torture and countless wars against non-believers that can cease only with recognition of a mutual ‘right to believe’. Yet, even in this context, extremely intolerant beliefs cannot be tolerated. Rights have limits and carry responsibilities.

Unfortunately, many people today seem to take great licence with the right to believe, flouting their responsibility. The wilful ignorance and false knowledge that are commonly defended by the assertion ‘I have a right to my belief’ do not meet James’s requirements. Consider those who believe that the lunar landings or the Sandy Hook school shooting were unreal, government-created dramas; that Barack Obama is Muslim; that the Earth is flat; or that climate change is a hoax. In such cases, the right to believe is proclaimed as a negative right; that is, its intent is to foreclose dialogue, to deflect all challenges; to enjoin others from interfering with one’s belief-commitment. The mind is closed, not open for learning. They might be ‘true believers’, but they are not believers in the truth.

Believing, like willing, seems fundamental to autonomy, the ultimate ground of one’s freedom. But, as Clifford also remarked: ‘No one man’s belief is in any case a private matter which concerns himself alone.’ Beliefs shape attitudes and motives, guide choices and actions. Believing and knowing are formed within an epistemic community, which also bears their effects. There is an ethic of believing, of acquiring, sustaining, and relinquishing beliefs – and that ethic both generates and limits our right to believe. If some beliefs are false, or morally repugnant, or irresponsible, some beliefs are also dangerous. And to those, we have no right.

Thursday 28 December 2017

I used to think people made rational decisions. But now I know I was wrong

Deborah Orr in The Guardian

It’s been coming on for a while, so I can’t claim any eureka moment. But something did crystallise this year. What I changed my mind about was people. More specifically, I realised that people cannot be relied upon to make rational choices. We would have fixed global warming by now if we were rational. Instead, there’s a stubborn refusal to let go of the idea that environmental degradation is a “debate” in which there are “two sides”.

Debating is a great way of exploring an issue when there is real room for doubt and nuance. But when a conclusion can be reached simply by assembling a mountain of known facts, debating is just a means of pitting the rational against the irrational. 

Humans like to think we are rational. Some of us are more rational than others. But, essentially, we are all slaves to our feelings and emotions. The trick is to realise this, and be sure to guard against it. It’s something that, in the modern world, we are not good at. Authentic emotions are valued over dry, dull authentic evidence at every turn.

I think that as individuality has become fetishised, our belief in our right to make half-formed snap judgments, based on little more than how we feel, has become problematically unchallengeable. When Uma Thurman declared that she would wait for her anger to abate before she spoke about Harvey Weinstein, it was, I believe, in recognition of this tendency to speak first and think later.

Good for her. The value of calm reasoning is not something that one sees acknowledged very often at the moment. Often, the feelings and emotions that form the basis of important views aren’t so very fine. Sometimes humans understand and control their emotions so little that they sooner or later coagulate into a roiling soup of anxiety, fear, sadness, self-loathing, resentment and anger which expresses itself however it can, finding objects to project its hurt and confusion on to. Like immigrants. Or transsexuals. Or liberals. Or Tories. Or women. Or men.

Even if the desire to find living, breathing scapegoats is resisted, untrammelled emotion can result in unwise and self-defeating decisions, devoid of any rationality. Rationality is a tool we have created to govern our emotions. That’s why education, knowledge, information is the cornerstone of democracy. And that’s why despots love ignorance.

Sometimes we can identify and harness the emotions we need to get us through the thing we know, rationally, that we have to do. It’s great when you’re in the zone. Even negative emotions can be used rationally. I, for example, use anger a lot in my work. I’m writing on it at this moment, just as much as I’m writing on a computer. I’ll stop in a moment. I’ll reach for facts to calm myself. I’ll reach for facts to make my emotions seem rational. Or maybe that’s just me. Whatever that means.


‘‘Consciousness’ involves no executive or causal relationship with any of the psychological processes attributed to it'David Oakley and Peter Halligan


It’s a fact that I can find some facts to back up my feelings about people. Just writing that down helps me to feel secure and in control. The irrationality of humans has been considered a fact since the 1970s, when two psychologists, Amos Tversky and Daniel Kahneman, showed that human decisions were often completely irrational, not at all in their own interests and based on “cognitive biases”. Their ideas were a big deal, and also formed the basis of Michael Lewis’s book, The Undoing Project.

More recent research – or more recent theory, to be precise – has rendered even Tversky and Kahneman’s ideas about the unreliability of the human mind overly rational.

Chasing the Rainbow: The Non-Conscious Nature of Being is a research paper from University College London and Cardiff University. Its authors, David Oakley and Peter Halligan, argue “that ‘consciousness’ contains no top-down control processes and that ‘consciousness’ involves no executive, causal, or controlling relationship with any of the familiar psychological processes conventionally attributed to it”.

Which can only mean that even when we think we’re being rational, we’re not even really thinking. That thing we call thinking – we don’t even know what it really is.

When I started out in journalism, opinion columns weren’t a big thing. Using the word “I’ in journalism was frowned upon. The dispassionate dissemination of facts was the goal to be reached for.

Now so much opinion is published, in print and online, and so many people offer their opinions about the opinions, that people in our government feel comfortable in declaring that experts are overrated, and the president of the United States regularly says that anything he doesn’t like is “fake news”.

So, people. They’re a problem. That’s what I’ve decided. I’m part of a big problem. All I can do now is get my message out there.

Sunday 15 May 2016

How Little do Experts Know- On Ranieri and Leicester, One Media Expert Apologises

In July of last year I may have written an article suggesting that the Italian was likely to get Leicester City relegated from the Premier League

 
Leicester City manager Claudio Ranieri lifts the Premier League trophy. Photograph: Carl Recine/Reuters


Marcus Christenson in The Guardian


No one likes to be wrong. It is much nicer to be right. In life, however, it is not possible to be right all the time. We all try our best but there are times when things go horribly wrong.
I should know. In July last year I sat down to write an article about Claudio Ranieri. The 63-year-old had just been appointed the new manager of Leicester City and I decided, in the capacity of being the football editor at the Guardian, that I was the right person to write that piece.




Claudio Ranieri: the anti-Pearson … and the wrong man for Leicester City?



I made that decision based on the following: I have lived and worked as a journalist in Italy and have followed Ranieri’s career fairly closely since his early days in management. I also made sure that I spoke to several people in Greece, where Ranieri’s last job before replacing Nigel Pearson at Leicester, had ended in disaster with the team losing against the Faroe Islands and the manager getting sacked.

It was quite clear to me that this was a huge gamble by Leicester and that it was unlikely to end well. And I was hardly the only one to be sceptical. Gary Lineker, the former Leicester striker and now Match of the Day presenter, tweeted “Claudio Ranieri? Really?” and followed it up with by saying: “Claudio Ranieri is clearly experienced, but this is an uninspired choice by Leicester. It’s amazing how the same old names keep getting a go on the managerial merry-go-round.”

I started my article by explaining what had gone wrong in Greece (which was several things) before moving on to talk about the rest of his long managerial career, pointing out that he had never won a league title in any country and nor had he stayed at any club for more than two seasons since being charge at Chelsea at the beginning of the 2000s.

I threw in some light-hearted “lines”, such as the fact that he was the manager in charge of Juventus when they signed Christian Poulsen (not really a Juventus kind of player) and proclaimed that the appointment was “baffling”.

I added: “In some ways, it seems as if the Leicester owners went looking for the anti-Nigel Pearson. Ranieri is not going to call a journalist an ostrich. He is not going to throttle a player during a match. He is not going to tell a supporter to ‘fuck off and die’, no matter how bad the abuse gets.”


Claudio Ranieri instructs his players during Greece’s defeat by the Faroe Islands, the Italian’s last game in charge of the Euro 2004 winners. Photograph: Thanassis Stavrakis/AP

Rather pleased with myself – thinking that I was giving the readers a good insight to the man and the manager – I also put a headline on the piece, which read: “Claudio Ranieri: the anti-Pearson … and the wrong man for Leicester City?”

I did not think much more of the piece until a few months later when Leicester were top of the league and showing all the signs of being capable of staying there.

After a while, the tweets started to appear from people pointing out that I may not have called this one right. As the season wore on, these tweets became more and more frequent, and they have been sent to me after every Leicester win since the turn of the year.

At some point in February I decided to go back and look at the piece again. It made for uncomfortable reading. I had said that describing his spell in charge of Greece as “poor” would be an understatement. I wrote that 11 years after being given the nickname “Tinkerman” because he changed his starting XI so often when in charge of Chelsea, he was still an incorrigible “Tinkerman”.

It gets worse. “Few will back him to succeed but one thing is for sure: he will conduct himself in an honourable and humble way, as he always has done,” the articles said. “If Leicester wanted someone nice, they’ve got him. If they wanted someone to keep them in the Premier League, then they may have gone for the wrong guy.”

Ouch. Reading it back again I was faced with a couple of uncomfortable questions, the key one being “who do you think you are, writing such an snobbish piece about a dignified man and a good manager?”

The second question was a bit easier to answer. Was this as bad as the “In defence of Nicklas Bendtner” article I wrote a couple of years ago? (The answer is “no”, by the way, few things come close to an error of judgment of that scale).

I would like to point out a few things though. I did get – as a very kind colleague pointed out – 50% of that last paragraph right. He clearly is a wonderful human being and when Paolo Bandini spoke to several of his former players recently one thing stood out: the incredible affection they still feel for this gentle 64-year-old.

All in all, though, there is no point defending the indefensible: I could not have got it more wrong.


At the start of this piece I said that no one likes to be wrong. Well, I was wrong about that too. I’ve enjoyed every minute of being embarrassingly wrong this season. Leicester is the best story that could have happened to football in this country, their triumph giving hope to all of us who want to start a season dreaming that something unthinkable might happen.

So thank you Leicester and thank you Claudio, it’s been quite wonderful.

Thursday 14 May 2015

The troubling flaws in forensic science

by Linda Geddes in BBC Future

“It has long been an axiom of mine that the little things are infinitely the most important.” So said the fictional detective, Sherlock Holmes. Armed with his finely honed skills of backwards reasoning, his trademark ability to solve unsolvable crimes often hinged on his revealing evidence too small to be noticed.
Holmes was an inspiration for the very founders of modern day forensic science. As the decades passed and the tools in their armoury grew, so too did the sheen of invincibility that surrounded their discipline. But there was a crucial chink in their methods that had been overlooked: subjectivity.
While the likes of Holmes’s successors in detective fiction may lead us to believe that forensic evidence is based on precise deduction, all too often it relies on a scientist’s personal opinion, rather than hard fact.
Science on trial
Consider the following case. In December 2009, Donald Gates walked out of his Arizona prison with $75 and a bus ticket to Ohio. After serving 28 years for a rape and murder he didn’t commit, he was a free man. Now the spotlight began to shift to the forensic technique that put him there: microscopic hair analysis.
Human hair is one of the most common types of evidence found at crime scenes. During the 80s and 90s, forensic analysts in the US and elsewhere often looked to the physical differences between hairs to determine whether those found at a crime scene matched hairs from a suspect – like Donald Gates.
When he stood trial in 1982, an FBI analyst called Michael Malone testified that hairs found on the body of the murder victim – a Georgetown University student called Catherine Schilling – were consistent with Donald Gates’ hairs. He added that the probability they came from anyone else was one in 10,000.
“That’s very compelling evidence, particularly when it comes from a witness wearing a white laboratory coat,” says Peter Neufeld, co-founder of the Innocence Project, a New York-based non-profit organisation that uses DNA evidence to overturn wrongful convictions.
DNA testing evidence on a pair of trousers
The FBI is now reviewing several thousand cases as DNA testing sheds new light on the truth (Credit: Getty Images)
However, hair analysis is not purely objective; I might think two hairs look identical, but you might disagree. Even if we agree that two hairs match, no-one has ever figured out how many other hairs might be similarly indistinguishable from one another. “When a person says that the probability is one-in-10,000, that’s simply a made-up number,” says Neufeld. “There’s no data to support it.”
Donald Gates was finally exonerated when DNA testing revealed that the hairs didn’t belong to him after all. Two similar exonerations followed soon afterwards. As a result of these cases, the FBI is now reviewing several thousand cases in which its scientists may have offered similarly misleading testimony. Last month, it announced that of the 268 cases it has reviewed so far that went to trial, 96% them involved scientifically invalid testimony or other errors by FBI agents. Among those convicted, 33 received death sentences, and nine have already been executed.
The FBI’s review won’t necessarily overturn the convictions, but it does mean that they need to be reconsidered carefully. Lawyers scrutinising these cases must work out what other evidence was presented in court; if they hinged on flawed hair testimony, retrials and exonerations may follow. In cases where the original physical evidence still exists, that DNA testing may shed new light on the truth.
Damning report
Even trusted lines of evidence, such as fingerprint analysis, are not water-tight. Research has shown that the same fingerprint expert can reach a different conclusion about the same fingerprints depending on the context they’re given about a case.
Based in part on these findings, in 2009 the National Academy of Sciences in the US published a report on the state of forensic science. Commissioned in response to a string of laboratory scandals and miscarriages of justice, its conclusions were damning. “Testimony based on faulty forensic science analyses may have contributed to the wrongful conviction of innocent people,” it said. “In a number of disciplines, forensic science professionals have yet to establish either the validity of their approach or the accuracy of their conclusions.”
The report was a wake-up call, not just for forensic scientists in the US, but around the world. “What it exposed were significant scientific deficiencies across many of the different methods that we use, both to examine and interpret different types of evidence,” says Nic Daeid, a professor of forensic science at the University of Dundee in Scotland. 
Of all lines of forensic evidence, DNA analysis was considered to be the most objective. Resting on complex chemical analysis, it seems stringently scientific – a gold-standard for how forensic science should be done. Yet perhaps juries should not be too quick to trust the DNA analyses they see in court.
Fingerprints on a sheet of paper
Even trusted lines of evidence, such as fingerprint analysis, are not water-tight (Credit: Thinkstock)
In 2010, while working as a reporter for New Scientist magazine, I teamed up with Itiel Dror from University College London, and Greg Hampikian from Boise State University in Idaho, to put this idea of DNA’s objectivity to the test.
We took DNA evidence from a real-life case – a gang-rape in Georgia, US – and presented it to 17 experienced analysts working in the same accredited government lab in the US.
In the original case, two analysts from the Georgia Bureau of Investigation concluded that the man who was ultimately convicted of the crime, Kerry Robinson, "could not be excluded" from the crime scene sample, based on his DNA profile. But when the evidence was shown to our 17 analysts, they reached very different conclusions; just one analyst agreed that Robinson "cannot be excluded". Four analysts said the evidence was inconclusive and 12 said he could be excluded.
Yet just because forensic science is subjective, this doesn’t mean it should be disregarded; it can still yield vital clues that can help to catch and convict murderers, rapists, and other criminals. “Subjectivity isn’t a bad word,” says Dror. “It doesn’t mean that the evidence isn’t reliable, but it is open to bias and contextual influences.”
Blind judgement
What’s needed are additional safeguards to shield forensic examiners against irrelevant information that might skew their judgement. A first step is to ensure they aren’t given irrelevant information, such as knowing that witnesses have placed the suspect at the crime scene, or that he has previous convictions for similar crimes. Another safeguard is to reveal the relevant information sequentially – and only when it is needed. “We need to give them the information that they need to do their job when they need it, but not extra information that’s irrelevant to what they’re doing and which could influence their perception and judgement,” says Dror.
In the US at least, this is starting to happen: a national commission on forensic science has been established, with the goal of strengthening the field – and this includes looking at human factors like cognitive bias. But similar strategies are needed elsewhere if forensic science is to rebuild its tattered reputation.
When it comes to deduction and proof, there is still much we can learn from Arthur Conan Doyle’s hero. As Sherlock Holmes also once said: "Eliminate all other factors, and the one which remains must be the truth."