Nobody’s political opinions are just the pure, objective, unvarnished truth. Except yours, obviously
When it comes to the important issues, I’m pretty sure my opinions are just right. Of course I am: if I thought they were wrong, I’d trade them in for some different ones. But in reality, there’s plenty of evidence to suggest that we’re all at least somewhat subject to bias – that my support for stricter gun control laws here in the US, for example, is partly based on wanting to support my team. Tell Republicans that some imaginary policy is a Republican one, as the psychologist Geoffrey Cohen did in 2003, and they’re much more likely to support it, even if it runs counter to Republican values. But ask them why they support it, and they’ll deny that party affiliation played a role. (Cohen found something similar for Democrats. Maybe I mentioned Republicans more prominently because I’m biased?)
Surely, though, if you tell people you’re giving them biased information – if you specifically draw their attention to the risk of being led astray by bias – they’ll begin to question their own objectivity? Nope: even then, they’ll insist they’re reaching an unbiased conclusion, if a new paper by five Princeton researchers (which I found via Tom Jacobs at Pacific Standard) is anything to go by.
Emily Pronin and her colleagues asked Princeton students, and other people recruited online, to look at 80 paintings, and to give each a score from 1 to 10 based on their artistic merit. Half of the subjects weren’t told the artists’ identities. The other half were allowed to see a name, purportedly that of the painter of each picture. In fact, those names were a mixture of famous artists and names pulled from the phone book. As you’d predict, those who saw the names were biased in favour of famous artists. But even though they acknowledged the risk of bias, when asked to assess their own objectivity, they didn’t view their judgments as any more biased as a result.
Even when the risk of bias was explicitly pointed out to them, people remained confident that they weren’t susceptible to it; indeed, they actually rated their performance as more objective than they’d predicted it would be at the start of the test. “Even when people acknowledge that what they are about to do is biased,” the researchers write, “they still are inclined to see their resulting decisions as objective.”
This is more evidence for the “bias blind spot”, a term coined by Pronin which refers to the head-spinning fact that we have a cognitive bias to the effect that we’re uniquely immune to cognitive biases. Take the famous better-than-average effect, or Lake Wobegon effect, whereby the majority of people think they’re above average on any number of measures – their driving skills, their popularity, the quality of their relationship – when clearly they can’t all be right. It turns out the bias also applies to bias. In other words, we’re convinced that we’re better than most at not falling victim to bias. We seem to imagine we’re transparent to ourselves: that when we turn our attention within, we can clearly see all the factors influencing our decisions. The study participants “used a strategy that they thought was biased,” the researchers note, “and thus they probably expected to feel some bias when using it. The absence of that feeling may have made them more confident in their objectivity.”
This helps explain, for example, why it’s often better for companies to hire people, or colleges to admit students, using objective checklists, rather than interviews that rely on gut feelings. As Jacobs notes, it’s also why some orchestras ask musicians to audition from behind screens, so that only their music can be judged. It’s no good relying on the judges’ sincere confidence that they’d never let sexism or other biases get in the way. They may really believe it – but they’re probably wrong. Bias spares nobody. Except me, of course.
Surely, though, if you tell people you’re giving them biased information – if you specifically draw their attention to the risk of being led astray by bias – they’ll begin to question their own objectivity? Nope: even then, they’ll insist they’re reaching an unbiased conclusion, if a new paper by five Princeton researchers (which I found via Tom Jacobs at Pacific Standard) is anything to go by.
Emily Pronin and her colleagues asked Princeton students, and other people recruited online, to look at 80 paintings, and to give each a score from 1 to 10 based on their artistic merit. Half of the subjects weren’t told the artists’ identities. The other half were allowed to see a name, purportedly that of the painter of each picture. In fact, those names were a mixture of famous artists and names pulled from the phone book. As you’d predict, those who saw the names were biased in favour of famous artists. But even though they acknowledged the risk of bias, when asked to assess their own objectivity, they didn’t view their judgments as any more biased as a result.
Even when the risk of bias was explicitly pointed out to them, people remained confident that they weren’t susceptible to it; indeed, they actually rated their performance as more objective than they’d predicted it would be at the start of the test. “Even when people acknowledge that what they are about to do is biased,” the researchers write, “they still are inclined to see their resulting decisions as objective.”
This is more evidence for the “bias blind spot”, a term coined by Pronin which refers to the head-spinning fact that we have a cognitive bias to the effect that we’re uniquely immune to cognitive biases. Take the famous better-than-average effect, or Lake Wobegon effect, whereby the majority of people think they’re above average on any number of measures – their driving skills, their popularity, the quality of their relationship – when clearly they can’t all be right. It turns out the bias also applies to bias. In other words, we’re convinced that we’re better than most at not falling victim to bias. We seem to imagine we’re transparent to ourselves: that when we turn our attention within, we can clearly see all the factors influencing our decisions. The study participants “used a strategy that they thought was biased,” the researchers note, “and thus they probably expected to feel some bias when using it. The absence of that feeling may have made them more confident in their objectivity.”
This helps explain, for example, why it’s often better for companies to hire people, or colleges to admit students, using objective checklists, rather than interviews that rely on gut feelings. As Jacobs notes, it’s also why some orchestras ask musicians to audition from behind screens, so that only their music can be judged. It’s no good relying on the judges’ sincere confidence that they’d never let sexism or other biases get in the way. They may really believe it – but they’re probably wrong. Bias spares nobody. Except me, of course.
No comments:
Post a Comment