Search This Blog

Showing posts with label trail. Show all posts
Showing posts with label trail. Show all posts

Saturday, 3 October 2015

How to blame less and learn more

Mathew Syed in The Guardian

Accountability. We hear a lot about it. It’s a buzzword. Politicians should be accountable for their actions; social workers for the children they are supervising; nurses for their patients. But there’s a catastrophic problem with our concept of accountability.

 Consider the case of Peter Connelly, better known as Baby P, a child who died at the hands of his mother, her boyfriend and her boyfriend’s brother in 2007. The perpetrators were sentenced to prison. But the media focused its outrage on a different group: mainly his social worker, Maria Ward, and Sharon Shoesmith, director of children’s services. The local council offices were surrounded by a crowd holding placards. In interviews, protesters and politicians demanded their sacking. “They must be held accountable,” it was said.

Many were convinced that the social work profession would improve its performance in the aftermath of the furore. This is what people think accountability looks like: a muscular response to failure. It is about forcing people to sit up and take responsibility. As one pundit put it: “It will focus minds.”

But what really happened? Did child services improve? In fact, social workers started leaving the profession en masse. The numbers entering the profession also plummeted. In one area, the council had to spend £1.5m on agency social work teams because it didn’t have enough permanent staff to handle a jump in referrals.

Those who stayed in the profession found themselves with bigger caseloads and less time to look after the interests of each child. They also started to intervene more aggressively, terrified that a child under their supervision would be harmed. The number of children removed from their families soared. £100m was needed to cope with new child protection orders.

Crucially, defensiveness started to infiltrate every aspect of social work. Social workers became cautious about what they documented. The bureaucratic paper trails got longer, but the words were no longer about conveying information, they were about back-covering. Precious information was concealed out of sheer terror of the consequences.

Almost every commentator estimates that the harm done to children following the attempt to “increase accountability” was high indeed. Performance collapsed. The number of children killed at the hands of their parents increased by more than 25% in the year following the outcry and remained higher for every one of the next three years.

Let us take a step back. One of the most well-established human biases is called the fundamental attribution error. It is about how the sense-making part of the brain blames individuals, rather than systemic factors, when things go wrong. When volunteers are shown a film of a driver cutting across lanes, for example, they infer that he is selfish and out of control. And this inference may indeed turn out to be true. But the situation is not always as cut-and-dried.

After all, the driver may have the sun in his eyes or be swerving to avoid a car. To most observers looking from the outside in, these factors do not register. It is not because they don’t think such possibilities are irrelevant, it is that often they don’t even consider them. The brain just sees the simplest narrative: “He’s a homicidal fool!”

Even in an absurdly simple event like this, then, it pays to pause to look beneath the surface, to challenge the most reductionist narrative. This is what aviation, as an industry, does. When mistakes are made, investigations are conducted. A classic example comes from the 1940s where there was a series of seemingly inexplicable accidents involving B-17 bombers. Pilots were pressing the wrong switches. Instead of pressing the switch to lift the flaps, they were pressing the switch to lift the landing gear.

Should they have been penalised? Or censured? The industry commissioned an investigator to probe deeper. He found that the two switches were identical and side by side. Under the pressure of a difficult landing, pilots were pressing the wrong switch. It was an error trap, an indication that human error often emerges from deeper systemic factors. The industry responded not by sacking the pilots but by attaching a rubber wheel to the landing-gear switch and a small flap shape to the flaps control. The buttons now had an intuitive meaning, easily identified under pressure. Accidents of this kind disappeared overnight.

This is sometimes called forward accountability: the responsibility to learn lessons so that future people are not harmed by avoidable mistakes.

But isn’t this soft? Won’t people get sloppy if they are not penalised for mistakes? The truth is quite the reverse. If, after proper investigation, it turns out that a person was genuinely negligent, then punishment is not only justifiable, but imperative. Professionals themselves demand this. In aviation, pilots are the most vocal in calling for punishments for colleagues who get drunk or demonstrate gross carelessness. And yet justifiable blame does not undermine openness. Management has the time to find out what really happened, giving professionals the confidence that they can speak up without being penalised for honest mistakes.

In 2001, the University of Michigan Health System introduced open reporting, guaranteeing that clinicians would not be pre-emptively blamed. As previously suppressed information began to flow, the system adapted. Reports of drug administration problems led to changes in labelling. Surgical errors led to redesigns of equipment. Malpractice claims dropped from 262 to 83. The number of claims against the University of Illinois Medical Centre fell by half in two years following a similar change. This is the power of forward accountability.

High-performance institutions, such as Google, aviation and pioneering hospitals, have grasped a precious truth. Failure is inevitable in a complex world. The key is to harness these lessons as part of a dynamic process of change. Kneejerk blame may look decisive, but it destroys the flow of information. World-class organisations interrogate errors, learn from them, and only blame after they have found out what happened.

And when Lord Laming reported on Baby P in 2009? Was blame of social workers justified? There were allegations that the report’s findings were prejudged. Even the investigators seemed terrified about what might happen to them if they didn’t appease the appetite for a scapegoat. It was final confirmation of how grotesquely distorted our concept of accountability has become.