Search This Blog

Friday 23 October 2020

The power of negative thinking

Tim Harford in The FT 


For a road sign to be a road sign, it needs to be placed in proximity to traffic. Inevitably, it is only a matter of time before someone drives into the pole. If the pole is sturdy, the results may be fatal. 

The 99% Invisible City, a delightful new book about the under-appreciated wonders of good design, explains a solution. The poles that support street furniture are often mounted on a “slip base”, which joins an upper pole to a mostly buried lower pole using easily breakable bolts. 

 A car does not wrap itself around a slip-based pole; instead, the base gives way quickly. Some slip bases are even set at an angle, launching the upper pole into the air over the vehicle. The sign is easily repaired, since the base itself is undamaged. Isn’t that clever? 

 There are two elements to the cleverness. One is specific: the detailed design of the slip-base system. But the other, far more general, is a way of thinking which anticipates that things sometimes go wrong and then plans accordingly. 

That way of thinking was evidently missing in England’s stuttering test-and-trace system, which, in early October, failed spectacularly. Public Health England revealed that 15,841 positive test results had neither been published nor passed on to contact tracers. 

The proximate cause of the problem was reported to be the use of an outdated file format in an Excel spreadsheet. Excel is flexible and any idiot can use it but it is not the right tool for this sort of job. It could fail in several disastrous ways; in this case, the spreadsheet simply ran out of rows to store the data. 

But the deeper cause seems to be that nobody with relevant expertise had been invited to consider the failure modes of the system. What if we get hacked? What if someone pastes the wrong formula into the spreadsheet? What if we run out of numbers? 

We should all spend more time thinking about the prospect of failure and what we might do about it. It is a useful mental habit but it is neither easy nor enjoyable. 

We humans thrive on optimism. Without the capacity to banish worst-case scenarios from our minds, we could hardly live life at all. Who could marry, try for a baby, set up a business or do anything else that matters while obsessing about what might go wrong? It is more pleasant and more natural to hope for the best. 

We must be careful, then, when we allow ourselves to stare steadily at the prospect of failure. Stare too long, or with eyes too wide, and we will be so paralysed with anxiety that success, too, becomes impossible. 

Care is also needed in the steps we take to prevent disaster. Some precautions cause more trouble than they prevent. Any safety engineer can reel off a list of accidents caused by malfunctioning safety systems: too many backups add complexity and new ways to fail. 

My favourite example — described in the excellent book Meltdown by Chris Clearfield and AndrĂ¡s Tilcsik — was the fiasco at the Academy Awards of 2017, when La La Land was announced as the winner of the Best Picture Oscar that was intended for Moonlight. The mix-up was made possible by the existence of duplicates of each award envelope — a precaution that triggered the catastrophe. 

But just because it is hard to think productively about the risk of failure does not mean we should give up. One gain is that of contingency planning: if you anticipate possible problems, you have the opportunity to prevent them or to prepare the ideal response. 

A second advantage is the possibility of rapid learning. When the aeronautical engineer Paul MacCready was working on human-powered aircraft in the 1970s, his plane — the Gossamer Condor — was designed to be easily modified and easily repaired after the inevitable crashes. (At one stage, the tail flap was adjusted by taping a Manila folder to it.) 

Where others had spent years failing to win the prestigious Kremer prize for human-powered flight, MacCready’s team succeeded in months. One secret to their success was that the feedback loop of fly —> crash —> adapt was quick and cheap. 

Not every project is an aeroplane but there are plenty of analogies. When we launch a new project we might think about prototyping, gathering data, designing small experiments and avidly searching for feedback from the people who might see what we do not. 

If we expect that things will go wrong, we design our projects to make learning and adapting part of the process. When we ignore the possibility of failure, when it comes it is likely to be expensive and hard to learn from. 

The third advantage of thinking seriously about failure is that we may turn away from projects that are doomed from the outset. From the invasion of Iraq to the process of Brexit, seriously exploring the daunting prospect of disaster might have provoked the wise decision not to start in the first place. 

But I have strayed a long way from the humble slip base. It would be nice if all failure could be anticipated so perfectly and elegantly. Alas, the world is a messier place. All around us are failures — of business models, of pandemic planning, even of our democratic institutions. It is fanciful to imagine designing slip bases for everything. 

Still: most things fail, sooner or later. Some fail gracefully, some disgracefully. It is worth giving that some thought.

No comments:

Post a Comment