Search This Blog

Saturday 22 June 2024

In Broken Britain, even the statistics don’t work

 Tim Harford in The FT 


From the bone-jarring potholes to the human excrement regularly released into British rivers, the country’s creaking infrastructure is one of the most visceral manifestations of the past 15 years of stagnation. To these examples of the shabby neglect of the essential underpinnings of modern life, let me add another: our statistical infrastructure. 

In her new book, How Infrastructure Works, engineering professor Deb Chachra argues that infrastructure is an extraordinary collective achievement and a triumph of long-term thinking. She adds that a helpful starting point for defining infrastructure is “all of the stuff that you don’t think about”. 

Statistical infrastructure certainly matches those descriptions. The idea that someone needs to decide what information to gather, and how to gather it, rarely crosses our mind — any more than we give much thought to what we flush down the toilet, or the fact that clean water comes from taps and electricity from the flick of a switch. 

As a result the UK’s statistical system, administrative databases, and evidence base for policy are suffering the same depredations as the nation’s roads, prisons and sewers. Easiest to measure are the inputs: the Office for National Statistics faces a 5 per cent cut in real terms to its budget this year, has been losing large numbers of experienced staff, and is hiring dramatically fewer than five years ago. 

But it is more instructive to consider some of the problems. The ONS has struggled to produce accurate estimates of something as fundamental as the unemployment rate, as it tries to divide resources between the traditional-but-foundering Labour Force Survey, and a streamlined-but-delayed new version which has been in the pipeline since 2017. 

That is an embarrassment, but the ONS can’t be held responsible for other gaps in our statistical system. A favourite example of Will Moy, chief executive of the Campbell Collaboration, a non-profit producer of systematic reviews of evidence in social science, is that we know more about the nation’s golfing habits than about trends in robbery or rape. This is because the UK’s survey of sporting participation is larger than the troubled Crime Survey of England and Wales, recently stripped of its status as an official National Statistic because of concerns over data quality. Surely nobody made a deliberate decision to establish those curious statistical priorities, but they are the priorities nonetheless. They exemplify the British state’s haphazard approach in deciding what to measure and what to neglect. 

This is foolishness. The government spends more than £1,200bn a year — nearly £18,000 for each person in the country — and without solid statistics, that money is being spent with eyes shut. 

For an example of the highs and lows of statistical infrastructure, consider the National Tutoring Programme, which was launched in 2020 in an effort to offset the obvious harms caused by the pandemic’s disruption to the school system. When the Department for Education designed the programme, it was able to turn to the Education Endowment Foundation for a solid, practical evidence base for what type of intervention was likely to work well. The answer: high-quality tutoring in small groups. 

This was the statistical system, in its broadest sense, working as it should: the EEF is a charity backed by the Department for Education, and when the crisis hit it had already gathered the evidence base to provide solutions. Yet — as the Centre for Public Data recently lamented — the DfE lacked the most basic data needed to evaluate its own programme: how many disadvantaged pupils were receiving tutoring, the quality of the tutoring, and what difference it made. The National Tutoring Programme could have gathered this information from the start, collecting evidence by design. But it did not. And as a result, we are left guessing about whether or not this was money well spent. 

Good data is not cheap to collect — but it is good value, especially when thoughtfully commissioned or built into policymaking by default. One promising avenue is support for systematic research summaries such as those produced by the Cochrane Collaboration for medicine and the Campbell Collaboration for social science and policy. If you want to understand how to promote literacy in primary schools, or whether neighbourhood policing is effective, a good research synthesis will tell you what the evidence says. Just as important, by revealing the gaps in our knowledge it provides a basis for funding new research. 

Another exciting opportunity is for the government to gather and link the administrative data we all produce as a byproduct of our interactions with officialdom. A well-designed system can safeguard personal privacy while unlocking all manner of insights. 

But fundamentally, policymakers need to take statistics seriously. These numbers are the eyes and ears of the state. If we neglect them, waste and mismanagement are all but inevitable. 

Chachra writes, “We should be seeing [infrastructure systems], celebrating them, and protecting them. Instead, these systems have been invisible and taken for granted.” 

We have taken a lot of invisible systems for granted over the past 20 years. The Resolution Foundation has estimated that in this period, UK public investment has lagged the OECD average by a cumulative half a trillion pounds. That is a lot of catching up to do. The next government will need some quick wins. Investing in better statistical infrastructure might be one of them.

No comments:

Post a Comment