Search This Blog

Tuesday, 29 August 2023

A level Economics: How to Improve Economic Forecasting

 Nicholas Gruen in The FT 


Today’s four-day weather forecasts are as accurate as one-day forecasts were 30 years ago. Economic forecasts, on the other hand, aren’t noticeably better. Former Federal Reserve chair Ben Bernanke should ponder this in his forthcoming review of the Bank of England’s forecasting. 

There’s growing evidence that we can improve. But myopia and complacency get in the way. Myopia is an issue because economists think technical expertise is the essence of good forecasting when, actually, two things matter more: forecasters’ understanding of the limits of their expertise and their judgment in handling those limits. 

Enter Philip Tetlock, whose 2005 book on geopolitical forecasting showed how little experts added to forecasting done by informed non-experts. To compare forecasts between the two groups, he forced participants to drop their vague weasel words — “probably”, “can’t be ruled out” — and specify exactly what they were forecasting and with what probability.  

That started sorting the sheep from the goats. The simple “point forecasts” provided by economists — such as “growth will be 3.0 per cent” — are doubly unhelpful in this regard. They’re silent about what success looks like. If I have forecast 3.0 per cent growth and actual growth comes in at 3.2 per cent — did I succeed or fail? Such predictions also don’t tell us how confident the forecaster is. 

By contrast, “a 70 per cent chance of rain” specifies a clear event with a precise estimation of the weather forecaster’s confidence. Having rigorously specified the rules of the game, Tetlock has since shown how what he calls “superforecasting” is possible and how diverse teams of superforecasters do even better.  

What qualities does Tetlock see in superforecasters? As well as mastering necessary formal techniques, they’re open-minded, careful, curious and self-critical — in other words, they’re not complacent. Aware, like Socrates, of how little they know, they’re constantly seeking to learn — from unfolding events and from colleagues. 

Superforecasters actively resist the pull to groupthink, which is never far away in most organisations — or indeed, in the profession of economics as a whole, as practitioners compensate for their ignorance by keeping close to the herd. The global financial crisis is just one example of an event that economists collectively failed to warn the world about. 

There are just five pages referencing superforecasting on the entire Bank of England website — though that’s more than other central banks. 

Bernanke could recommend that we finally set about the search for economic superforecasters. He should also propose that the BoE lead the world by open sourcing economic forecasting.  

In this scenario, all models used would be released fully documented and a “prediction tournament” would focus on the key forecasts. Outsiders would be encouraged to enter the tournament — offering their own forecasts, their own models and their own reconfiguration or re-parameterisation of the BoE’s models. Prizes could be offered for the best teams and the best schools and universities.  

The BoE’s forecasting team(s) should also compete. The BoE could then release its official forecasts using the work it has the most confidence in, whether it is that of its own team(s), outsiders or some hybrid option. Over time, we’d be able to identify which ones were consistently better.  

Using this formula, I predict that the Bank of England’s official forecasts would find their way towards the top of the class — in the UK, and the world.

No comments:

Post a Comment