Search This Blog

Showing posts with label imperfect information. Show all posts
Showing posts with label imperfect information. Show all posts

Wednesday 12 October 2016

Nobel prize winners’ research worked out a theory on worker productivity – then Amazon and Deliveroo proved it wrong

Ben Chu in The Independent


Financial incentives are important. We all know that’s true. If you were offered a job that paid £10 an hour and then someone else came up offering to pay you £11 an hour for identical work, which one would you choose?

Most of us would also accept that well-designed employment contracts can get more out of us. If we could take home more money for working harder (or more effectively), most of us would.

Bengt Holmstrom won the Nobel economics prize this week for his theoretical research on the optimum design for a worker’s contract to encourage the individual to work as productively as possible.

The work of Holmstrom and his fellow Nobel laureate, Oliver Hart, is subtle, recognising that the complexity of the world can cause simplistic piece-rate contracts or bonus systems to yield undesirable results.

For instance, if you pay teachers more based on exam results, you will find they “teach to the test” and neglect other important aspects of children’s education. If you reward CEOs primarily based on the firm’s share price performance you will find that they focus on boosting the short-term share price, rather than investing for the long-term health of the company.

Holmstrom and Hart also grappled with the problem of imperfect information. It is hard to measure an individual worker’s productivity, particularly when they are engaged in complex tasks.

So how can you design a contract based on individual performance? Holmstrom’s answer was that where measurement is impossible, or very difficult, pay contracts should be biased towards a fixed salary rather than variable payment for performance.

Yet when information on an employee’s performance is close to perfect, there can also be problems.

The information problem seems to be on the way to resolution in parts of the low-skill economy. Digital technology allows much closer monitoring of workers’ performance than in the past. Pickers at Amazon’s Swansea warehouse are issued with personal satnav computers which direct them around the giant warehouse on the most efficient routes, telling them which goods to collect and place in their trolleys. The devices also monitor the workers’ productivity in real time – and those that don’t make the required output targets are “released” by the management.

The so-called “gig economy” is at the forefront of what some are labelling “management by algorithm”. The London-founded cycling food delivery service app Deliveroo recently tried to implement a new pay scale for riders. The company’s London boss said this new system based on fees per delivery would increase pay for the most efficient riders. UberEats – Uber's own meal delivery service – attempted something similar.

Yet the digital productivity revolution is encountering some resistance. The proposed changes by UberEats and Deliveroo provoked strikes from their workers. And there is a backlash against Amazon’s treatment of warehouse workers.

It is possible that some of this friction is as much about employment status as contract design and pay rates. One of the complaints of the UberEats and Deliveroo couriers is that they are not treated like employees at all.

It may also reflect the current state of the labour market. If people don’t want to work in inhuman warehouses or for demanding technology companies, why don’t they take a job somewhere else? But if there are not enough jobs in a particular region, people may have no choice. The employment rate is at an all-time high, but there’s still statistical evidence that many workers would like more hours if they could get them.

Yet the new technology does pose tough questions about worker treatment. And there is no reason why these techniques of digital monitoring of employees should be confined to the gig economy or low-skill warehouse jobs.

One US tech firm called Percolata installs sensors in shops that measure the volume of customers and then compare that with the sales per employee. This allows managements to make a statistical adjustment for the fact that different shops have different customer footfall rates – it fills in the old information blanks. The result is a closer reading of an individual shop worker’s productivity.

Workers who do better can be awarded with more hours. “It creates this competitive spirit – if I want more hours, I need to step it up a bit,” Percolata’s boss told the Financial Times.

It’s possible to envisage these kinds of digital monitoring techniques and calculations being rolled out in a host of jobs and bosses making pay decisions on the basis of detailed productivity data. But one doesn’t have to be a neo-Luddite to feel uncomfortable with these trends. It’s not simply the potential for tracking mistakes by the computers and flawed statistical adjustments that is problematic, but the issue of how this could transform the nature of the workspace.

Financial incentives matter, yet there is rather more to the relationship between a worker and employer than a pay cheque. Factors such as trust, respect and a sense of common endeavour matter too – and can be important motivators of effort.

If technology meant we could design employment contracts whereby every single worker was paid exactly according to his or her individual productivity, it would not follow that we necessarily should.

Thursday 4 February 2016

The age of deference to doctors and elites is over. Good riddance

Mary Dejevsky in The Independent

There was something about the story of five-year-old Ashya King that went beyond the plight of this one small, sick child, wrapped up in his blanket and connected to a drip. It was not just the public relations savvy of his family: the elaborate preparations for their flight, recorded and posted on the internet that drew such all-consuming public interest. Nor was it only the drama of the police chase across Europe, and the nights spent in a Spanish prison. It was much more.

There was a profound clash of principles here at a junction of extremes: a child with a terminal brain tumour, a fixed medical consensus, and parents who hoped, believed, there could be another way.

You probably remember – I certainly do – how forcefully Ashya’s father, Brett, argued his case. He had, he said, set about learning all he could about treatment possibilities for his son’s condition on the internet and in medical journals and concluded that a particular form of therapy was superior to the one being offered by the NHS.

Now, 18 months on, Ashya King’s story has a sequel beyond the so-far happy ending of his recovery announced last March. The sequel is that the treatment his family fought for so hard has indeed been found to be superior to that generally offered by the NHS, and in precisely the ways that the Kings had argued. A study published in The Lancet Oncology – an offshoot of The Lancet – concluded that the proton beam therapy, such as Ashya eventually obtained in Prague, was as effective as conventional radiotherapy, but less likely to cause damage to hearing, brain function and vital organs, especially in children. 

In one way, that should perhaps come as no surprise, given King’s claim to have scoured the literature. There is also room for caution. This was a relatively small study conducted in the US. There was no control group – with children, this is deemed (rightly) to be unethical – and harmful side-effects were reduced, not eliminated. But such is the nature of medical research, and the treatment decisions based on it. Things are rarely cut and dried; it is more a balance of probability.

This may be one reason why the Lancet findings had less resonance than might have been expected, given the original hue and cry about Ashya’s case. But my cynical bet is that if the study had shown there was essentially no difference between the two treatments, or that proton beams were a quack therapy potentially hyped for commercial advantage, sections of the NHS establishment would have been out there day and night, warning parents who might be tempted to follow the Kings’ path how wrong-headed they were, and stressing how the doctors had been vindicated.

Instead, there were low-key interviews with select specialists, who noted that three NHS centres providing the therapy would be open by April 2018. Until then, those (few) children assessed as suitable for proton beam treatment would continue to go to the United States at public expense. (Why the US, rather than Prague or elsewhere in Europe, is not explained.)

It may just be my imagination, but I sensed an attempt to avoid reigniting the passions that had flared over Ashya’s treatment at the time, and especially not to raise other parents’ expectations. But I don’t think the controversy should be allowed to rest so easily. The King family’s experience raised serious questions about the practice of medicine in the UK and the attitudes of the professionals to their patients. And these latest research findings on proton therapy mean that it still does.

When Brett King presented his arguments, he did so not just with understandable emotion, but with enviable lucidity. He patently understood what he was talking about. This treatment was there; he wanted to give it a go, and he was prepared to raise the funds to pay for it. To the medics, he may well have come across as difficult, and there were those who genuinely felt that he was acting against the best interests of his son. In that case, the arguments should have gone to court – as they had done with eight-year-old Neon Roberts and his contested cancer treatment half a year before. That the Kings are Jehovah’s Witnesses may also have cued particular caution.

However, what many, especially in the medical establishment, seem reluctant to recognise is that change is afoot in relations between the professional elite and the rest – and not only because the so-called “age of deference” is dead.

Increasingly, it seems, we lay people are invited to make choices, only to be censured, or worse, for making the “wrong” one. Lawyers, for instance, will repeatedly tell you that they offer only advice; it is up to us to act on it, or not. So it is, increasingly, in the NHS. 

In theory, you can choose your GP, your hospital, your consultant – and, within reason, your treatment. In practice, it is more complicated. You may live too far away, the professionals may try to protect their patch, and the actual consultant is not there.

In the crucial matter of information, however, things have been evening up. The internet-nerd who turns up at the GP surgery convinced he is mortally ill may be a time-consuming nuisance, but such self-interested diligence can also help to point a time-strapped GP in the right direction. Not all are hypochondriacs. Patients may have more time and motive to research new treatments than their doctor. We old-fashioned scribes may have misgivings about the rise of citizen-journalism. But not all challenges to professional expertise are ignorant – or wrong.

In the case of Ashya King, everyone behaved questionably, even as they genuinely believed they were acting in the child’s very best interests.

But the days when the professionals – for all their years of training – had the field to themselves are gone. In medicine, we lay people are getting used to that. Are they?

Saturday 23 January 2016

Silence from big six energy firms is deafening

If this were a competitive market, our fuel bills would be £850 a year instead of £1,100

Patrick Collinson in The Guardian


 
UK consumers are not seeing their tariffs cut despite the fall in wholesale gas and oil prices. Photograph: Alamy


You cannot hope to bribe or twist the British journalist (goes the old quote from Humbert Wolfe) “But, seeing what the man will do unbribed, there’s no occasion to.” Much the same could be said about Britain’s energy companies. You cannot call them a cartel. But seeing what they do without actively colluding, there’s no occasion to.

Almost every day the price of oil and gas falls on global markets. But this has been met with deafening inactivity from the big six energy giants. Their standard tariffs remains stubbornly high, bar tiny cuts by British Gas last year and e.on, this week.

If this were a competitive market, which reflected the 45% fall in wholesale prices seen over the last two years, the average dual-fuel consumer in Britain would be paying £850 or so a year, rather than the £1,100 charged to most customers on standard tariffs.

But it is not a competitive market. The energy giants know that around 70% of customers rarely switch, so they can be very effectively milked through the pricey standard tariff, which is, itself, set at peculiarly similar levels across the big providers. The advent of paperless billing probably helps the companies, too, with busy householders failing to spot that they are paying way over the odds.

The gap between the standard tariffs and the low-cost tariffs is now astounding – £1,100 a year vs £775 a year. Yes, the 30% of households who regularly switch can, and do, benefit. But why must we have a business model where seven out of 10 customers lose out, while three out of 10 gain?

The vast majority would rather have an honest tariff deal where their energy company passes on reductions in wholesale prices without having to go through the rigmarole of switching.

Instead, we have a regulatory set-up which believes that the problem is that not enough of us switch. It thinks that it will be solved by getting that 30% figure up to 50% or more. Unfortunately, too, many regulators have a mindset that is almost ideologically attuned to a belief in the efficacy of markets, and the benefits of competition. If competition is not working, then they think the answer is simply more competition.

What would benefit consumers in these natural monopoly markets would be less competition and more regulation. We now have decades of evidence of how privatised former monopolies behave, and what it tells us is that they are there to benefit shareholders and bonus-seeking management, rather than customers.

In March we will hear from the Competition and Markets Authority about the results of its investigation into the energy market. Maybe it will conclude that privatisation and competition have failed, but my guess is that it won’t. The clue is in the name of the authority.

• A final word about home insurance. Last week I said every insurer is in on the game, happy to rip-off loyal customers, particularly older ones. I received a letter from a 90-year-old householder in Richmond Upon Thames, who, for 20 years has bought home and contents cover from the Ecclesiastical Insurance company.

After seeing my coverage, he nervously checked his premiums, as he had been letting them go through on direct debit for years without scrutiny.

To his delight, he discovered that Ecclesiastical had, unprompted, been cutting his insurance premiums.

One company, at least, doesn’t think it should skin an elderly customer just because it can probably get away with it. We should perhaps praise the lord there is an insurer out there with a conscience.

Is Ecclesiastical the only “ethical” insurer, or are there any others who are not “in on the game”, asks our reader from Richmond. Let me know!

Thursday 21 January 2016

Arguing the toss

Nathan Leamon in Cricinfo


Will awarding the toss to the away team even up the playing field and deliver more away Test wins, or is this yet another case of received cricketing wisdom not stacking up with the facts?


You will rarely be criticised for choosing to bat. Batting is the default setting; bowling first is seen as the gamble © Getty Images



On the first morning of the first Test between Pakistan and England in Abu Dhabi, three events came to mind. One current, one recent, one infamous. The first was the conversation between Michael Atherton and both captains at the toss and the unanimity of all concerned. The second, the recent proposal from Ricky Ponting and Michael Holding amongst others, that the toss be done away with in Test cricket and the choice given instead to the away captain. The other was Brisbane 2002, and Nasser Hussain choosing to bowl first on a day almost as hot as the one in Abu Dhabi.

Let's start with the second. The suggestion of awarding the toss to the away captain was made by Ponting as a possible solution to the perceived problem of home teams tailoring wickets to suit their strengths. And the resulting domination of home teams. "It has never been harder to win away from home", we are told repeatedly.

Ironically, the decline of away wins is one of those facts that is assumed to be true without often, it would seem, being checked. In fact, it has never been easier to win on the road. More Tests are won by the away team now than at any time in recent history.


AWAY WINS IN TESTS

Decade     Win%
2010s        28.8
2000s        28.4
1990s        23.1
1980s        21.1
1970s        22.7
1960s        21.5


This is largely down to the decline in the draw. There have been more and more results in Tests and although the proportion of them that have gone the way of the visitors has shifted slightly in favour of the home team, this has resulted in a significant rise in away wins.

That said, there are other factors that suggest the balance of power is shifting slightly towards the home team. The gap between averages at home and averages away is growing, for example. So let's assume for now that the premise is true, and that home teams are increasingly dominant.

Holding and Ponting have suggested giving the toss to the visiting captain to prevent home teams stacking the conditions in their favour. I don't know whether this is a good idea or not. But there are three reasons that we should question whether it would achieve its aims.

Firstly, it assumes groundsmen can reliably bake certain characteristics into a pitch. In practice, pitch preparation seems to be an inexact science. I have stood before Test matches around the world and listened to groundsmen describe how the pitch is going to play, only to watch it do something completely different half an hour later.

It also presupposes that the interests of groundsman and home team are aligned, which is often not the case. In England for example, venues are heavily incentivised to maximise revenues from the Tests they host by ensuring five full days' play. So groundsmen, understandably, often pay less attention to the needs of the visiting circus than to the people who pay their salary for the other 51 weeks of the year.

Secondly, there is a law of unintended consequences in sporting rule changes that can often produce the opposite result to the one intended. If a home captain had control over the pitch, the framers of this law are assuming he would back away from tilting it in his favour. Is it not just as likely that he would go the other way and seek to produce a pitch so favourable that the toss was taken out of the equation? This after all is what MS Dhoni openly sought to do when England and Australia each last toured, produce pitches that turn big from ball one, and so take the toss out of the equation. Equally, you could imagine England or Australia producing genuine green-tops that would be as helpful to the quicks on day four as day one.

But lastly, and most importantly, it assumes that captains are able to use the toss to their advantage. This is not in any way proven. In fact the evidence suggests it just isn't the case.

At the time of writing, 1,048 Tests have been played since January 1990. During that period, the side that won the toss has lost slightly more (377) matches than it has won (374). Winning the toss in the modern era appears to give a side no advantage at all.

It wasn't always so. On uncovered pitches, batting first in almost all instances was a robustly successful strategy. If it rained during the match, the pitch would deteriorate, affecting the side batting second disproportionately. Until 1970, the side batting first in a Test won 36 per cent of matches, and lost 28 per cent.

But in the modern era, the advantage of winning the toss seems to have disappeared. This is, of course, stunningly counterintuitive.
Test cricket is an asymmetric game. One team bats first, then the other. And the two teams' chances of winning are not equal. The team batting first has different requirements for victory to the team batting second, and the pitch changes over the course of the match, affecting the balance of power between bat and ball. Therefore, we would assume, teams that win the toss can choose the best conditions and so gain an advantage. But they don't. How can that possibly be?

Dropped catches and a sickening injury to Simon Jones didn't help Nasser Hussain after he chose to bowl in Brisbane in 2002 © Getty Images





Sometimes, a perfectly reasonable response to current circumstances becomes a habit, then a tradition, then an article of faith that outlives the circumstances that created it. We rarely question what we know to be self-evidently true. And so the bias towards batting first seems to have outlived the circumstances that created it by several decades.

"If you win the toss, nine times out of ten you should bat. On the tenth occasion you should think about bowling and then bat."

That was a very successful strategy to adopt for the first century of Test cricket. And one that is still the default setting for most captains. In the 700 Tests played since January 2000, nearly twice as many captains have batted first than have chosen to bowl. Is it still successful?

In a word, no. In that period, the side batting first has won 36 per cent of those Tests, the side bowling first 39 per cent. The bat-first bias at the toss would seem to be neutral at best, and probably counter-productive.


It is still hard to believe that captains aren't able to use the toss to their advantage. There are venues where the evidence is stark. Some pitches clearly favour the side batting first, some the side batting second. In the 40 Tests played in Lahore, the team batting first has won just three. Adelaide by contrast is a classic bat-first venue. It starts as a batsman's paradise, but by the fifth day can be very tricky to bat on, with considerable turn for the spinners. In the 74 Tests played at the ground the side batting first have won 35, the side batting second 19. Since 1990 averages in the first innings are 44.6, in the second 38.9, the third 30.1 and the fourth 27.1 and, as you would expect, in that period, 25 out of 26 captains have chosen to bat first, gaining a considerable advantage in doing so.

These are not isolated cases. Many pitches have similarly skewed characteristics. Galle and Old Trafford for example, both have similar records to Adelaide. Karachi is as bowl-first friendly as Lahore.



****



Captains' behaviour at the toss seems to be yet another example of received cricketing wisdom not concurring with the evidence. Where what teams do doesn't seem to maximise their chances of winning. Why is this the case?

Well, part of the story involves how our brains handle information. There has been a great deal of research into memory and perception, and the results are both surprising and illuminating when it comes to our decision-making in sport. For a start, our memories don't work as you might expect. They are not akin to a videotape; we don't record a series of events and then play them back as and when they are needed.

The disturbing truth is that our unaided recall is not very good. The human brain encodes less than 10 per cent of what we experience, the rest it simply makes up. Our minds construct a narrative around the coded memories we do have that fills in the gaps with a plausible story. Faced with a huge number of random or near random events (a cricket match, for instance) our brains pattern-spot, even when there is no pattern. Our minds look for those events that they can form into a pattern or story, and that becomes the meaning or lesson that we take away from the match. Even if the vast number of events that occurred didn't fit the pattern, we disproportionately remember the ones that did.

At their best then, our memories seem to work along the lines of Albert Camus's description of fiction, they are the lie through which we tell the truth. What we remember didn't actually happen, what we remember is a story that our brains have fabricated, but one that we hope contains the essential truth of what happened in a way that we can understand and retain.

Our fallible memories are only part of the reason captains and coaches behave the way they do. There is another, far more powerful reason to make the choices they make and one which is harder to argue against. For this we need to go back to Brisbane in 2002, and Nasser Hussain choosing to bowl.


"The test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function."
- F. Scott Fitzgerald



It was the first Test of the Ashes, an Australian team were at the peak of their powers and playing at home in 'Fortress Brisbane', the hardest ground in the world to win at as an away team. No visiting team had won in the last 26 Tests played at the 'Gabbattoir'. Hussain won the toss and chose to bowl, Australia were 364-2 by the close of play and went on to win comfortably.

It is no use looking back with hindsight and using that to determine whether a decision was right or wrong. I am sure that if Nasser had known that choosing to bowl first would bring a host of dropped chances, the loss of a bowler to injury and Australia piling up the first-innings runs, he would have chosen to have a look behind door B and strapped his pads on.
But he didn't know, and in evaluating a past decision, we shouldn't know either. We need to remain behind the veil of ignorance, aware of all the potential paths the match could have taken, but ignorant of the one that it did.

One way we can do that is to simulate the match. There are various models that allow us to simulate matches given the playing strengths of the two sides and give probabilities for the outcome. When we do this for that Brisbane Test, we get the following probabilities for England:


Decision                  Win                  Draw                 Lose
Bat First                   4%                     3%                   93%
Bowl First                 4%                   10%                   86%



Every batsman in Australia's top seven for that match finished his career averaging over 45 (three averaged 50-plus), none of the English players did, only two averaged 40. England had a decent bowling attack. Australia had Warne, McGrath and Gillespie with 1,000 wickets between them already.

England were a pretty good side, they'd won four, lost two of their previous 10 matches. But they were hopelessly outgunned, and in alien conditions. Steve Waugh, the Australian captain, was also going to bowl if he had won the toss. If he had done then Australia would almost certainly have won the match as well. Australia were almost certainly going to win regardless of who did what at the toss.

But none of that made any difference. Hussain's decision to bowl first was castigated by the public and press of both countries. Wisden described it as "one of the costliest decisions in Test history". One senior journalist wrote that the decision should prompt the England captain "to summon his faithful hound, light a last cigarette and load a single bullet into the revolver".

For Nasser in Brisbane, read Ricky Ponting at Edgbaston in 2005, another decision to insert the opposition that has never been lived down. Yet, if either of them had batted first and lost, no one would ever remember their decision at the toss. You will rarely if ever be criticised for choosing to bat. Batting is the default setting; bowling first is seen as the gamble. And remember, the side that bats first loses significantly more than it wins.

Test cricket is one of the greatest contests in sport, a brilliant, multi-faceted contest for mind and body. But it is also a game of numbers. If you can tilt the numbers slightly in your favour, get them working for you, not against you, plot a slightly more efficient path to victory, then you are always working slightly downhill rather toiling against the slope.

As I write this, Pakistan are about to go out and bowl for the fourth consecutive day of England's first innings in Abu Dhabi on a pitch that you could land light aircraft on. They have home advantage, have made the orthodox decision, played well, and yet there is only one team that can win the match from here, and it isn't them. If this is what home advantage and winning the toss looks like then they are welcome to it.

It is all but certain that if they had ended up batting second they would now be in a considerably better position. Reverse the first innings as they have happened and Pakistan would now be batting past an exhausted England side and about to put them under the pump for a difficult last three sessions. And in the alternative scenarios where one side or the other got a first innings lead, as we have seen, those work disproportionately in favour of the side batting second.

But, we all do it. We look at a pristine wicket, flat, hard and true, and batting seems the only option. It is written into our cricketing DNA. The evidence may suggest there is a small marginal gain in bowling. But small margins be damned. If the marginal gain erodes your credibility and authority, then that is probably not an exchange you are willing to make. There are tides you can't swim against.

Which brings us back to Alastair Cook and Misbah-ul-Haq, standing in Abu Dhabi in the baking heat. Both are men of considerable character; brave, implacable and preternaturally determined to win. Each has withstood the slings and arrows of captaining their country through some fairly outrageous fortunes. Each is ready to bat first without a second thought. Because while they are certainly brave, they are not stupid. And you would have to be really stupid to make the right decision.
And there of course you have the central problem of much decision-making in cricket. This pitch is slightly different to all the other pitches that there have ever been. And you don't know for certain how it is going to play, or how that will influence the balance of power in the match. There are those who would argue that this is why stats are useless, or at best very limited.

I would agree entirely that stats are never sufficient to make a decision. There is nuance and subtlety to weigh; the brain and eye have access to information that the laptop doesn't. The feel and instincts of coaches and players, the hard-wired learning from decades in the game, contains incredibly valuable information and will always be the mainstay of decision-making that must be flexible and fluid through changing match situations. But if we are honest, we must also accept that the sheer weight and tonnage of what we don't know about how cricket works would sink a battleship. To use stats and nothing else to make decisions would be incredibly foolish, and as far as I am aware no one ever has. But equally, to insist on making decisions on incomplete information, without ever reviewing the effectiveness of those decisions would seem almost equally perverse.

I'm not saying that everyone was wrong in Abu Dhabi. I'm not saying that Misbah should have bowled. The weight of opprobrium heaped on him doesn't bear thinking about. It's the sort of decision that ends captaincies. No, Misbah had only one option and he took it. But maybe, just maybe, one day there will come a time when it isn't such an obvious choice.

Thursday 22 October 2015

Why too much choice is stressing us out

Stuart Jeffries in The Guardian

Once upon a time in Springfield, the Simpson family visited a new supermarket. Monstromart’s slogan was “where shopping is a baffling ordeal”. Product choice was unlimited, shelving reached the ceiling, nutmeg came in 12lb boxes and the express checkout had a sign reading, “1,000 items or less”. In the end the Simpsons returned to Apu’s Kwik-E-Mart.

In doing so, the Simpsons were making a choice to reduce their choice. It wasn’t quite a rational choice, but it made sense. In the parlance of economic theory, they were not rational utility maximisers but, in Herbert Simon’s term, “satisficers” – opting for what was good enough, rather than becoming confused to the point of inertia in front of Monstromart’s ranges of products.

This comes to mind because Tesco chief executive Dave Lewis seems bent on making shopping in his stores less baffling than it used to be. Earlier this year, he decided to scrap 30,000 of the 90,000 products from Tesco’s shelves. This was, in part, a response to the growing market shares of Aldi and Lidl, which only offer between 2,000 and 3,000 lines. For instance, Tesco used to offer 28 tomato ketchups while in Aldi there is just one in one size; Tesco offered 224 kinds of air freshener, Aldi only 12 – which, to my mind, is still at least 11 too many.

Now Lewis is doing something else to make shopping less of an ordeal and thereby, he hopes, reducing Tesco’s calamitous losses. He has introduced a trial in 50 stores to make it easier and quicker to shop for the ingredients for meals. Basmati rice next to Indian sauces, tinned tomatoes next to pasta.

What Lewis is doing to Tesco is revolutionary. Not just because he recognises that customers are time constrained, but because he realises that increased choice can be bad for you and, worse, result in losses that upset his shareholders.


Scrapping 30,000 products ... Tesco chief executive Dave Lewis is streamlining the supermarket experience. Photograph: Neil Hall/Reuters

But the idea that choice is bad for us flies in the face of what we’ve been told for decades. The standard line is that choice is good for us, that it confers on us freedom, personal responsibility, self-determination, autonomy and lots of other things that don’t help when you’re standing before a towering aisle of water bottles, paralysed and increasingly dehydrated, unable to choose. That wasn’t how endless choice was supposed to work, argues American psychologist and professor of social theory Barry Schwartz in his book The Paradox of Choice. “If we’re rational, [social scientists] tell us, added options can only make us better off as a society. This view is logically compelling, but empirically it isn’t true.”

Consider posh jams. In one study cited by Schwartz, researchers set up two displays of jams at a gourmet food store for customers to try samples, who were given a coupon for a dollar off if they bought a jar. In one display there were six jams, in the other 24: 30% of people exposed to the smaller selection bought a jam, but only 3% of those exposed to the larger selection did.

Now consider – and there’s no easy way to say this – your pension options. Schwartz found that a friend’s accounting firm was offering 156 different retirement plans. Schwartz noted that there was a shift of responsibilities from employer to employee in this seemingly benign transfer of choice: “When the employer is providing only a few routes to retirement security, it seems important to take responsibility for the quality of those routes. But when the employer takes the trouble to provide many routes, then it seems reasonable to think that the employer has done his or her part. Choosing wisely among those options becomes the employees’s responsibility.”


Posh jam conundrum ... too many options can be baffling. Photograph: Graham Turner for the Guardian

But that’s the problem. Which of us, really, feels competent to choose between 156 varieties of pension plan? Who wouldn’t rather choose to lie in a bath of biscuits playing Minecraft? And yet, at the same time, we are certain that making a decision about our workplace pensions is an important one to get right. But instead of making that choice, Schwartz says, many defer it endlessly. One of his colleagues got access to the records of Vanguard, a gigantic mutual-fund company, and found that for every 10 mutual funds the employer offered, rate of participation went down 2% – even though by not participating, employees were passing up as much as $5,000 a year from the employer who would happily match their contribution.

But even if we do make a choice, Schwartz argues, “we end up less satisfied with the result of the choice than we would be if we had fewer options to choose from”. When there are lots of alternatives to consider, it is easy to imagine the attractive features of alternatives that you reject that make you less satisfied with the alternative that you’ve chosen.

Increased choice, then, can make us miserable because of regret, self-blame and opportunity costs. Worse, increased choice has created a new problem: the escalation in expectations. Consider jeans. Once there was only one kind, says Schwartz – the ill-fitting sort that, fingers-crossed, would get less ill-fitting once he wore and washed them repeatedly. Now, what with all the options (stone-washed, straight-leg, boot-fit, distressed, zip fly, button fly, slightly distressed, very distressed, knee-holed, thigh-holed, knee and thigh-holed, pretty much all holes and negligible denim), Schwartz feels entitled to expect that there is a perfect pair of jeans for him. Inevitably, though, when he leaves the store, he is likely to be less satisfied now than when there were hardly any options.


In the good old days there was just one kind of jeans ... Photograph: Ben Margot/AP

Schwartz’s suggestion is that, at a certain point, choice shifts from having a positive relationship with happiness to an inverse one. So, what’s the answer? “The secret to happiness is low expectations,” he says, sensibly.

No wonder, then, we aren’t happy. In the 10 years since Schwartz wrote his book, the ideology of unlimited choice has expanded into unlikely areas – schools, sex, parenting, TV – and expectations have risen as a result. Equally importantly, new tactics have developed to help consumers deal with the downsides of choice. For instance, Schwartz notes, there is an increasing reliance on recommendation engines to help people cope with choice. “The internet hath created a problem that it is now trying to solve,” he says.

One of the areas affected is dating. Relationships are being treated like any other product – online we can browse and compare prospective sexual partners.

“I think dating sites are now the most common path for meeting romantic partners, and the overwhelming amount of choice that dating sites have created is a real problem,” says Schwartz. One of those problems was noted by the comedian Aziz Ansari in his book Modern Romance. In it, a woman recounts meeting a man on the dating app Tinder, then spending the journey to their first date swiping through the service to see if anyone better was available. Failure to commit to a date or a relationship can itself be a choice – indeed, the sociology professor who helped Ansari with his book, Eric Klinenberg, wrote Going Solo: The Extraordinary Rise and Surprising Appeal of Living Alone to account for those who have stepped off the treadmill of dating, the nightmare of having more choice but less reason to choose. Hence, too, Japan’s soshoku danshi or herbivore men who, so corrupted by the endless choices offered by online pornography, are no longer interested in real sex or romantic relationships. Psychologist Philip Zimbardo fears that, thanks to how online pornography is offering more choices for masturbatory satisfactions, becoming more interactive and immersive, choosing real-life romantic relationships will become even less appealing.


BT vying with Sky for football coverage is good news, right? Not necessarily ... Photograph: Rex Shutterstock

There’s another problem with choice: it can be more apparent than real – that is, a seeming increase in choice masks the fact that you’re paying more for the same stuff you had before. My Guardian colleague Barney Ronay identified this when he considered football on TV recently. The ostensibly good news is that BT Sport is competing with Sky for football rights. It now exclusively shows European Champions League football, which should mean more choice, lower customer outlay and more joy, shouldn’t it? But if you are already a Sky Sports subscriber (or, perhaps more pertiniently, watched the free-to-air games on ITV), it means the opposite. If Barney wants to watch the same amount of football as last year, he will now have to pay more.

That sort of phenomenon repeats itself across TV more generally. To watch all the good stuff on TV now involves paying money in the form of monthly subscriptions to Amazon Prime, Netflix, Sky, BT and Blinkbox, as well as having a Freeview box. But who can afford that kind of outlay? A decade ago everything you could ever wish to watch was on Sky (if you were prepared, admittedly, to pay a monthly subscription to Murdoch). A decade before that, all good TV was on terrestrial, so once you had paid for the telly and the licence you were set. What is sold to us as increased choice has thus made us poorer and, if Ronay’s experience is anything to go by, more disappointed. As Ronay says: “for the captive consumer this isn’t really a proper choice at all, but an opportunity to spend the same and get less, or alternatively spend more and get the same.”




UK TV: how much does it cost to watch everything?



Anger at this state of affairs is comprehensible to anyone who lives in an advanced western society in 2015 and has to choose between mobile phone plans, schools, and water, gas and electricity suppliers – not to mention minimally distinguishable prospective dates. Admittedly these are the choices typical of decadent westerners in the era of late capitalism, but that thought doesn’t make the burden of choice any easier to bear.

Consider electricity, says Professor Renata Salecl, author of The Tyranny of Choice. “Privatisation of electricity did not bring the desired outcome – lesser prices, better service – however, it did contribute to the anxiety and feeling of guilt on the side of the consumers. We feel that it is our fault we are paying too much and we are anxious that a better deal is just around the corner. However, while we are losing valuable time doing research on which provider to choose, we then stop short of actually making the choice.”

So we do nothing, and corporations profit from this inertia.

All of this confounds the idea that human beings act in such a way that they maximise their wellbeing and minimise their pain. “People often act against their wellbeing. They also rarely make choices in a rational way.”

The political idea of allowing parents to choose between schools was to apply the presumed rigour of the market to education so that underperforming schools would improve or close. Standards would rise and formerly illiterate brats in key stage II would relax after double quantum physics by dancing around the maypole singing settings of Horace’s verse in Latin, just like in Michael Gove’s dreams.



 Education has become a consumer good, and could all go wrong ... Illustration from Charles Dickens’ Oliver Twist. Mansell/Time Life Pictures/Getty Images

So education has become a consumer good, and my daughter’s is something I’m encouraged to think about as though it were a trip to the shops to buy shoes. Can I buy the best education for my daughter, possibly by moving house, lying about my real address or selling a kidney for private schooling? I’m not sure, but one thing I am becoming increasingly convinced of is what Salecl says: “Ideology that convinces us that everyone can make it if only he or she makes the right choice relies on blindness – we do not see that social constraints stop us making out of our lives what we wish for. And when we think about choice as a primarily individual matter, we also become blind about broader social, political choices.”

What she means, I think, is that the ideology of choice makes us forget that some things shouldn’t be bought and sold, and they are the most important things of all. What’s more, having made a decision, we don’t want to hear we have got our choices wrong. “We are constantly under the impression that life choices we made after careful planning should bring us expected results – happiness, security, contentment – and that with better choices, traumatic feelings that we have when dealing with loss, risk and uncertainty can be avoided.” No wonder, then, that Slacel’s most recent work is on the power of denial and ignorance. “When people are overwhelmed by choice and when they are anxious about it, they often turn to denial, ignorance and wilful blindness.”

Schwartz demurs, arguing that some extensions of choice can be a good thing. When I tell him about the preponderance of academies and free schools that, seemingly, increase choice for British parents, he says a similar phenomenon, charter schools, has arisen in the US. “There is something good about this, since public education in much of the US is dreadful, and competition might make it better. But there is no doubt it is stressing parents out big time.”

As with choosing a pension, choosing a school leaves scope for regret, shame and fear of missing out. And, in extremis, the terrifying sense that I might inadvertently choose an option that will mess up my daughter’s future.


Challenging the rhetoric of choice ... Jeremy Corbyn. Photograph: Mary Turner/Getty Images

In 2015, though, there are counter-tendencies to the stress-inducing extension of choice. Not only is Tesco reducing its number of products, but the new leader of the Labour party has just been elected on a political platform that, in part, challenges the rhetoric of choice. Jeremy Corbyn proposes to renationalise not just the rail network but public utilities (gas, electricity and water), partly in the hope that the reduction of choice will provide a fairer, less anxiety-inducing experience for their users.

Perhaps, Corbyn’s political philosophy suggests, what we need is not more choice, but less; not more competition but more monopolies. But before you counter with something along the lines of “Why don’t you go and live in North Korea, pinko?” consider this: Paypal founder Peter Thiel argues that monopolies are good things and that competition, often, doesn’t help either businesses or customers. “In the real world outside economic theory, every business is successful exactly to the extent that it does something others cannot. Monopoly is therefore not a pathology or an exception. Monopoly is the condition of every successful business.” Competition, in short, is for losers.

That, of course, doesn’t mean that successful capitalists like Thiel would be supporting Corbyn in his plan to recreate the state monopolies of yore or submit schools once more to local education control, but it does mean the rhetoric of choice and competition is at least being challenged and not only from the political left.

“At least we are talking about a political and economic choice,” says Salecl, “and are not simply following the ‘desires’ of the market.” Perhaps: if she’s right about that, then we are opting for something we haven’t done for a long time.

Thursday 1 October 2015

Right to 30-day refund becomes law

Brian Milligan in BBC News


New consumer protection measures - including longer refund rights - have come into force under the Consumer Rights Act.

For the first time anyone who buys faulty goods will be entitled to a full refund for up to 30 days after the purchase.

Previously consumers were only entitled to refunds for a "reasonable time".

There will also be new protection for people who buy digital content, such as ebooks or online films and music.

They will be entitled to a full refund, or a replacement, if the goods are faulty.

The Act also covers second-hand goods, when bought through a retailer.

People buying services - like a garage repair or a haircut - will also have stronger rights.

Under the new Act, providers who do not carry out the work with reasonable care, as agreed with the consumer, will be obliged to put things right.

Or they may have to give some money back.

'Fit for purpose'

"The new laws coming in today should make it easier for people to understand and use their rights, regardless of what goods or services they buy," said Gillian Guy the chief executive of Citizens Advice.

When disputes occur, consumers will now be able to take their complaints to certified Alternative Dispute Resolution (ADR) providers, a cheaper route than going through the courts.

The Consumer Rights Act says that goods 

- must be of satisfactory quality, based on what a reasonable person would expect, taking into account the price 

- must be fit for purpose. If the consumer has a particular purpose in mind, he or she should make that clear

- must meet the expectations of the consumer


The Act has been welcomed by many consumer rights groups and further information can be found here.

"Now, if you buy a product - whether physical or digital - and discover a fault within 30 days you'll be entitled to a full refund," said Hannah Maundrell, the editor of money.co.uk. "The party really is over for retailers that try to argue the point."

The Act also enacts a legal change that will enable British courts to hear US-style class action lawsuits, where one or several people can sue on behalf of a much larger group.

It will make it far easier for groups of consumers or small businesses to seek compensation from firms that have fixed prices and formed cartels.

Wednesday 30 September 2015

How the banks ignored the lessons of the crash

Joris Luyendijk in The Guardian

Ask people where they were on 9/11, and most have a memory to share. Ask where they were when Lehman Brothers collapsed, and many will struggle even to remember the correct year. The 158-year-old Wall Street bank filed for bankruptcy on 15 September 2008. As the news broke, insiders experienced an atmosphere of unprecedented panic. One former investment banker recalled: “I thought: so this is what the threat of war must feel like. I remember looking out of the window and seeing the buses drive by. People everywhere going through a normal working day – or so they thought. I realised: they have no idea. I called my father from the office to tell him to transfer all his savings to a safer bank. Going home that day, I was genuinely terrified.”

A veteran at a small credit rating agency who spent his whole career in the City of London told me with genuine emotion: “It was terrifying. Absolutely terrifying. We came so close to a global meltdown.” He had been on holiday in the week Lehman went bust. “I remember opening up the paper every day and going: ‘Oh my God.’ I was on my BlackBerry following events. Confusion, embarrassment, incredulity ... I went through the whole gamut of human emotions. At some point my wife threatened to throw my BlackBerry in the lake if I didn’t stop reading on my phone. I couldn’t stop.”

Other financial workers in the City, who were at their desks after Lehman defaulted, described colleagues sitting frozen before their screens, paralysed – unable to act even when there was easy money to be made. Things were looking so bad, they said, that some got on the phone to their families: “Get as much money from the ATM as you can.” “Rush to the supermarket to hoard food.” “Buy gold.” “Get everything ready to evacuate the kids to the country.” As they recalled those days, there was often a note of shame in their voices, as if they felt humiliated by the memory of their vulnerability. Even some of the most macho traders became visibly uncomfortable. One said to me in a grim voice: “That was scary, mate. I mean, not film scary. Really scary.”

I spent two years, from 2011 to 2013, interviewing about 200 bankers and financial workers as part of an investigation into banking culture in the City of London after the crash. Not everyone I spoke to had been so terrified in the days and weeks after Lehman collapsed. But the ones who had phoned their families in panic explained to me that what they were afraid of was the domino effect. The collapse of a global megabank such as Lehman could cause the financial system to come to a halt, seize up and then implode. Not only would this mean that we could no longer withdraw our money from banks, it would also mean that lines of credit would stop. As the fund manager George Cooper put it in his book The Origin of Financial Crises: “This financial crisis came perilously close to causing a systemic failure of the global financial system. Had this occurred, global trade would have ceased to function within a very short period of time.” Remember that this is the age of just-in-time inventory management, Cooper added – meaning supermarkets have very small stocks. With impeccable understatement, he said: “It is sobering to contemplate the consequences of interrupting food supplies to the world’s major cities for even a few days.”





These were the dominos threatening to fall in 2008. The next tile would be hundreds of millions of people worldwide all learning at the same time that they had lost access to their bank accounts and that supplies to their supermarkets, pharmacies and petrol stations had frozen. The TV images that have come to define this whole episode – defeated-looking Lehman employees carrying boxes of their belongings through Wall Street – have become objects of satire. As if it were only a matter of a few hundred overpaid people losing their jobs: Look at the Masters of the Universe now, brought down to our level!

In reality, those cardboard box-carrying bankers were the beginning of what could very well have been a genuine breakdown of society. Although we did not quite fall off the edge after the crash in the way some bankers were anticipating, the painful effects are still being felt in almost every sector. At this distance, however, seven years on, it’s hard to see what has changed.

A typical education in the west leaves you with more insight into ancient Rome or Egypt than into our financial system – and while there are plenty of books and DVDs for lay people about, say, quantum mechanics, evolution or the human genome, before the crash there was virtually nothing to explain finance to outsiders in accessible language. The City, as John Lanchester put it in his book about the 2008 crash, Whoops!, is still “a far-off country of which we know little”.

As a result, ordinary people trying to form an opinion about finance over the past decades have had very little to go on, and many seem to have latched on to the images provided by films and TV. 


The British stereotype of the boring banker began to change in the 80s when finance was deregulated. Following Ronald Reagan’s dictum, “Government is not the solution to the problem, it is the problem”, banks were allowed to unite under one roof activities that regulation had previously required to be divided between separate firms and banks. They were able to grow to sizes many times bigger than a country’s GDP – the assumption being that the market would be self-regulating. The changes also meant that bankers became immensely powerful. Hollywood provided the City with a new hero: the financier Gordon Gekko, from Oliver Stone’s 1987 film Wall Street, who brought us the phrase “Greed is good”. In his portrait of bond traders’ raging ambition, Bonfire of the Vanities, novelist Tom Wolfe coined the term “Masters of the Universe”.

It all seemed innocent entertainment, before 2008: tales about a far-off country where boys behaved badly, scandalously even, but above all, reassuringly far away from the comfort and safety of our own homes, something like watching a Quentin Tarantino film or an episode of The Sopranos. This was the era when a Labour chancellor, Gordon Brown, could give a speech to a gathering of bankers and asset managers and tell them: “The financial services sector in Britain, and the City of London at the centre of it, is a great example of a highly skilled, high value-added, talent-driven industry that shows how we can excel in a world of global competition. Britain needs more of the vigour, ingenuity and aspiration that you already demonstrate that is the hallmark of your success.”

Those words were spoken in 2007, and a year later the world found itself in the middle of the biggest financial panic since the 1930s. In the end, it was only through a combination of pure luck, extremely expensive nationalisations and bailouts, the lowest interest rates in recorded history plus an ongoing experiment in mass money printing, that total meltdown was averted.

The post-Lehman panic was followed by a wave of investigations and reconstructions by journalists, writers and politicians. More than 300 books have been published about the crash in English alone. Every western country held extensive hearings and produced detailed recommendations. Everything you need to know about what is wrong with finance and the banks today is in their reports; the problem is that there is so much more that needs to be explained.

Most areas inside banking had little or nothing to do with the crash, while many players outside banking bore a heavy responsibility, too, including insurers, credit rating agencies, accountancy firms, financial law firms, central banks, regulators and politicians. Investors such as pension funds had been egging the banks on to make more profits by taking more risk. Unless you had a firm understanding of finance, the causes of the crash were very unclear, and this must be part of the reason why the clearest and most urgent lesson of all would get lost or buried: the financial system itself had become dangerously flawed.

After the crash of 2008, ignorance among the general public, reticence among complicit mainstream politicians and a deeply skewed and sensationalist portrayal of finance in the mass media conspired to create the narrative that the crash was caused by greed or by some other character flaw in individual bankers: psychopathy, gambling addiction or cocaine use. (A whole genre of City memoirs sprang up with titles such as Binge Trading: The Real Inside Story of Cash, Cocaine and Corruption in the City. Gordon Gekko returned for a sequel, Wall Street: Money Never Sleeps, and Leonardo di Caprio scored an immense hit playing the title role in The Wolf of Wall Street, about a whoring and cocaine-snorting financial fraudster.)

From there it was a small step to the notion that we can fix finance by getting rid of the “jerks”, as the plain speaking former Barclays CEO Bob Diamond put it. When Diamond was forced to resign in July 2012 over a scandal involving interest rate rigging by his traders, his successor, Antony Jenkins, also promised to focus on changing the culture. And so the same banks that brought us the mess of 2008 eagerly embraced the need for cultural change – which alone should arouse our suspicions. If there is one recurring theme in the many conversations I had with City insiders, it was the need for structural rather than cultural change; not so much different bankers, but a different system.

“Sometimes I feel as if finance has reacted to the crisis the way a motorist might after a near-accident,” said the City veteran at a small credit rating agency whose wife had almost chucked his phone into a lake at the height of the panic. “There is the adrenaline surge directly after the lucky escape, followed by the huge shock when you realise what could have happened. But as the journey continues and the scene recedes in the rear-view mirror, you tell yourself: maybe it wasn’t that bad. The memory of your panic fades, and you even begin to misremember what happened. Was it really that bad?”

He was a soft-spoken man, the sort to send a text message if he is going to be five minutes late to a meeting. But now he was really angry: “If you had told people at the height of the crisis that years later we’d have had no fundamental changes, nobody would have believed you. Such was the panic and fear. But here we are. It’s back to business as usual. We went from ‘We nearly died from this’ to ‘We survived this’.”


The City is governed by a code of silence and fear of publicity; those caught talking to the press without a PR officer present could be sacked or sued. But once I had persuaded City insiders to talk (always and only on condition of anonymity), they were remarkably forthcoming.

“I have the wrong accent and I went to a shit school,” said one City veteran, after explaining that for many years he had made millions at a top bank only to move to an even better-paying hedge fund. “Forty years ago, I wouldn’t even have been given an interview in the City. Finance today is fiercely meritocratic. Doesn’t matter if you’re gay or black or working class, if you can do something better than the other person, you’ll move up.”

He was a mathematician by training, and his direct manner reminded me of stallholders at the biggest open market in my hometown of Amsterdam – tough guys with a highly developed mistrust of pretentiousness. He fuelled himself with Diet Coke and coffee and teased me for ordering cranberry juice. Before he was recruited by the bank in the early 1990s, he had taught at a university; his only idea of an investment bank was based on two books he had read: Liar’s Poker by Michael Lewis and Barbarians at the Gate by Bryan Burrough and John Helyar. “Traders as loud, crass, bad-mouthed, macho dickheads. The sort of guys with red braces who shout ‘buy, buy, sell, sell’ into their phones and have eating competitions.” Many outsiders still believe that these are the people occupying the top positions in big banks, he said, and taking the biggest risks. “That’s over,” he told me. “Some of the best traders are now women. Totally unassuming, cerebral and talented. Trading is no longer a balls job. It’s a brains job. To be sure, the kind of maths traders now have to be able to do is not of the wildly hard variety. But it requires real skills in that area.”

He described the basic flaw in the banking system as it has evolved over the past decades: other people’s money. Until deregulation began to liberate finance from the constraints placed on it after the last major crash in the 1930s, risky banking in the City was carried out in firms that were organised as partnerships, which were not listed on the stock exchange. The partners owned and ran the firm – and when things went wrong, they were liable. Hence the system of bonuses: if you put your personal fortune on the line and things go well, it stands to reason that you deserve a big bonus. Because if things go the other way, you are personally liable for the losses.





Back in the days when his bank was still a partnership, the former banker had drawn on his gift for maths to build a complex financial product that he thought was very clever. “I was very new and maybe a bit cocky,” he said. “So I went over to the head of trading and showed it to him, saying, ‘Look, we can make a lot of money with this.’ The head of trading was a partner in the traditional sense. He looked at me and replied: ‘Don’t forget, this is my money you’re fucking with.’”

The problem with the way banks are now organised is not that they take risks – that is their job. The problem with today’s banks is that those who accept the risks are no longer those who get stuck with the bill.

A bank that is listed on the stock market loses control to the new owners; that is, the shareholders. When these shareholders, which can include insurers, wealthy dynasties or pension funds, start demanding ever greater profits, then greater profits are what you have to deliver. In 2007, in an inadvertent moment of candour, the then CEO of the megabank Citigroup, Charles O Prince, summarised this relationship: “As long as the music is playing, you’ve got to get up and dance.”

This dynamic became all the more dangerous as globalisation began to create a single market for finance. Not only were partnerships allowed to be listed on the stock exchange or taken over by a publicly listed bank, they were also allowed to go on a global shopping spree. Wave after wave of mergers and acquisitions meant banks could generate higher profits than the GDPs of their host countries, resulting in the institutions that we now know as “too big to fail” – so big that if they go bust, they can bring down the system with them. When excessive risk turns sour, it’s the taxpayer who suffers.

In a functioning free market system, incompetence and recklessness are punished by failure and bankruptcy. But there is currently no functioning free market at the centre of the global free market system. I heard City workers scoff at the employees of banks that cannot be allowed to fail – calling them overpaid civil servants who play a game they cannot lose. Risk-taking at a bank that will always be saved, they said, is like playing Russian roulette with someone else’s head.

In the old days, veterans told me, there was an office party almost every Friday: celebrating the anniversary of someone who had stayed with the firm for 20 years or longer. That is all over now, and in its place has come a hire-and-fire culture characterised by an absence of loyalty on either side. Employment in the City is now a purely transactional affair. It is exceedingly rare to find people who have stayed with the same bank for their entire career.

Many of the insiders I spoke to had stories about abrupt sackings: You get a call from a colleague, saying: “Look, could you do me a favour and get my coat and bag?” She is standing outside with a blocked security pass. One morning, you swipe your pass only to hear a beep and find your entrance barred. You turn to the receptionist who says, after a glance at her computer screen, “Would you please have a seat over there until somebody comes to fetch you?”

In the City, sudden dismissals of this kind have a name: “executions”. Add to these the quarterly “waves” when headquarters decides to reduce headcount and a certain percentage of staff worldwide are given the sack, all on one day. Some banks operate a “cull”. Every year, prestigious banks such as Goldman Sachs and JP Morgan routinely fire their least profitable staff. “When the cull comes ...” people would say, or: “Oh yes, we cull.”

“When you can be out of the door in five minutes, your horizon becomes five minutes,” one City worker told me. Another asked: “Why would I treat my bank any better than my bank treats me?”

If the threat of being culled influenced bankers’ behaviour through fear, there were also powerful motivations. Deregulation has allowed perverse incentives into the very fabric of global finance. People are faced with immense temptations to take risks with their bank’s capital or reputation, knowing that if they don’t act on them, their colleague across the desk will.

Before the deregulation of the 80s and 90s, the City was far from perfect: it was a snobbish, antisemitic and misogynistic place. But the City – and Wall Street – of old was a world that Gus Levy, head of Goldman Sachs in the 70s, famously described as “long term greedy”; you made money with your client and your firm. Because partners were personally liable, they had an interest in keeping their firm on a manageable scale and making sure their employees told them of any risks. In a few decades, this system has evolved into one that Levy called “short term greedy”; you make money at the expense of the client, of the bank, of the shareholder or of the taxpayer. This did not happen because bankers suddenly became evil, but because the incentives fundamentally changed.

Until the mid 80s, the London Stock Exchange’s motto was dictum meum pactum, “my word is my bond”. These days the governing principle is caveat emptor, or “buyer beware” – it is effectively up to the professional investor to figure out what the bank is offering. As one builder of complex financial products explained to me: “You have got to read the small print. You need to bring in a lawyer who explains it to you before you buy these things.”


Perhaps the most terrifying interview of all the 200 I recorded was with a senior regulator. It was not only what he said but how he said it: as if the status quo was simply unassailable. Ultimately, he explained, regulators – the government agencies that ensure the financial sector is safe and compliant – rely on self-declaration; what is presented by a bank’s internal management. The trouble, he said with a calm smile, is that a bank’s internal management often doesn’t know what’s going on because banks today are so vast and complex. He did not think he had ever been deliberately lied to, although he acknowledged that, obviously, he couldn’t know for sure. “The real threat is not a bank’s management hiding things from us, it’s the management not knowing themselves what the risks are.”

He talked about the culture of fear and how people are not managing their actions for the benefit of their bank. Instead, “they are managing their career”. He believed that the crash had been more “cock-up than conspiracy”. Bank management is in conflict, he pointed out: “What is good for the long term of the bank or the country may not be what is best for their own short-term career or bonus.”

If the problem with finance is perverse incentives, then the insistence on greed as the cause for the crash is part of the problem. There is a lot of greed in the City, as there is elsewhere in society. But if you blame the crash on character flaws in individuals you imply that the system itself is fine, all we need to do is to smoke out the crooks, the gambling addicts, the coke-snorters, the sexists, the psychopaths. Human beings always have at least some scope for choice, hence the differences in culture between banks. Still, human behaviour is largely determined by incentives, and in the current set-up, these are sending individual bankers, desks or divisions within banks – as well as the banks themselves – in the wrong direction.

How hard would it be to change those incentives? From the viewpoint of those I interviewed, not hard at all. First of all, banks could be chopped up into units that can safely go bust – meaning they could never blackmail us again. Banks should not have multiple activities going on under one roof with inherent conflicts of interest. Banks should not be allowed to build, sell or own overly complex financial products – clients should be able to comprehend what they buy and investors understand the balance sheet. Finally, the penalty should land on the same head as the bonus, meaning nobody should have more reason to lie awake at night worrying over the risks to the bank’s capital or reputation than the bankers themselves. You might expect all major political parties to have come out by now with their vision of a stable and productive financial sector. But this is not what has happened.

Not that there has been no reform. Banks are taxed when they get beyond a certain size, for example, and all banks must now finance a larger part of their risks with equity rather than borrowed money. American banks are banned from using their own capital to speculate and invest in the markets, and the European commission, or national governments, have forced a few banks to shrink or sell off their investment bank activities – the Dutch bank ING, for example, was told to sell off its insurance arm, ING Direct. But change has been largely cosmetic, leaving the sector’s basic architecture intact. If a bank collapses, the new European banking union – set up in 2012 to transfer banking policy from a national to a European level – is meant to step in and wind it down in an orderly fashion. But who is propping up that European banking union, if several banks should fail at the same time? The taxpayer. A bonus cap in banking was introduced by the EU, so instead of paying widely publicised million-pound bonuses, banks now simply offer higher salaries.

Perhaps the most promising change in the UK is the so-called “senior person regime” that makes it possible to prosecute bankers for reckless behaviour – but only after they have wrecked their bank. Virtually all big banks remain publicly listed or are doing everything they can to get back on the stock exchange. They have never allowed staff to talk openly about what went wrong before 2008 and why. The code of silence remains intact. The banks have not sacked the accountancy firms or credit rating agencies that failed to raise the alarm over the erroneous or misleading items on their balance sheets. Banks have certainly not joined hands to fight for a globally enforced increase in capital buffers (the minimum capital they are required to hold), which could help them absorb and survive severe losses. Indeed, they have spent millions lobbying to keep any increase in buffers as low as possible.

“Back to business as usual.” This is how many interviewees described the post-crash atmosphere in the City. As the senior regulator put it with chilling equanimity: “Is the sector fixed, after the crisis? I don’t think so.” What we have now, he added, is “what you get with free-market capitalism – consolidation of all wealth into fewer and fewer banks, which end up dividing up the market as a cartel.”


When it comes to global finance, the most startling news isn’t news at all; the important facts have been known for a long time among insiders. The problem goes much deeper: the sector has become immune to exposure.

“If I had a million pounds for every time I have heard a possible reform opposed because ‘it wouldn’t have prevented Northern Rock or Lehman Brothers going bust’, I might now have enough money to bail out a bank,” the Financial Times columnist John Kay wrote in 2013. “The objective of reform is not to prevent Northern Rock or Lehman going bust ... The problem revealed by the 2007-08 crisis was not that some financial services companies collapsed, but that there was no means of handling their failure without endangering the entire global financial system.”

Only last year Andrew Haldane, chief economist at the Bank of England, told the German magazine Der Spiegel that the balances of the big banks are “the blackest of black holes”. Haldane is responsible for the stability of the financial sector as a whole. He knowingly told a journalist that he couldn’t possibly have an idea of what the banks have on their books. And? Nothing happened.

It made sense in 2008 for those in the know not to deepen the panic by talking about it. Indeed, one of the most powerful figures in the EU in 2008, the almost supernaturally levelheaded Herman van Rompuy, waited until 2014 to acknowledge in an interview that he had seen the system come within “a few millimetres of total implosion”.

But because the general public was left in the dark, there was never enough political capital to take on the banks. Compare this to the 1930s in the US, when the crash was allowed to play out, giving Franklin D Roosevelt the chance to bring in simple and strong new laws that kept the financial sector healthy for many decades – until Reagan and Thatcher undid one part, and Clinton and Blair the other.





Tony Blair is now making a reported £2.5m a year as adviser to JP Morgan, while the former US Treasury secretary Timothy Geithner and the former secretary of state Hillary Clinton have been paid upwards of $100,000 a speech to address small audiences at global banks. It is tempting to see corruption in all this, but it seems more likely that, over the past decades, politicians as well as regulators have come to identify themselves with the financial sector they are supposed to be regulating. The term here is “cognitive capture”, a concept popularised by the economist and former Financial Times columnist Willem Buiter, who described it as over-identification between the regulator and the regulated – or “excess sensitivity of the Fed to financial market and financial sector concerns and fears”.

With corruption, you are given money to do something you would not have done otherwise. Capture is more subtle and no longer requires a transfer of funds – since the politician, academic or regulator has started to believe that the world works in the way that bankers say it does. Sadly, Willem Buiter never wrote a definitive account of capture; he no longer works in academia and journalism. He has moved to the megabank Citigroup.

The European commission president, Jean-Claude Juncker, memorably said in 2013 that European politicians know very well what needs to be done to save the economy. They just don’t know how to get elected after doing it. A similar point could be made about the major parties in this country: they know very well what needs to be done to make finance safe again. They just don’t know where their campaign donations and second careers are going to come from once they have done it.

Still, the complicity of mainstream politicians is not the whole story. Finance today is global, while democratically legitimate politics operates on a national level. Banks can play off one country or block of countries against the other, threatening to pack up and leave if a piece of regulation should be introduced that doesn’t agree with them. And they do, shamelessly. “OK, let us assume our country takes on its financial sector,” a mainstream European politician told me. “In that case, our banks and financial firms simply move elsewhere, meaning we will have lost our voice in international forums. Meanwhile, globally, nothing has changed.”

This then opens up the most difficult question of all: how is the global financial sector to be brought back under control if there is no global political authority capable of challenging it?

Seven years after the collapse of Lehman Brothers, it is often said that nothing was learned from the crash. This is too optimistic. The big banks have surely drawn a lesson from the crash and its aftermath: that in the end there is very little they will not get away with.

Tuesday 19 May 2015

Why I choose to have less choice

Shopping around is the mantra of the modern era. But who really benefits from our befuddlement?

Tim Lott in The Guardian

Once, when I was suffering a fit of depression, I walked into a supermarket to buy a packet of washing powder. Confronted by a shelf full of different possibilities, I stood there for 15 minutes staring at them, then walked out without buying any washing powder at all.
I still feel echoes of that sensation of helplessness. If I just want to buy one item but discover that if I buy three of the items I will save myself half the item price, I find myself assailed by choice paralysis.
I hate making consumer choices at the best of times, because I have this uncomfortable suspicion that big companies are trying to gull me out of as much money as possible, using sophisticated techniques designed by people who are smarter than I am.
For instance, when I buy an insurance product, how can I decide whether I should just buy the cheapest, or the best? The best is the one most likely to pay out without penalty or fuss, but that information is much harder to find out than factors such as cost, extent of cover, etc. It’s complicated. So I often try not to make choices – by just putting my payments for insurance with my usual insurers on direct debit, for example, which means I don’t have to think about shopping around.
This issue of choice and complexity lies at the heart of the experience of being modern. It penetrates commerce, politics and our personal lives. It may even be connected to the fact that there are higher levels of depression in society than ever before.
This idea was suggested by Barry Schwartz in his book The Paradox of Choice. Choice oppresses us. Why? Because there are too many choices and they are often too complex for us to be confident that we are making the right one.
When you might have 200 potential choices to make of a particular style of camera, it is difficult to feel sure you have chosen the right one – even if you spend an inordinate amount of time trying to make a rational decision. Or you may see the same model two weeks after you’ve bought it being sold more cheaply. When there was less choice and fewer types of camera, this kind of experience was rare. Our capacity for hindsight has become a means of punishing ourselves.
Complexity is not entirely accidental. Late capitalism solves the dilemma of competition (for the producer) through complexity. To try to choose a mortgage, or a pension, or a computer, requires a tremendous amount of application, so we become relatively easy to gull. Whether it is a power company or a loan company, we struggle to understand tariffs, terms and the small print. Exhausted, we just take a stab and hope for the best, or we succumb to inertia; choose what we have always chosen. Consumers are thrown back on simple cues that are advantageous to the producers, such as brand recognition.
Complexity also impacts on politics. Once it was pretty clear who to vote for – your class position, on the whole, made it a simple matter of self-interest for most voters. Now we have become closer to what is ironically the democratic ideal – ie choice-making actors – voting is more of a challenge than it once was. Do you really have a good enough grasp of economic theory to judge whether it is best to spend or save in a recession? Do you understand the complexities of private provision in the NHS enough to rule it out? Do you know enough about international affairs to support a reduction in defence spending, or a retreat from the EU? Most people don’t – so, again, they make snap judgments based on loyalty and sentiment.
This problem of choice and complexity is ubiquitous. It applies in medicine. If I am ill and asked to make a choice about treatment, I would often rather leave the choice to the doctor, if only because if the wrong choice is made, I am not going to feel nearly so bad about it. I had a prostate cancer scare recently, and I just wanted to be told what to do – not decide whether, say, I should choose an operation that would guarantee impotency in order to stave off a 5% chance of cancer. The burden of choice was too big.
In the field of education a similar dilemma applies. Once your child went to the local primary or secondary. Now you have to decide from a bewildering number of types of school. In the personal realm, once, you stayed married for life. Now, if you are in an unhappy marriage you have to decide whether to stay or not. These may be all positive developments, but they come at a cost – the potential for regret.
So how should one react to complexity? Schwartz suggests we should limit choice, not extend it. If you are shopping for food, go to supermarkets that are priced simply with a limited range, such as Aldi and Lidl. Recognise and accept complexity – which means accepting that you can never be sure that you’ve made the right choice.
Above all, don’t fall for the old trope of only wanting “the best”. Schwartz calls such people “maximisers” – people who are never happy, because they have expectations that can never be met, since in a world of complexity and unlimited choice there is always a better option. Be a “satisficer” instead – people who are happy to say “that’s good enough”, or “it’ll do”.
This may not work in politics – saying the Conservatives “will do” when you wanted the Green party is not very satisfactory – but as a consumer, and in life generally, it’s a pretty good formula. It’ll do, anyway.