Search This Blog

Monday 5 October 2015

You can print money, so long as it’s not for the people

Zoe Williams in The Guardian

In its broadest sense, the phrase there’s no magic money tree is just a variation on “money doesn’t grow on trees”, a thing you say to children to indicate that wealth comes not from the beneficence of a magical universe, but from hard graft in a corporeal reality. The pedantic child might point to the discrepant amounts of work required to yield a given amount of money, and say that its value is a social construction.

Over time, that loose, rather weak-minded meaning has ceded to a specific economic critique; Jeremy Corbyn – along with anyone who challenges the prevailing fiscal narrative – is dangerous and wrong, since he wants to print money. Money cannot be created from nowhere, because there’s no magic money tree. End of.

The flaw in that argument is that all money is created from nowhere. In normal circumstances, it is created from nowhere as credit, by private banks, and lent to us, usually (85% of the time) in the form of a mortgage on an existing residential property. Decades of credit extension have perverted the housing market to turn a mortgage into a lifetime’s bonded servitude. The economists Jordá, Schularick and Taylor argued convincingly last year that the causes of this economic crisis, the next and the one before are all, fundamentally, the extension of credit and its impact on house prices. So the magic money tree isn’t gushing cash in a socially responsible fashion (if it were used responsibly, it wouldn’t be magic) but the idea that we have a centrally planned, carefully stewarded monetary policy, with finite creation and demonstrable long-term aims, which some loonie leftie wants to come along and unravel, is simply wrong.

In abnormal circumstances, such as the ones we’ve lived through since the financial crisis, central banks are also magic money trees. In the bizarre construction of current economic orthodoxy, you’re not allowed to say so, even though the Bank of England has created £375bn in quantitative easing (QE); theFederal Reserve bought $1.25tn worth of mortgage-backed securities in its first round of QE; the European Central Bank had as a core principle that it couldn’t create money until, suddenly, in awesome amounts, it could; the Bank of Korea has a stimulus package, as does the People’s Bank of China; and Japan started it. Central banks typically justify money creation on the basis that it’s temporary, it’s unfortunate, it’s driven by the crisis and it will ultimately get back to normal.

None of that alters the fact that no bank had that money in savings. I recently said out loud, “we do have a magic money tree, it’s called the Bank of England” in a Newsnight debate with a former adviser to Blair, John McTernan. He made a face like a politician accidentally talking to a member of the public but what the camera didn’t catch was Evan Davis, who stuck his tongue out, like a cat taking a pill. It was days ago, and people are still tweeting me pictures of the Zimbabwean dollar and the Weimar Republic, saying “is this what you want? IS IT?”

Quantitative easing is bizarrely unapproachable, even though it’s happening right across the world and its unwinding will dominate the economic picture for years to come; one is allowed to reference QE, so long as one maintains at all times a technocratic tone, to indicate that one understands and approves of it as nothing more than a lever to create stability. It was the best idea ever, until you suggest something similar could be done for a social purpose, and then it’s the most perilous idea ever. To interrogate why the benefit must always go to the existing asset-holding class, why human ingenuity can’t devise anything more productive and equitable, is to reveal the shaming depth of your incomprehension. It’s not that you don’t understand money; it’s that you don’t understand the exigencies of the debate, which are that you sign up to a number of false principles before you start.

It turned out that the “no money tree” brigade meant: “If you create money infinitely, that will cause inflation” That is a really curious argument against Corbyn’s people’s QE, like going up to someone eating a banana and saying: “If you eat limitless bananas, you will give yourself potassium poisoning.” There’s a secondary argument about the independence of central banks from governments, which is actually rather an elegant example of our dishevelled politics: if the government issues no directive to the Bank of England, and all the gains of QE go to the wealthiest, that’s “independent”. If the government had said, invest this in, say, the green economy, that would have been independence lost. It has become normal to see upwards redistribution as a law of the physical universe, and anything else as the interference of a heavy-handed state.

None of this is to say that people’s QE is straightforward and unproblematic; Corbyn is talking about spending on infrastructure (housing, broadband), whereas that phrase as it was coined described helicopter money, or overt money financing, literally getting money into the economy by randomly giving it to people. They’re two discrete propositions – overt money financing and green and social investment – and rolling them into one doesn’t do much to promote understanding on this terrain.

However, the real barrier to debate is, as with so much in the realm of debt and austerity, that it’s conducted in bad faith, with infantilising aphorisms, aimed not at deepening understanding but at shooing away public interest with unavoidable economic realities. As a tactic, this has reached the end of its plausibility.

Saturday 3 October 2015

The Art Of Fear-Mongering

Uri Avnery in Outlook India


"WE HAVE nothing to fear but fear itself," said President Franklin Delano Roosevelt. He was wrong.

Fear is a necessary condition for human survival. Most animals in nature possess it. It helps them to respond to dangers and evade or fight them. Human beings survive because they are fearful.

Fear is both individual and collective. Since its earliest days, the human race has lived in collectives. This is both a necessary and a desired condition. Early humans lived in tribes. The tribe defended their territory against all “strangers" — neighboring tribes — in order to safeguard their food supply and security. Fear was one of the uniting factors.

Belonging to one's tribe (which after many evolutions became a modern nation) is also a profound psychological need. It, too, is connected with fear — fear of other tribes, fear of other nations.

But fear can grow and become a monster.

RECENTLY I received a very interesting article by a young scientist, Yoav Litvin [*], dealing with this phenomenon.

It described, in scientific terms, how easily fear can be manipulated. The science involved was the research of the human brain, based on experiments with laboratory animals like mice and rats.

Nothing is easier than to create fear. For example, mice were given an electric shock while exposed to rock music. After some time, the mice showed reactions of extreme fear when the rock music was played, even without being given a shock. The music alone produced fear.

This could be reversed. For a long time, the music was played for them without the pain. Slowly, very slowly, the fear abated. But not completely: when, after a long time, a shock was again delivered with the music, the full symptoms of fear re-appeared immediately. Once was enough.

APPLY THIS to human nations, and the results are the same.

The Jews are a perfect laboratory specimen. Centuries of persecution in Europe taught them the value of fear. Smelling danger from afar, they learned to save themselves in time — generally by flight.

In Europe, the Jews were an exception, inviting victimizing. In the Byzantine (East Roman) Empire, Jews were normal. All over the empire, territorial peoples turned into ethnic-religious communities. A Jew in Alexandria could marry a Jewess in Antioch, but not the girl next door, if she happened to be an Orthodox Christian.

This "millet" system endured all through the Islamic Ottoman Empire, the British Mandate and still lives happily in today's State of Israel. An Israeli Jew cannot legally marry an Israeli Christian or Muslim in Israel.

This was the reason for the absence of anti-Semitism in the Arab world, apart from the detail that the Arabs are Semites themselves. Jews and Christians, the "peoples of the book", have a special status in an Islamic state (like Iran today), in some ways second-class, in some ways privileged (they do not have to serve in the army). Until the advent of Zionism, Arab Jews were no more fearful than most other human beings.

The situation in Europe was quite different. Christianity, which split off from Judaism, harbored a deep resentment towards the Jews from the start. The New Testament contains profoundly anti-Jewish descriptions of Jesus' death, which every Christian child learns at an impressionable age. And the fact that the Jews in Europe were the only people (apart from the gypsies) who had no homeland made them all the more suspicious and fear-inspiring.

The continued suffering of the Jews in Europe implanted a continuous and deep-seated fear in every European Jew. Every Jew was on continuous alert, consciously, unconsciously or subconsciously, even in times and countries which seemed far from any danger — like the Germany of my parents' youth.

My father was a prime example of this syndrome. He grew up in a family that had lived in Germany for generations. (My father, who had studied Latin, always insisted that our family had come to Germany with Julius Caesar.) But when the Nazis came to power, it took my father just a few days to decide to flee, and a few months later my family arrived happily in Palestine.

ON A personal note: my own experience with fear was also interesting. For me, at least.

When the Hebrew-Arab war of 1948 broke out, I naturally enlisted for combat duty. Before my first battle I was — literally — convulsed by fear. During the engagement, which happily was a light one, the fear left me, never to return. Just so. Disappeared.

In the following 50 or so engagements, including half a dozen major battles, I felt no fear.

I was very proud of this, but it was a stupid thing. Near the end of the war, when I was already a squad leader, I was ordered to take over a position which was exposed to enemy fire. I went to inspect it, walking almost upright in broad daylight, and was at once hit by an Egyptian armor-piercing bullet. Four of my soldiers, volunteers from Morocco, bravely got me out under fire. I arrived at the field hospital just in time to save my life.

Even this did not restore to me my lost fear. I still don't feel it, though I am aware that this is exceedingly stupid.

BACK TO my people.

The new Hebrew community in Palestine, founded by refugees from the pogroms of Moldavia, Poland, Ukraine and Russia, and later reinforced by the remnants of the Holocaust, lived in fear of their Arab neighbors, who revolted from time to time against the immigration.

The new community, called the Yishuv, took great pride in the heroism of its youth, which was quite able to defend itself, its towns and its villages. A whole cult grew up around the new Sabra ("cactus plant"), the fearless, heroic young Hebrew born in the country. When in the war of 1948, after prolonged and bitter fighting (we lost 6500 young men out of a community of 650,000 people) we eventually won, collective rational fear was replaced by irrational pride.

Here we were, a new nation on new soil, strong and self-reliant. We could afford to be fearless. But we were not.

Fearless people can make peace, reach a compromise with yesterday's enemy, reach out for co-existence and even friendship. This happened — more or less — in Europe after many centuries of continuous wars.

Not here. Fear of the "Arab World" was a permanent fixture in our national life, the picture of "little Israel surrounded by enemies" both an inner conviction and a propaganda ploy. War followed war, and each one produced new waves of anxiety.

This mixture of overweening pride and profound fears, a conqueror's mentality and permanent Angst, is a hallmark of today's Israel. Foreigners often suspect that this is make-believe, but it is quite real.

FEAR IS also the instrument of rulers. Create Fear and Rule. This has been a maxim of kings and dictators for ages.

In Israel, this is the easiest thing in the world. One has just to mention the Holocaust (or Shoah in Hebrew) and fear oozes from every pore of the national body.

Stoking Holocaust memories is a national industry. Children are sent to visit Auschwitz, their first trip abroad. The last Minister of Education decreed the introduction of Holocaust studies in kindergarten (seriously). There is a Holocaust Day — in addition to many other Jewish holidays, most of which commemorate some past conspiracy to kill the Jews.

The historical picture created in the mind of every Jewish child, in Israel as well as abroad, is, in the words of the Passover prayer read aloud every year in every Jewish family: "In every generation they arise against us to annihilate us, but God saves us from their hands!"

PEOPLE WONDER what is the special quality that enables Binyamin Netanyahu to be elected again and again, and rule practically alone, surrounded by a flock of noisy nobodies.

The person who knew him best, his own father, once declared that "Bibi" could be a good Foreign Minister, but on no account a Prime Minister. True, Netanyahu has a good voice and a real talent for television, but that is all. He is shallow, he has no world vision and no real vision for Israel, his historical knowledge is negligible.

But he has one real talent: fear-mongering. In this he has no equal.

There is hardly any major speech by Netanyahu, in Israel or abroad, without at least one mention of the Holocaust. After that, there comes the latest up-to-date fear-provoking image.

Once it was "international terrorism". The young Netanyahu wrote a book about it and established himself as an expert. In reality, this is nonsense. There is no such thing as international terrorism. It has been invented by charlatans, who build a career on it. Professors and such.

What is terrorism? Killing civilians? If so, the most hideous acts of terrorism in recent history were Dresden and Hiroshima. Killing civilians by non-state fighters? Take your pick. As I have said many times: "freedom fighters" are on my side, "terrorists" are on the other side.

Palestinians, and Arabs in general, are, of course, terrorists. They hate us for taking part of their land away. Obviously, you cannot make peace with perverse people like that. You can only fear and fight them.

When the field of terrorist-fighters became too crowded, Netanyahu switched to the Iranian bomb. There it was — the actual threat to our very existence. The Second Holocaust.

To my mind, this has always been ridiculous. The Iranians will not have a bomb, and if they did — they would not use it, because their own national annihilation would be guaranteed.

But take the Iranian bomb from Netanyahu, and what remains? No wonder he fought tooth and nail to keep it. But now it has been finally pushed away. What to do?

Don't worry. Bibi will find another threat, more blood-curdling than any before.

Just wait and tremble.

Cricket: How much does captaincy really matter?

Kartikeya Date in Cricinfo


Mike Brearley was fortunate to have captained England when Botham and Willis were arguably at their best © Getty Images



How often have you heard "captaincy" being applauded by professional observers? "Great captaincy", they say. Or "lacklustre captaincy". Mahendra Singh Dhoni has experience of both types of criticism.

The entire concept is bogus. Have you ever heard of a captain being criticised when his team wins? Or have you heard it said, "We saw some superb captaincy from Clarke today but Australia were just not good enough"? No, when Australia lose, it is the other captain who did well. More consequentially, can you think of a good captain of an inferior team beating a better side purely because of captaincy?

Captaincy seems to be a concept by writers for writers. It exists not because cricket is played but because cricket is written about and argued about. There is a difference between noting the mere existence of a captain as the person who decides bowling changes and field settings, and captaincy as a full-fledged art consequential to the game. It is the latter that is bogus. Every time a captain puts a third slip in and a catch goes there, it doesn't amount to "great captaincy". Since a field is set every over, it's just one choice that worked, among many dozens of choices that didn't.

Historically captaincy has also had social significance. The captain had to be someone from a good background. Who are his parents? What social class does he come from? Which university did he go to? Will he look plausible when dignitaries visit? Until 1952, the captain of England had to be an amateur (someone who could afford to play cricket for fun, not as a job, because he had other sources of income). That year, Len Hutton became the first professional cricketer to captain England, 75 years after England first played a Test. As Osman Samiuddin notes in his history of Pakistan cricket, early Pakistan captains were chosen for their Oxbridge pedigree. Today these colonial markers are no longer fashionable. In their place we have vague notions of "leadership" and other such management-speak.

Mike Brearley (Cambridge University and Middlesex) captained England in 31 of his 39 Tests. He made 1442 runs at 22.88 in 66 innings in Test cricket. He played all his innings in the top seven, most frequently as opener.

Brearley did not get picked for England as a specialist batsman alone after the Centenary Test in Melbourne in March 1977. He is Exhibit A for the pro-captaincy set - the most prominent member of the very small set of players who were not good enough to make a Test team with bat or ball, but were picked primarily as captain.
Brearley's reputation rests on his career as captain in Ashes Tests. He led England to two Ashes wins at home, in 1977 and 1981. He also led England to an Ashes win against a Packer-affected Australia side in Australia. Apart from this, Brearley led England to victory against New Zealand at home, and a Pakistan side (also Packer-depleted) in 1978. Add to this an unconvincing 1-0 win at home against an Indian side that had very little fast bowling (Kapil Dev was still raw in 1979) and the tired remnants of their spin quartet. Sunil Gavaskar nearly brought India level at The Oval in that series, despite Brearley's captaincy.

At first glance, it is an impressive record. But in all those series, England were simply the better side, either because they were playing at home or because their opponents were crippled by defections.



Indian captains have been termed "aggressive" usually when they've had quality bowlers at their disposal © AFP


What happened to England against full-strength opposition in that 1977-81 period? They were thumped 3-0 in three Tests in Australia in 1979-80 under Brearley, lost 1-0 to West Indies at home in 1980 and 2-0 away in 1980-81. Brearley did not make the side in the two series against West Indies. In fact, he never faced the strongest team of his era. Without him, under Ian Botham, England did quite well against West Indies in 1980 when you consider what had happened in 1976 and what was to happen in 1984.

It is not uncommon for England captains to be highly successful in England. England have traditionally been very difficult to beat at home. Brearley's successor Bob Willis won six out of nine Tests in England against much stronger India, Pakistan and New Zealand sides that had Kapil, Imran Khan and Richard Hadlee in their prime. England's only loss under Willis came against New Zealand in Leeds in 1983, after Lance Cairns took ten wickets in the match.

Brearley's impact on the English teams he led is questionable. Would they have won just as well with any captain other than Botham in 1981? Was Brearley's value purely that Botham flourished under him? It's clear that under Brearley, Botham was an extraordinary player. He made seven centuries and took 15 five-wicket hauls in 26 such Tests. But his next best efforts came under Bob Willis, in the 1982-84 period. It is plausible to think that Brearley merely had the benefit of having Botham at his best. Perhaps equally importantly, he had Willis at his peak.

The crucial question about Brearley might be: if he was really such a fantastic captain, why did he continue to pick himself in the XI when it was clear that he wasn't good enough to play at Test level? Can you imagine what would happen if a player with an average of 22 was allowed 39 Tests as a specialist batsman today? Why, think of what happened to Botham in 1980. Poor performances against the best team in the world brought its own pressure. Brearley had no such problems.

The idea that the record of captains depends on the quality of their players is generally accepted. But it is curiously discarded when captaincy itself is discussed. There are, for example, rumblings about Virat Kohli being a more "aggressive" captain compared to Dhoni. There is no basis for thinking this. Given turning tracks and opponents who had little experience of batting on them and no high-quality spinners in their ranks to exploit them, Dhoni's India demolished teams with disdain. West Indies, New Zealand and an Australian team in crisis all answered to this description. When Dhoni had quality bowling, he looked a very aggressive captain and India won handsomely. Just as they did under Rahul Dravid. If India have quality bowling under Kohli, he will be remembered as an "aggressive captain".

Much is made of the fact that India's batting didn't do well in England and Australia under Dhoni in the 2011-12 seasons. But we forget that when India won in England in 2007, India's top seven did not make a single century in the series. It was the bowling, led by Zaheer Khan, that made all the difference. Zaheer's series in England ranks alongside those of McGrath, Warne and Murali in the 21st century. The support he got from RP Singh, Anil Kumble, Sreesanth and Ganguly made it one of India's finest overseas performances ever.


Michael Clarke: a "tactically astute" leader, but what do his numbers say in away Tests? © Getty Images


Let's consider Michael Clarke's record. Clarke is widely regarded as the most tactically astute leader of his generation, but that didn't prevent him from losing 13 out of 28 Tests outside Australia. The only place where Australia have won comfortably under him is the West Indies (where the hosts have won only three out of 17 home Test series against major teams in the 21st century, and lost 11).

Look down the list of captains away from home in the 21st century, and you'll find that some of the most highly regarded captains had losing records - Michael Clarke, Mahela Jayawardene, Nasser Hussain.

All this suggests that captaincy is overrated by observers in cricket. If one were to list the essential cricketing skills for a Test and ODI team from the most important to the least, they would be as follows: fast bowling, spin bowling, allrounders, opening batting, middle-order batting, wicketkeeping, catching, ground fielding, overall fitness, captaincy.

A quality team with a nondescript captain would win way more than a bad team with a "good" captain. This is one of the great features of cricket. Despite being an aristocratic sport, on the field, it is a great leveller. To win, you have to bowl well, bat well and field well. At the highest level, tactics are not a mystery. Every club cricketer knows what the best options (or the best three or four options) are for a given team in a given situation.

Kumble once said of his googly, "They pick it, but they still have to play it." That is what makes a top Test team. The ability to play so well that even when the opposition knows exactly what's coming, they have to play very well to cope.

Perhaps it is better to think of captaincy as one thinks of wicketkeeping. A keeper who makes a lot of mistakes or has bad footwork and is repeatedly caught in bad positions is noticed. Similarly, a keeper who has to keep pulling off brilliant diving takes is also noticed. In the first case, the keeper is poor. In the second, the bowling is poor. Similarly, if a captain is being noticed one way or the other, something is wrong with the team.

How to blame less and learn more

Mathew Syed in The Guardian

Accountability. We hear a lot about it. It’s a buzzword. Politicians should be accountable for their actions; social workers for the children they are supervising; nurses for their patients. But there’s a catastrophic problem with our concept of accountability.

 Consider the case of Peter Connelly, better known as Baby P, a child who died at the hands of his mother, her boyfriend and her boyfriend’s brother in 2007. The perpetrators were sentenced to prison. But the media focused its outrage on a different group: mainly his social worker, Maria Ward, and Sharon Shoesmith, director of children’s services. The local council offices were surrounded by a crowd holding placards. In interviews, protesters and politicians demanded their sacking. “They must be held accountable,” it was said.

Many were convinced that the social work profession would improve its performance in the aftermath of the furore. This is what people think accountability looks like: a muscular response to failure. It is about forcing people to sit up and take responsibility. As one pundit put it: “It will focus minds.”

But what really happened? Did child services improve? In fact, social workers started leaving the profession en masse. The numbers entering the profession also plummeted. In one area, the council had to spend £1.5m on agency social work teams because it didn’t have enough permanent staff to handle a jump in referrals.

Those who stayed in the profession found themselves with bigger caseloads and less time to look after the interests of each child. They also started to intervene more aggressively, terrified that a child under their supervision would be harmed. The number of children removed from their families soared. £100m was needed to cope with new child protection orders.

Crucially, defensiveness started to infiltrate every aspect of social work. Social workers became cautious about what they documented. The bureaucratic paper trails got longer, but the words were no longer about conveying information, they were about back-covering. Precious information was concealed out of sheer terror of the consequences.

Almost every commentator estimates that the harm done to children following the attempt to “increase accountability” was high indeed. Performance collapsed. The number of children killed at the hands of their parents increased by more than 25% in the year following the outcry and remained higher for every one of the next three years.

Let us take a step back. One of the most well-established human biases is called the fundamental attribution error. It is about how the sense-making part of the brain blames individuals, rather than systemic factors, when things go wrong. When volunteers are shown a film of a driver cutting across lanes, for example, they infer that he is selfish and out of control. And this inference may indeed turn out to be true. But the situation is not always as cut-and-dried.

After all, the driver may have the sun in his eyes or be swerving to avoid a car. To most observers looking from the outside in, these factors do not register. It is not because they don’t think such possibilities are irrelevant, it is that often they don’t even consider them. The brain just sees the simplest narrative: “He’s a homicidal fool!”

Even in an absurdly simple event like this, then, it pays to pause to look beneath the surface, to challenge the most reductionist narrative. This is what aviation, as an industry, does. When mistakes are made, investigations are conducted. A classic example comes from the 1940s where there was a series of seemingly inexplicable accidents involving B-17 bombers. Pilots were pressing the wrong switches. Instead of pressing the switch to lift the flaps, they were pressing the switch to lift the landing gear.

Should they have been penalised? Or censured? The industry commissioned an investigator to probe deeper. He found that the two switches were identical and side by side. Under the pressure of a difficult landing, pilots were pressing the wrong switch. It was an error trap, an indication that human error often emerges from deeper systemic factors. The industry responded not by sacking the pilots but by attaching a rubber wheel to the landing-gear switch and a small flap shape to the flaps control. The buttons now had an intuitive meaning, easily identified under pressure. Accidents of this kind disappeared overnight.

This is sometimes called forward accountability: the responsibility to learn lessons so that future people are not harmed by avoidable mistakes.

But isn’t this soft? Won’t people get sloppy if they are not penalised for mistakes? The truth is quite the reverse. If, after proper investigation, it turns out that a person was genuinely negligent, then punishment is not only justifiable, but imperative. Professionals themselves demand this. In aviation, pilots are the most vocal in calling for punishments for colleagues who get drunk or demonstrate gross carelessness. And yet justifiable blame does not undermine openness. Management has the time to find out what really happened, giving professionals the confidence that they can speak up without being penalised for honest mistakes.

In 2001, the University of Michigan Health System introduced open reporting, guaranteeing that clinicians would not be pre-emptively blamed. As previously suppressed information began to flow, the system adapted. Reports of drug administration problems led to changes in labelling. Surgical errors led to redesigns of equipment. Malpractice claims dropped from 262 to 83. The number of claims against the University of Illinois Medical Centre fell by half in two years following a similar change. This is the power of forward accountability.

High-performance institutions, such as Google, aviation and pioneering hospitals, have grasped a precious truth. Failure is inevitable in a complex world. The key is to harness these lessons as part of a dynamic process of change. Kneejerk blame may look decisive, but it destroys the flow of information. World-class organisations interrogate errors, learn from them, and only blame after they have found out what happened.

And when Lord Laming reported on Baby P in 2009? Was blame of social workers justified? There were allegations that the report’s findings were prejudged. Even the investigators seemed terrified about what might happen to them if they didn’t appease the appetite for a scapegoat. It was final confirmation of how grotesquely distorted our concept of accountability has become.

Thursday 1 October 2015

The Tories are setting up their own trade union movement

Jon Stone in The Independent

The Conservative party is launching a new organisation to represent trade unionists who have Tory sympathies, it has said.

Robert Halfon, the Conservatives’ deputy chairman, said his party was now “the party of working people” and that “militant” union leaders were putting workers’ off existing structures.

“We want to provide a voice for Conservative-minded trade unionists and moderate trade unionists and this week we will announcing a new organisation in the Conservative party called the Conservative Workers and Trade Unionists movement and that is going to be a voice for Conservative trade unionists,” he said in an interview with parliament’s The House magazine.

“We are recreating the Conservative trade union workers’ movement. There will be a new website and people will be able to join. There will be a voice for moderate trade unionists who feel they may have sympathy with the Conservatives or even just feel that they’re not being represented by militant trade union leaders.”

The organisation could act as a caucus within existing workplace trade unions and allow the Tories to stand candidates in internal elections. It will be formally announced at the party’s conference in Manchester next week.


Robert Halfon: Deputy Conservative Chairman

The announcement comes as the Conservative government launches the biggest crackdown on trade unionists for 30 years.

Business Secretary Sajid Javid is moving to criminalise unlawful picketing, as well as new rules making it harder for workers to strike legally.


New financing rules will also make it far more difficult for trade unions to direct funds to Labour, the political party they founded.

Labour leader Jeremy Corbyn described the claim the Tories were a party for workers as an “absurd lie” in his speech to Labour’s annual conference in Brighton.

He pointed to cuts to tax credits that would leave people in work worse off, even after a steep rise in the minimum wage.

“We’ll fight this every inch of the way and we’ll campaign at the workplace, in every community against this Tory broken promise and to expose the absurd lie that the Tories are on the side of working people, that they are giving Britain a pay rise,” Mr Corbyn said.

Research by the Institute for Fiscal Studies found that the rebranded higher minimum wage introduced by George Osborne in the Budget came “nowhere near” to compensating for benefit cuts.

Other research conducted by the institute found that the sharpest benefit cuts would fall on people with jobs.

Sweden introduces six-hour work day

Hardeep Matharu in The Independent

Sweden is moving to a six-hour working day in a bid to increase productivity and make people happier.

Employers across the country have already made the change, according to the Science Alert website, which said the aim was to get more done in a shorter amount of time and ensure people had the energy to enjoy their private lives.

Toyota centres in Gothenburg, Sweden’s second largest city, made the switch 13 years ago, with the company reporting happier staff, a lower turnover rate, and an increase in profits in that time.

Filimundus, an app developer based in the capital Stockholm, introduced the six-hour day last year.

“The eight-hour work day is not as effective as one would think," Linus Feldt, the company’s CEO told Fast Company.

"To stay focused on a specific work task for eight hours is a huge challenge. In order to cope, we mix in things and pauses to make the work day more endurable. At the same time, we are having it hard to manage our private life outside of work."

Mr Feldt has said staff members are not allowed on social media, meetings are kept to a minimum, and that other distractions during the day are eliminated - but the aim is that staff will be more motivated to work more intensely while in the office.

He said the new work day would ensure people have enough energy to pursue their private lives when they leave work – something which can be difficult with eight-hour days.

“My impression now is that it is easier to focus more intensely on the work that needs to be done and you have the stamina to do it and still have the energy left when leaving the office,” Mr Feldt added.

According to Science Alert, doctors and nurses in some hospitals in the country have even made the move to the six-hour day.

A retirement home in Gothenburg made the six-hour switch earlier this year and is conducting an experiment, until the end of 2016, to determine whether the cost of hiring new staff members to cover the hours lost is worth the improvements to patient care and boosting of employees’ morale.

Right to 30-day refund becomes law

Brian Milligan in BBC News


New consumer protection measures - including longer refund rights - have come into force under the Consumer Rights Act.

For the first time anyone who buys faulty goods will be entitled to a full refund for up to 30 days after the purchase.

Previously consumers were only entitled to refunds for a "reasonable time".

There will also be new protection for people who buy digital content, such as ebooks or online films and music.

They will be entitled to a full refund, or a replacement, if the goods are faulty.

The Act also covers second-hand goods, when bought through a retailer.

People buying services - like a garage repair or a haircut - will also have stronger rights.

Under the new Act, providers who do not carry out the work with reasonable care, as agreed with the consumer, will be obliged to put things right.

Or they may have to give some money back.

'Fit for purpose'

"The new laws coming in today should make it easier for people to understand and use their rights, regardless of what goods or services they buy," said Gillian Guy the chief executive of Citizens Advice.

When disputes occur, consumers will now be able to take their complaints to certified Alternative Dispute Resolution (ADR) providers, a cheaper route than going through the courts.

The Consumer Rights Act says that goods 

- must be of satisfactory quality, based on what a reasonable person would expect, taking into account the price 

- must be fit for purpose. If the consumer has a particular purpose in mind, he or she should make that clear

- must meet the expectations of the consumer


The Act has been welcomed by many consumer rights groups and further information can be found here.

"Now, if you buy a product - whether physical or digital - and discover a fault within 30 days you'll be entitled to a full refund," said Hannah Maundrell, the editor of money.co.uk. "The party really is over for retailers that try to argue the point."

The Act also enacts a legal change that will enable British courts to hear US-style class action lawsuits, where one or several people can sue on behalf of a much larger group.

It will make it far easier for groups of consumers or small businesses to seek compensation from firms that have fixed prices and formed cartels.