Search This Blog

Friday, 24 October 2014

The truth about evil

John Gray in The Guardian
When Barack Obama vows to destroy Isis’s “brand of evil” and David Cameron declares t
hat Isis is an “evil organisation” that must be obliterated, they are echoing Tony Blair’s judgment of Saddam Hussein: “But the man’s uniquely evil, isn’t he?” Blair made this observation in November 2002, four months before the invasion of Iraq, when he invited six experts to Downing Street to brief him on the likely consequences of the war. The experts warned that Iraq was a complicated place, riven by deep communal enmities, which Saddam had dominated for over 35 years. Destroying the regime would leave a vacuum; the country could be shaken by Sunni rebellion and might well descend into civil war. These dangers left the prime minster unmoved. What mattered was Saddam’s moral iniquity. The divided society over which he ruled was irrelevant. Get rid of the tyrant and his regime, and the forces of good would prevail.
If Saddam was uniquely evil 12 years ago, we have it on the authority of our leaders that Isis is uniquely evil today. Until it swept into Iraq a few months ago, the jihadist group was just one of several that had benefited from the campaign being waged by western governments and their authoritarian allies in the Gulf in support of the Syrian opposition’s struggle to overthrow Bashar al-Assad. Since then Isis has been denounced continuously and with increasing intensity; but there has been no change in the ruthless ferocity of the group, which has always practised what a radical Islamist theorist writing under the name Abu Bakr Naji described in an internet handbook in 2006 as “the management of savagery”.
Ever since it was spun off from al-Qaida some 10 years ago, Isis has made clear its commitment to beheading apostates and unbelievers, enslaving women and wiping out communities that will not submit to its ultra-fundamentalist interpretation of Islam. In its carefully crafted internet videos, it has advertised these crimes itself. There has never been any doubt that Isis practises methodical savagery as an integral part of its strategy of war. This did not prevent an abortive attempt on the part of the American and British governments in August of last year to give military support to the Syrian rebels – a move that could have left Isis the most powerful force in the country. Isis became the prime enemy of western governments only when it took advantage of the anarchy these same governments had created when they broke the state of Iraq with their grandiose scheme of regime change.
Against this background, it would be easy to conclude that talk of evil in international conflicts is no more than a cynical technique for shaping public perceptions. That would be a mistake. Blair’s secret – which is the key to much in contemporary politics – is not cynicism. A cynic is someone who knowingly acts against what he or she knows to be true. Too morally stunted to be capable of the mendacity of which he is often accused, Blair thinks and acts on the premise that whatever furthers the triumph of what he believes to be good must be true. Imagining that he can deliver the Middle East and the world from evil, he cannot help having a delusional view of the impact of his policies.

Saddam Hussein fires a rifle
“But the man’s uniquely evil, isn’t he?” Tony Blair said of Saddam Hussein.Photograph: Reuters

Here Blair is at one with most western leaders. It’s not that they are obsessed with evil. Rather, they don’t really believe in evil as an enduring reality in human life. If their feverish rhetoric means anything, it is that evil can be vanquished. In believing this, those who govern us at the present time reject a central insight of western religion, which is found also in Greek tragic drama and the work of the Roman historians: destructive human conflict is rooted in flaws within human beings themselves. In this old-fashioned understanding, evil is a propensity to destructive and self-destructive behaviour that is humanly universal. The restraints of morality exist to curb this innate human frailty; but morality is a fragile artifice that regularly breaks down. Dealing with evil requires an acceptance that it never goes away.
No view of things could be more alien at the present time. Whatever their position on the political spectrum, almost all of those who govern us hold to some version of the melioristic liberalism that is the west’s default creed, which teaches that human civilisation is advancing – however falteringly – to a point at which the worst forms of human destructiveness can be left behind. According to this view, evil, if any such thing exists, is not an inbuilt human flaw, but a product of defective social institutions, which can over time be permanently improved.


Paradoxically, this belief in the evanescence of evil is what underlies the hysterical invocation of evil that has lately become so prominent. There are many bad and lamentable forces in the world today, but it is those that undermine the belief in human improvement that are demonised as “evil”. So what disturbs the west about Vladimir Putin, for example, is not so much the persecution of gay people over which he has presided, or the threat posed to Russia’s neighbours by his attempt to reassert its imperial power. It is the fact that he has no place in the liberal scheme of continuing human advance. As a result, the Russian leader can only be evil. When George W Bush looked into Putin’s eyes at a Moscow summit in May 2002, he reported, “I was able to get a sense of his soul”. When Joe Biden visited the Kremlin in 2011, he had a very different impression, telling Putin: “Mr Prime Minister, I’m looking into your eyes, and I don’t think you have a soul.” According to Biden, Putin smiled and replied, “We understand each other.” The religious language is telling: nine years earlier, Putin had been a pragmatic leader with whom the west could work; now he was a soulless devil.
It’s in the Middle East, however, that the prevailing liberal worldview has proved most consistently misguided. At bottom, it may be western leaders’ inability to think outside this melioristic creed that accounts for their failure to learn from experience. After more than a decade of intensive bombing, backed up by massive ground force, the Taliban continue to control much of Afghanistan and appear to be regaining ground as the American-led mission is run down. Libya – through which a beaming David Cameron processed in triumph only three years ago, after the use of western air power to help topple Gaddafi – is now an anarchic hell-hole that no western leader could safely visit. One might think such experiences would be enough to deter governments from further exercises in regime change. But our leaders cannot admit the narrow limits of their power. They cannot accept that by removing one kind of evil they may succeed only in bringing about another – anarchy instead of tyranny, Islamist popular theocracy instead of secular dictatorship. They need a narrative of continuing advance if they are to preserve their sense of being able to act meaningfully in the world, so they are driven again and again to re-enact their past failures.
Many view these western interventions as no more than exercises in geopolitics. But a type of moral infantilism is no less important in explaining the persisting folly of western governments. Though it is clear that Isis cannot be permanently weakened as long as the war against Assad continues, this fact is ignored – and not only because a western-brokered peace deal that left Assad in power would be opposed by the Gulf states that have sided with jihadist forces in Syria. More fundamentally, any such deal would mean giving legitimacy to a regime that western governments have condemned as more evil than any conceivable alternative. In Syria, the actual alternatives are the survival in some form of Assad’s secular despotism, a radical Islamist regime or continuing war and anarchy. In the liberal political culture that prevails in the west, a public choice among these options is impossible.
There are some who think the very idea of evil is an obsolete relic of religion. For most secular thinkers, what has been defined as evil in the past is the expression of social ills that can in principle be remedied. But these same thinkers very often invoke evil forces to account for humankind’s failure to advance. The secularisation of the modern moral vocabulary that many believed was under way has not occurred: public discourse about good and evil continues to be rooted in religion. Yet the idea of evil that is invoked is not one that features in the central religious traditions of the west. The belief that evil can be finally overcome has more in common with the dualistic heresies of ancient and medieval times than it does with any western religious orthodoxy.

* * *

A radically dualistic view of the world, in which good and evil are separate forces that have coexisted since the beginning of time, was held by the ancient Zoroastrians and Manicheans. These religions did not face the problem with which Christian apologists have struggled so painfully and for so long – how to reconcile the existence of an all-powerful and wholly good God with the fact of evil in the world. The worldview of George W Bush and Tony Blair is commonly described as Manichean, but this is unfair to the ancient religion. Mani, the third-century prophet who founded the faith, appears to have believed the outcome of the struggle was uncertain, whereas for Bush and Blair there could never be any doubt as to the ultimate triumph of good. In refusing to accept the permanency of evil they are no different from most western leaders.

Saint Augustine by Caravaggio.
Saint Augustine by Caravaggio.Photograph: The Guardian

The west owes its ideas of evil to Christianity, though whether these ideas would be recognised by Jesus – the dissident Jewish prophet from whose life and sayings St Paul conjured the Christian religion – is an open question. The personification of evil as a demonic presence is not a feature of biblical Judaism, where the figure of Satan appears chiefly as a messenger or accuser sent by God to challenge wrongdoers. Despite the claims of believers and advances in scholarship, not enough is known to pronounce with any confidence on what Jesus may himself have believed. What is clear is that Christianity has harboured a number of quite different understandings of evil.
A convert from Manicheism, St Augustine established a powerful orthodoxy in the fourth century when he tried to distance Christianity from dualism and maintained that evil was not an independent force coeval with good but came into the world when human beings misused the gift of free will. Reflecting Augustine’s own conflicts, the idea of original sin that he developed would play a part in the unhealthy preoccupation with sexuality that appears throughout most of Christianity’s history. Yet in placing the source of evil within human beings, Augustine’s account is more humane than myths in which evil is a sinister force that acts to subvert human goodness. Those who believe that evil can be eradicated tend to identify themselves with the good and attack anyone they believe stands in the way of its triumph.
Augustine had an immense influence, but dualistic views in which evil exists as an independent force have erupted repeatedly as heretical traditions within Christianity. The Cathar movement that developed in parts of Europe in the 13th century revived a Manichean cosmogony in which the world is the work not of a good God but instead of a malevolent angel or demi-urge. A rival heresy was promoted by the fourth century theologian Pelagius, an opponent of Augustine who denied original sin while strongly affirming free will, and believed that human beings could be good without divine intervention. More than any of the ancient Greek philosophers, Pelagius put an idea of human autonomy at the centre of his thinking. Though he is now almost forgotten, this heretical Christian theologian has a good claim to be seen as the true father of modern liberal humanism.
In its official forms, secular liberalism rejects the idea of evil. Many liberals would like to see the idea of evil replaced by a discourse of harm: we should talk instead about how people do damage to each other and themselves. But this view poses a problem of evil remarkably similar to that which has troubled Christian believers. If every human being is born a liberal – as these latter-day disciples of Pelagius appear to believe – why have so many, seemingly of their own free will, given their lives to regimes and movements that are essentially repressive, cruel and violent? Why do human beings knowingly harm others and themselves? Unable to account for these facts, liberals have resorted to a language of dark and evil forces much like that of dualistic religions.


The efforts of believers to explain why God permits abominable suffering and injustice have produced nothing that is convincing; but at least believers have admitted that the ways of the Deity are mysterious. Even though he ended up accepting the divine will, the questions that Job put to God were never answered. Despite all his efforts to find a solution, Augustine confessed that human reason was not equal to the task. In contrast, when secular liberals try to account for evil in rational terms, the result is a more primitive version of Manichean myth. When humankind proves resistant to improvement, it is because forces of darkness – wicked priests, demagogic politicians, predatory corporations and the like – are working to thwart the universal struggle for freedom and enlightenment. There is a lesson here. Sooner or later anyone who believes in innate human goodness is bound to reinvent the idea of evil in a cruder form. Aiming to exorcise evil from the modern mind, secular liberals have ended up constructing another version of demonology, in which anything that stands out against what is believed to be the rational course of human development is anathematised.
The view that evil is essentially banal, presented by Hannah Arendt in her book Eichmann in Jerusalem (1963), is another version of the modern evasion of evil. Arendt suggested that human beings commit atrocities from a kind of stupidity, falling into a condition of thoughtlessness in which they collude in practices that inflict atrocious suffering on other human beings. It was some such moral inertia, Arendt maintained, that enabled Eichmann to take a leading part in perpetrating the Holocaust. Arendt’s theory of the banality of evil tends to support the defence of his actions that Eichmann presented at his trial: he had no choice in doing what he did. She represented Eichmann as a colourless bureaucrat performing a well-defined function in an impersonal bureaucratic machine; but the Nazi state was in fact largely chaotic, with different institutions, departments of government and individuals competing for Hitler’s favour. Careful historical research of the kind that David Cesarani undertook in his book Eichmann: His Life and Crimes (2004) suggests that Eichmann was not a passive tool of the state, but chose to serve it. When he organised the deportation and mass murder of Jews, he wasn’t simply furthering his career in the Nazi hierarchy. What he did reflected his deep-seated antisemitism. Eichmann took part in the Holocaust because he wanted to do so. In this he was no different from many others, though his crimes were larger in scale.
No doubt something like the type of evil that Arendt identified is real enough. Large parts of the population in Germany went along with Nazi policies of racial persecution and genocide from motives that included social conformity and obedience to authority. The number of doctors, teachers and lawyers who refused to implement Nazi policies was vanishingly small. But again, this wasn’t only passive obedience. Until it became clear that Hitler’s war might be lost, Nazism was extremely popular. As the great American journalist William Shirer reported in his eyewitness account of the rise of Hitler, The Nightmare Years:
“Most Germans, so far as I could see, did not seem to mind that their personal freedom had been taken away, that so much of their splendid culture was being destroyed and replaced with a mindless barbarism, or that their life and work were being regimented to a degree never before experienced even by a people accustomed for generations to a great deal of regimentation … On the whole, people did not seem to feel that they were being cowed and held down by an unscrupulous tyranny. On the contrary, they appeared to support it with genuine enthusiasm.”
When large populations of human beings collude with repressive regimes it need not be from thoughtlessness or inertia. Liberal meliorists like to think that human life contains many things that are bad, some of which may never be entirely eliminated; but there is nothing that is intrinsically destructive or malevolent in human beings themselves – nothing, in other words, that corresponds to a traditional idea of evil. But another view is possible, and one that need make no call on theology.
What has been described as evil in the past can be understood as a natural tendency to animosity and destruction, co-existing in human beings alongside tendencies to sympathy and cooperation. This was the view put forward by Sigmund Freud in a celebrated exchange of letters with Albert Einstein in 1931-32. Einstein had asked: “Is it possible to control man’s mental evolution so as to make him proof against the psychosis of hate and destructiveness?” Freud replied that “there is no likelihood of our being able to suppress humanity’s aggressive tendencies”.
Freud suggested that human beings were ruled by impulses or instincts, eros and thanatos, impelling them towards life and creation or destruction and death. He cautioned against thinking that these forces embodied good and evil in any simple way. Whether they worked together or in opposition, both were necessary. Even so, Freud was clear that a major threat to anything that might be called a good life came from within human beings. The fragility of civilisation reflected the divided nature of the human animal itself.

The crowd at the 1936 Olympic Games in Berlin raise their hands in the Nazi salute in tribute to Hitler's arrival at the stadium.
The crowd at the 1936 Olympic Games in Berlin salute Hitler’s arrival at the stadium. Photograph: Bettmann/Corbis

One need not subscribe to Freud’s theory (which in the same letter he describes as a type of mythology) to think he was on to something here. Rather than psychoanalysis, it may be some version of evolutionary psychology that can best illuminate the human proclivity to hatred and destruction. The point is that destructive behaviour of this kind flows from inherent human flaws. Crucially, these defects are not only or even mainly intellectual. No advance in human knowledge can stop humans attacking and persecuting others. Poisonous ideologies like Nazi “scientific racism” justify such behaviour. But these ideologies are not just erroneous theories that can be discarded when their falsehood has been demonstrated. Ideas of similar kinds recur whenever societies are threatened by severe and continuing hardship. At present, antisemitism and ethnic nationalism, along with hatred of gay people, immigrants and other minorities, are re-emerging across much of the continent. Toxic ideologies express and reinforce responses to social conflict that are generically human.
Mass support for despotic regimes has many sources. Without the economic upheavals that ruined much of the German middle class, the Nazis might well have remained a fringe movement. Undoubtedly there were many who looked to the Nazi regime for protection against economic insecurity. But it is a mistake to suppose that when people turn to tyrants, they do so despite the crimes that tyrants commit. Large numbers have admired tyrannical regimes and actively endorsed their crimes. If Nazism had not existed, something like it would surely have been invented in the chaos of interwar Europe.

* * *

When the west aligned itself with the USSR in the second world war, it was choosing the lesser of two evils – both of them evils of a radical kind. This was the view of Winston Churchill, who famously said he would “sup with the devil” if doing so would help destroy “that evil man” Hitler. Churchill’s candid recognition of the nature of the choice he made is testimony to how shallow the discourse of evil has since become. Today, no western politician could admit to making such a decision.
In his profound study On Compromise and Rotten Compromises, the Israeli philosopher Avishai Margalit distinguishes between regimes that rest on cruelty and humiliation, as many have done throughout history, and those that go further by excluding some human beings altogether from moral concern. Describing the latter as radically evil, he argues that Nazi Germany falls into this category. The distinction Margalit draws is not a quantitative one based on the numbers of victims, but categorical: Nazi racism created an immutable hierarchy in which there could be no common moral bonds. Margalit goes on to argue – surely rightly – that in allying itself with the Soviet Union in the struggle against Nazism, the west was making a necessary and justified moral compromise. But this was not because the Nazis were the greater evil, he suggests. For all its oppression, the Soviet Union offered a vision of the future that included all of humankind. Viewing most of the species as less than human, Nazism rejected morality itself.
There should be no doubt that the Nazis are in a class by themselves. No other regime has launched a project of systematic extermination that is comparable. From the beginning of the Soviet system there were some camps from which it was difficult to emerge alive. Yet at no time was there anything in the Soviet gulag akin to the Nazi death camps that operated at Sobibor and Treblinka. Contrary to some in post-communist countries who try to deny the fact, the Holocaust remains a unique crime. Judged by Margalit’s formula, however, the Soviet Union was also implicated in radical evil. The Soviet state implemented a policy of exclusion from society of “former persons” – a group that included those who lived off unearned income, clergy of all religions and tsarist functionaries – who were denied civic rights, prohibited from seeking public office and restricted in their access to the rationing system. Many died of starvation or were consigned to camps where they perished from overwork, undernourishment and brutal treatment.
Considered as a broad moral category, what Margalit defines as radical evil is not uncommon. The colonial genocide of the Herero people in German South-West Africa (now Namibia) at the start of the 20th century was implemented against a background of ersatz-scientific racist ideology that denied the humanity of Africans. (The genocide included the use of Hereros as subjects of medical experiments, conducted by doctors some of whom returned to Germany to teach physicians later implicated in experiments on prisoners in Nazi camps.) The institution of slavery in antebellum America and South African apartheid rested on a similar denial. A refusal of moral standing to some of those they rule is a feature of societies of widely different varieties in many times and places. In one form or another, denying the shared humanity of others seems to be a universal human trait.

An Islamic State fighter in Raqqa, Iraq.
An Islamic State fighter in Raqqa, Syria. Photograph: Reuters

Describing Isis’s behaviour as “psychopathic”, as David Cameron has done, represents the group as being more humanly aberrant than the record allows. Aside from the fact that it publicises them on the internet, Isis’s atrocities are not greatly different from those that have been committed in many other situations of acute conflict. To cite only a few of the more recent examples, murder of hostages, mass killings and systematic rape have been used as methods of warfare in the former Yugoslavia, Chechnya, Rwanda, and the Congo.
A campaign of mass murder is never simply an expression of psychopathic aggression. In the case of Isis, the ideology of Wahhabism has played an important role. Ever since the 1920s, the rulers of the Saudi kingdom have promoted this 18th-century brand of highly repressive and exclusionary Sunni Islam as part of the project of legitimating the Saudi state. More recently, Saudi sponsorship of Wahhabi ideology has been a response to the threat posed by the rise of Shia Iran. If the ungoverned space in which Isis operates has been created by the west’s exercises in regime change, the group’s advances are also a byproduct of the struggle for hegemony between Iran and the Saudis. In such conditions of intense geopolitical rivalry there can be no effective government in Iraq, no end to the Syrian civil war and no meaningful regional coalition against the self-styled caliphate.
But the rise of Isis is also part of a war of religion. Nothing is more commonplace than the assertion that religion is a tool of power, which ruling elites use to control the people. No doubt that’s often true. But a contrary view is also true: politics may be a continuation of religion by other means. In Europe religion was a primary force in politics for many centuries. When religion seemed to be in retreat, it renewed itself in political creeds – Jacobinism, nationalism and varieties of totalitarianism – that were partly religious in nature. Something similar is happening in the Middle East. Fuelled by movements that combine radical fundamentalism with elements borrowed from secular ideologies such as Leninism and fascism, conflict between Shia and Sunni communities looks set to continue for generations to come. Even if Isis is defeated, it will not be the last movement of its kind. Along with war, religion is not declining, but continuously mutating into hybrid forms.

Iraqi Yazidis
Iraqi Yazidis, who fled an Islamic State attack on Sinjar, gather to collect bottles of water at the Bajid Kandala camp in Kurdistan’s western Dohuk province. Photograph: Ahmad al-Rubaye/AFP/Getty Images

Western intervention in the Middle East has been guided by a view of the world that itself has some of the functions of religion. There is no factual basis for thinking that something like the democratic nation-state provides a model on which the region could be remade. States of this kind emerged in modern Europe, after much bloodshed, but their future is far from assured and they are not the goal or end-point of modern political development. From an empirical viewpoint, any endpoint can only be an act of faith. All that can be observed is a succession of political experiments whose outcomes are highly contingent. Launched in circumstances in which states constructed under the aegis of western colonialism have broken down under the impact of more recent western intervention, the gruesome tyranny established by Isis will go down in history as one of these experiments.
The weakness of faith-based liberalism is that it contains nothing that helps in the choices that must be made between different kinds and degrees of evil. Given the west’s role in bringing about the anarchy in which the Yazidis, the Kurds and other communities face a deadly threat, non-intervention is a morally compromised option. If sufficient resources are available – something that cannot be taken for granted – military action may be justified. But it is hard to see how there can be lasting peace in territories where there is no functioning state. Our leaders have helped create a situation that their view of the world claims cannot exist: an intractable conflict in which there are no good outcomes. 

Tuesday, 21 October 2014

‘Cleansing the stock’ and other ways governments talk about human beings


Those in power don’t speak of ‘people’ or ‘killing’ – it helps them do their job. And we are picking up their dehumanising euphemisms
Israeli attack on Gaza school
An Israeli strike on a UN school in the northern Gaza Strip in which two children were killed and a dozen other people were injured. 'Mowing the lawn'? Photograph: Mohammed Abed/AFP/Getty Images

To blot people out of existence first you must blot them from your mind. Then you can persuade yourself that what you are doing is moral and necessary. Today this isn’t difficult. Those who act without compassion can draw upon a system of thought and language whose purpose is to shield them – and blind us – to the consequences.
The contention by Lord Freud, a minister in the UK’s Department of Work and Pensions, that disabled people are “not worth the full wage” isn’t the worst thing he’s alleged to have said. I say “alleged” because what my ears tell me is contested by Hansard, the official parliamentary record. During a debate in the House of Lords, he appeared to describe the changing number of disabled people likely to receive the employment and support allowance as a “bulge of, effectively, stock”. After a furious response by the people he was talking about, this was transcribed by Hansard as“stopped”, rendering the sentence meaningless. I’ve listened to the word several times on the parliamentary video. Like others, I struggle to hear it as anything but “stock”.
If we’re right, he is not the only person at his department who uses this term. Its website describes disabled people entering the government’s work programme for between three and six months as “3/6Mth stock”. Perhaps this makes sense when you remember that they are a source of profit for the companies running the programme. The department’s delivery plan recommends using “credit reference agency data to cleanse the stock of fraud and error”. To cleanse the stock: remember that.
Human beings – by which I mean those anthropoid creatures who do not necessarily receive social security – often live in families. But benefit claimants live in “benefit units”, defined by the government as “an adult plus their spouse (if applicable) plus any dependent children living in the household”. On the bright side, if you die while on a government work programme, you’ll be officially declared a “completer”. Which must be a relief.
A dehumanising system requires a dehumanising language. So familiar and pervasive has this language become that it has soaked almost unnoticed into our lives. Those who do have jobs are also described by the function they deliver to capital. These days they are widely known as “human resources”.
The living world is discussed in similar terms. Nature is “natural capital”. Ecological processes are ecosystem services, because their only purpose is to serve us. Hills, forests and rivers are described in government reports as “green infrastructure”. Wildlife and habitats are “asset classes” in an “ecosystems market”. Fish populations are invariably described as “stocks”, as if they exist only as movable assets from which wealth can be extracted – like disabled recipients of social security. The linguistic downgrading of human life and the natural world fuses in a word a Norwegian health trust used to characterise the patients on its waiting list: biomass.
Those who kill for a living employ similar terms. Israeli military commanders described the massacre of 2,100 Palestinians, most of whom were civilians (including 500 children), in Gaza this summer as “mowing the lawn”. It’s not original. Seeking to justify Barack Obama’s drone war in Pakistan (which has so far killed 2,300 people, only 4% of whom have since been named as members of al-Qaida), Obama’s counter-terrorism adviser Bruce Riedel explained that “you’ve got to mow the lawn all the time. The minute you stop mowing, the grass is going to grow back.” The director of the CIA, John Brennan, claimed that with “surgical precision” his drones “eliminate the cancerous tumour called an al-Qaida terrorist while limiting damage to the tissue around it”. Those who operate the drones describe their victims as bug splats.
During its attack on the Iraqi city of Falluja in November 2004, the US army used white phosphorus to kill or maim people taking shelter in houses or trenches. White phosphorus is fat-soluble. Even small crumbs of it bore through living tissue on contact. It destroys mucous membranes, blinding people and ripping up their lungs. Its use as a weapon is banned by the Chemical Weapons Convention, as the US army knows: one of its battle books observes that “it is against the law of land warfare to employ WP against personnel targets” (personnel targets, by the way, are human beings). But never mind all that. The army has developed a technique it calls Shake ‘n Bake: flush people out with phosphorus, then kill them with high explosives. Shake ‘n Bake is a product made by Kraft Foods for coating meat with breadcrumbs before cooking it.
Terms such as these are designed to replace mental images of death and mutilation with images of something else. Others, such as “collateral damage” (dead or wounded civilians), “kinetic activity” (shooting and bombing), “compounds” (homes) and “extraordinary rendition” (kidnapping and torture by states), are intended to prevent the formation of any mental pictures at all. If you can’t see what is being discussed, you will struggle to grasp the implications. The clearest example is “neutralising”, which neutralises the act of killing it describes.
I doubt many people could kill and wound if their language accurately represented what they were doing. It is notable that those who are most enthusiastic about waging war are the least able to describe what they are talking about without resorting to metaphor and euphemism. Few people have nightmares about squashing insects or mowing the lawn.
The media, instead of challenging public figures to say kill when they mean kill, and people when they mean people, repeats these evasions. Uncontested, their sanitised, trivialised, belittling terms seep into our own mouths, until we also talk about “operatives” or “human capital” or “illegal aliens” without stopping to consider how those words resonate and what they permit us not to see. I wouldn’t be surprised if there are dehumanising metaphors in this article that I have failed to spot.
If we wish to reclaim public life from the small number of people who have captured it, we must also reclaim the language in which it is expressed. To know what we are talking about: this, in more than one sense, is the task of those who want a better world.

A mutiny by World Bank staff when prescribed a dose of its own restructuring medicine

The recommendations of the World Bank/IMF are presented to us, the people of the South, as scientific, objective, necessary, fair, and in the best interests of the countries where they are to be implemented. This is why the rebellion episode by the bank staff to its restructuring is so significant


PETER RONALD DESOUZA in The Hindu

In the Financial Times of October 8, the columnist Shawn Donnan, reported that the World Bank was facing an internal “‘mutiny.” Yes, the word mutiny was used. The professional staff were apparently angry about several issues, a deep discontent, because of which the rebellion had been brewing over many days. The key issue was the restructuring exercise being undertaken by the President, Jim Yong Kim, to save, through both the elimination of benefits to staff on mission and also through possible lay-offs, the sum of $400 million. The restructuring exercise, staff felt, was deeply flawed both procedurally and substantively. The columnist reported some members saying that this “thing [restructuring] is affecting everything.” “We can’t do business. We don’t have the budget. It’s a mess, ...” Another staff member complained that “nickel and diming” on travel budgets was causing travelling staff to have to pay for their own breakfasts. “It’s really small beer,” she said. “Has anyone ever thought about the impact of these changes on staff morale?”
Resistance against restructuring

To assuage their feelings, before the semi-annual meeting of the Bank and International Monetary Fund (IMF) with Finance Ministers and Central Bankers of member countries, President Jim Yong Kim had to hurriedly convene a “town hall” meeting with the staff to discuss their concerns. The issues that was fuelling their anger were: (i) the cost-cutting exercise which meant that items of expenditure that they had been accustomed to, such as a paid for breakfast, were being withdrawn, (ii) the secrecy and opacity of the whole exercise i.e., appointment of consultants, payment of bonus to the senior management, hiring of new senior managers, etc, (iii) the award of a “scarce skills premium” of $94,000 as bonus, over and above his salary of $3,79,000, to the Chief Financial Officer who was carrying out the exercise, and (iv) to the appointment and payment of the huge sum of $12.5 million to external consultants such as McKinsey, Deloitte, and Booz Allen for advice on how to restructure a development Bank, as reported in the Economic Times of October 15, 2014.
For those of us from the Global South, who not only receive but also have to follow the advice of the Bretton Woods twins, on how to “restructure” our economies and change our policies, this episode has four very interesting lessons. The recommendations of the World Bank/IMF are presented to us, the people of the South, as scientific, objective, necessary, fair, and in the best interests of the countries where they are to be implemented. The World Bank is the repository of the most authoritative knowledge on development. It annually publishes the flagship World Development Report (WDR), the first of which in 1978 was titled “Prospects for Growth and Alleviation of Poverty.” Every year since 1978, it flags important themes for development with the 2013 WDR being on “Jobs” and restructuring required to align them with the new economy. The 2015 WDR is on “Mind and Culture” and the World Bank website reports the central argument as being “that policy design that takes into account psychological and cultural factors will achieve development goals faster.” This is scholarly knowledge and is used by many university classrooms as part of required reading. This is what positions the World Bank as a premier knowledge institution on development. Then why is the rebellion episode so significant? There are four aspects of that which merit discussion.
The first is the resistance against the restructuring medicine. This is the same medicine used by the World Bank against the rest of the world. The restructuring exercise, which has eliminated jobs within the public sector, whether this be in government or in the support services required by any public institution, such as of subordinate administrative staff, has produced an underclass of workers, who, although they are still needed, have been deprived of the welfare and security benefits that the permanent staff enjoys and were benefits that had been won by a long history of working class struggles. So, when security guards, drivers, mess workers, sweepers, the class IV workers, have now to live lives filled with anxiety about illness, unemployment, etc., because they work for labour contractors who do not provide any such benefits, the anger of the World Bank professional staff who, because of the restructuring, have to pay for their “breakfast” is a little difficult to understand. The restructuring exercise of economies in the global south has produced an underclass whose livelihood insecurity has increased exponentially. The mutiny at the World Bank appears somewhat paradoxical. Not only is the exercise personally dishonest, given the rebellion when the policy is applied at home, but it is also intellectually dishonest when read against the 2015 WDR. Is this the modern performance of the “mutiny on the bounty”?
Control by the few

The second aspect is the process adopted in the internal restructuring. The Reuters and FT reports tell us that the common complaint of the staff is that the many aspects of the restructuring exercise, initiated by the president, were non-transparent. There was an opacity to the process. For example, questions such as the following needed to be asked. What was the method followed to give the CFO a “scarce skills premium” of $94,000 over and above his salary? Was the work done outside the normal duty of the CFO? How did the president decide on who qualifies for a “scarce skills premium” and how many persons have qualified for this bonus? These were questions asked at the town hall meeting. If the “scarce skills premium” was based on sound management principles, why did the CFO agree to forego the bonus after the uproar? These are good questions and lead one to wonder if countries have the same option of protesting? Did Greece and Portugal and Ireland and Argentina have the protest option? The interesting lesson from this episode is that restructuring produces pain and distress to the many while it rewards the few especially those tasked with implementing it. These few have access to political and intellectual power. They control the methods adopted of public justification which produces a discourse that the restructuring is necessary and will benefit the whole. The few get rewarded while the many pay the price in the restructuring in many countries of the global south.
Neo-liberal triumph
The third aspect is the use of consultants. This is the most disappointing and alarming aspect of the episode. For an institution such as the World Bank, whose main rationale is that it is a knowledge institution about how to promote development, to now implicitly declare that it does not have the knowledge required to restructure itself is a severe admission of the weakness of its knowledge base and skill sets. How does it then prepare a road map to restructure economies when restructuring an institution is infinitely easier than restructuring the economy of a country? Restructuring an institution can draw on the interdisciplinary knowledge of the WDR 2015 such as best practice, graduated approaches, evidenced-based policies, results-based management, measuring and monitoring, etc. (all the keywords of the World Bank itself), to achieve the result of a better, leaner, more efficient, and fair institution. But the decision to hire outside consultants, paying a whopping fee of $12.5 million, shows that the World Bank does not either believe in its own capability, or worse doesn’t have this capability. What is alarming is the message that development thinking will, from now on, be done and propagated by the big global consultancies. Is the World Bank announcing that henceforth even its development knowledge will be outsourced? As reported in the Economic Times, one of the protesters said, “What do they know about development and the complexities of what we do?” Indeed, what do they know? But if we see the economic policy institutions of many countries, we will see a seamless movement of personnel between global consultancies and central banks. Our own development thinking has been outsourced to neo-liberal knowledge institutions, such as global consultancies, ratings agencies and investment banks. We can see this takeover of knowledge production in the area of economic policy, the triumph of the neo-liberal frame, even in India. Look at the key players of our economic policymaking. The World Bank has now given its stamp of approval to this trend. The recolonisation of the Indian mind and the policy discourse is near complete.
The fourth aspect of this troubling episode is the use of words to legitimise the action. In the last few months of the Indian public debate, we have come to see the power of words and the social power the purveyors of these words acquire. The word makes the world. Tagore argued for this philosophical position that language constructs reality, that we see the beauty of the world through our language, and that outside language there is no beauty. Controlling the word, the Bank decides to reward its CFO with a large bonus, while it is reducing the financial package of its other employees; it deploys the justification for this decision as a “scare skills premium.” The CFO gets the additional money because he has scarce skills. The investment Bank fraternity has to be rewarded with huge bonuses because they have scarce skills. Wall Street is built on this justification. This is capitalism’s masterstroke of controlling perception, controlling the public discourse by controlling words. We accept the differentials because we are made to believe it is a “scarce skills premium” to be paid for our own good. Sometimes a typographical error brings out the truth much better. By mistake I typed it as “scare skills premium.” It is.

Cricket: complex, unknowable cricket

 Jon Hotten in Cricinfo


Old Trafford 2005: one in a scarcely imaginable run of four matches carved from gold © Getty Images
Enlarge
This is Martin Amis, writing about chess: "Nowhere in sport, perhaps nowhere in human activity, is the gap between the trier and the expert so astronomical."
Is he right? In the field of human activity, at least, I can think of another arena in which the knowledge gap between amateur and pro is vast - that of theoretical physics. The latest man to try and bridge it is the particle physicist and former keyboard player with D:Ream (best-known hit - "Things Can Only Get Better"), Dr Brian Cox. 
He has a new TV series called The Human Universe, and he kicked it off with an interesting analogy for beginning to understand exactly what theoretical physicists are on about.
"Cricket" he began, is "unfathomable" to those who don't understand it, yet "bewitching" to those that do. "And all of [cricket's] complexity emerged from a fixed set of rules."
He held up the single page of a scorebook on which he'd written down the formula for the Standard Model of Particle Physics and the Theory of Relativity, and then flicked through a copy of the Laws of Cricket. "By this notation at least," he said, "cricket is more complex than the universe."
Maybe that's because Einstein never bothered with an equation to sum the Laws up, yet it's a clever way of illustrating that an understanding of the rules of anything doesn't necessarily lead to an understanding of the subject that those rules govern. To return to chess and Martin Amis, here he is on a match between Garry Kasparov and Nigel Short: "They are trying to hold on to, to brighten and to bring to blossom, a coherent vision which the arrangement of the pieces may or may not contain."
Cricket and chess are superficially simple. Sit someone in front of a chess board and you can explain the basics of the pieces and the moves in a few minutes. Show them a cricket bat and ball and a set of stumps and the idea and aims of the game become apparent. Where the genius lies - and where Cox's analogy holds quite nicely - is in the infinite small variations that these simple structures contain. The complexities mount when the knowledge and ability of the players grows: as Amis said, they are trying to bring about a vision from within the rules that isn't actually there until it happens.
This is obvious in cricket, where the game is built around endless repetitions of the same actions - the ball is bowled, the ball is hit (or not) - which, under different conditions and with various personnel, become increasingly complex as they happen again and again until stories emerge from within them.
The other day I watched a rerun of the 2005 Ashes. Once again, it gripped. Of the 1778 Test marches played before the ones in those series, along came, at random, four in a row that finished in teeth-grinding tension. Taking a wider view, they were barely different from all of the other games of cricket in history: it was simply the appearance of these very tiny complexities, one after the other, that made them what they were.
Sometimes we can feel these patterns emerging, at others we're simply too close to make them out. The brilliance of the design of the game provides a framework that stretches off into the future. We, as players and spectators, are finite, but cricket itself is universal.
Martin Amis was fascinated by chess because he felt it was "unmasterable", and it was, by any individual. "It is a game that's beyond the scope of the human mind," he wrote. And yet the human mind has devised computer programs that are able to analyse any game and beat any player. Theoretically at least, machines have reached "the end" of chess.
Despite the obsession with stats, that fate can't really await cricket. It may not be quite as complex and unknowable as deep space, but it needs a bigger book of Laws. What an invention it is.

------Further Thought by Samir Chopra in Cricinfo

The renewability of cricket


 
The 45-year-old professor, the older version of the once-15-year-old schoolboy, sees a very different game of cricket from his younger counterpart  © Getty Images
Enlarge
My Cordon colleague Jon Hotten writes, in his recent post, "Cricket: complex, unknowable cricket"
The brilliance of the design of the game provides a framework that stretches off into the future. We, as players and spectators, are finite, but cricket itself is universal.
I think Jon must have meant something other than "universal", because otherwise the contrast made here doesn't work. I suspect he wanted to say something like "cricket is infinitely extensible" or "renewable".
Be that as it may, I want to suggest here that "we, as players and spectators" have a great deal to do with the perceived complexity of cricket. Quite simply, this is because we change over time; we do not bring, to our encounters with the game in the middle, a stable, enduring entity, but one subject constantly to a variety of physical, emotional, psychological, and of course, political variations. This perennially in flux object brings to its viewings of cricket a variety of lenses; and we do not merely perceive, we interpret and contextualise, we filter and sift. (As John Dewey, the great American pragmatist philosopher noted, "Thought is intrinsic to experience.") These interpretations and contextualisations change over time.
The 45-year-old man, the professor, the older version of the once-15-year-old schoolboy, sees a very different game of cricket from his younger counterpart. And as he continues to "grow" and change, he will continue to "see" a different game played out in front of him. He will renew cricket, make it extensible and renewable. The seemingly infinite variations possible in a 30-hour, 450-over encounter between 22 other humans, each playing cricket ever so differently from those that have preceded him, will provide ample fodder for this extensibility and renewability.
A game of cricket exists within a larger symbolic order of meaning. When a young spectator sees men in white pick up bat and ball, he understands their activities within a perceptual framework in which active fantasy and wishful longing play an active part. As he grows, matures, acquires a political and aesthetic sense, and hopefully expands his intellectual, emotional and romantic horizons he will revise this, and come to understand the game differently. He may go on to watch umpteen variations on the fourth-innings chase theme, and each one will be uniquely located within this under-construction framework.
Anna Karenina is a classic precisely because in the face of changing readings it continues to speak to us, across time and space and idiosyncratic translations
Consider, by way of analogy, the reading of the classics, great works of literature, which continue to be read for years and years after their writing. Read Anna Karenina as a youngster, perhaps for a high-school class in literature, well before you have ever dated, or been in a serious romantic relationship; you will experience the heartbreak - and tragedy - of the adults at the centre of its story very differently when, 15 years later, you have acquired a few scars (and perhaps a child) of your own. Of course, Tolstoy's masterwork is a classic precisely because in the face of such changing readings it continues to speak to us, across time and space and idiosyncratic translations. This is why Susan Sontag - like others before her - suggested the classics were worth reading several times over; each reading was likely to be a new one, a co-operative, joint construction of meaning by the reader and the writer.
Cricket's games do not exist in isolation. They are played within larger political and economic realities, ones that affect its spectators and its players; these too change our understanding of the game. The benefactions of empire in the past are now the property of its subjects; witness the turmoil this has caused in our recent relationship to the game. The conflict in the middle can come to be understood very differently in these circumstances. Where the youngster might have seen heroes in the past, he now may see villains.
These remarks above suggest another way in which cricket could continue to renew itself over time: it could be embedded within more cultures and societies; it could be written about, and understood by, a broader cross section of humanity than it has been thus far; the language of description for it could expand beyond its current repertoire. (I have been fortunate enough to talk about cricket in English, Hindi/Urdu, and Punjabi; trust me, the game is viewed very differently through these alternative linguistic lenses.) We might find, too, that the conversations that surround it in new climes and locales enrich our previous understandings of cricket.
Imagine a rich cricket literature in not just English but in other languages too. Who knows what new classics, new understandings of cricket, we might find there?

Sunday, 19 October 2014

Why did Britain’s political class buy into the Tories’ economic fairytale?

Falling wages, savage cuts and sham employment expose the recovery as bogus. Without a new vision we’re heading for social conflict

Demonstrators protesting about austerity in London
Demonstrators among tens of thousands who protested about austerity in London on Saturday. Photograph: Mary Turner/Getty Images

The UK economy has been in difficulty since the 2008 financial crisis. Tough spending decisions have been needed to put it on the path to recovery because of the huge budget deficit left behind by the last irresponsible Labour government, showering its supporters with social benefit spending. Thanks to the coalition holding its nerve amid the clamour against cuts, the economy has finally recovered. True, wages have yet to make up the lost ground, but it is at least a “job-rich” recovery, allowing people to stand on their own feet rather than relying on state handouts.
That is the Conservative party’s narrative on the UK economy, and a large proportion of the British voting public has bought into it. They say they trust the Conservatives more than Labour by a big margin when it comes to economic management. And it’s not just the voting public. Even the Labour party has come to subscribe to this narrative and tried to match, if not outdo, the Conservatives in pledging continued austerity. The trouble is that when you hold it up to the light this narrative is so full of holes it looks like a piece of Swiss cheese.
First, let’s look at the origins of the deficit. Contrary to the Conservative portrayal of it as a spendthrift party, Labour kept the budget in balance averaged over its first six years in office between 1997 and 2002. Between 2003 and 2007 the deficit rose, but at 3.2% of GDP a year it was manageable.
More importantly, this rise in the deficit between 2003 and 2007 was not due to increased welfare spending. According to data from the Office for National Statistics, social benefit spending as a proportion of GDP was more or less constant at about 9.5% of GDP a year during this period. The dramatic climb in budget deficit from there to the average of 10.7% in 2009-2010 was mostly a consequence of the recession caused by the financial crisis.
First, the recession reduced government revenue by the equivalent of 2.4% of GDP – from 42.1% to 39.7% – between 2008 and 2009-10. Second, it raised social spending (social benefit plus health spending). Economic downturn automatically increases spending on many social benefits, such as unemployment benefit and income support, but it also increases spending on things like disability benefit and healthcare, as increased unemployment and poverty lead to more physical and mental health problems. In 2009-10, at the height of the recession, UK public social spending rose by the equivalent of 3.2% of GDP compared with its 2008 level (from 21.8% to 24%).
When you add together the recession-triggered fall in tax revenue and rise in social spending, they amount to 5.6% of GDP – almost the same as the rise in the deficit between 2008 and 2009-10 (5.7% of GDP). Even though some of the rise in social spending was due to factors other than the recession, such as an ageing population, it would be safe to say that much of the rise in deficit can be explained by the recession itself, rather than Labour’s economic mismanagement.
When faced with this, supporters of the Tory narrative would say, “OK, but however it was caused, we had to control the deficit because we can’t live beyond our means and accumulate debt”. This is a pre-modern, quasi-religious view of debt. Whether debt is a bad thing or not depends on what the money is used for. After all, the coalition has made students run up huge debts for their university education on the grounds that their heightened earning power will make them better off even after they pay back their loans.
The same reasoning should be applied to government debt. For example, when private sector demand collapses, as in the 2008 crisis, the government “living beyond its means” in the short run may actually reduce public debt faster in the long run, by speeding up economic recovery and thereby more quickly raising tax revenues and lowering social spending. If the increased government debt is accounted for by spending on projects that raise productivity – infrastructure, R&D, training and early learning programmes for disadvantaged children – the reduction in public debt in the long run will be even larger.
Against this, the advocates of the Conservative narrative may retort that the proof of the pudding is in the eating, and that the recovery is the best proof that the government’s economic strategy has worked. But has the UK economy really fully recovered? We keep hearing that national income is higher than at the pre-crisis peak of the first quarter of 2008. However, in the meantime the population has grown by 3.5 million (from 60.5 million to 64 million), and in per capita terms UK income is still 3.4% less than it was six years ago. And this is even before we talk about the highly uneven nature of the recovery, in which real wages have fallen by 10% while people at the top have increased their shares of wealth.
But can we not at least say that the recovery has been “jobs-rich”, creating 1.8m positions between 2011 and 2014? The trouble with this is that, apart from the fact that the current unemployment rate of 6% is nothing to be proud of, many of the newly created jobs are of very poor quality.
The ranks of workers in “time-related unemployment”, doing fewer hours than they wish due to a lack of availability of work – have swollen dramatically. Between 1998 and 2005, only about 1.9% of workers were in such a position; by 2012-13 the figure was 8%.
Then there is the extraordinary increase in self-employment. Its share of total employment, whose historical norm (1984-2007) was 12.6%, now stands at an unprecedented 15%. With no evidence of a sudden burst of entrepreneurial energy among Britons, we may conclude that many are in self-employment out of necessity or even desperation. Even though surveys show that most newly self-employed people say it is their preference, the fact that these workers have experienced a far greater collapse in earnings than employees – 20% against 6% between 2006-07 and 2011-12, according to the Resolution Foundation – suggests that they have few alternatives, not that they are budding entrepreneurs going places.
So, in between the people in underemployment (6.1% of employment) and the precarious newly self-employed (2.4%), 8.5% of British people in work (or 2.6 million people) are in jobs that do not fully utilise their abilities – call that semi-unemployment, if you will.
The success of the Conservative economic narrative has allowed the coalition to pursue a destructive and unfair economic strategy, which has generated only a bogus recovery largely based on government-fuelled asset bubbles in real estate and finance, with stagnant productivity, falling wages, millions of people in precarious jobs, and savage welfare cuts.
The country is in desperate need of a counter narrative that shifts the terms of debate. A government budget should be understood not just in terms of bookkeeping but also of demand management, national cohesion and productivity growth. Jobs and wages should not be seen simply as a matter of people being “worth” (or not) what they get, but of better utilising human potential and of providing decent and dignified livelihoods. Ways have to be found to generate economic growth based on rising productivity rather than the continuous blowing of asset bubbles.
Without a new economic vision incorporating these dimensions, Britain will continue on its path of stagnation, financial instability and social conflict.