Search This Blog

Showing posts with label human. Show all posts
Showing posts with label human. Show all posts

Thursday 11 August 2016

How the World Bank’s biggest critic became its president

Andrew Rice in The Guardian

In a shanty town perched in the hilly outskirts of Lima, Peru, people were dying. It was 1994, and thousands of squatters – many of them rural migrants who had fled their country’s Maoist guerrilla insurgency – were crammed into unventilated hovels, living without basic sanitation. They faced outbreaks of cholera and other infectious diseases, but a government austerity program, which had slashed subsidised health care, forced many to forgo medical treatment they couldn’t afford. When food ran short, they formed ad hoc collectives to stave off starvation. A Catholic priest ministering to a parish in the slum went looking for help, and he found it in Jim Yong Kim, an idealistic Korean-American physician and anthropologist.

In his mid-30s and a recent graduate of Harvard Medical School, Kim had helped found Partners in Health, a non-profit organisation whose mission was to bring modern medicine to the world’s poor. The priest had been involved with the group in Boston, its home base, before serving in Peru, and he asked Kim to help him set up a clinic to aid his flock. No sooner had Kim arrived in Lima, however, than the priest contracted a drug-resistant form of tuberculosis and died.

Kim was devastated, and he thought he knew what to blame: the World Bank. Like many debt-ridden nations, Peru was going through “structural adjustment”, a period of lender-mandated inflation controls, privatisations and government cutbacks. President Alberto Fujimori had enacted strict policies, known collectively as “Fujishock”, that made him a darling of neoliberal economists. But Kim saw calamitous trickledown effects, including the tuberculosis epidemic that had claimed his friend and threatened to spread through the parish.

So Kim helped organise a conference in Lima that was staged like a teach-in. Hundreds of shanty town residents met development experts and vented their anger with the World Bank. “We talked about the privatisation of everything: profits and also suffering,” Kim recalls. “The argument we were trying to make is that investment in human beings should not be cast aside in the name of GDP growth.” Over the next half-decade, he would become a vociferous critic of the World Bank, even calling for its abolition. In a 2000 book, Dying for Growth, he was lead author of an essay attacking the “capriciousness” of international development policies. “The penalties for failure,” he concluded, “have been borne by the poor, the infirm and the vulnerable in poor countries that accepted the experts’ designs.”

Kim often tells this story today, with an air of playful irony, when he introduces himself – as the president of the World Bank. “I was actually out protesting and trying to shut down the World Bank,” Kim said one March afternoon, addressing a conference in Maryland’s National Harbor complex. “I’m very glad we lost that argument.”

 
Medical staff treat people with Ebola in Kailahun, Sierra Leone, 2014. Photograph: Carl de Souza/AFP/Getty Images

The line always gets a laugh, but Kim uses it to illustrate a broader story of evolution. As he dispenses billions of development dollars and tees off at golf outings with Barack Obama – the US president has confessed jealousy of his impressive five handicap – Kim is a long way from Peru. The institution he leads has changed too. Structural adjustment, for one, has been phased out, and Kim says the bank can be a force for good. Yet he believes it is only just awakening to its potential – and at a precarious moment.

Last year, the percentage of people living in extreme poverty dropped below 10% for the first time. That’s great news for the world, but it leaves the World Bank somewhat adrift. Many former dependents, such as India, have outgrown their reliance on financing. Others, namely China, have become lenders in their own right. “What is the relevance of the World Bank?” Kim asked me in a recent interview. “I think that is an entirely legitimate question.”

Kim believes he has the existential answers. During his four years at the bank’s monumental headquarters on H Street in Washington, he has reorganised the 15,000-strong workforce to reflect a shift from managing country portfolios to tackling regional and global crises. He has redirected large portions of the bank’s resources (it issued more than $61bn in loans and other forms of funding last year) toward goals that fall outside its traditional mandate of encouraging growth by financing infrastructure projects – stemming climate change, stopping Ebola, addressing the conditions driving the Syrian exodus.

Yet many bank employees see Kim’s ambitions as presumptuous, even reckless, and changes undertaken to revitalise a sluggish bureaucracy have shaken it. There have been protests and purges, and critics say Kim’s habit of enunciating grandiose aspirations comes with a tendency toward autocracy. The former bank foe now stands accused of being an invasive agent, inflicting his own form of shock therapy on his staff. “The wrong changes have been done badly,” says Lant Pritchett, a former World Bank economist.

Pritchett argues that, beyond issues of personality and style, Kim’s presidency has exposed a deep ideological rift between national development, which emphasises institution-building and growth, and what Pritchett terms “humane” development, or alleviating immediate suffering. Kim, however, sees no sharp distinction: he contends that humane development is national development – and if the bank persists in believing otherwise, it could be doomed to obsolescence.

Kim likes to say that as a doctor with experience of treating the poor, his humanitarian outlook is his strongest qualification for his job – an opinion that probably vexes critics who point out that he knew little about lending before arriving at the bank. “Finance and macroeconomics are complicated, but you can actually learn them,” he says. “The hardest thing to learn is mud-between-your-toes, on-the-ground development work. You can’t learn that quickly. You can’t learn that through trips where you’re treated like a head of state. You have to have kind of done that before.”

Kim talks fast and he walks fast. Following him – a lithe, balding 56-year-old surrounded by a deferential, suited entourage – you can easily imagine him in a white coat as a physician making his rounds. He has a doctor’s diagnostic mindset too; he talks about ascertaining “the problem”, or what public-health experts call the “cause of the causes”. He thinks of poverty as an ailment and is trying to devise a “science of delivery”. It’s a philosophy built on a life-long interest in the intersection of science and humanities. Born in Seoul in 1959 to parents displaced by the Korean war, Kim emigrated to the US with his family when he was a child, eventually ending up in Muscatine, Iowa. His was one of two Asian families in the small town. His mother was an expert in Confucian philosophy, his father a dentist. Kim excelled at his studies while playing quarterback in high school. He attended Brown University in Providence, Rhode Island, where he studied human biology. His father wanted him to be a doctor, but he gravitated toward anthropology. Because Harvard let him pursue a medical degree and a PhD simultaneously, he landed there. Kim struck up a friendship with Paul Farmer, a fellow student, over shared interests in health and justice. In 1987, they formed Partners in Health.

The two came of age when the World Bank’s influence was arguably at its most powerful and controversial. Conceived along with the International Monetary Fund at the 1944 Bretton Woods conference, the bank was meant to rebuild Europe. However, it found its central mission as a source of startup capital for states emerging from the demise of colonial empires. The bank could borrow money cheaply in the global markets, thanks to the creditworthiness of its shareholders (the largest being the US government), then use that money to finance the prerequisites for economic growth – things such as roads, schools, hospitals and power plants. Structural adjustment came about in response to a series of debt crises that culminated in the 1980s. The Bretton Woods institutions agreed to bail out indebted developing states if they tightened their belts and submitted to painful fiscal reforms.

 
A wall dividing shanty towns and rich neighbourhoods in Lima, Peru, where Kim’s journey to the World Bank began. Photograph: Oxfam/EPA

To Kim and Farmer, the moral flaw in the bank’s approach was that it imposed mandates with little concern for how cutting budgets might affect people’s health. They thought that “the problem” in global health was economic inequality, and in Haiti Partners in Health pioneered a grassroots methodology to tackle it: improve the lives of communities by training locals to provide medical care (thus creating jobs) and by expanding access to food, sanitation and other basic necessities. Though hardly insurgents – they were based at Harvard, after all – the friends passionately argued that policy discussions in Geneva and Washington needed to be informed by ground truths, delivered by the people living them.

Farmer’s impressive work ethic and pious demeanour made him famous – and the subject of Tracy Kidder’s acclaimed book Mountains Beyond Mountains – but Kim was the partner with systemic ambitions. “For Paul, the question is, ‘What does it take to solve the problem of giving the best care in the world to my patients?’” Kim says. “But he doesn’t spend all his time thinking about, ‘So how do you take that to scale in 188 countries?’” (Both men, who remain close friends, have wonMacArthur Foundation “genius grants” for their work.)

Kim’s desire to shape policy landed him at the World Health Organisation (WHO) in 2003, overseeing its HIV/Aids work. The job required him to relocate to Geneva with his wife – a paediatrician he had met at Harvard – and a son who was just a toddler. (They now have two children, aged 16 and seven.) In the vigorously assertive style that would become his hallmark – going where he wants to go even if he’s not sure how to get there – Kim pledged to meet an audacious goal: treating three million people in the developing world with antiretroviral drugs by 2005, a more than sixfold increase over just two years. The strategy, in Kim’s own words, was, “Push, push, push.” The “three-by-five pledge,” as it was known, ended up being impossible to reach, and Kim publicly apologised for the failure on the BBC. But the world got there in 2007 – a direct result, Kim says, of the pledge’s impact on global-health policymaking: “You have to set a really difficult target and then have that really difficult target change the way you do your work.”

Kim left the WHO in 2006. After a stopover at Harvard, where he headed a centre for health and human rights, he was hired to be president of Dartmouth College in New Hampshire. He arrived in 2009, with little university management experience but characteristically high hopes. With the global recession at its zenith, however, Kim was forced to spend much of his time focused on saving Dartmouth’s endowment.

He hardly knew the difference between hedge funds and private equity, so a venture capitalist on the college’s board would drive up from Boston periodically to give him lessons, scribbling out basic financial concepts on a whiteboard or scratch paper. His tenure soon turned stormy as he proposed slashing $100m from the school’s budget and clashed with faculty members who complained about a lack of transparency. Joe Asch, a Dartmouth alumnus who writes for a widely read blog about the university, was highly critical of Kim. “He is a man who is very concerned about optics and not so concerned about follow-through,” Asch says now. “Everyone’s sense was that he was just there to punch his ticket.” Soon enough, a surprising opportunity arose.

The way Kim tells it, the call came out of the blue one Monday in March 2012. Timothy Geithner, another Dartmouth alumnus who was then the US treasury secretary, was on the line asking about Kim’s old nemesis. “Jim,” Geithner asked, “would you consider being president of the World Bank?”

When the government contacted him, Kim confesses, he had only the foggiest notion of how development finance worked. He had seen enough in his career, however, to know that running the bank would give him resources he scarcely could have imagined during his years of aid work. Instead of agonising over every drop of water in the budgetary bathtub, he could operate a global tap. “When I really saw what it meant to be a bank with a balance sheet, with a mission to end extreme poverty,” Kim says, “it’s like, wow.” His interest was bolstered by the bank’s adoption, partly in response to 1990s-era activists, ofstringent “safeguards”, or lending rules intended to protect human rights and the environment in client states.

 
Barack Obama nominates Kim for the presidency of the World Bank, Washington DC, 2012. Photograph: Charles Dharapak/AP

By custom, the World Bank had always been run by an American, nominated by the US president for a five-year term. But in 2012 there was a real international race for the post. Some emerging-market nations questioned deference to the United States, and finance experts from Nigeria and Colombia announced their candidacies. After considering political heavyweights such as Susan Rice, John Kerry and Hillary Clinton – who were all more interested in other jobs – Obama decided he needed an American he could present as an outsider to replace the outgoing president, Robert Zoellick, a colourless former Goldman Sachs banker and Republican trade negotiator. Clinton suggested Kim and “championed Jim as a candidate”, says Farmer. (Partners in Health works with the Clinton Foundation.)

Embedded within the dispute over superpower prerogatives was a larger anxiety about what role the World Bank should play in the 21st century. Extreme poverty had dropped from 37% in 1990 to just under 13% in 2012, so fewer countries needed the bank’s help. With interest rates at record lows, the states that needed aid had more options for borrowing cheap capital, often without paternalistic ethical dictates. New competitors, such as investment banks, were concerned mainly with profits, not safeguards. As a result, whereas the World Bank had once enjoyed a virtual monopoly on the development-finance market, by 2012 its lending represented only about 5% of aggregate private-capital flows to the developing world, according to Martin Ravallion, a Georgetown University economist. And while the bank possessed a wealth of data, technical expertise and analytical capabilities, it was hampered by red tape. One top executive kept a chart in her office illustrating the loan process, which looked like a tangle of spaghetti.

At Kim’s White House interview, Obama still needed some convincing that the global-health expert could take on the task of reinvigorating the bank. When asked what qualified him over candidates with backgrounds in finance, Kim cited Obama’s mother’s anthropology dissertation, about Indonesian artisans threatened by globalisation, to argue that there was no substitute for on-the-ground knowledge of economic policies’ impact. Two days later, Obama unveiled his pick, declaring that it was “time for a development professional to lead the world’s largest development agency”.

Kim campaigned for the job with the zeal of a convert. In an interview with the New York Times, he praised the fact that, unlike in the 1990s, “now the notion of pro-poor development is at the core of the World Bank”. He also embarked on an international “listening tour” to meet heads of state and finance ministers, gathering ideas to shape his priorities in office. Because votes on the bank’s board are apportioned according to shareholding, the US holds the greatest sway, and Obama’s candidate was easily elected. Kim took office in July 2012, with plans to eradicate extreme poverty. Farmer cites a motto carved at the entry to the World Bank headquarters – “Our dream is a world free of poverty” – that activists such as Kim once sniggered at. “Jim said, ‘Let’s change it from a dream to a plan, and then we don’t have to mock it.’”

Kim still had to win over another powerful constituency: his staff. Bank experts consider themselves an elite fraternity. Presidents and their mission statements may come and go, but the institutional culture remains largely impervious. “The bank staff,” says Jim Adams, a former senior manager, “has never fully accepted the governance.” When Robert McNamara expanded the bank’s mission in the late 1960s, doing things such as sending helicopters to spray the African black fly larvae that spread river blindness, many staffers were “deeply distressed to see the institution ‘running off in all directions’”, according to a history published in 1973. When James Wolfensohn arrived in the mid-1990s with plans to move away from structural adjustment and remake the bank like a consulting firm, employees aired their gripes in the press. “Shake-up or cock-up?” asked an Economist headline. Paul Wolfowitz, whose presidency was marred by leaks, was pushed out in 2007 after accusations of cronyism resulted in a damning internal investigation.

Recognising this fraught history, Kim went on a second listening tour: he met representatives of every bank department and obtained what he describes, in anthropologist-speak, as “almost a formal ethnography” of the place. What he lacked in economic knowledge, he made up for in charm. “Dr Kim is personable, Dr Kim is articulate, Dr Kim looks very moved by what he has to say,” says Paul Cadario, a former bank executive who is now a professor at the University of Toronto.

 
The funeral of a woman who died of Aids on the outskirts of Kigali, Rwanda. Photograph: Sean Smith for the Guardian

The initial goodwill, however, vanished when Kim announced his own structural adjustment: a top-to-bottom reorganisation of the bank. It wasn’t so much the idea of change that riled up the staff. Even before Kim took office, respected voices were calling for a shakeup. In 2012, a group of bank alumni published a report criticising an “archaic management structure”; low morale was causing staff turnover, and there was an overreliance on consultants, promotion on the basis of nationality, and a “Balkanisation of expertise”. Where Kim went awry, opponents say, was in imposing his will without first garnering political support. “One famous statement is that the World Bank is a big village,” says Cadario, now a Kim critic. “And if you live in a village, it is a really bad idea to have enemies.”

The bank had been designed around the idea that local needs, assessed by staff assigned to particular countries and regions, should dictate funding; cooperation across geographical lines required internal wrangling over resources. So Kim decided to dismantle existing networks. He brought in the management consulting firm McKinsey & Co, which recommended regrouping the staff into 14 “global practices”, each of which would focus on a policy area, such as trade, agriculture or water. Kim hired outsiders to lead some departments and pushed out several formerly powerful officials with little explanation. To symbolise that he was knocking down old walls, he had a palatial, wood-panelled space on the World Bank’s executive floor retrofitted as a Silicon Valley-style open-plan office, where he could work alongside his top staff.

Kim also announced that he would cut $400m in administrative expenses, and eliminate about 500 jobs – a necessary measure, he said, because low interest rates were cutting into the bank’s profits. Kim says he “made a very conscious decision to let anyone who wanted… air their grievances.” His opponents detected no such tolerance, however, and their criticisms turned ad hominem. Around Halloween in 2014, a satirical newsletter circulated among the staff, depicting Kim as Dr Frankenstein: “Taking random pieces from dead change management theories,” it read, “he and his band of external consultants cobble together an unholy creature resembling no development bank ever seen before.” Anonymous fliers attacking Kim also began to appear around bank headquarters.

Kim portrayed internal dissent as a petty reaction to perks such as travel per diems being cut. “There’s grumbling about parking and there’s grumbling about breakfast,” he told the Economist. Meanwhile, bank staffers whispered about imperial indulgences on Kim’s part, such as chartering a private jet. (Kim claims this is a longstanding practice among bank presidents, which he only uses when there are no other options.)

A French country officer named Fabrice Houdart emerged as a lead dissenter, broadcasting his frustrations with Kim in a blog on the World Bank’s intranet. In one post, he questioned whether “a frantic race to show savings… might lead to irreversible long-term damages to the institution.” (This being the World Bank, his sedition was often illustrated with charts and statistics.) The staff went into open rebellion after Houdart revealed that Bertrand BadrĂ©, the chief financial officer, whom Kim had hired and who was in charge of cutting budgets, had received a nearly $100,000 bonus on top of his $379,000 salary. Kim addressed a raucous town hall meeting in October 2014, where he told furious staffers, “I am just as tired of the change process as all of you are.”

A few months later, Houdart was demoted after being investigated for leaking a privileged document. The alleged disclosure was unrelated to Kim’s reorganisation – it had to do with Houdart’s human rights advocacy, for which he was well known at the bank – and Kim says the investigation began before Houdart’s denunciations of his presidency. Critics, however, portray it as retaliatory. “Fabrice has become a folk hero,” Cadario says, “because he was brave enough to say what many of the people within the bank are thinking.” (Houdart is currently disputing his demotion before an internal administrative tribunal.)

“It’s never fun when large parts of the organisation are criticising you personally,” Kim admits. Yet he maintains that his tough decisions were necessary. “In order to do a real change, you have to put jobs at risk,” he says. “And completely understandably, people hate that.”

In the heat of the staff revolt, Kim was devoting attention to a very different crisis: Ebola. In contrast with the bank’s historically cautious, analytical approach, Kim was pushing it to become more involved in emergency response. He committed $400m to confront the epidemic immediately, a quarter of which he pushed out in just nine days. He dispatched bank employees to afflicted west African countries and reproached the head of the WHO for the organisation’s lack of urgency. “Rather than being tied up in bureaucracy, or saying, ‘We don’t do those things,’ Jim is saying that if poor people’s lives are at risk… then it is our business,” says Tim Evans, whom Kim hired to run the bank’s new global practice for health.

Some bank veterans disagreed, vehemently. Nearly two years on, they still worry that in trying to save the day, Kim runs the risk of diverting the bank from its distinct mission. “Pandemic response is important – but it’s not the WHO, it’s the World Bank,” says Jean-Louis Sarbib, a former senior vice-president who now runs a nonprofit development consultancy. “I don’t think he understands the World Bank is not a very large NGO.” Referencing Kim’s work with Partners in Health, Sarbib adds, “The work of the World Bank is to create a system so that he doesn’t need to come and create a clinic in Haiti.”

In reply to this critique, Kim likes to cite a study co-written by a former World Bank economist, Larry Summers, that found that 24% of full-income growth in developing countries between 2000 and 2011 was attributable to improved public health. Put simply, Kim says, pandemics and other health deficits represent enormous threats to economic development, so they should be the World Bank’s business. The same goes for climate change, which the bank is fighting by funding a UN initiative to expand sustainable energy. As for violent conflicts, rather than waiting until the shooting has stopped and painstakingly preparing a post-conflict assessment – as the bank has done in the past – Kim wants to risk more capital in insecure zones.

“We… bought into this notion that development is something that happens after the humanitarian crisis is over,” Kim said at a recent event called the Fragility Forum, where he sat next to representatives of various aid groups and the president of the Central African Republic in the World Bank’s sun-soaked atrium. “We are no longer thinking that way.”

After the forum, amid a whirlwind day of meetings and speeches, Kim stopped at a hotel cafe with me to unwind for a few minutes. As a counterweight to his life’s demands, he practices Korean Zen-style meditation, but he also seems to blow off steam by brainstorming aloud. Goals and promises came pouring out of him like a gusher. Besides eliminating extreme poverty, which he has now promised will be done by 2030, Kim wants to raise incomes among the bottom 40% of the population in every country. He also wants to achieve universal access to banking services by 2020.

Long past our allotted interview time, Kim told me he had just one more idea: “Another huge issue that I want to bring to the table is childhood stunting.” At the Davos World Economic Forum this year, he explained, everyone was chattering about a “fourth industrial revolution”, which will centre on artificial intelligence, robotics and other technological leaps. But Kim thinks whole countries are starting out with a brainpower deficit because of childhood malnutrition. “These kids have fewer – literally fewer – neuronal connections than their non-stunted classmates,” he said. “For every inch that you’re below the average height, you lose 2% of your income.”

“This is fundamentally an economic issue,” he continued. “We need to invest in grey-matter infrastructure. Neuronal infrastructure is quite possibly going to be the most important infrastructure.”

To World Bank traditionalists, addressing nutrition is an example of the sort of mission creep that makes Kim so maddening. Despite its name and capital, the bank can’t be expected to solve all the world’s humanitarian problems. (“We are not the UN,” is an informal mantra among some staffers.) Poor countries may well prefer the bank to stick to gritty infrastructural necessities, even if Kim and his supporters have splashier goals. “The interests of its rich-country constituencies and its poor-country borrowers are diverging,” Pritchett says. “It’s like the bank has a foot on two boats. Sooner or later, it’s going to have to jump on one boat or the other, or fall in the water. So far, Jim Kim is just doing the splits.”

 
The World Bank headquarters in Washington. Photograph: Lauren Burke/AP

Kim’s defenders insist the bank hasn’t abandoned its core business. In fact, as private investment in emerging markets has contracted recently, due to instability in once-booming economies such as Brazil, countries have found more reasons to turn to the World Bank. Its primary lending unit committed $29.7bn in loans this fiscal year, nearly doubling the amount from four years earlier. “There is so much need in the world that I’m not worried we’re going to run out of projects to finance,” Kim says. He also hopes the worst of the tumult within the bank is over. A few elements of his reorganisation have been scaled back; after the new administrative structure proved unwieldy, the 14 global practices were regrouped into three divisions. Some of his more polarising hires have also left.

A five-year term, Kim says, is hardly sufficient to implement his entire agenda, and he has conveyed his desire to be reappointed in 2017. Though internal controversies have been damaging, and America’s domination of the bank remains a source of tension, the next US president (quite possibly Kim’s friend Hillary Clinton) will have a strong say in the matter. If he keeps his job, Kim wants to show that the World Bank can serve as a link between great powers and small ones, between economics and aid work – retaining its influence as old rules and boundaries are erased and new ones are scribbled into place.

Kim thinks he can succeed, so long as he keeps one foot rooted in his experiences as a doctor with mud between his toes. But he also wants to share his revelations about capital with his old comrades. “I really feel a responsibility to have this conversation with development actors who, like me 10 years ago, didn’t really understand the power of leverage,” Kim says with a guileless air.

“God,” he adds, “it is just such a powerful tool.”

Monday 2 May 2016

Do we want our children taught by humans or algorithms?

Zoe Williams in The Guardian


 
Parents ‘have been galvanised by the … sight of their children in distress’ over the tests. Photograph: Dominic Lipinski/PA



It is incredibly hard for a headteacher to shout “rubbish” in a crowded hall while an authority figure is speaking. It is like asking a lung specialist to smoke a cigarette. Yet that’s what happened when Nicky Morgan addressed the National Association of Head Teachers conference yesterday. They objected partly to her programme of turning all schools into academies by 2020 and partly to her luminously daft insistence that “testing”, “improving” and “educating” are interchangeable words. 

Her government “introduced the phonics check for six-year-olds, and 100,000 more young people are able to read better as a result,” she told the BBC when she first became education secretary, and she has been trotting out the same nonsense ever since. No amount of disagreement from professionals in the field dents her faith or alters her rhetoric. Indeed, since the Michael Gove era, teachers have been treated as recalcitrant by definition, motivated by sloth, their years of experience reframed not as wisdom but as burnout. When they object to a policy, that merely proves what a sorely needed challenge it poses to their cushy lives. When they shout “rubbish” in a conference hall, it is yet more evidence of what a dangerous bunch of trots they are.

On Tuesday, parents enter the fray, with a school boycott organised by Let Our Kids Be Kids, to protest against “unnecessary testing and a curriculum that limits enjoyment and real understanding”. Some have been galvanised by the bizarre and unnecessary sight of their children in distress, others by solidarity with the teachers – who inconveniently continue to command a great deal of respect among people who actually meet with them – and others who can’t join in the boycott because of minor administrative details such as having to go to work, but have signed the petition. It is the beginning of a new activism – muscular, cooperative and agile because it has to be.


The boycott is in protest against ‘unnecessary testing and a curriculum that limits enjoyment and real understanding’. Photograph: Barry Batchelor/PA

If the only problem is that it causes anxiety to a load of pampered under-10s, shouldn’t they just suck it up? Isn’t that the best way to learn what the world is like? The framing of this debate is precisely wrong. No serious educationalist thinks that the way to drive up standards among children is to make tests more frequent and more exacting. Nor does anybody of any expertise really believe that teachers need to be incentivised by results. It is an incredibly tough, demanding, indifferently remunerated job, which nobody would do except as a vocation. It is not for the profession or the parents to explain what the tests are doing to the kids; it is for the education secretary to explain what these tests are for. 

By coincidence, at the end of last week, Randi Weingarten, head of the American Federation of Teachers, was in London to hand in a petition to Pearson, the education company and provider of curriculums and test delivery. The petition protested against two perceived issues: concerns about over-testing in US schools and alleged profiteering in the global south. The trajectory in US education, from universal public provision with local accountability to mass outsourcing and centralised control, is strikingly similar to what has happened here. It begins with the creation of a failure narrative, “that both the Democrats and the Republican bought into, which is, the sky is falling, the sky is falling, the sky is falling”, Weingarten told me. That creates the rationale for testing, since, without data, you can’t tell whether you’re improving. Those tests are consequential: the results can be used to fire teachers, close down schools, hold pupils back a year. All the most profound decisions in education can suddenly be made by an algorithm, with no human judgment necessary.

Simultaneously, says Weingarten, Charter schools were introduced, originally – like academies – “as part of a bigger public school system where you could incubate ideas”, but very soon remodelled as a way to supplant rather than supplement the existing system. “And in between all of this, you started seeing the marketisation and the monetisation.” Until things can be counted, there isn’t much scope to create a market.

I was never fully convinced that academisation and hyper-testing were undertaken to create the market conditions for privatisation down the line; I thought it more plausible that the testing was merely a politician’s wheeze to create data out of humans that could then be stuffed into manifestos to persuade other humans that the policies were going in the right direction. Yet the parallels between the US and England are insistent – it has become impossible to ignore the idea that our government is mimicking theirs for a reason.

Whether all this is a prelude to privatisation or a PR stunt for a chaotic government doesn’t actually matter in the medium term: to put seven-year-olds under intolerable pressure for either of those ends would be equally abhorrent. In the long term, the mutation of schools into joyless exam factories won’t be halted by resistance alone, we also need to make a proper account of what education is for.

As Weingarten describes, “We have to help kids build relationships. We have to address their life skills, so they can negotiate the world. We have to help kids build resilience. We have to help kids learn how to problem-solve, how to think, how to engage. So tell me, how are any of these things tested on a standardised test?” That’s a test question for the tin-eared secretary of state herself.

Monday 25 May 2015

How to turn a liberal hipster into a capitalist tyrant in one evening

Paul Mason in The Guardian

 
World Factory … how would you cope? Photograph: photograph by David Sandison

The choices were stark: sack a third of our workforce or cut their wages by a third. After a short board meeting we cut their wages, assured they would survive and that, with a bit of cajoling, they would return to our sweatshop in Shenzhen after their two-week break.

But that was only the start. In Zoe Svendsen’s play World Factory at the Young Vic, the audience becomes the cast. Sixteen teams sit around factory desks playing out a carefully constructed game that requires you to run a clothing factory in China. How to deal with a troublemaker? How to dupe the buyers from ethical retail brands? What to do about the ever-present problem of clients that do not pay? Because the choices are binary they are rarely palatable. But what shocked me – and has surprised the theatre – is the capacity of perfectly decent, liberal hipsters on London’s south bank to become ruthless capitalists when seated at the boardroom table.

The classic problem presented by the game is one all managers face: short-term issues, usually involving cashflow, versus the long-term challenge of nurturing your workforce and your client base. Despite the fact that a public-address system was blaring out, in English and Chinese, that “your workforce is your vital asset” our assembled young professionals repeatedly had to be cajoled not to treat them like dirt.

And because the theatre captures data on every choice by every team, for every performance, I know we were not alone. The aggregated flowchart reveals that every audience, on every night, veers towards money and away from ethics.

Svendsen says: “Most people who were given the choice to raise wages – having cut them – did not. There is a route in the decision-tree that will only get played if people pursue a particularly ethical response, but very few people end up there. What we’ve realised is that it is not just the profit motive but also prudence, the need to survive at all costs, that pushes people in the game to go down more capitalist routes.”

In short, many people have no idea what running a business actually means in the 21st century. Yes, suppliers – from East Anglia to Shanghai – will try to break your ethical codes; but most of those giant firms’ commitment to good practice, and environmental sustainability, is real. And yes, the money is all important. But real businesses will take losses, go into debt and pay workers to stay idle in order to maintain the long-term relationships vital in a globalised economy.

Why do so many decent people, when asked to pretend they’re CEOs, become tyrants from central casting? Part of the answer is: capitalism subjects us to economic rationality. It forces us to see ourselves as cashflow generators, profit centres or interest-bearing assets. But that idea is always in conflict with something else: the non-economic priorities of human beings, and the need to sustain the environment. Though World Factory, as a play, is designed to show us the parallels between 19th-century Manchester and 21st-century China, it subtly illustrates what has changed.


A worker in a Chinese clothing factory Photograph: Imaginechina/Corbis

A real Chinese sweatshop owner is playing a losing game against something much more sophisticated than the computer at the Young Vic: an intelligent machine made up of the smartphones of millions of migrant workers on their lunchbreak, plugging digitally into their village networks to find out wages and conditions elsewhere. That sweatshop owner is also playing against clients with an army of compliance officers, themselves routinely harassed by NGOs with secret cameras.

The whole purpose of this system of regulation – from above and below – is to prevent individual capitalists making short-term decisions that destroy the human and natural resources it needs to function. Capitalism is not just the selfish decisions of millions of people. It is those decisions sifted first through the all-important filter of regulation. It is, as late 20th-century social theorists understood, a mode of regulation, not just of production.

Yet it plays on us a cruel ideological trick. It looks like a spontaneous organism, to which government and regulation (and the desire of Chinese migrants to visit their families once a year) are mere irritants. In reality it needs the state to create and re-create it every day.

Banks create money because the state awards them the right to. Why does the state ram-raid the homes of small-time drug dealers, yet call in the CEOs of the banks whose employees commit multimillion-pound frauds for a stern ticking off over a tray of Waitrose sandwiches? Answer: because a company has limited liability status, created by parliament in 1855 after a political struggle.

Our fascination with market forces blinds us to the fact that capitalism – as a state of being – is a set of conditions created and maintained by states. Today it is beset by strategic problems: debt- ridden, with sub-par growth and low productivity, it cannot unleash the true potential of the info-tech revolution because it cannot imagine what to do with the millions who would lose their jobs.

The computer that runs the data system in Svendsen’s play could easily run a robotic clothes factory. That’s the paradox. But to make a third industrial revolution happen needs something no individual factory boss can execute: the re-regulation of capitalism into something better. Maybe the next theatre game about work and exploitation should model the decisions of governments, lobbyists and judges, not the hapless managers.

Monday 12 November 2012

Management theory was hijacked in the 80s. We're still suffering the fallout.



Financial trading
'Managers abandoned their previous policy of retaining and reinvesting profits in favour of large dividend and share buyback payouts to shareholders.' Photograph: David Karp/AP
This week the City has been congratulating itself on 20 years of UK corporate governance codes. Since the original Cadbury document in 1992, the UK has basked in its role as governance leader, with 70 other countries having followed its example and adopted similar guidelines.
There's just one problem: is it the right kind of governance? The day the FT carried the story, Incomes Data Services reported that FTSE 100 boardroom pay went up by a median 10% last year, a soaraway trend that the best code in the world has complacently overseen. Nor could it prevent the RBS meltdown, Libor or PPI mis-selling to the tune of £12bn, the biggest rip-off in financial history. It didn't stop phone-hacking or BP taking short cuts. It has sanctioned wholesale offloading of risk, whether individual (pensions, careers) or collective (global and financial warming) on to society, while rejecting any responsibility of its own except to shareholders.
So jerry-built is the corporate economy erected on the scaffolding of the City codes that it can no longer deliver even the material progress by which it justifies its privileges: even with a return to growth, living standards for lower and middle earners may be no higher in 2020 than in 2000, according to the Resolution Foundation. The truth is that UK corporate governance has neither headed off major scandal nor nurtured effective long-term management. In fact the opposite is true.
The irony is that we know what makes companies prosper in the long term. They manage themselves as whole systems, look after their people, use targets and incentives with extreme caution, keep pay differentials narrow (we really are in this together) and treat profits as the score rather than the game. And it's a given that in the long term companies can't thrive unless they have society's interests at heart along with their own.
So why do so many boards and managers, supported by politicians, systematically do the opposite – run companies as top-down dictatorships, pursue growth by merger, destroy teamwork with runaway incentives, attack employment rights and conditions, outsource customer service, treat their stakeholders as resources to be exploited, and refuse wider responsibilities to society?
The answer is that management in the 1980s was subject to an ideological hijack by Chicago economics that put at the heart of governance a reductive "economic man" view of human nature needing to be bribed or whipped to do their exclusive job of maximising shareholder returns. Embedded in the codes, these assumptions now have the status of unchallenged truths.
The consequences of the hijack have been momentous. The first was to align managers' interests not with their own organisations but with financial outsiders – shareholders. That triggered a senior management pay explosion that continues to this day. The second was that managers abandoned their previous policy of retaining and reinvesting profits in favour of large dividend and share buyback payouts to shareholders.
Ironically, the effect of this stealth revolution was to undercut the foundations of the very shareholder value under whose flag the activists had ridden into battle. Along with corporate welfare and customer service, among the functions squeezed in the shareholder bonanza was research and development. Innovation has stalled since the 1980s, prompting some economists to query whether the era of growth itself is over.
But it's not economics, it's management, stupid. Unsurprisingly, downtrodden and outsourced workers, mis-sold-to customers, exploited suppliers and underpowered innovation cancelled out any gains from ever more ingenious financial engineering – leaving shareholders less well off in the shareholder-value-era since 1980 than in previous decades. The great crash of 2008 stripped away any remaining doubt: the economic progress of the last 30 years was a mirage. As Nassim Nicholas Taleb put it in The Black Swan, the profits were illusory, "simply borrowed against destiny with some random payment time."
Over the last decades, misconceived ideologically based governance has recreated management as a new imperium in which shareholders and managers rule and the real world dances to finance's tune. A worthier anniversary to celebrate is the death seven years ago this month, on 11 November, of Peter Drucker, one of the architects of pre-code management, which he insisted was a "liberal art". Austrian by birth, Drucker was a cultured humanist one of whose distinctions was having his books burned by the Nazis. In The Practice of Management in 1954 he wrote: "Free enterprise cannot be justified as being good for business. It can be justified only as being good for society".

Monday 19 December 2011

'Freedom' an instrument of oppression

This bastardised libertarianism makes 'freedom' an instrument of oppression

It's the disguise used by those who wish to exploit without restraint, denying the need for the state to protect the 99%
pudles2012
Illustration by Daniel Pudles

Freedom: who could object? Yet this word is now used to justify a thousand forms of exploitation. Throughout the rightwing press and blogosphere, among thinktanks and governments, the word excuses every assault on the lives of the poor, every form of inequality and intrusion to which the 1% subject us. How did libertarianism, once a noble impulse, become synonymous with injustice?

In the name of freedom – freedom from regulation – the banks were permitted to wreck the economy. In the name of freedom, taxes for the super-rich are cut. In the name of freedom, companies lobby to drop the minimum wage and raise working hours. In the same cause, US insurers lobby Congress to thwart effective public healthcare; the government rips up our planning laws; big business trashes the biosphere. This is the freedom of the powerful to exploit the weak, the rich to exploit the poor.

Rightwing libertarianism recognises few legitimate constraints on the power to act, regardless of the impact on the lives of others. In the UK it is forcefully promoted by groups like the TaxPayers' Alliance, the Adam Smith Institute, the Institute of Economic Affairs, and Policy Exchange. Their concept of freedom looks to me like nothing but a justification for greed.

So why have we been been so slow to challenge this concept of liberty? I believe that one of the reasons is as follows. The great political conflict of our age – between neocons and the millionaires and corporations they support on one side, and social justice campaigners and environmentalists on the other – has been mischaracterised as a clash between negative and positive freedoms. These freedoms were most clearly defined by Isaiah Berlin in his essay of 1958, Two Concepts of Liberty. It is a work of beauty: reading it is like listening to a gloriously crafted piece of music. I will try not to mangle it too badly.

Put briefly and crudely, negative freedom is the freedom to be or to act without interference from other people. Positive freedom is freedom from inhibition: it's the power gained by transcending social or psychological constraints. Berlin explained how positive freedom had been abused by tyrannies, particularly by the Soviet Union. It portrayed its brutal governance as the empowerment of the people, who could achieve a higher freedom by subordinating themselves to a collective single will.

Rightwing libertarians claim that greens and social justice campaigners are closet communists trying to resurrect Soviet conceptions of positive freedom. In reality, the battle mostly consists of a clash between negative freedoms.

As Berlin noted: "No man's activity is so completely private as never to obstruct the lives of others in any way. 'Freedom for the pike is death for the minnows'." So, he argued, some people's freedom must sometimes be curtailed "to secure the freedom of others". In other words, your freedom to swing your fist ends where my nose begins. The negative freedom not to have our noses punched is the freedom that green and social justice campaigns, exemplified by the Occupy movement, exist to defend.

Berlin also shows that freedom can intrude on other values, such as justice, equality or human happiness. "If the liberty of myself or my class or nation depends on the misery of a number of other human beings, the system which promotes this is unjust and immoral." It follows that the state should impose legal restraints on freedoms that interfere with other people's freedoms – or on freedoms which conflict with justice and humanity.

These conflicts of negative freedom were summarised in one of the greatest poems of the 19th century, which could be seen as the founding document of British environmentalism. In The Fallen Elm, John Clare describes the felling of the tree he loved, presumably by his landlord, that grew beside his home. "Self-interest saw thee stand in freedom's ways / So thy old shadow must a tyrant be. / Thou'st heard the knave, abusing those in power, / Bawl freedom loud and then oppress the free."

The landlord was exercising his freedom to cut the tree down. In doing so, he was intruding on Clare's freedom to delight in the tree, whose existence enhanced his life. The landlord justifies this destruction by characterising the tree as an impediment to freedom – his freedom, which he conflates with the general liberty of humankind. Without the involvement of the state (which today might take the form of a tree preservation order) the powerful man could trample the pleasures of the powerless man. Clare then compares the felling of the tree with further intrusions on his liberty. "Such was thy ruin, music-making elm; / The right of freedom was to injure thine: / As thou wert served, so would they overwhelm / In freedom's name the little that is mine."

But rightwing libertarians do not recognise this conflict. They speak, like Clare's landlord, as if the same freedom affects everybody in the same way. They assert their freedom to pollute, exploit, even – among the gun nuts – to kill, as if these were fundamental human rights. They characterise any attempt to restrain them as tyranny. They refuse to see that there is a clash between the freedom of the pike and the freedom of the minnow.

Last week, on an internet radio channel called The Fifth Column, I debated climate change with Claire Fox of the Institute of Ideas, one of the rightwing libertarian groups that rose from the ashes of the Revolutionary Communist party. Fox is a feared interrogator on the BBC show The Moral Maze. Yet when I asked her a simple question – "do you accept that some people's freedoms intrude upon other people's freedoms?" – I saw an ideology shatter like a windscreen. I used the example of a Romanian lead-smelting plant I had visited in 2000, whose freedom to pollute is shortening the lives of its neighbours. Surely the plant should be regulated in order to enhance the negative freedoms – freedom from pollution, freedom from poisoning – of its neighbours? She tried several times to answer it, but nothing coherent emerged which would not send her crashing through the mirror of her philosophy.

Modern libertarianism is the disguise adopted by those who wish to exploit without restraint. It pretends that only the state intrudes on our liberties. It ignores the role of banks, corporations and the rich in making us less free. It denies the need for the state to curb them in order to protect the freedoms of weaker people. This bastardised, one-eyed philosophy is a con trick, whose promoters attempt to wrongfoot justice by pitching it against liberty. By this means they have turned "freedom" into an instrument of oppression.

Monday 12 December 2011

Population decline is the elephant in the world's living room

The fifth horseman of the apocalypse
By Spengler

(The essay below appears as a preface to my book How Civilizations Die (and Why Islam is Dying, Too). [1]

Population decline is the elephant in the world's living room. As a matter of arithmetic, we know that the social life of most developed countries will break down within two generations. Two out of three Italians and three of four Japanese will be elderly dependents by 2050. [1] If present fertility rates hold, the number of Germans will fall by 98% over the next two centuries. No pension and health care system can support such an inverted population pyramid. Nor is the problem limited to the industrial nations. Fertility is falling at even faster rates - indeed, at rates never before registered anywhere - in the Muslim world. The world's population will fall by as much as a fifth between the middle and the end of the 21st century, by far the worst decline in human history.

The world faces a danger more terrible than the worst Green imaginings. The European environmentalist who wants to shrink the world's population to reduce carbon emissions will spend her declining years in misery, for there will not be enough Europeans alive a generation from now to pay for her pension and medical care. [2] For the first time in world history, the birth rate of the whole developed world is well below replacement, and a significant part of it has passed the demographic point of no return.

But Islamic society is even more fragile. As Muslim fertility shrinks at a rate demographers have never seen before, it is converging on Europe's catastrophically low fertility as if in time-lapse photography. The average 30-year-old Iranian woman comes from a family of six children, but she will bear only one or two children during her lifetime. Turkey and Algeria are just behind Iran on the way down, and most of the other Muslim countries are catching up quickly. By the middle of this century, the belt of Muslim countries from Morocco to Iran will become as gray as depopulating Europe. The Islamic world will have the same proportion of dependent elderly as the industrial countries - but one-tenth the productivity. A time bomb that cannot be defused is ticking in the Muslim world.

Imminent population collapse makes radical Islam more dangerous, not less so. For in their despair, radical Muslims who can already taste the ruin of their culture believe that they have nothing to lose.

Political science is at a loss in the face of demographic decline and its consequences. The wasting away of nations is an insoluble conundrum for modern political theory, which is based on the principle of rational self-interest. At the threshold of extinction, the political scientists' clever models break down. We "do not negotiate with terrorists". But a bank robber holding hostages is a terrorist of sorts, and the police negotiate with such miscreants as a matter of course. And what if the bank robber knows he will die of an incurable disease in a matter of weeks? That changes the negotiation. The simple truth - call it Spengler's Universal Law #1 - A man, or a nation, at the brink of death does not have a "rational self-interest".

Conventional geopolitical theory, which is dominated by material factors such as territory, natural resources, and command of technology, does not address how peoples will behave under existential threat. Geopolitical models fail to resemble the real world in which we live, where the crucial issue is the willingness or unwillingness of a people inhabiting a given territory to bring a new generation into the world.

Population decline, the decisive issue of the 21st century, will cause violent upheavals in the world order. Countries facing fertility dearth, such as Iran, are responding with aggression. Nations confronting their own mortality may choose to go down in a blaze of glory. Conflicts may be prolonged beyond the point at which there is any rational hope of achieving strategic aims - until all who wish to fight to the death have taken the opportunity to do so.
Analysis of national interests cannot explain why some nations go to war without hope of winning, or why other nations will not fight even to defend their vital interests. It cannot explain the historical fact that peoples fight harder, accepting a higher level of sacrifice in blood and treasure, when all hope of victory is past. Conventional geopolitical analysis cannot explain the causes of population collapse either, any more than its consequences - for example, under what circumstances strategic reverses (notably the two world wars of the past century) may crush the aspirations of the losers and result in apathy and demographic death.

Why do individuals, groups, and nations act irrationally, often at the risk of self-destruction? Part of the problem lies in our definition of rationality. Under normal circumstances we think it irrational for a middle-aged man to cash in his insurance policy and spend money as fast as possible. But if the person in question has a terminal illness and no heirs, we think it quite reasonable to spend it all quickly, like Otto Kringelein in Grand Hotel or his updated equivalent, Queen Latifah's character in The Last Holiday. And if we know that we shall presently die of rabies, what is to prevent us from biting everyone we dislike? Countries sometimes suffer the equivalent of terminal illness. What seems suicidal to Americans may appear rational to an existentially challenged people confronting its imminent mortality.

Self-immolation of endangered peoples is sadly common. Stone-age cultures often disintegrate upon contact with the outside world. Their culture breaks down, and suicides skyrocket. An Australian researcher writes about "suicide contagion or cluster deaths - the phenomenon of indigenous people, particularly men from the same community taking their own lives at an alarming rate". [3] Canada's Aboriginal Health Foundation reports, "The overall suicide rate among First Nation communities is about twice that of the total Canadian population; the rate among Inuit is still higher - 6 to 11 times higher than the general population." [4] Suicide is epidemic among Amazon tribes. The London Telegraph reported on November 19, 2000,
The largest tribe of Amazonian Indians, the 27,000-strong Guarani, are being devastated by a wave of suicides among their children, triggered by their coming into contact with the modern world. Once unheard of among Amazonian Indians, suicide is ravaging the Guarani, who live in the southwest of Brazil, an area that now has one of the highest suicide rates in the world. More than 280 Guarani have taken their own lives in the past 10 years, including 26 children under the age of 14 who have poisoned or hanged themselves. Alcoholism has become widespread, as has the desire to own radios, television sets and denim jeans, bringing an awareness of their poverty. Community structures and family unity have broken down and sacred rituals come to a halt.
Of the more than 6,000 languages now spoken on the planet, two become extinct each week, and by most estimates half will fall silent by the end of the century. [5] A United Nations report claims that nine-tenths of the languages now spoken will become extinct in the next hundred years. [6] Most endangered languages have a very small number of speakers. Perhaps a thousand distinct languages are spoken in Papua New Guinea, many by tribes of only a few hundred members. Several are disappearing tribal languages spoken in the Amazon rainforest, the Andes Mountains, or the Siberian taiga. Eighteen languages have only one surviving speaker. It is painful to imagine how the world must look to these individuals. They are orphaned in eternity, wiped clean of memory, their existence reduced to the exigency of the moment.

But are these dying remnants of primitive societies really so different from the rest of us? Mortality stalks most of the peoples of the world - not this year or next, but within the horizon of human reckoning. A good deal of the world seems to have lost the taste for life. Fertility has fallen so far in parts of the industrial world that languages such as Ukrainian and Estonian will be endangered within a century and German, Japanese, and Italian within two. The repudiation of life among advanced countries living in prosperity and peace has no historical precedent, except perhaps in the anomie of Greece in its post-Alexandrian decline and Rome during the first centuries of the Common Era. But Greece fell to Rome, and Rome to the barbarians. In the past, nations that foresaw their own demise fell to the Four Horsemen of the Apocalypse: War, Plague, Famine, and Death. Riding point for the old quartet in today's more civilized world is a Fifth Horseman: loss of faith. Today's cultures are dying of apathy, not by the swords of their enemies.

The Arab suicide bomber is the spiritual cousin of the despondent aboriginal of the Amazon rain forest. And European apathy is the opposite side of the coin of Islamic extremism. Both apathetic Europeans and radical Muslims have lost their connection to the past and their confidence in the future. There is not a great deal of daylight between European resignation to cultural extinction at the hundred-year horizon, and the Islamist boast, "You love life, and we love death." Which brings us to Spengler's Universal Law #2: When the nations of the world see their demise not as a distant prospect over the horizon, but as a foreseeable outcome, they perish of despair. Like the terminally ill patient cashing in his insurance money, a culture that anticipates its own extinction has a different standard of rationality than does conventional political science.

Game theorists have tried to make political strategy into a quantitative discipline. Players with a long-term interest think differently than players with a short-term interest. A swindler who has no expectation of encountering his victim again will take what he can and run; a merchant who wants repeat customers will act honestly as a matter of self-interest. By the same token, the game theorists contends, nations learn that it is in their interest to act as responsible members of the world community, for the long-run advantages of good behavior outweigh the passing benefits of predation.

But what if there isn't any long run - not, at least, for some of the "players" in the "game"? The trouble with applying game theory to the problem of existential war is that the players may not expect to be there for the nth iteration of the game. Entire peoples sometimes find themselves faced with probable extinction, so that no peaceful solution appears to be a solution for them.

Situations of this sort have arisen frequently in history, but never as frequently as today, when so many of the world's cultures are not expected to survive the next two centuries. A people facing cultural extinction may well choose war, if war offers even a slim chance of survival. That is just how radical Islamists view the predicament of traditional Muslim society in the face of modernity. The Islamists fear that if they fail, their religion and culture will disappear into the maelstrom of the modern world. Many of them rather would die fighting. 

Paradoxically it is possible for wars of annihilation to stem from rational choice, for the range of choices always must be bounded by the supposition that the chooser will continue to exist. Existential criteria, that is, trump the ordinary calculus of success and failure. If one or more of the parties knows that peace implies the end of its existence, it has no motive to return to peace. That is how the radical Islamists of Hamas view the future of Muslim society. A wealthy and successful Jewish state next to a poor and dysfunctional Palestinian state may imply the end of the moral authority of Islam, and some Palestinians would rather fight to the death than embrace such an outcome. Rather than consign their children to the Western milieu of personal freedom and sexual license, radical Muslims will fight to the death.

But why are Muslims - and Europeans, and Japanese - living under a societal death sentence? Why are populations collapsing in the modern world? Demographers have identified several different factors associated with population decline: urbanization, education and literacy, the modernization of traditional societies. Children in traditional society had an economic value, as agricultural labor and as providers for elderly parents; urbanization and pension systems turned children into a cost rather than a source of income. And female literacy is a powerful predictor of population decline among the world's countries. Mainly poor and illiterate women in Mali and Niger bear eight children in a lifetime, while literate and affluent women in the industrial world bear one or two.

But what determines whether it is one child or two? Children also have a spiritual value. That is why the degree of religious faith explains a great deal of the variation in population growth rates among the countries of the world. The industrial world's lowest fertility rates are encountered among the nations of Eastern Europe where atheism was the official ideology for generations. The highest fertility rates are found in countries with a high degree of religious faith, namely the United States and Israel. And demographers have identified religion as a crucial factor in the differences among populations within countries. When faith goes, fertility vanishes, too. The death-spiral of birth rates in most of the industrial world has forced demographers to think in terms of faith. Dozens of new studies document the link between religious belief and fertility.

But why do some religions seem to provide better protection against the sterilizing effects of modernity than others? The fastest demographic decline ever registered in recorded history is taking place today in Muslim countries; demographic winter is descending fastest in the fifth of the world where religion most appears to dominate. And even more puzzling: why does one religion (Christianity) seem to inoculate a people against demographic decline in one place (America) but not in another (Europe)? In many parts of the world, what once looked like an indestructible rock of faith has melted in the hot light of modernity. In others, modernity has only added compost for the growth of faith. Apparently some kinds of faith will survive in the modern world, and others will fail.

Strategic analysts and politicians are poorly equipped to understand these new and disturbing circumstances, with their overarching implications for political strategy and economics. To make sense of the world today we must do better than secular political science, which pigeon-holes faith as one more belief-structure among the other belief-structures in its collection of specimens.

Our political science is uniquely ill-equipped to make sense of a global crisis whose ultimate cause is spiritual. But was not always so. From the advent of Christianity to the seventeenth-century Enlightenment, the West saw politics through the lens of faith. St Augustine's fifth-century treatise The City of God looked through the state to the underlying civil society, and understood that civil society as a congregation - a body bound together by common loves, as opposed to Cicero's state founded only on common interests. (In the concluding chapter, we will consider Augustine's view as a lodestar for an American foreign policy that realistically addresses the threats created by the imminent demographic collapse of nations.)

We might call Augustine's view "theopolitics." A millennium later, Niccolo Machiavelli and Thomas Hobbes changed the subject, to the individual's desire for power, wealth, and personal survival. Hobbes, the 17th-century grandfather of modern political science, introduced a radically truncated anthropology, centered on the individual's struggle for survival. The state, he argued, was a compact among individuals who survival prospects were poor in a "state of nature"; thus they ceded their individual rights to a sovereign in return for protection. A century later Montesquieu added differences in climate, terrain, and resources to the mix. The modern view of atomized man motivated only by the pursuit of material advantage is loosely known as "geopolitics".

What prompted this revolution in political thinking that has left modern political theory without the tools to understand the causes and implications of the current demographic collapse? Undoubtedly, the terrible religious wars of the 16th and 17th centuries poisoned the idea of faith-based politics. Europe fought dynastic and political wars under the false flag of religion until the Thirty Years' War of 1618-1648 destroyed almost half the population of Central Europe. The Peace of Westphalia that ended this fearful war forever buried the political model that Christendom had advanced since Augustine: a universal Christian empire that would keep the peace and limit the arbitrary power of kings. Things are not as simple as they seem in the standard account of the violence that soured the West on theopolitics. For - as we shall see - the nation-states that opposed universal empire were founded on a contending kind of faith, a fanatical form of national self-worship whose internal logic was not played out until world war and genocide in the 20th century, and the collapse of faith and fertility in the 21st. But when Thomas Hobbes published his great book Leviathan three years after the end of the Thirty Years' War, it seemed credible that "the papacy is no other than the ghost of the deceased Roman Empire, sitting crowned upon the grave thereof".

One powerful attraction of the Hobbesian revolution in political thinking was the power it promised to intellectuals. If politics reduces to the individual and his material concerns, then it is possible to manipulate the individual through the alternation of his material circumstances. A clever elite could fix all the problems of the world. Immanuel Kant boasted in 1793 that he could write a constitution for a race of devils, "if only they be rational." Europe ignored him and proceeded to destroy itself in the Napoleonic Wars and the two world wars of the past century. Today, as in Kant's time, the great frustration in world affairs is the refusal of some players to act rationally. Something was gained, but much more was lost, in the 17th-century Hobbesian revolution in political thought. To view human beings as creatures concerned solely with power, wealth, and security is an impoverished anthropology. The missing tools - the ones Machiavelli and Hobbes removed from the toolbox - are exactly the ones we need to understand and cope with the dangers inherent in the wholesale collapse of cultures that faces us today.

Secularism in all its forms fails to address the most fundamental human need. Sociologist Eric Kaufmann, who himself bewails the fecundity of the religious and the infertility of the secular, puts it this way: "The weakest link in the secular account of human nature is that it fails to account for people's powerful desire to seek immortality for themselves and their loved ones." Traditional society had to confront infant mortality as well as death by hunger, disease, and war. That shouldn't be too troubling, however: "We may not be able to duck death completely, but it becomes so infrequent that we can easily forget about it."

Has death really become infrequent? Call it Spengler's Universal Law #3: Contrary to what you may have heard from the sociologists, the human mortality rate is still 100%.

We can stick our fingers in our ears and chant "I can't hear you!" only so long in the face of mortality. Religion offers the individual the means to transcend mortality, to survive the fragility of a mortal existence. Homo religiosus confronts death in order to triumph over it. But the world's major religions are distinguished by the different ways in which they confront mortality. We cannot make sense of the role of religion in demographic, economic, and political developments - and of the different roles of different religions in different places and times - without understanding the existential experience of the religious individual. It is challenging to recount this experience to a secular analyst; it is somewhat like describing being in love to someone who never has been in love. One doesn't have to be religious to understand religion, but it helps.

But without understanding humankind's confrontation of his own morality in religion, political science is confined to analysis on the basis of the survival instinct - which suddenly seems to be failing whole peoples - and rational self-interest - at a time when nations and peoples are not behaving in a conspicuously rational manner.

At the conclusion of a previous irruption of irrationality - the First World War - a young German soldier at a remote post in Macedonia jotted down his thoughts on army postcards in the final months of the First World War. A small, bespectacled man with a thin mustache, he had been groomed to be one of the mandarins of the German academy, a philosopher whose function was to reinforce the country's confidence in its culture. Just before the war began he had returned to Judaism, after a near conversion to Christianity. As the casualty lists rose in inverse proportion to the hope of victory, the consolations of philosophy seemed hollow. Philosophers, he wrote, were like small children who clapped their hands over their ears and shouted "I can't hear you!" before the fear of death. "From death - from the fear of death - comes all of our knowledge of the All," the soldier began. It was not the individual's fear of death that fascinated the young soldier, but the way entire nations respond to the fear of their collective death. He wrote:
Just as every individual must reckon with his eventual death, the peoples of the world foresee their eventual extinction, be it however distant in time. Indeed, the love of the peoples for their own nationhood is sweet and pregnant with the presentiment of death. Love is only surpassing sweet when it is directed towards a mortal object, and the secret of this ultimate sweetness only is defined by the bitterness of death. Thus the peoples of the world foresee a time when their land with its rivers and mountains still lies under heaven as it does today, but other people dwell there; when their language is entombed in books, and their laws and customers have lost their living power.
The soldier was Franz Rosenzweig, and the postcards would become his great book The Star of Redemption. Awareness of death defines the human condition, so that human beings cannot bear their own mortality without the hope of immortality. And our sense of immortality is social. The culture of a community is what unites the dead with those yet to be born.

The death of a culture is an uncanny event, for it erases not only the future but also the past, that is, the hopes and fears, the sweat and sacrifice of countless generations whose lives no longer can be remembered, for no living being will sing their songs or tell their stories.

The first surviving work of written literature, the Epic of Gilgamesh written perhaps 3,700 years ago, recounts the Sumerian king's quest for immortality. After a journey beset by hardship and peril, Gilgamesh is told: "The life that you are seeking you will never find. When the gods created man they allotted to him death, but life they retained in their own keeping."

In the pre-Christian world, Rosenzweig points out, the peoples of the world anticipated their eventual extinction. Every nation's love of itself is pregnant with the presentiment of death, for each tribe knows that its time on earth is limited. Some fight to the death. Others cease to breed. Some do both.

Christianity first taught them the Jewish promise of eternal life. To talk of "man's search for meaning" trivializes the problem. What humankind requires is meaning that transcends death. This need explains a great deal of human behavior that otherwise might seem irrational. One does not have to be religious to grasp this fundamental fact of the human condition, but religion helps, because faith makes explicit the human need to transcend morality. Secular rationalists have difficulty identifying with the motives of existentially challenged peoples - not so much because they lack faith, but because they entertain faith in rationality itself, and believe with the enthusiasm of the convert in the ability of reason to explain all of human experience.

But not only the religious need the hope of immortality. The most atheistic communist hopes that his memory will live on in the heart of a grateful proletariat. Even if we do not believe that our soul will have a place in heaven or that we shall be resurrected in the flesh, we nonetheless believe that something of ourselves will remain, in the form of progeny, memories, or consequences of actions, and that this something will persist as long as people who are like us continue to inhabit the Earth. Humanity perseveres in the consolation that some immortal part of us transcends our death. Sadly, our hope for immortality in the form of remembrance is a fragile and often a vain one. Immortality of this sort depends upon the survival of people who are like us - that is, upon the continuity of our culture. If you truly believe in a supernatural afterlife, to be sure, nothing can really disappoint you. But there is no consolation in being the last Mohican.

And that's because of Spengler's Universal Law #4: The history of the world is the history of humankind's search for immortality. When nations go willingly into that dark night, what should we conclude about human nature?

Human beings may not be the only animals who are sentient of death. (Elephants evidently grieve for their dead, and dogs mourn their dead masters.) But we are the only animals whose sense of continuity depends on culture as much as it does upon genes. Unlike men and women, healthy animals universally show an instinct for self-preservation and the propagation of their species. We do not observe cats deciding not to have kittens the better to pursue their careers as mousers.

I do not mean to suggest that humans beings of different cultures belong to different species. On the contrary, the child of a Kalahari Bushman will thrive if raised in the family of a Glaswegian ship's engineer. (As Jared Diamond, the author of Collapse: How Societies Choose to Fail or Succeed, observes it is easier to be stupid in a modern welfare state than in a hunter-gatherer tribe in New Guinea.)

But culture performs a role among human beings similar to the role species plays for animals. An adult Bushman would never fully adapt to industrial society, any more than a Glaswegian ship's engineer would last a fortnight in the Kalahari. Insofar as an animal can be said to experience an impulse toward the future beyond his own life, that impulse is fulfilled by the propagation of the species. But individual human existence looks forward to the continuation of the culture that nurtures, sustains, and transmits our contribution to future generations. Culture is the stuff out of which we weave the hope of immortality - not merely through genetic transmission but through inter-generational communication.

In the absence of religious faith, if our culture dies, our hope of transcending mere physical existence dies with it. Individuals trapped in a dying culture live in a twilight world. They embrace death through infertility, concupiscence, and war. A dog will crawl into a hole to die. The members of sick cultures do not do anything quite so dramatic, but they cease to have children, dull their senses with alcohol and drugs, become despondent, and too frequently do away with themselves. Or they may make war on the perceived source of their humiliation.

The truth is - to invoke Spengler's Universal Law #5 - Humankind cannot bear mortality without the hope of immortality. When men and women lose the sacred, they lose the desire to live. Despairing of immortality, we stand astonished before the one fact we know with certainty - that someday we must die. This is as true of modern homo sapiens sapiens as it was of our remotest ancestors. Even Neanderthal burial sites have been unearthed with grave gifts. "Man does not live by bread alone," Moses said on the east bank of the Jordan River. The affluent peoples of the world have all the bread they need, but have lost the appetite for life.

Americans are ill-equipped to empathize with the existential fears of other nations. America is the great exception to the demographic collapse sweeping the modern world. As an immigrant nation we regenerate ourselves. We bear no baggage from a tragic past. The glue that holds us together is a common concept of justice and opportunity. The United States is what John Courtney Murray called "a propositional nation". In our benevolence and optimism we assume that all peoples are like us, forgetting that we are or descend from people who chose to abandon the tragic fate of their own nations at the further shore and selected themselves into the American nation. But we have learned that our capacity to influence events in the rest of the world, even in the absence of a competing superpower, is limited, and that the dissipation of our resources can be deadly for us. Our strategic thinking suffers from a failure to take into account the existential problems of other nations. We think in the narrow categories of geopolitics, but we need to study theopolitics - the powerful impact of religious beliefs and aspirations on world events. Even we exceptional Americans must come to grips with the collapse of faith and fertility - especially in the rapidly and dangerously declining Muslim world - in order to prevail in a world in which tragic outcomes are more common than happy endings.

Notes
1. These ratios are based on the Elderly Dependency Ratio calculated by the model of the United Nations World Population Prospects 2010 revision, assuming constant fertility. The model is available at http://esa.un.org/unpd/wpp/unpp/panel_indicators.htm 2. Jared Diamond’s 2005 book Collapse: How Societies Choose to Fail or Succeed blames exhaustion of resource and environmental damage. The extinct people of Easter Island and the pre-Columbian Mayans chopped down too many trees, Diamond observes, and thus he argues that environmental damage is the greatest threat to our civilization. (Never mind that America has expanded its forests by 20 million acres during the past quarter century: disaster stories of this sort resonate with a public fed on media reports of global warming and apocalyptic disaster movies.) Easter Island, though, is something of a rarity in world history. The cultures about which we know the most - and from which our own civilization descends - failed from a different cause. Classical Greece and Rome died for the same reason that Western Europe, Japan, and other parts of the modern world are dying today: they lost their motivation to bring children into the world. The infertile Greeks were conquered by Rome’s army and the inexhaustible manpower of the farms of the Italian peninsula; as the Romans later grew childless, they were overrun by a small force of barbarian invaders.