Search This Blog

Showing posts with label priest. Show all posts
Showing posts with label priest. Show all posts

Saturday 2 September 2017

Holy men — theirs and ours

Pervez Hoodbhoy in The Dawn
INDIA and Pakistan have more influential holy men per square mile than anyone has ever counted. Some are just rich, others both powerful and rich. Once upon a time their followers were only the poor, superstitious and illiterate. But after the massive resurgence of religion in both countries this base has expanded to include politicians, film and cricket stars, and college-educated people who speak English and drive posh cars.
It is rare for an Indian holy man to bite the dust but one just did. The self-styled messenger of God, Ram Rahim Singh of Dera Sacha Sauda was convicted of two rapes by an Indian court. He is also accused of 52 other rapes, two murders, and storing 400 pairs of testicles in his refrigerators cut from 400 devotees on the promise of getting them nirvana. An avid Modi supporter, Singh travelled in entourages of 100-plus cars and claims 50-60 million followers. Vote-hungry politicians have touched his feet and done deals. After his conviction his crazed followers rioted, convinced of a conspiracy against their God. So far 38 people have died, hundreds injured, cars and public buildings set on fire.
But although Singh is one of India’s bigger holy men he is still small, dispensable fry. The really powerful ones are those who have learned the value of using religion in national politics. Today India is living out the extremist Hindutva ideology of Golwalkar and Savarkar with a head of government who is unabashedly committed to Hindu supremacy. This holy man’s clear and evident role in the communal riots of Gujarat in 2002 had led to his being banned from entering the US in 2005. However, no Indian court could find any wrongdoing committed by the then chief minister, now prime minister.
Pakistan’s holy men also come in two sorts. The pir resembles the Hindu and Sikh spiritual guru in some respects. He hands out amulets, prescriptions, and blessings — usually for a hefty price — to credulous mureeds (followers). Pirs allegedly have magical healing powers. For example, Benazir Bhutto was a mureed of the prescient Pir Pinjar, a man who claimed to cure terminally ill patients by spraying water on them with a garden hose. Her husband, ex-president Asif Ali Zardari, had a black goat sacrificed daily on the advice of his pir. But educated Muslims increasingly spurn such practices and the pir is losing out.
The second kind of Pakistani holy man — the mullah — has had a very different trajectory. Once a poor and largely harmless cleric, he was the butt of many a joke. Sought only for funerals and Friday prayers, he eked out an existence by teaching the Quran to children. Allama Iqbal heaped scorn upon him: Teri namaz main baqi jalal hai na jamal (The prayers you lead are empty of grace and grandeur), Teri azaan main nahin meri sehr ka payam (Your azan is cold and uninspiring).
But the Soviet invasion of Afghanistan in 1979 changed the mullah’s fortunes. Indispensable to the US-Pakistan-Saudi grand jihad alliance, this once pathetic figure could now be seen driven around in a SUV, commanding a militia, or screaming through multiple turbo-charged loudspeakers. Some eventually became successful land-grabbers, wheeler-dealers, and shady entrepreneurs. Few Pakistanis will fail to recognise the identities of Maulana Diesel, Maulana Whiskey, and Mullah Disco.
Serious conflict between mullah and state came after 9/11. Gen Musharraf’s apparent surrender to America enraged the mullah, who resolved to seize control of the Pakistani state. Ensconced in the heart of Pakistan’s capital, armed vigilante groups from Islamabad’s Red Mosque and Jamia Hafsa took over a government building, in January 2007. They kidnapped ordinary citizens and policemen, and repeated the demands of tribal militants fighting the Pakistan Army. From their FM station they broadcast a message: “We have weapons, grenades and we are expert in manufacturing bombs. We are not afraid of death.” Islamabad turned into a war zone and, by the time the insurrection was finally crushed, 150-200 lives had been lost.
Pakistani courts have failed to convict our holy men (as well as women). For example, Maulana Aziz and Umme Hassan (his wife, who headed Jamia Hafsa) were exonerated of any wrongdoing and are today going about their normal business. The court had ruled that possession of heavy weaponry by the accused could not be proven. It dismissed TV footage that showed Aziz’s students with gas masks firing Kalashnikovs. Weapons seized by the army and placed in a police armoury disappeared mysteriously. Although 10 of Pakistan’s crack SSG commandos died in the crackdown, the army — known for quick action in Balochistan — also did not pursue the case.
Why have Indians and Pakistanis become so tolerant — nay, supportive — of holy men, whether of the spiritual or political kind? Why are those who aspire to power so successful in using religion to motivate their electorates? After all, this is the 21st century, not the 12th.
The culprit could be modernity. Technology has created enormous psychological distress by doing away with traditional ways of living and bringing in a new, uncertain and ever-changing world. Older forms of associations such as the extended family and village community, together with their values, are disappearing. Cramped living conditions, pollution, ugliness all around, and job insecurities are a fact of life for most urban dwellers.
There is enormous nostalgia for the time when the world was supposedly perfect. This is why people looking for simple answers to today’s complex questions eagerly buy the wares peddled by holy men. Just as Hindutva encourages Indian Hindus to dream of the ‘authentic’ India, Muslim clerics tell their followers to dream of reclaiming Islam’s ancient glories.
But this is clutching at a straw. It gets far worse when religion is infused into politics. This produces a highly toxic, explosive mix as large masses of people blindly and unquestioningly follow holy men. Instead of dividing people still further, whether inside or outside national boundaries, South Asian states should aspire towards becoming a part of cosmopolitan world society removed from the prejudices of religion, caste and race.

Tuesday 11 July 2017

How economics became a religion

John Rapley in The Guardian



Although Britain has an established church, few of us today pay it much mind. We follow an even more powerful religion, around which we have oriented our lives: economics. Think about it. Economics offers a comprehensive doctrine with a moral code promising adherents salvation in this world; an ideology so compelling that the faithful remake whole societies to conform to its demands. It has its gnostics, mystics and magicians who conjure money out of thin air, using spells such as “derivative” or “structured investment vehicle”. And, like the old religions it has displaced, it has its prophets, reformists, moralists and above all, its high priests who uphold orthodoxy in the face of heresy.

Over time, successive economists slid into the role we had removed from the churchmen: giving us guidance on how to reach a promised land of material abundance and endless contentment. For a long time, they seemed to deliver on that promise, succeeding in a way few other religions had ever done, our incomes rising thousands of times over and delivering a cornucopia bursting with new inventions, cures and delights.

This was our heaven, and richly did we reward the economic priesthood, with status, wealth and power to shape our societies according to their vision. At the end of the 20th century, amid an economic boom that saw the western economies become richer than humanity had ever known, economics seemed to have conquered the globe. With nearly every country on the planet adhering to the same free-market playbook, and with university students flocking to do degrees in the subject, economics seemed to be attaining the goal that had eluded every other religious doctrine in history: converting the entire planet to its creed.

Yet if history teaches anything, it’s that whenever economists feel certain that they have found the holy grail of endless peace and prosperity, the end of the present regime is nigh. On the eve of the 1929 Wall Street crash, the American economist Irving Fisher advised people to go out and buy shares; in the 1960s, Keynesian economists said there would never be another recession because they had perfected the tools of demand management.

The 2008 crash was no different. Five years earlier, on 4 January 2003, the Nobel laureate Robert Lucas had delivered a triumphal presidential address to the American Economics Association. Reminding his colleagues that macroeconomics had been born in the depression precisely to try to prevent another such disaster ever recurring, he declared that he and his colleagues had reached their own end of history: “Macroeconomics in this original sense has succeeded,” he instructed the conclave. “Its central problem of depression prevention has been solved.”

No sooner do we persuade ourselves that the economic priesthood has finally broken the old curse than it comes back to haunt us all: pride always goes before a fall. Since the crash of 2008, most of us have watched our living standards decline. Meanwhile, the priesthood seemed to withdraw to the cloisters, bickering over who got it wrong. Not surprisingly, our faith in the “experts” has dissipated.

Hubris, never a particularly good thing, can be especially dangerous in economics, because its scholars don’t just observe the laws of nature; they help make them. If the government, guided by its priesthood, changes the incentive-structure of society to align with the assumption that people behave selfishly, for instance, then lo and behold, people will start to do just that. They are rewarded for doing so and penalised for doing otherwise. If you are educated to believe greed is good, then you will be more likely to live accordingly.

The hubris in economics came not from a moral failing among economists, but from a false conviction: the belief that theirs was a science. It neither is nor can be one, and has always operated more like a church. You just have to look at its history to realise that.

The American Economic Association, to which Robert Lucas gave his address, was created in 1885, just when economics was starting to define itself as a distinct discipline. At its first meeting, the association’s founders proposed a platform that declared: “The conflict of labour and capital has brought to the front a vast number of social problems whose solution is impossible without the united efforts of church, state and science.” It would be a long path from that beginning to the market evangelism of recent decades.

Yet even at that time, such social activism provoked controversy. One of the AEA’s founders, Henry Carter Adams, subsequently delivered an address at Cornell University in which he defended free speech for radicals and accused industrialists of stoking xenophobia to distract workers from their mistreatment. Unknown to him, the New York lumber king and Cornell benefactor Henry Sage was in the audience. As soon as the lecture was done, Sage stormed into the university president’s office and insisted: “This man must go; he is sapping the foundations of our society.” When Adams’s tenure was subsequently blocked, he agreed to moderate his views. Accordingly, the final draft of the AEA platform expunged the reference to laissez-faire economics as being “unsafe in politics and unsound in morals”.

 
‘Economics has always operated more like a church’ … Trinity Church seen from Wall Street. Photograph: Alamy Stock Photo

So was set a pattern that has persisted to this day. Powerful political interests – which historically have included not only rich industrialists, but electorates as well – helped to shape the canon of economics, which was then enforced by its scholarly community.

Once a principle is established as orthodox, its observance is enforced in much the same way that a religious doctrine maintains its integrity: by repressing or simply eschewing heresies. In Purity and Danger, the anthropologist Mary Douglas observed the way taboos functioned to help humans impose order on a seemingly disordered, chaotic world. The premises of conventional economics haven’t functioned all that differently. Robert Lucas once noted approvingly that by the late 20th century, economics had so effectively purged itself of Keynesianism that “the audience start(ed) to whisper and giggle to one another” when anyone expressed a Keynesian idea at a seminar. Such responses served to remind practitioners of the taboos of economics: a gentle nudge to a young academic that such shibboleths might not sound so good before a tenure committee. This preoccupation with order and coherence may be less a function of the method than of its practitioners. Studies of personality traits common to various disciplines have discovered that economics, like engineering, tends to attract people with an unusually strong preference for order, and a distaste for ambiguity.

The irony is that, in its determination to make itself a science that can reach hard and fast conclusions, economics has had to dispense with scientific method at times. For starters, it rests on a set of premises about the world not as it is, but as economists would like it to be. Just as any religious service includes a profession of faith, membership in the priesthood of economics entails certain core convictions about human nature. Among other things, most economists believe that we humans are self-interested, rational, essentially individualistic, and prefer more money to less. These articles of faith are taken as self-evident. Back in the 1930s, the great economist Lionel Robbins described his profession in a way that has stood ever since as a cardinal rule for millions of economists. The field’s basic premises came from “deduction from simple assumptions reflecting very elementary facts of general experience” and as such were “as universal as the laws of mathematics or mechanics, and as little capable of ‘suspension’”.

Deducing laws from premises deemed eternal and beyond question is a time-honoured method. For thousands of years, monks in medieval monasteries built a vast corpus of scholarship doing just that, using a method perfected by Thomas Aquinas known as scholasticism. However, this is not the method used by scientists, who tend to require assumptions to be tested empirically before a theory can be built out of them.
But, economists will maintain, this is precisely what they themselves do – what sets them apart from the monks is that they must still test their hypotheses against the evidence. Well, yes, but this statement is actually more problematic than many mainstream economists may realise. Physicists resolve their debates by looking at the data, upon which they by and large agree. The data used by economists, however, is much more disputed. When, for example, Robert Lucas insisted that Eugene Fama’s efficient-markets hypothesis – which maintains that since a free market collates all available information to traders, the prices it yields can never be wrong – held true despite “a flood of criticism”, he did so with as much conviction and supporting evidence as his fellow economist Robert Shiller had mustered in rejecting the hypothesis. When the Swedish central bank had to decide who would win the 2013 Nobel prize in economics, it was torn between Shiller’s claim that markets frequently got the price wrong and Fama’s insistence that markets always got the price right. Thus it opted to split the difference and gave both men the medal – a bit of Solomonic wisdom that would have elicited howls of laughter had it been a science prize. In economic theory, very often, you believe what you want to believe – and as with any act of faith, your choice of heads or tails will as likely reflect sentimental predisposition as scientific assessment.

It’s no mystery why the data used by economists and other social scientists so rarely throws up incontestable answers: it is human data. Unlike people, subatomic particles don’t lie on opinion surveys or change their minds about things. Mindful of that difference, at his own presidential address to the American Economic Association nearly a half-century ago, another Nobel laureate, Wassily Leontief, struck a modest tone. He reminded his audience that the data used by economists differed greatly from that used by physicists or biologists. For the latter, he cautioned, “the magnitude of most parameters is practically constant”, whereas the observations in economics were constantly changing. Data sets had to be regularly updated to remain useful. Some data was just simply bad. Collecting and analysing the data requires civil servants with a high degree of skill and a good deal of time, which less economically developed countries may not have in abundance. So, for example, in 2010 alone, Ghana’s government – which probably has one of the better data-gathering capacities in Africa – recalculated its economic output by 60%. Testing your hypothesis before and after that kind of revision would lead to entirely different results.

 
‘The data used by economists rarely throws up incontestable answers’ … traders at the New York Stock Exchange in October 2008. Photograph: Spencer Platt/Getty Images

Leontief wanted economists to spend more time getting to know their data, and less time in mathematical modelling. However, as he ruefully admitted, the trend was already going in the opposite direction. Today, the economist who wanders into a village to get a deeper sense of what the data reveals is a rare creature. Once an economic model is ready to be tested, number-crunching ends up being done largely at computers plugged into large databases. It’s not a method that fully satisfies a sceptic. For, just as you can find a quotation in the Bible that will justify almost any behaviour, you can find human data to support almost any statement you want to make about the way the world works.

That’s why ideas in economics can go in and out of fashion. The progress of science is generally linear. As new research confirms or replaces existing theories, one generation builds upon the next. Economics, however, moves in cycles. A given doctrine can rise, fall and then later rise again. That’s because economists don’t confirm their theories in quite the same way physicists do, by just looking at the evidence. Instead, much as happens with preachers who gather a congregation, a school rises by building a following – among both politicians and the wider public.

For example, Milton Friedman was one of the most influential economists of the late 20th century. But he had been around for decades before he got much of a hearing. He might well have remained a marginal figure had it not been that politicians such as Margaret Thatcher and Ronald Reagan were sold on his belief in the virtue of a free market. They sold that idea to the public, got elected, then remade society according to those designs. An economist who gets a following gets a pulpit. Although scientists, in contrast, might appeal to public opinion to boost their careers or attract research funds, outside of pseudo-sciences, they don’t win support for their theories in this way.
However, if you think describing economics as a religion debunks it, you’re wrong. We need economics. It can be – it has been – a force for tremendous good. But only if we keep its purpose in mind, and always remember what it can and can’t do.

The Irish have been known to describe their notionally Catholic land as one where a thin Christian veneer was painted over an ancient paganism. The same might be said of our own adherence to today’s neoliberal orthodoxy, which stresses individual liberty, limited government and the free market. Despite outward observance of a well-entrenched doctrine, we haven’t fully transformed into the economic animals we are meant to be. Like the Christian who attends church but doesn’t always keep the commandments, we behave as economic theory predicts only when it suits us. Contrary to the tenets of orthodox economists, contemporary research suggests that, rather than seeking always to maximise our personal gain, humans still remain reasonably altruistic and selfless. Nor is it clear that the endless accumulation of wealth always makes us happier. And when we do make decisions, especially those to do with matters of principle, we seem not to engage in the sort of rational “utility-maximizing” calculus that orthodox economic models take as a given. The truth is, in much of our daily life we don’t fit the model all that well.


Economists work best when they take the stories we have given them, and advise us on how we can help them to come true


For decades, neoliberal evangelists replied to such objections by saying it was incumbent on us all to adapt to the model, which was held to be immutable – one recalls Bill Clinton’s depiction of neoliberal globalisation, for instance, as a “force of nature”. And yet, in the wake of the 2008 financial crisis and the consequent recession, there has been a turn against globalisation across much of the west. More broadly, there has been a wide repudiation of the “experts”, most notably in the 2016 US election and Brexit referendum.

It would be tempting for anyone who belongs to the “expert” class, and to the priesthood of economics, to dismiss such behaviour as a clash between faith and facts, in which the facts are bound to win in the end. In truth, the clash was between two rival faiths – in effect, two distinct moral tales. So enamoured had the so-called experts become with their scientific authority that they blinded themselves to the fact that their own narrative of scientific progress was embedded in a moral tale. It happened to be a narrative that had a happy ending for those who told it, for it perpetuated the story of their own relatively comfortable position as the reward of life in a meritocratic society that blessed people for their skills and flexibility. That narrative made no room for the losers of this order, whose resentments were derided as being a reflection of their boorish and retrograde character – which is to say, their fundamental vice. The best this moral tale could offer everyone else was incremental adaptation to an order whose caste system had become calcified. For an audience yearning for a happy ending, this was bound to be a tale of woe.

The failure of this grand narrative is not, however, a reason for students of economics to dispense with narratives altogether. Narratives will remain an inescapable part of the human sciences for the simple reason that they are inescapable for humans. It’s funny that so few economists get this, because businesses do. As the Nobel laureates George Akerlof and Robert Shiller write in their recent book, Phishing for Phools, marketers use them all the time, weaving stories in the hopes that we will place ourselves in them and be persuaded to buy what they are selling. Akerlof and Shiller contend that the idea that free markets work perfectly, and the idea that big government is the cause of so many of our problems, are part of a story that is actually misleading people into adjusting their behaviour in order to fit the plot. They thus believe storytelling is a “new variable” for economics, since “the mental frames that underlie people’s decisions” are shaped by the stories they tell themselves.

Economists arguably do their best work when they take the stories we have given them, and advise us on how we can help them to come true. Such agnosticism demands a humility that was lacking in economic orthodoxy in recent years. Nevertheless, economists don’t have to abandon their traditions if they are to overcome the failings of a narrative that has been rejected. Rather they can look within their own history to find a method that avoids the evangelical certainty of orthodoxy.

In his 1971 presidential address to the American Economic Association, Wassily Leontief counselled against the dangers of self-satisfaction. He noted that although economics was starting to ride “the crest of intellectual respectability … an uneasy feeling about the present state of our discipline has been growing in some of us who have watched its unprecedented development over the last three decades”.

Noting that pure theory was making economics more remote from day-to-day reality, he said the problem lay in “the palpable inadequacy of the scientific means” of using mathematical approaches to address mundane concerns. So much time went into model-construction that the assumptions on which the models were based became an afterthought. “But,” he warned – a warning that the sub-prime boom’s fascination with mathematical models, and the bust’s subsequent revelation of their flaws, now reveals to have been prophetic – “it is precisely the empirical validity of these assumptions on which the usefulness of the entire exercise depends.”

Leontief thought that economics departments were increasingly hiring and promoting young economists who wanted to build pure models with little empirical relevance. Even when they did empirical analysis, Leontief said economists seldom took any interest in the meaning or value of their data. He thus called for economists to explore their assumptions and data by conducting social, demographic and anthropological work, and said economics needed to work more closely with other disciplines.


Leontief’s call for humility some 40 years ago stands as a reminder that the same religions that can speak up for human freedom and dignity when in opposition, can become obsessed with their rightness and the need to purge others of their wickedness once they attain power. When the church retains its distance from power, and a modest expectation about what it can achieve, it can stir our minds to envision new possibilities and even new worlds. Once economists apply this kind of sceptical scientific method to a human realm in which ultimate reality may never be fully discernible, they will probably find themselves retreating from dogmatism in their claims.

Paradoxically, therefore, as economics becomes more truly scientific, it will become less of a science. Acknowledging these limitations will free it to serve us once more.

Thursday 20 October 2016

The cult of the expert – and how it collapsed

Led by a class of omnipotent central bankers, experts have gained extraordinary political power. Will a populist backlash shatter their technocratic dream?

Sebastian Mallaby in The Guardian

On Tuesday 16 September 2008, early in the afternoon, a self-effacing professor with a neatly clipped beard sat with the president in the Roosevelt Room of the White House. Flanked by a square-shouldered banker who had recently run Goldman Sachs, the professor was there to tell the elected leader of the world’s most powerful country how to rescue its economy. Following the bankruptcy of one of the nation’s storied investment banks, a global insurance company was now on the brink, but drawing on a lifetime of scholarly research, the professor had resolved to commit $85bn of public funds to stabilising it.

The sum involved was extraordinary: $85bn was more than the US Congress spent annually on transportation, and nearly three times as much as it spent on fighting Aids, a particular priority of the president’s. But the professor encountered no resistance. “Sometimes you have to make the tough decisions,”the president reflected. “If you think this has to be done, you have my blessing.”

Later that same afternoon, Federal Reserve chairman Ben Bernanke, the bearded hero of this tale, showed up on Capitol Hill, at the other end of Pennsylvania Avenue. At the White House, he had at least been on familiar ground: he had spent eight months working there. But now Bernanke appeared in the Senate majority leader’s conference room, where he and his ex-Wall Street comrade, Treasury secretary Hank Paulson, would meet the senior leaders of both chambers of Congress. A quiet, balding, unassuming technocrat confronted the lions of the legislative branch, armed with nothing but his expertise in monetary plumbing.

Bernanke repeated his plan to commit $85bn of public money to the takeover of an insurance company.

“Do you have 85bn?” one sceptical lawmaker demanded.

“I have 800bn,” Bernanke replied evenly – a central bank could conjure as much money as it deemed necessary.

But did the Federal Reserve have the legal right to take this sort of action unilaterally, another lawmaker inquired?

Yes, Bernanke answered: as Fed chairman, he wielded the largest chequebook in the world – and the only counter-signatures required would come from other Fed experts, who were no more elected or accountable than he was. Somehow America’s famous apparatus of democratic checks and balances did not apply to the monetary priesthood. Their authority derived from technocratic virtuosity.

When the history is written of the revolt against experts, September 2008 will be seen as a milestone. The $85bn rescue of the American International Group (AIG) dramatised the power of monetary gurus in all its anti-democratic majesty. The president and Congress could decide to borrow money, or raise it from taxpayers; the Fed could simply create it. And once the AIG rescue had legitimised the broadest possible use of this privilege, the Fed exploited it unflinchingly. Over the course of 2009, it injected a trillion dollars into the economy – a sum equivalent to nearly 30% of the federal budget – via its newly improvised policy of “quantitative easing”. Time magazine anointed Bernanke its person of the year. “The decisions he has made, and those he has yet to make, will shape the path of our prosperity, the direction of our politics and our relationship to the world,” the magazine declared admiringly.

The Fed’s swashbuckling example galvanized central bankers in all the big economies. Soon Europe saw the rise of its own path-shaping monetary chieftain, when Mario Draghi, president of the European Central Bank, defused panic in the eurozone in July 2012 with two magical sentences. “Within our mandate, the ECB is ready to do whatever it takes to preserve the euro,” he vowed, adding, with a twist of Clint Eastwood menace, “And believe me, it will be enough.” For months, Europe’s elected leaders had waffled ineffectually, inviting hedge-fund speculators to test the cohesion of the eurozone. But now Draghi was announcing that he was badder than the baddest hedge-fund goon. Whatever it takes. Believe me.

In the summer of 2013, when Hollywood rolled out its latest Superman film, cartoonists quickly seized upon a gag that would soon become obvious. Caricatures depicted central-bank chieftains decked out in Superman outfits. One showed Bernanke ripping off his banker’s shirt and tie, exposing that thrilling S emblazoned on his vest. Another showed the bearded hero hurtling through space, red cape fluttering, right arm stretched forward, a powerful fist punching at the void in front of him. “Superman and Federal Reserve chairman Ben Bernanke are both mild-mannered,” a financial columnist deadpanned. “They are both calm, even in the face of global disasters. They are both sometimes said to be from other planets.”

At some point towards the middle of the decade, shortly before the cult of the expert smashed into the populist backlash, the shocking power of central banks came to feel normal. Nobody blinked an eye when Haruhiko Kuroda, the head of Japan’s central bank, created money at a rate that made his western counterparts seem timid. Nobody thought it strange when Britain’s government, perhaps emulating the style of the national football team, conducted a worldwide talent search for the new Bank of England chief. Nobody was surprised when the winner of that contest, the telegenic Canadian Mark Carney, quickly appeared in newspaper cartoons in his own superman outfit. And nobody missed a beat when India’s breathless journalists described Raghuram Rajan, the new head of the Reserve Bank of India, as a “rock star”, or when he was pictured as James Bond in the country’s biggest business newspaper. “Clearly I am not a superman,” Rajan modestly responded.


No senator would have his child’s surgery performed by an amateur. So why would he not entrust experts with the economy?

If Bernanke’s laconic “I have 800bn” moment signalled a new era of central-banking power, Rajan’s “I am not a superman” wisecrack marked its apotheosis. And it was a high watermark for a wider phenomenon as well, for the cult of the central banker was only the most pronounced example of a broader cult that had taken shape over the previous quarter of a century: the cult of the expert. Even before Bernanke rescued the global economy, technocrats of all stripes – business leaders, scientists, foreign and domestic policy wonks – were enthralled by the notion that politicians might defer to the authority of experts armed with facts and rational analysis. Those moments when Bernanke faced down Congress, or when Draghi succeeded where bickering politicians had failed, made it seem possible that this technocratic vision, with its apolitical ideal of government, might actually be realised.

The key to the power of the central bankers – and the envy of all the other experts – lay precisely in their ability to escape political interference. Democratically elected leaders had given them a mission – to vanquish inflation – and then let them get on with it. To public-health experts, climate scientists and other members of the knowledge elite, this was the model of how things should be done. Experts had built Microsoft. Experts were sequencing the genome. Experts were laying fibre-optic cable beneath the great oceans. No senator would have his child’s surgery performed by an amateur. So why would he not entrust experts with the economy?

In 1997, the economist Alan Blinder published an essay in Foreign Affairs, the house journal of the American foreign policy establishment. His title posed a curious question: “Is government too political?”

Four years earlier, Blinder had left Princeton University, his academic home for two decades, to do battle in the public square as a member of President Bill Clinton’s Council of Economic Advisors. The way Blinder saw things, this was a responsibility more than a pleasure: experts had a duty to engage in public debates – otherwise, “the quacks would continue to dominate the pond”, as he had once written. Earnest, idealistic, but with a self-deprecating wit, Blinder was out to save the world from returning to that dark period in the Reagan era when supply-side ideologues ruled the roost and “nonsense was worshipped as gospel”. After two years at the White House and another two as vice chairman of the Fed, Blinder’s essay was a reflection on his years of service.

His argument reflected the contrast between his two jobs in Washington. At the White House, he had advised a brainy president on budget policy and much else, but turning policy wisdom into law had often proved impossible. Even when experts from both parties agreed what should be done, vested interests in Congress conspired to frustrate enlightened progress. At the Fed, by contrast, experts were gloriously empowered. They could debate the minutiae of the economy among themselves, then manoeuvre the growth rate this way or that, without deferring to anyone.

To Blinder, it was self-evident that the Fed model was superior – not only for the experts, but also in the eyes of the public. The voters did not want their members of Congress micromanaging technical affairs – polls showed declining trust in politicians, and it was only a small stretch to suggest that citizens wanted their political leaders to delegate as much as possible to experts. “Americans increasingly believe that their elected officials are playing games rather than solving problems,” Blinder wrote. “Political debate has too much ‘spin’ and too little straight talk.” In sum, too much meddling by elected politicians was a turn-off for the voters who elected them. It was a paradoxical contention.

Disaffection with the political mainstream in the America of the 1990s had created a yearning for white-hatted outsiders as potential presidential candidates: the billionaire businessman Ross Perot, who ran in 1992 and 1996; the anti-politician, Steve Forbes, whose signature proposal was to radically simplify America’s byzantine tax code. But rather than replace politicians with populist outsiders, whose grasp of public policy was suspect, Blinder advanced an alternative idea: the central-bank model of expert empowerment should be extended to other spheres of governance.

Blinder’s proposal was most clearly illustrated by tax policy. Experts from both political parties agreed that the tax system should be stripped of perverse incentives and loopholes. There was no compelling reason, for example, to encourage companies to finance themselves with debt rather than equity, yet the tax code allowed companies to make interest payments to their creditors tax-free, whereas dividend payments to shareholders were taxed twice over. The nation would be better off if Congress left the experts to fix such glitches rather than allowing politics to frustrate progress. Likewise, environmental targets, which balanced economic growth on the one hand and planetary preservation on the other, were surely best left to the scholars who understood how best to reconcile these duelling imperatives. Politicians who spent more of their time dialing for dollars than thinking carefully about policy were not up to these tasks. Better to hand them off to the technicians in white coats who knew what they were doing.


A dark question lurked in educated minds. If all the isms were wasms, if history was over, what good were politicians?

The call to empower experts, and to keep politics to a minimum, failed to trigger a clear shift in how Washington did business. But it did crystallise the assumptions of the late 1990s and early 2000s – a time when sharp criticisms of gridlock and lobbying were broadly accepted, and technocratic work-arounds to political paralysis were frequently proposed, even if seldom adopted. President Barack Obama’s (unsuccessful) attempt to remove the task of tackling long-term budget challenges from Congress by handing them off to the bipartisan Simpson-Bowles commission was emblematic of this same mood. Equally, elected leaders at least paid lip service to the authority of experts in the government’s various regulatory agencies – the Food and Drug Administration, the Securities and Exchange Commission, and so on. If they nonetheless overruled them for political reasons, it was in the dead of night and with a guilty conscience.

And so, by the turn of the 21st century, a new elite consensus had emerged: democracy had to be managed. The will of the people had its place, but that place had to be defined, and not in an expansive fashion. After all, Bill Clinton and Tony Blair, the two most successful political leaders of the time, had proclaimed their allegiance to a “third way”, which proposed that the grand ideological disputes of the cold war had come to an end. If the clashes of abstractions – communism, socialism, capitalism and so on –were finished, all that remained were practical questions, which were less subjects of political choice and more objects of expert analysis. Indeed, at some tacit, unarticulated level, a dark question lurked in educated minds. If all the isms were wasms, if history was over, what good were politicians?

 

Federal Reserve chairman Ben Bernanke testifies before Congress in October 2011. Photograph: Jim Lo Scalzo/EPA

For Blinder and many of his contemporaries, the ultimate embodiment of empowered gurudom was Alan Greenspan, the lugubrious figure with a meandering syntax who presided over the Federal Reserve for almost two decades. Greenspan was a technocrat’s technocrat, a walking, talking cauldron of statistics and factoids, and even though his ideological roots were in the libertarian right, his happy collaboration with Democratic experts in the Clinton administration fitted the end-of-history template perfectly. At Greenspan’s retirement in 2006, Blinder and a co-author summed up his extraordinary standing. They proclaimed him “a living legend”. On Wall Street, “financial markets now view Chairman Greenspan’s infallibility more or less as the Chinese once viewed Chairman Mao’s”.

Greenspan was raised during the Great Depression, and for much of his career, such adulation would have been inconceivable – for him or any central banker. Through most of the 20th century, the men who acted as bankers to the bankers were deliberately low-key. They spurned public attention and doubted their own influence. They fully expected that politicians would bully them into trying to stimulate the economy, even at the risk of inflation. In 1964, in a successful effort to get the central bank to cut interest rates, Lyndon Johnson summoned the Fed chairman William McChesney Martin to his Texas ranch and pushed him around the living room, yelling in his face, “Boys are dying in Vietnam, and Bill Martin doesn’t care!” In democracies, evidently, technocratic power had limits.

Through the 1970s and into the 1980s, central-bank experts continued to be tormented. Richard Nixon and his henchmen once smeared Arthur Burns, the Fed chairman, by planting a fictitious story in the press, insinuating that Burns was simultaneously demanding a huge pay rise for himself and a pay freeze for other Americans. Following in this tradition, the Reagan administration frequently denounced the Fed chief, Paul Volcker, and packed the Fed’s board with pro-Reagan loyalists, who ganged up against their chairman.


There were Alan Greenspan postcards, Alan Greenspan cartoons, Alan Greenspan T-shirts, even an Alan Greenspan doll

When Greenspan replaced Volcker in 1987, the same pattern continued at first. The George HW Bush administration tried everything it could to force Greenspan to cut interest rates, to the point that a White House official put it about that the unmarried, 65-year-old Fed chairman reminded him of Norman Bates, the mother-fixated loner in Hitchcock’s Psycho.

And yet, starting with the advent of the Clinton administration, Greenspan effected a magical shift in the prestige of monetary experts. For the last 13 years of his tenure, running from 1993 to 2006, he attained the legendary status that Blinder recognised and celebrated. There were Alan Greenspan postcards, Alan Greenspan cartoons, Alan Greenspan T-shirts, even an Alan Greenspan doll. “How many central bankers does it take to screw in a lightbulb?” asked a joke of the time. “One,” the answer went: “Greenspan holds the bulb and the world revolves around him.” Through quiet force of intellect, Greenspan seemed to control the American economy with the finesse of a master conductor. He was the “Maestro”, one biographer suggested. The New Yorker’s John Cassidy wrote that Greenspan’s oracular pronouncements became “as familiar and as comforting to ordinary Americans as Prozac and The Simpsons, both of which debuted in 1987, the same year President Reagan appointed him to office”.

Greenspan’s sway in Washington stretched far beyond the Fed’s core responsibility, which was to set interest rates. When the Clinton administration wanted to know how much deficit reduction was necessary, it asked Greenspan for a number, at which point that number assumed a talismanic importance, for no other reason than that Greenspan had endorsed it. When Congress wanted to understand how far deficit reduction would bring bond yields down, it demanded an answer from Greenspan, and his answer duly became a key plank of the case for moving towards budget balance. The Clinton adviser Dick Morris summed up economic policy in this period: “You figure out what Greenspan wants, and then you get it to him.”

Greenspan loomed equally large in the US government’s management of a series of emerging market meltdowns in the 1990s. Formally, the responsibility for responding to foreign crises fell mainly to the Treasury, but the Clinton team relied on Greenspan – for ideas and for political backing. With the Republicans controlling Congress, a Democratic president needed a Republican economist to vouch for his plans – to the press, Congress, and even the conservative talk radio host Rush Limbaugh. “Officials at the notoriously reticent Federal Reserve say they have seldom seen anything like it,” the New York Times reported in January 1995, remarking on the Fed chairman’s metamorphosis from monetary technocrat into rescue salesman. In 1999, anticipating the moment when it anointed Ben Bernanke its man of the year, Time put Greenspan on its cover, with smaller images of the Treasury secretary and deputy Treasury secretary flanking him. Greenspan and his sidemen were “economist heroes”, Time lectured its readers. They had “outgrown ideology”.

By the last years of his tenure, Greenspan’s reputation had risen so high that even fellow experts were afraid of him. When he held forth at the regular gatherings of central bank chiefs in Basel, the distinguished figures at the table, titans in their own fields, took notes with the eagerness of undergraduates. So great was Greenspan’s status that he started to seem irreplaceable. As vice-president Al Gore prepared his run for the White House, he pronounced himself Greenspan’s “biggest fan” and rated the chairman’s performance as “outstanding A-plus-plus”. Not to be outdone, the Republican senator John McCain wished the chairman could stay at his post into the afterlife. “I would do like we did in the movie Weekend at Bernie’s,” McCain joked during a Republican presidential primary debate. “I’d prop him up and put a pair of dark glasses on him and keep him as long as we could.”

How did Greenspan achieve this legendary status, creating the template for expert empowerment on which a generation of technocrats sought to build a new philosophy of anti-politics? The question is not merely of historical interest. With experts now in retreat, in the United States, Britain and elsewhere, the story of their rise may hold lessons for the future.

Part of the answer lies in the circumstances that Greenspan inherited. In the United States and elsewhere, central bankers were given space to determine interest rates without political meddling because the existing model had failed. The bullying of central banks by Johnson and Nixon produced the disastrous inflation of the 1970s, with the result that later politicians wanted to be saved from themselves – they stopped harassing central banks, understanding that doing so damaged economic performance and therefore their own reputations. Paul Volcker was a partial beneficiary of this switch: even though some Reagan officials attacked him, others recognised that he must be given the space to drive down inflation. Following Volcker’s tenure, a series of countries, starting with New Zealand, granted formal independence to their central banks. Britain crossed this Rubicon in 1997. In the United States, the Fed’s independence has never been formal. But the climate of opinion on monetary issues offered a measure of protection.

Healthy economic growth was another factor underpinning Greenspan’s exalted status. Globalisation, coupled with the surge of productivity that followed the personal computer revolution, made the 1990s a boom time. The pro-market policies that Greenspan and his fellow experts had long advocated seemed to be delivering the goods, not only in terms of growth but also in falling inequality, lower rates of crime, and lower unemployment for disadvantaged minorities. The legitimacy of experts relies on their presumed ability to deliver progress. In Greenspan’s heyday, experts over-delivered.

Yet these fortunate circumstances are not the whole story. Greenspan amassed more influence and reputation than anyone else because there was something special about him. He was not the sort of expert who wanted to confine politics to its box. To the contrary, he embraced politics, and loved the game. He understood power, and was not afraid to wield it.


Greenspan’s genius was to combine high-calibre expert analysis with raw political methods

Greenspan is regarded as the ultimate geek: obsessed with obscure numbers, convoluted in his speech, awkward in social settings. Yet he was far more worldly than his technocratic manner suggested. He entered public life when he worked for Nixon’s 1968 campaign – not just as an economic adviser, but as a polling analyst. In Nixon’s war room, he allied himself with the future populist presidential candidate Patrick Buchanan, and his memos to Nixon were peppered with ideas on campaign spin and messaging. In 1971, when Nixon went after the Fed chairman, Arthur Burns, Greenspan was recruited to coax Burns into supporting the president. In the mid-1970s, when Greenspan worked in the Gerald Ford administration, he once sneaked into the White House on a weekend to help rewrite a presidential speech, burying an earlier draft penned by a bureaucratic opponent. At the Republican convention in 1980, Greenspan tried to manoeuvre Ford on to Ronald Reagan’s ticket – an outlandish project to get an ex-president to serve as vice president.

Greenspan’s genius was to combine high-calibre expert analysis with raw political methods. He had more muscle than a mere expert and more influence than a mere politician. The combination was especially potent because the first could be a cover for the second: his political influence depended on the perception that he was an expert, and therefore above the fray, and therefore not really political. Unlike politician-politicians, Greenspan’s advice had the ring of objectivity: he was the man who knew the details of the federal budget, the outlook for Wall Street, the political tides as they revealed themselves through polling data. The more complex the problems confronting the president, the more indispensable Greenspan’s expertise became. “He has the best bedside manner I’ve ever seen,” a jealous Ford administration colleague recalled, remarking on Greenspan’s hypnotic effect on his boss. “Extraordinary. That was his favourite word. He’d go in to see Ford and say, ‘Mr President, this is an extraordinarily complex problem.’ And Ford’s eyes would get big and round and start to go around in circles.”

By the time Greenspan became Fed chairman, he was a master of the dark arts of Washington. He went to extraordinary lengths to cultivate allies, fighting through his natural shyness to attend A-list parties, playing tennis with potentially troublesome financial lobbyists, maintaining his contacts on Wall Street, building up his capital by giving valuable counsel to anyone who mattered. Drawing on the advantage of his dual persona, Greenspan offered economic advice to politicians and political advice to economists. When Laura Tyson, an exuberant Berkeley economist, was appointed to chair Bill Clinton’s Council of Economic Advisers, she was flattered to find that the Fed chairman had tips on her speaking style. Too many hand gestures and facial expressions could undermine her credibility, Greenspan observed. The CEA chairwoman should simply present facts, with as little visual commentary as possible.

Greenspan’s critics frequently complained that he was undermining the independence of the Fed by cosying up to politicians. But the critics were 180 degrees wrong: only by building political capital could Greenspan protect the Fed’s prerogatives. Clinton had no natural love for Greenspan: he would sometimes entertain his advisers with a cruel imitation of him – a cheerless old man droning on about inflation. But after a landmark 1993 budget deal and a 1995 bailout of Mexico, Clinton became a firm supporter of the Fed. Greenspan had proved that he had clout. Clinton wanted to be on the right side of him.

The contrast with Greenspan’s predecessor, the rumpled, egg-headed Paul Volcker, is revealing. Volcker lacked Greenspan’s political skills, which is why the Reagan administration succeeded in packing his board with governors who were ready to outvote him. When Greenspan faced a similar prospect, he had the muscle to fight back: in at least one instance, he let his allies in the Senate know that they should block the president’s candidate. Volcker also lacked Greenspan’s facility in dealing with the press – he refused to court public approval and sometimes pretended not to notice a journalist who had been shown into his office to interview him. Greenspan inhabited the opposite extreme: he courted journalists assiduously, opening presents each Christmas at the home of the Wall Street Journal’s Washington bureau chief, Al Hunt, flattering reporters with private interviews even as he berated other Fed governors for leaking to them. It was only fitting that, halfway through his tenure, Greenspan married a journalist whose source he had once been.

The upshot was that Greenspan maximised a form of power that is invaluable to experts. Because journalists admired him, it was dangerous for politicians to pick a fight with the Fed: in any public dispute, the newspaper columnists and talking heads would take Greenspan’s side of the argument. As a result, the long tradition of Fed-bashing ceased almost completely. Every Washington insider understood that Greenspan was too powerful to touch. People who got on the wrong side of him would find their career prospects dim. They would see their intellectual shortcomings exposed. They would find themselves diminished.


 
Mark Carney, the governor of the Bank of England, in 2015. Photograph: Jonathan Brady/AFP/Getty Images

Of course, the triumph of the expert was bound to be fragile. In democracies, the will of the people can be sidelined only for so long, and 2016 has brought the whirlwind. The Brexit referendum featured Michael Gove’s infamous assertion that “the British people have had enough of experts”. Since the vote, Mark Carney, the Bank of England governor once pictured as superman, has been accused by the government of running dubious monetary experiments that exacerbate inequality – an attack picked up by William Hague, who this week threatened the central bank with the loss of its independence unless it raised interest rates. In the United States, Donald Trump has ripped into intellectuals of all stripes, charging Fed chair Janet Yellen with maintaining a dangerously loose monetary policy in order to help Obama’s poll ratings.




Inside the Bank of England



Both Gove and Trump sensed, correctly, that experts were primed for a fall. The inflationary catastrophe sparked by 1970s populism has faded from the public memory, and no longer serves as a cautionary tale. Economies have recovered disappointingly from the 2008 crash – a crash, incidentally, for which Greenspan must share the blame, since he presided over the inflation of the subprime mortgage bubble. What little growth there has been has also passed most people by, since the spoils have been so unequally distributed. If the experts’ legitimacy depends on delivering results, it is hardly surprising that they are on the defensive.

And yet the history of the rise of the experts should remind us of three things. First, the pendulum will swing back, just as it did after the 1970s. The saving grace of anti-expert populists is that they do discredit themselves, simply because policies originating from the gut tend to be lousy. If Donald Trump were to be elected, he would almost certainly cure voters of populism for decades, though the price in the meantime could be frightening. In Britain, which is sliding towards a wreck of a divorce with its most important trading partners, the delusions and confusions of the Brexit camp will probably exact an economic price that will be remembered for a generation.

Second, Alan Blinder had a point: democratic politics is prone to errors and gridlock, and there is much to be said for empowering technocrats. The right balance between democratic accountability and expert input is not impossible to strike: the model of an independent central bank does provide a template. Popularly elected politicians have a mandate to determine the priorities and ambitions of the state, which in turn determine the goals for expert bodies – whether these are central banks, environmental agencies, or the armed forces. But then it behooves the politicians to step back. Democracy is strengthened, not weakened, when it harnesses experts.

Thirdly, however, if the experts want to hasten their comeback, they must study the example of Greenspan’s politicking. It is no use thinking that, in a democracy, facts and analysis are enough to win the day. As the advertising entrepreneur John Kearon has argued, the public has to feel you are correct; the truth has to be sold as well as told; you have to capture the high ground with a brand that is more emotionally compelling than that of your opponents. In this process, as Greenspan’s career demonstrates, the media must be wooed. Enemies must be undermined. And, if you succeed, your face might just appear on a T-shirt.

Two decades ago, in his final and posthumous book, the American cultural critic Christopher Lasch went after contemporary experts. “Elites, who define the issues, have lost touch with the people,” he wrote. “There has always been a privileged class, even in America, but it has never been so dangerously isolated from its surroundings.” These criticisms presciently anticipated the rise of Davos Man – the rootless cosmopolitan elite, unburdened by any sense of obligation to a place of origin, its arrogance enhanced by the conviction that its privilege reflects brains and accomplishment, not luck and inheritance. To survive these inevitable resentments, elites will have to understand that they are not beyond politics – and they will have to demonstrate the skill to earn the public trust, and preserve it by deserving it. Given the alternative, we had better hope that they are up to it.

Tuesday 12 February 2013

Pope resigns: The pope who was not afraid to say sorry


Pope Benedict XVI was a courageous pontiff who made a sincere attempt to restore the good name of the Church

Pope Benedict XVI: though small of stature and delicate as bone china in demeanour, he grew slowly into the dignity of his office  Photo: AP
When Joseph Ratzinger was chosen by his fellow cardinals to be pope in April 2005, he was universally billed as the continuity candidate. He had spent 25 years doing John Paul II’s bidding in charge of the old Holy Office, and most Catholics believed they knew exactly what Benedict XVI stood for. Few expected any surprises. Yet now he has pulled off the biggest surprise of all by becoming the first pope in 600 years to resign.
The flawless logic of his resignation letter demonstrates that there is nothing clouding Benedict’s reason. “To steer the boat of St Peter… both strength of mind and body are necessary,” he explained, before stating that he simply didn’t have the stamina for it any more.

Which isn’t in the least surprising. In any other multinational organisation of 1.3 billion members, the idea that an 85-year-old could continue to exercise absolute authority on a daily basis would be regarded as untenable. For the Pope is not some figurehead, the religious equivalent of Queen Beatrix of the Netherlands, abdicating on her 75th birthday to make way for “the next generation”. He is an absolute monarch.

Logic, though, isn’t the quality most often associated with the papacy. John Paul II and before him Paul VI carried on in office long after their bodies had failed them. They upheld the conviction in Catholicism that being elected pope is a divinely ordained duty, to be carried along a personal Via Dolorosa unto death.

But that is not what canon law stipulates. It explicitly sets out conditions for abdication, and so Benedict has invoked them. There is no mystery, or smoking gun, but rather just extraordinary courage and selflessness. Perhaps having watched John Paul II, a vigorous athlete of a man when he took office, decline into someone unable to move or to be understood, made Benedict’s decision for him. He did not want to be a lame-duck pope; he knew that is not what the Catholic Church needs.
Yesterday’s announcement inevitably prompts the question of how his eight years on St Peter’s throne are to be viewed. As some kind of extended postscript to John Paul II’s eye-catching, game-changing era? Or as a stand-alone epoch with distinctive policies and preoccupations?

The consensus leans heavily towards the former, but history could well judge Benedict more kindly. He may have lacked his predecessor’s physical and spiritual charisma, and his unmissable presence on the world stage when major events were happening around him (the collapse of the Berlin Wall, two Gulf wars, 9/11), but Benedict has nevertheless shown himself to be very much his own man. Two of his decisions as pope illustrate what a break he made with his predecessor.
Just as they don’t retire, popes also avoid at all costs admitting that they get things wrong, notwithstanding that they are infallible in certain matters of faith and morals. So few can have expected “God’s Rottweiler”, as he was known when he was carrying out John Paul’s orders in relation to dissenters, to start breaking the mould as pope by issuing mea culpas. But that is precisely what he did.

In January 2009, for instance, he wrote to every Catholic bishop in the world to confess to his own mishandling of the case of Bishop Richard Williamson. This self-styled English prelate, a member of the fundamentalist Lefebvrist group excommunicated by John Paul, had been readmitted to the Catholic Church on Benedict’s watch. But days before, Williamson had given a TV interview in which he denied the Holocaust. The international outcry was huge – and magnified because of Benedict’s own brief spell in the Hitler Youth. The Pope’s response was a heartfelt and humble letter of apology.

His second volte-face came over the issue of paedophile priests. Under John Paul, the issue had been shamefully brushed under the carpet. The Polish pontiff, for example, declined to hand over to justice one of his great favourites, Father Marcial Maciel, the Mexican founder of the Legionaries of Christ, a traditionalist religious order. Despite well-documented allegations going back many years about Maciel’s sexual abuse of youngsters in his seminaries, he was treated on papal orders as an honoured guest in the Vatican.

Yet within a month of taking office, Benedict moved to remove any protection and to discipline Maciel. He ordered the priest, then in his late eighties, never again to say mass or speak in public. And when Maciel died in 2008, his low-key funeral was followed by a rapid dismantling of the religious organisation he had built.

It was part of a concerted drive that made Benedict the first pope to sincerely attempt to address clerical abuse and restore the good name of the Catholic Church. In March 2009, for example, he sent another letter of apology, this time to Catholics in Ireland. “You have suffered grievously,” he wrote to Irish victims of paedophile priests, “and I am truly sorry. I know that nothing can undo the wrong you have endured. It is understandable that you find it hard to forgive or be reconciled with the Church. In her name, I openly express the shame and remorse that we all feel.”

That is quite a statement coming from a pope. It may be that his own past as a lieutenant of John Paul made him part of the problem, but he was unafraid to look this appalling betrayal of trust in the eye, not least in a series of meetings he arranged on his travels.

In fact Benedict wasn’t much of a traveller. Global Catholicism and international leaders usually had to come to him in Rome rather than vice versa. Yet, though small of stature and delicate as bone china in demeanour, he grew slowly into the dignity of his office after it had initially threatened to swamp him.

So his 2010 trip to Britain did not, as had been widely predicted, pale beside the enduring and vivid memory of John Paul’s barnstorming 1982 visit. Instead the crowds warmed to this serious man, with his nervous smile and understated humanity, as he kissed babies and waved from his Popemobile. Even sceptics responded positively to his determination to speak his mind about the marginalisation of religion.

There were, inevitably, notable failures in his reign. He was too much the career Vatican insider to shake up the curia, the Church’s central bureaucracy. Its scheming and corruption was exposed for all to see in the “Vatileaks” scandal last year, with Benedict’s own butler, Paolo Gabriele, convicted of stealing the Pope’s private papers that revealed squabbling cardinals and unprincipled priests in the papal inner circle.

And Benedict’s chosen “big tent” approach to leadership – which was to make him more German Shepherd than Rottweiler by welcoming dissidents back into the fold – also soon blew away. What remained was a willingness to make concessions to schismatic ultra-conservatives, but paper-thin patience with liberal theologians or grassroots movements such as that demanding genuine doctrinal change in Austria.

Patently more at home in a library or a theological college than on the world political stage, Benedict could be clumsy – as when in September 2006 his return to his alma mater, Regensburg University in Bavaria, was overshadowed by derogatory remarks about the prophet Mohammed which he quoted in his lecture. But he went out of his way to make amends on a trip to Turkey soon afterwards, joining Muslim clerics in prayer in the Blue Mosque in Istanbul. This was only the second time a pope had ever entered a mosque.

For every failure, there was a success. His inaugural encyclical, Deus Caritas Est (“God is Love”), in December 2005 broke new ground, first in being written in such a way that non‑theologians could follow it, and second in celebrating human love without the standard Catholic exemptions for gays, the unmarried and those using contraception. “Sex please, we’re Catholics” was the reaction of the influential Catholic weekly, the Tablet.

Though his decision to opt for retirement will mark out this papacy in history, Benedict’s eight-year rule did not see the Catholic Church perform spectacular U-turns on any major doctrinal questions. Yet it was also so much more than a seamless continuation of what had gone before.

John Paul II may have left his cardinals with little choice other than to elect Joseph Ratzinger as a safe pair of hands. But Benedict XVI has, by the way he has stood down and by his record in office, made it more possible that a moderniser, in touch with the realities of life in the 21st century, will be chosen as the 266th successor to St Peter.
 
Peter Stanford is a former editor of the 'Catholic Herald’

Tuesday 31 January 2012

Who came up with the model for excessive pay? No, it wasn't the bankers – it was academics

All the focus has been on bankers' bonuses, yet no one has looked at the economists who argued for rewarding bosses by giving them a bigger financial stake in their companies



Take a big step back. Ignore those sterile debates about how Dave screwed up over Stephen Hester's pay and where this leaves Ed. Instead, ask this: which profession has done most to justify the millions handed over to the boss of RBS, his colleagues and counterparts? Which group has been most influential in making the argument that top people deserve top pay? Not the executives themselves – at least, not directly. Nor the headhunters. Try the economists.




The ground rules for the system by which City bankers, Westminster MPs and ordinary taxpayers live today were set by two US economists just a couple of decades ago. In 1990, Michael Jensen and Kevin Murphy published one of the most famous papers in economics, which first appeared in the Journal of Political Economy and then in the Harvard Business Review. Its argument is well summed up by the latter's title: "CEO Incentives: It's Not How Much You Pay, But How."



The way to get better performance out of bosses, argued the economists, was by giving them a bigger financial stake in their company's performance. You couldn't have asked for a better codification of bonus culture had you stuck a mortar board on Gordon Gekko's head. So popular, so influential was Jensen and Murphy's work that it opened the door to a new corporate culture: one where executives routinely scooped millions in stock options, apparently justified by top research that they were worth it.



The usual criticism of economists is that they missed the crisis: they preferred their models to reality, and those models took no account of the mischief that could be caused by bankers running wild. Of all explanations, this is the most comforting; all academics need to do next time, presumably, is look a little harder – ideally with a grant from the taxpayer.



But economists didn't just fail to spot the financial crisis – they helped create it. They provided the intellectual framework and drew up the policies that helped caused the boom – and the bust. Yet rather than a full-blown investigation, their active involvement in this crisis and their motivations have barely got a look-in. As Philip Mirowski, one of the world's leading historians of economic thought, puts it: "The bankers have got off the hook, and gone back to business as usual – and so too have the economists." It's the same discipline that spoke all that nonsense about markets always being efficient that is now deciding how to reform the economy.



A few weeks ago, I described the current economic system as a bankocracy run by the banks, for the banks. Mainstream economists play the role of a secularised priesthood, explaining to the laity just how and why the markets' will must be done. Why are they doing this? Luigi Zingales, an economist at Chicago, calls it "economists' capture". Much of the blame for the financial crisis has fallen on regulators for being captured by the bankers, and seeing the world from their point of view. The same thing, he believes, has happened to academics. When Zingales looked at the 150 most downloaded academic papers on executive pay he found that those arguing that bosses should get more (à la Jensen and Murphy) were 55% more likely to get published in the top journals.



Anyone who saw the film Inside Job will recall the scene in which leading economists are shown puffing financial deregulation, or the outlook for Icelandic capital markets, or whatever – and then revealed to have taken hundreds of thousands, sometimes millions, from the very interests they are advocating. But this goes wider than direct payment; many academics also believe those arguments about how markets work best when they are left alone. As the economist Steve Keen puts it: "Most economists are deluded."



Maybe, but it also pays to be deluded. Think about the rewards for toeing the mainstream economic line. Publication in prestigious journals. Early professorships at top universities. The conferences, the consultancies at big banks, the speaking fees. And then: the solicitations of the press, the book contracts. On it goes.



Rob Johnson, director of the Institute of New Economic Thinking, quotes a dictum he was once given by a leading west coast economist. "If you got behind Wall Street," he remembers the professor telling him, "you went to Lake Como every summer. If you left finance alone, you took a nice vacation in California. And if you took on the bankers, you drove a secondhand car."



Were this corruption of analytical philosophy, say, this might not matter so much. But economics shapes our policy and our public debates – and it warps both. Yesterday, I listened to a discussion of Hester's bonus (what else?) on the Today programme. Defending Hester, a journalist quoted some American finding that CEO pay had actually halved since 2001.



Chicago economist Steve Kaplan does indeed argue that "CEO pay in 2006 remained below CEO pay in 2000 and 2001". What's missing there is that 2001 was the height of the US dotcom boom, when bosses were getting crazy money. Kaplan also writes papers about how hard it is to be a chief executive. According to his CV, his consulting clients have included Accenture, Goldman Sachs and a bunch of other Wall Street banks. This is the way such arguments are prosecuted: without full disclosure of either evidence or interests. And in such arguments, it's you that loses.

Saturday 17 September 2011

The biblical foundation for a celibate priesthood is flimsy, and now cracks are beginning to show in the Catholic church's ban on marriage for those in holy orders

The troubled history of priests, sex and the church may be at a turning point



  • In a new autobiography published this week, Father Edward Daly, former bishop of Derry and the handkerchief-waving priest of the famous Bloody Sunday photograph, has called for an end to the celibacy rule for Catholic priests. Pointing to the severe decline in numbers of serving clergy (while the worldwide Catholic population has almost doubled since 1970, the number of priests has remained virtually static), Daly believes crisis could be averted by allowing priests to marry. Many see clerical celibacy as fundamental to the church, but in fact it is a religious tradition rather than a strict scriptural prohibition, and it has been far from universally observed throughout its history.

    The biblical foundation for a celibate priesthood is flimsy. While Saint Paul recommended celibacy, he thought anyone who cannot "contain themselves" should marry, "for it is better to marry than to be burnt" (1 Corinthians 7:9). Further, the Gospels spoke of apostles who were married, with no hindrance to their ministry. But the model of Christ's own celibacy (emulated by the priest acting "in persona Christi") marked it out as a higher calling, and ultimately an unmarried priest would be more committed to his religious duties, his celibacy giving him the "power to attend upon the Lord, without impediment" (1 Corinthians 7:35).

    The first official attempt to impose celibacy on those in holy orders was made at the Council of Elvira (c 306), and efforts to enforce it followed throughout the middle ages. But how it played out in practice varied enormously, and stories of married clergy and fornicating popes abounded. Pope John XII was accused by a 10th-century synod of having "fornicated with the widow of Rainier, with Stephana his father's concubine, with the widow Anna, and with his own niece, and he made the sacred palace into a whorehouse".

    Unperturbed by such examples, the First and Second Lateran Councils in the 12th century decreed that clerical marriages were invalid, but Thomas Aquinas asserted a century later that this was not the decree of God, but merely church law, reversible by papal or conciliar authority. Indeed, in the middle ages the prohibition of marriage had less to do with spiritual concerns than the conservation of church property. Married priests meant legitimate heirs and the loss of church assets through inheritances – something that couldn't be countenanced.

    The 16th-century Council of Trent confirmed the celibacy rule (just as the Church of England was abolishing it), but it was only in the 20th century that priestly celibacy, along with all matters of sexual morality, became an obsession for the church hierarchy. Following the reforms of the Second Vatican Council, Pope Paul VI issued the encyclical Sacerdotalis Caelibatus, reaffirming the fundamental value of celibacy as allowing "a closer and more complete relationship with the mystery of Christ and the Church for the good of all mankind".

    Yet the encyclical also permitted the possibility of married clergy from other Christian traditions being ordained as Catholic priests, and cracks began to show in the edifice. Although Pope Benedict rejected the idea of married priests in 2006, he has since taken up Paul VI's baton by allowing defecting married Anglican ministers to enter the church.

    The absolute prohibition on married Catholic priests has gone, and with suggestions (of debatable credibility) of a link between the church's child abuse crisis and celibacy, last year's plaintive call for the abolition of the rule from Italian women romantically involved with priests, and the proliferation of groups advocating a married priesthood, a new chapter in the troubled history of priests, sex and the church may be opening.

Why the Pope must face justice at The Hague

We survivors of clergy sex abuse have brought our evidence to the ICC so that the Vatican might finally account for its cover-up
  • Members of SNAP, including Barbara Blaine, protest at the ICC in The Hague about clergy sex abuse
    Members of Survivors Network of those Abused by Priests (Snap), including Barbara Blaine (third from right), at the International Criminal Court (ICC) in The Hague, 13 September 2011. Photograph: Rob Keeris/AP

    When it comes to holding the Catholic Church accountable for sexual abuse of children by members of the clergy, all roads lead to Rome. That is what my organisation, Survivors Network of those Abused by Priests (Snap), concluded after years of seeking justice in other venues and being turned away.

    On 13 September, we travelled to the Hague to file an 84-page complaint and over 20,000 pages of supporting materials with the International Criminal Court, documenting our charge that the Pope and Vatican officials have tolerated and enabled the systematic and widespread concealing of rape and child sex crimes throughout the world.

    Holding childhood photographs that tell a wrenching story of innocence and faith betrayed, and joined by our attorneys from the New York-based Center for Constitutional Rights, we stood up and demanded the justice that has so long been denied. The New York Times called the filing "the most substantive effort yet to hold the pope and the Vatican accountable in an international court for sexual abuse by priests".

    No doubt, many people of faith are shocked that we would accuse a world church leader of crimes against humanity – a man considered by many to be infallible. But the man who is infallible must also be accountable.

    By the Vatican's own account, "only" about 1.5-5% of Catholic clergy have been involved in sexual violence against children. With a reported 410,593 priests worldwide as of 2009, that means the number of offending priests would range from 6,158 to 20,529. Considering that many offenders have multiple victims, the number of children at risk is likely in the tens, or even hundreds, of thousands.

    We believe the thousands of pages of evidence we filed this week will substantiate our allegations that an operation has been put in place not only to hide the widespread sexual violence by priests in all parts of the world, but also to obstruct investigation, remove suspects out of criminal jurisdictions and do everything possible to silence victims, discredit whistleblowers, intimidate witnesses, stonewall prosecutors and keep a tighter lid than ever on clergy sex crimes and cover-ups. The result of this systematic effort is that, despite a flood of well-publicised cases, many thousands of children remain vulnerable to abuse.

    While many pedophile priests have been suspended in recent years, few have been criminally charged and even fewer defrocked. Worse, no one who ignored, concealed or enabled these predators has suffered any consequences. At the head of this hierarchy of denial and secrecy is the Pope, who has served as an enabler of these men. We believe the Vatican must face investigation to determine whether these incidences have been knowingly concealed and clergymen deliberately protected when their crimes have come to light.

    I know this story well, because I was sexually abused by a parish priest, from my time in junior high school until graduation. Because of the shame and trauma, several years passed before I was able to tell anyone. By that time, it was too late to file criminal charges. Church officials refused to restrict that priest's access to children or take action against him for several more years, despite other victims coming forward.

    Indeed, powerful factors prevent all but the most assertive, healthy and lucky victims from seeking justice. Many others succumb to drugs, anorexia, depression or suicide when the pain of innocence betrayed becomes too much to bear. A recent investigation in Australia revealed a case in which 26 among the numerous victims of a particular priest had committed suicide.

    For the safety of children and the prevention of yet more heinous wrongdoing, the International Criminal Court may be the only real hope. What other institution could possibly bring prosecutorial scrutiny to bear on the largest private institution on the planet?

    Our journey for justice has been a long one, and it's not over yet. But we know where it must end: with justice at The Hague.