Search This Blog

Showing posts with label decline. Show all posts
Showing posts with label decline. Show all posts

Thursday, 15 December 2022

The strange case of Britain’s demise

A country that prided itself on stability has seemed to be in free-fall. Whodunnit? asks The Economist

 | GRANTHAM

The driveway dips as you approach Belton House, the gold-hued façade rising before you as the road tilts up again. Passing through a marble-floored hall to the ornate saloon, early visitors would have admired a portrait of the original master’s daughter with a black attendant. For a while, says Fiona Hall of the National Trust, a heritage charity that these days owns the property, servants came and went from the kitchen wing through a discreet tunnel. A magnificent staircase led finally to a rooftop cupola, and views of an estate that stretched beyond the horizon.

Built in the 1680s, the idyllic mansion embodies a costume-drama view of Britain’s past that is widely cherished at home and abroad. Its location in Lincolnshire makes it emblematic in another way: in the heart of England, in a region that in 2016 voted decisively for Brexit, and on the outskirts of Grantham, a typical market town that was the birthplace of Margaret Thatcher, the country’s most important post-war prime minister. Previously the venue for a murder-mystery evening featuring suspects in period dress, this history-laden spot is an apt place to ponder a different sort of mystery. Who nobbled Britain?

Alas, the victim is in a parlous state. A country that likes to think of itself as a model of phlegmatic common sense and good-humoured stability has become an international laughing stock: three prime ministers in as many months, four chancellors of the exchequer and a carousel of resigning ministers, some of them repeat offenders. “The programme of the Conservative Party,” declared Benjamin Disraeli in 1872, “is to maintain the constitution of the country.” The latest bunch of party leaders have broken their own laws, sidelined official watchdogs, disrespected Parliament and dishonoured treaties.

Not just a party, or a government, but Britain itself can seem to be kaput. England’s union with Scotland, cemented not long after Belton House was built, is fraying. Real incomes have flatlined since the crash of 2008, with more years of stagnation to come as the economy limps behind those of most other rich countries. The reckless tax-slashing mini-budget in September threatened to deliver the coup de grâce. The pound tanked, markets applied a “moron premium” to British sovereign debt and the Bank of England stepped in to save the government from itself.

Today the economy is entering recession, inflation is rampant and pay strikes are disrupting railways, schools and even hospitals. The National Health Service (nhs), the country’s most cherished institution, is buckling. Millions of people are waiting for treatment in hospitals. Ambulances are perilously scarce.

In Grantham, a town of neat red-brick terraced houses, half-timbered pubs and 45,000 residents, the malaise shows up in a penumbra of hardship. Amid staff shortages in the nhs—and an uproar—the local emergency-care service has been cut back. Immured in stacks of nappies and cornflakes at the food bank he runs, Brian Hanbury says demand is up by 50% on last year, and is set to rocket as heating bills bite. Rachel Duffey of PayPlan, a debt-solutions firm that is one of the biggest local employers, predicts that need for help with debts is “about to explode” nationwide, as people already feeling the pinch come to the end of fixed-rate mortgage deals. As for the mini-budget: “It was a shambles,” laments Jonathan Cammack, steward of Grantham Conservative Club.

Natural causes


Whodunnit? A rich cast of suspects is implicated in the debacle. Some are obvious, others lurk in the shadows of history, seeping poison rather than dealing sudden blows. A few are outsiders, but as in many of the spookiest mysteries, most come from inside the house.

To begin with, Britons with long memories may detect a familiar condition: a government that has reached decrepit old age. A parliamentary remark in October about soon-to-quit Liz Truss—“the prime minister is not under a desk”—brought to mind immortal lines from the death-spiral of the Labour administration that lasted from 1997 to 2010. Then the chancellor referred to the prime minister’s henchmen as “the forces of hell”; “Home secretary’s husband put porn on expenses”, newspapers reported. In the mid-1990s, at the fag-end of Tory rule that began in 1979, a run of mps were caught with their pants down or their fingers in the till in another relay of shame.

Britain seems trapped in a doom loop of superannuated governments which, after a term or two of charismatic leadership and reformist vim, wind up bereft of talent, sinking in their own mistakes and wracked by backbench rebellions; in office but barely in power. Eventually routed at the polls, it then takes the guilty parties several parliamentary terms to recover. In opposition, both Labour and the Tories have determinedly learned the wrong lessons from defeat before alighting on the right ones. In a system with two big parties, for either to lose its mind is dangerous. For both to do so at once—as happened when, amid recent Tory convulsions, Labour was led by Jeremy Corbyn, a hard-left throwback—is a calamity.

“A family with the wrong members in control,” George Orwell wrote of the English. Yet a repeating cycle of senile governments does not, by itself, explain the national plight. Those previous administrations never plumbed the depth of disarray the current lot has reached. Something else has struck a country that has spewed out ruinous policies and a sequence of leaders resembling a reverse ascent of man: from plausible but glib David Cameron, to out-of-her-depths Theresa May, disgraceful Boris Johnson and then Ms Truss, probably the worst premier in modern history. Philip Cowley of Queen Mary University of London says that, in bygone days, Rishi Sunak would at this stage of his career have been a junior Treasury minister, rather than the latest prime minister.

Violence has been inflicted on the body politic—most brazenly, by Brexit, in the referendum, with 52%. Parties in power for over a decade are bound to scrape the bottom of the talent barrel. In this case, much of the Tory barrel was poured down the drain when support for Brexit became a prerequisite for office. The outcome has been rule by chancers and cranks. Mr Johnson’s Brexit machinations put him in Downing Street; the tribalism that the campaign fostered kept him there for much longer than he deserved. Brexit has wrecked the Tory party—and yet it is, broadly speaking, the side that won.

Brexit has also institutionalised lying in British politics, as the dishonesty of Brexiteer promises segued into the pretence that they are being fulfilled. They are not. “Nothing much has changed,” says Mr Cammack in Grantham. “Life just keeps going on.” But some things have changed for the worse. Investment is down and inflation higher than it would have been inside the European Union. Labour, skilled and otherwise, is scarce. Farmers are losing crops for want of workers. In Lincolnshire, says Johanna Musson of the National Farmers Union, tulip-growers are especially fretful. The county’s exports have fallen as, across Britain, Brexit-induced red tape leads some businesses to give up on European markets altogether.

In 1975, during an earlier strike-hit era, Britain held another referendum on its relationship with Europe. Roy Jenkins, a pro-Europe statesman, predicted that, if it left, it would wind up in “an old people’s home for faded nations”. Give or take a detour to the lunatic asylum, that judgment looks prescient. The economy is floundering and the country’s international prestige is plummeting: precisely the future Brexit was meant to avoid.

Still, as any murder-mystery aficionado knows, the obvious suspect is rarely the right one. In the curious case of Britain’s decline, Brexit is as much a weapon as the ultimate culprit.

The hand of history


Many of the factors behind the decision to leave have roughed up other countries, too. Lots of people on both sides of the Atlantic crave simple answers to complex questions, and populists have provided them. Faith in mainstream parties has waned, even as expectations of government have risen. The line between politics and entertainment has blurred, aggravating, in Britain, an old reluctance to take things too seriously, and a weakness for wits and eccentrics who cock a snook at convention. That is less damaging when there is substance behind their insouciance and discipline beneath the panache.

Ben Page, the boss of Ipsos, a global research firm, points to what he terms the “loss of the future”, common across the West but acute in Britain. In 2008, as the financial crisis struck, only 12% of Britons thought youngsters would have a worse quality of life than their parents, Mr Page notes. Now that figure is 41%. As elsewhere, people worry about immigration and feel threatened by globalisation. All this makes Britain’s predicament seem less an inside job than part of a wider takedown of democracy.

But other likely suspects lurk in the attic of British history. One grew up down the road from Belton House. The grocer’s shop in Grantham above which Margaret Roberts, later Thatcher, was born is now a chiropractor and beautician. A statue of her put up earlier this year was quickly egged and defaced (she endured worse in real life). Her legend still looms over the country—particularly her Conservative Party.

Thatcher’s 11-year rule was an amalgam of caution, patience, luck and boldness. But among some Tories it is often misremembered as a prolonged ecstasy of tax-cutting, fight-picking, union-bashing and shouting “No, no, no” at Brussels. The rows over Europe that erupted on her watch rumbled on till the referendum of 2016. For some, she bequeathed a hunch that if economic policy doesn’t hurt, it isn’t working. Her ousting nurtured a lasting taste for party bloodletting. To court Tory members, Ms Truss even seemed to mimic Thatcher’s wardrobe. (It took just 81,326 of them to put her in Downing Street.)

Peer deeper into the past and more evidence comes to light. Recall, for instance, that painting in the saloon at Belton House, of the girl and her black attendant, possibly a slave. Her family, the Brownlows, had links to both Caribbean plantations and the East India Company, which helps explain the house’s splendid collection of Asian porcelain. The wider legacy of Britain’s former empire, runs a plausible theory, is a gnawing sense of unmet expectations and a fatal delusion of grandeur over the country’s place in the world.

For Sathnam Sanghera, author of “Empireland”, a powerful book about the largely unspoken effects of imperialism, “the original sin behind Brexit is empire.” The circumstances in which that empire was lost may have redoubled the psychic blow: in the wake of the second world war, during which, at least in the popular memory, Britain stood nobly alone against the Nazi onslaught. Afterwards it found itself diminished, broke and outdone by erstwhile foes, nurturing entwined feelings of greatness and grievance and haunted by phantom invasions. As the Irish author Fintan O’Toole has quipped, “England never got over winning the war.” In his view, Brexit was “imperial England’s last last stand”.

Perhaps not quite the last. Even now you can hear an echo of imperial hubris in the tendency of some British politicians to talk to eu negotiators, or the international bond markets, as if they were waiters in a Mediterranean bistro, liable to comply if only you repeat yourself loudly enough. It resounds in hollow boasts about having the best health care or army (or football team) in the world, in the yen to “punch above our weight”, and in the pursuit of a pure sort of sovereignty which, in an age of climate change, pandemics and imported gas, no longer exists.

“Until we face up to our history,” thinks Mr Sanghera, “we’re just going to carry on being dysfunctional.” On this analysis, the unravelling of Britain is a kind of karma.

In the 18th century, with a shrug


Maybe. Yet imperialism, greatness and all that have always been more an elite preoccupation than a popular one. In his enlightening new book, “The Strange Survival of Liberal Britain”, Vernon Bogdanor of King’s College London cites a survey of Britons conducted in 1951, when the loss of empire ought to have been most raw. Half of respondents couldn’t name a single colony (one suggested Lincolnshire). Odd as it is to say of a country that for centuries ruled swathes of the world, it may not be ruptures like the end of empire or Brexit that have done in modern Britain, but, less dramatically, a kind of long-term drift; not violence, in other words, but neglect.

Think back to the era in which Belton House was built. After the execution of Charles I in 1649 and the short-lived English Commonwealth, the monarchy had been restored. Compared with other European nations, the English got their big revolution done early—but then thought better of it, afterwards nudging forwards to constitutional monarchy and democracy. This piecemeal approach has characterised the country’s political evolution ever since. Walter Bagehot, a great Victorian editor of The Economist, noted the habit of compromising on thorny constitutional issues—or ducking them. “The hesitating line of a half-drawn battle was left to stand for a perpetual limit,” he wrote of such botches, and “succeeding generations fought elsewhere.”

Booby traps were often left behind. One lies in the fuzzy and weak restraints on the British executive. As Lord Hailsham, a Tory grandee, warned in 1976, a government with a secure majority in the House of Commons has an inbuilt tendency towards “elective dictatorship”. The House of Lords, which is meant to scrutinise legislation, is the fudge par excellence. In an absurd backroom deal of 1999, the hereditary peers who once dominated it were ejected—except for 92 of them. They are still there; when one dies, another is elected to replace him. Those are the only elections to Parliament’s upper chamber.

It is hard to see many other countries tolerating such a farrago. Meanwhile, a gentlemanly understanding that leaders would regulate their personal behaviour, once known as the “good chaps” theory of government, did not survive contact with Mr Johnson. As when a mob realises the rule of law is a confidence trick, it turned out that a few good shoves could dispense with much of the flimflam of oversight.

Or consider the myopic attitudes of successive governments to devolution. When it created the Scottish Parliament, Sir Tony Blair’s Labour administration did not fully anticipate the subsequent surge in English nationalism. Nor did it foresee how, after taking office in Edinburgh, the canny, pro-independence Scottish National Party (snp) would enjoy both the dignity of power and the sheen of opposition to Westminster. Now Brexit is inflicting more casual vandalism on the union, undermining support for it in Scotland and Northern Ireland, which both voted to remain in the eu.

Whereas once Scottish independence was an in-or-out proposition, says Sir John Curtice of the University of Strathclyde, it has become a choice between competing unions, British and European. As the snp vows to rejoin the eu, some Scottish Remainers who had rejected independence are embracing the idea. For some in Northern Ireland, explains Katy Hayward of Queen’s University Belfast, the mere fact of Brexit made a united Ireland more desirable; the region’s awkward post-Brexit position has led still more to think unification is likelier than it was before. Across Britain, a majority thinks the union will fall apart. It is not on the cards yet, but one day Britain may dissolve itself by accident.

Drift and neglect have undermined more than the constitution and the union. David Kynaston, the pre-eminent historian of 20th-century England, invokes Sir Siegmund Warburg, a German-born banker who helped shake up the City (on the slide as an equity market in the aftermath of Brexit). Warburg detested the British fondness for the phrase, “We’ll cross that bridge when we come to it.” As Mr Kynaston observes, Britain is not a place that is “good at grasping the nettle”.

With some glaring, uncharacteristic exceptions—Thatcher’s battle with the coal miners, the bust-up over Brexit—Britain tends to dislike confrontation, especially the ideological kind, perhaps a legacy of the civil war. It prefers irony to ideas and douses plain-speaking in good manners; its people have a quaint instinct to apologise when a stranger steps on their foot. Alongside this squeamishness, says Mr Kynaston, runs a “deep-dyed anti-intellectual empiricism”, and an inclination to tackle problems “pragmatically, as and when they arise, not looking for trouble in advance”.

This reticence has costs, not least through its complicity in the underpowered economy. Consider the glacial planning regime, or—an even more venerable problem—the skewed education system. It produces a narrow elite, dominated for too long by the alumni of a few private schools: Brexit and the mini-budget can both be traced to the playing fields of Eton, attended by Mr Johnson, Mr Cameron, who botched the referendum, and Kwasi Kwarteng, very briefly the chancellor. Less conspicuous, but at least as damaging, is the country’s long educational tail.

It has recently made some progress in international education rankings, but a stubborn quarter or so of 11-year-olds in England are unable to read at the expected level. A higher share of teenage boys are not in work, education or training than in most other rich countries. As for those who stay in the classroom: the “greater part of what is taught in schools and universities…does not seem to be the most proper preparation” for “the business which is to employ [students] during the remainder of their days.” That was Adam Smith in “The Wealth of Nations”, published in 1776. Employers make similar complaints in 2022.

In a post-imperial, post-industrial, ever-more competitive world, all that contributes to a skills shortage and a long-term productivity gap with other advanced economies. The fat years under Sir Tony and Gordon Brown disguised these shortcomings—until the crash, when it became clear that the boom they oversaw was over-reliant on financial services and public and private debt. Using the fruits of Thatcherite economics to fund a more generous state had seemed a political elixir; it turned out to be a fair-weather formula. In the kindest of circumstances, New Labour left some of the hardest problems unsolved. Most new jobs went to foreign-born workers. The number of working-age adults receiving welfare benefits barely shifted.

The cradle of the Industrial Revolution has not yet found a secure niche in the 21st-century economy. Nor has it figured out how to pay sustainably for the sort of public services that Britons expect. If, in the matter of Britain’s meltdown, Thatcher is an accessory before the fact, so is Sir Tony.

The country-house red herring


In the upstairs-downstairs, country-house vision of Britain, the country is a museum of class, with overlords surveying their lands and minions scurrying below stairs as they once did at Belton House. Famously, Disraeli wrote of “two nations”, the rich and the poor, as distinct as “inhabitants of different planets”. England, especially, is indeed a class-ridden place, whose denizens still make snap judgments about each other’s backgrounds based on accents, shoes and haircuts. Too many at the bottom of the ladder cannot see a way up it. Some at the top still benefit from unearned deference. Politicians often share this binary outlook, thinking the business of government is to squeeze the rich and comfort the poor, or vice versa.

But Disraeli’s formulation is too crude for 21st-century Britain. After generations of muddling through, it is in large part a country of people who are not exactly poor but are by no means rich. Instead they are “just about managing”, as Mrs May, the last prime minister but two, described them.

Take Grantham, a constituency in which the average income in 2020 was £25,600 ($32,900), just below the national median. (This year, Britain’s gdp per person will be more than 25% lower than America’s, measured at purchasing-power parity.) Amid the cost-of-living squeeze, says Mr Hanbury at the food bank, not only households that rely on welfare benefits but nurses and teachers are coming unstuck: “People live so close to the edge.”

It is only a 70-minute train ride to London, but power in Westminster seems remote, reflects Father Stuart Cradduck of St Wulfram’s, a lovely medieval church behind Grantham’s low-slung high street. Lincolnshire, he says, feels like a “forgotten county”. Kelham Cooke, the leader of the local council, says young people who leave for university often don’t come back. Regional inequality is another old, hard problem that successive British governments have only desultorily tackled, watching on as London sucked in talent and capital and other places fell behind.

There is something to be said for drift; or, to put it another way, gradualism. A “highly original quality of the English”, Orwell wrote in 1947, “is their habit of not killing one another.” By slowly expanding the franchise and incorporating the labour movement into democratic politics, Britain avoided continental-style extremism in the 19th and 20th centuries. When liberalism perished elsewhere in Europe in the 1930s, observes Mr Bogdanor, it survived in Britain. Compared with places such as France or Italy, where the far right is resurgent—or with ultrapolarised America—it is healthy in Britain still. Ms Truss’s stint in Downing Street was inglorious, but, Mr Bogdanor notes, she was removed quietly and efficiently, without riots or fuss. The flawed parliamentary system worked.

So drift can be benign. But it can also take you into a cul-de-sac—or off a cliff. In Britain it has led to economic mediocrity and disgruntlement, which in turn contributed to the yelp of Brexit and the desperate magical thinking of the mini-budget. Senile governments, self-inflicted wounds, the blowback of empire, corrosive global trends, the spectres of bygone leaders: they are all accomplices. But the main cause of Britain’s woe belongs less at a crime scene than in a school report. In the end, it didn’t try hard enough. 

Saturday, 7 January 2017

I'm a junior doctor in the NHS, and I'm terrified for this winter

Aislinn Macklin-Doherty in The Guardian


Widespread concerns that the NHS will face the “toughest winter ever” are not exaggerated or unfounded – just look at the terrible news today from Worcestershire. We really should be worried for ourselves and our relatives. As a junior doctor and a researcher looking after cancer patients in the NHS, I am terrified by the prospect of what the next few months will bring. But we must not forget this is entirely preventable.




Three patients die at Worcestershire hospital amid NHS winter crisis


Our current crisis is down to the almost clockwork-like series of reshuffling, rebranding and top-down disorganisation of the services by government. It’s led to an inexorable decline in the quality of care.


I have also become aware of an insidious “takeover” by the private sector. It is both literal – in the provision of services – and ideological, with an overwhelming prevalence of business-speak being absorbed into our collective psyche. But the British public (and even many staff) remain largely unaware that this is happening.

Where the consultant physician or surgeon was once general, they now increasingly play second fiddle to chief executives and clinical business unit managers. Junior doctors such as myself (many of whom have spent 10-15 years practising medicine and have completed PhDs) must also fall in line to comply with business models and corporate strategy put forward by those with no clinical training or experience with patients.

It is this type of decision-making (based on little evidence) and seemingly unaccountable policymaking that means patient care is suffering. Blame cannot be laid at the feet of a population of demanding and ageing patients, nor the “health tourists” who are too often scapegoated.

The epitome of such changes is known as the “sustainability and transformation plans”. These will bring about some of the biggest shifts in how NHS frontline service are funded and run in recent history, and yet, worryingly, most of my own colleagues have not even heard of them. Even fewer feel able to influence them.

Sustainability and transformation plans will see almost a third of regions having an A&E closed or downgraded, and nearly half will see numbers of inpatient bed reductions. This is all part of the overarching five-year plan to drive through £22bn in efficiency savings in the NHS. But with overwhelming cuts in social services and community care and with GPs under immense pressure, people are forced to go to A&E because they quite simply do not have any other options.

I have been on the phone with patients with cancer who need to come into hospital with life-threatening conditions such as sepsis, and I have been forced to tell them, “We have no beds here you need to go to another local A&E.” Responses such as, “Please doctor don’t make me go there – last time there were people backed up down the corridors,” break my heart.

According to the Kings Fund, our NHS leaders are choosing to spend less year-on-year on healthcare (as a proportion of GDP) than at any other time in NHS history and yet we are the fifth richest economy in the world. Simultaneously private sector involvement increases and astronomical interest rates from private finance initiatives must be paid, with hospitals such as St Bartholomew’s in London having to pay up to £2m per week in interest alone. No wonder nearly all hospitals are now in dire straits.

This is all the result of intentional policies being made at the top with minimal consultation of those on the frontline. With such policies accumulating over the years we are now seeing the crisis come to a climax. The UK has fewer beds per person and fewer doctors per person than most countries in Europe. Fewer ambulances are now able to reach the highest-category emergencies, which means people having asthma attacks, heart attacks and traffic accidents are being left to wait longer in situations where minutes really matter.

The sustainability and transformation plans for my local area in south-west London show that they plan to cut 44% of inpatient bed stays over the next four years . This is dangerous. It is likely that St Helier hospital in Sutton, which takes many emergencies in the area, will close and patients will then not only have access to critically reduced services, they will then have to travel longer to hospital, having waited longer for the ambulance to get to them.

This will be the straw that broke the camel’s back. I cannot stand by while patients’ lives are put at unnecessary risk this winter. And neither should you.

Saturday, 30 April 2016

Trump says what no other candidate will: the US is no longer exceptional

With his slogan ‘Make America Great Again’, Trump is the first leader of recent times to attack American exceptionalism. In fact, he claims it is the opposite

 
The slogan that changed the trajectory of American political discourse? Only time will tell. Photograph: Matt York/AP


Tom Engelhardt for Tom Dispatch


“Low-energy Jeb”. “Little Marco”. “Lyin’ Ted”. “Crooked Hillary”. Give Donald Trump credit: he has a memorable way with insults. His have a way of etching themselves on the brain. And they’ve garnered media coverage, analysis and commentary almost beyond imagining.

Memorable as they might be however, they won’t be what lasts of Trump’s 2016 election run. That’s surely reserved for a single slogan that will sum up his candidacy when it’s all over (no matter how it ends). He arrived with it on that Trump Tower escalator in the first moments of his campaign, and it now headlines his website, where it’s also emblazoned on an array of products from hats to T-shirts.




President Trump fills world leaders with fear: 'It's gone from funny to really scary'



You already know which line I mean: “Make America Great Again!”

That exclamation point ensures you won’t miss the hyperbolic, Trumpian nature of its promise to return the country to its former glory days. In it lies the essence of his campaign, of what he’s promising his followers and Americans generally – and yet, strangely enough, of all his lines it’s the one most taken for granted, the one that’s been given the least thought and analysis. And that’s a shame, because it represents something new in our American age. The problem, I suspect, is that what first catches the eye is the phrase “make America great” and then, of course, the exclamation point, while the single most important word in the slogan, historically speaking, is barely noted: again.

With that word, Trump crossed a line in American politics that until his escalator moment represented a kind of psychological taboo for politicians of any stripe and of either party, including presidents and potential candidates for that position. He is the first American leader or potential leader of recent times not to feel the need or obligation to insist that the US, the “sole” superpower of Planet Earth, is an “exceptional” nation, an “indispensable” country, or even in an unqualified sense a “great” one. His claim is the opposite: that, at present, America is anything but exceptional, indispensable or great, though he alone could make it “great again”.

In that claim lies a curiosity that, in a court of law, might be considered an admission of guilt. Yes, it says, if one man is allowed to enter the White House in January 2017, this could be a different country, but – and herein lies the originality of the slogan – it is not great now.

Trump, in other words, is the first person to run openly and without apology on a platform of American decline. Think about that for a moment. “Make America Great Again!” is indeed an admission, in the form of a boast.

As he tells his audiences repeatedly, America, the formerly great, is today a punching bag for China, Mexico ... well, you know the pitch. You don’t have to agree with him on the specifics. What’s interesting is the overall vision of a country lacking in its former greatness.

Perhaps a little history of American greatness and presidents (as well as presidential candidates) is in order here.

‘City upon a hill’


John F Kennedy simply assumed America was great. Photograph: Paul Schutzer/Time & Life Pictures/Getty Image

Once upon a time, in a distant America, the words “greatest”, “exceptional” and “indispensable” weren’t part of the political vocabulary.

American presidents didn’t bother to claim any of them for this country, largely because American wealth and global preeminence were so indisputable. We’re talking about the 1950s and early 1960s, the post-second world war and pre-Vietnam “golden” years of American power. Despite a certain hysteria about the supposed dangers of domestic communists, few Americans then doubted the singularly unchallengeable power and greatness of the country. It was such a given, in fact, that it was simply too self-evident for presidents to cite, hail or praise.

So if you look, for instance, at the speeches of John F Kennedy, you won’t find them littered with exceptionals, indispensables or their equivalents.

In a pre-inaugural speech he gave in January 1961 on the kind of government he planned to bring to Washington, for instance, he did cite the birth of a “great republic” and quoted Puritan John Winthrop on the desirability of creating a country that would be “a city upon a hill” to the rest of the world, with all of humanity’s eyes upon us.

In his inaugural address (“Ask not what your country can do for you”) he invoked a kind of unspoken greatness, saying: “We shall pay any price, bear any burden, meet any hardship, support any friend, oppose any foe to assure the survival and the success of liberty.”

It was then common to speak of the US with pride as a “free nation” (as opposed to the “enslaved” ones of the communist bloc) rather than an exceptional one. His only use of “great” was to invoke the US-led and Soviet Union-led blocs as “two great and powerful groups of nations”.

Kennedy could even fall back on a certain modesty in describing the US role in the world (which in those years, from Guatemala to Iran to Cuba, all too often did not carry over into actual policy), saying in one speech: “We must face the fact that the United States is neither omnipotent or omniscient – that we are only 6% of the world’s population – that we cannot impose our will upon the other 94% of mankind – that we cannot right every wrong or reverse each adversity – and that therefore there cannot be an American solution to every world problem.” In that same speech, he typically spoke of America as “a great power” – but not “the greatest power”.

If you didn’t grow up in that era, you may not grasp that none of this in any way implied a lack of national self-esteem. Quite the opposite: it implied a deep and abiding confidence in the overwhelming power and presence of this country, a confidence so unshakeable that there was no need to speak of it.

If you want a pop cultural equivalent for this, consider America’s movie heroes of that time, actors such as John Wayne and Gary Cooper, whose westerns and, in the case of Wayne, war movies were iconic. What’s striking when you look back at them from the present moment is this: while neither of those actors was anything but an imposing figure, they were also remarkably ordinary looking. They were in no way over-muscled, nor were they over-armed in the modern fashion. It was only in the years after the Vietnam war, when the country had absorbed what felt like a grim defeat, been wracked by oppositional movements, riots and assassinations, when a general sense of loss had swept over the polity, that the over-muscled hero, the exceptional killing machine, made the scene. (Think:Rambo.)

Consider this then if you want a definition of decline: when you have to state openly (and repeatedly) what previously had been too obvious to say, you’re heading, as the opinion polls always like to phrase it, in the wrong direction; in other words, once you have to say it, especially in an overemphatic way, you no longer have it.


The Reagan reboot



What better way to attest to America’s greatness than its military might? Photograph: Scott Stewart/AP

That note of defensiveness first crept into the American political lexicon with the unlikeliest of politicians: Ronald Reagan, the man who seemed like the least defensive, most genial guy on the planet. On this subject at least, think of him as Trumpian before the advent of the Donald, or at least as the man who (thanks to his ad writers) invented the political use of the word “again”. It was, after all, employed in 1984 in the seminal ad of his political run for a second term in office. While that bucolic-looking TV commercial was titled “Prouder, Stronger, Better”, its first line ever so memorably went: “It’s morning again in America.” (“Why would we ever want to return to where we were less than four short years ago?”)

Think of this as part of a post-Vietnam Reagan reboot, a time when the US in Rambo-esque fashion was quite literally muscling up and over-arming in a major way. Reagan presided over “the biggest peacetime defense build-up in history” against what, referencing Star Wars, he called an “evil empire” – the Soviet Union. In those years he also worked to rid the country of what was then termed “the Vietnam syndrome” in part by rebranding that war a “noble cause”.

In a time when loss and decline were much on American minds, he dismissed them both, even as he set the country on a path toward the present moment of 1% dysfunction in a country that no longer invests fully in its own infrastructure, whose wages are stagnant, whose poor are a growth industry, whose wealth now flows eternally upward in a political environment awash in the money of the ultra-wealthy, and whose over-armed military continues to pursue a path of endless failure in the greater Middle East.

Reagan, who spoke directly about American declinist thinking in his time – “Let’s reject the nonsense that America is doomed to decline” – was hardly shy about his superlatives when it came to this country. He didn’t hesitate to re-channel classic American rhetoric, ranging from Winthop’s “shining city upon a hill” (perhaps cribbed from Kennedy) in his farewell address to Lincoln-esque (“the last best hope of man on Earth”) invocations such as “here in the heartland of America lives the hope of the world” or “in a world wracked by hatred, economic crisis and political tension, America remains mankind’s best hope”.

And yet in the 1980s there were still limits to what needed to be said about America. Surveying the planet, you didn’t yet have to refer to us as the “greatest” country of all or as the planet’s sole truly “exceptional” country. Think of such repeated superlatives of our own moment as defensive markers on the declinist slope. The now commonplace adjective “indispensable” as a stand-in for American greatness globally, for instance, didn’t even arrive until Bill Clinton’s secretary of state, Madeleine Albright, began using it in 1996.

It only became an indispensable part of the rhetorical arsenal of American politicians, from Barack Obama on down, a decade into the 21st century, when the country’s eerie dispensability (unless you were a junkie for failed states and regional chaos) became ever more apparent.

As for the US being the planet’s “exceptional” nation, a phrase that now seems indelibly part of the American grain and that no president or presidential candidate avoids, it’s surprising how late it entered the lexicon.

As John Gans Jr wrote in the Atlantic in 2011: “Obama has talked more about American exceptionalism than Presidents Reagan, George HW Bush, Bill Clinton, and George W Bush combined: a search on UC Santa Barbara’s exhaustive presidential records library finds that no president from 1981 to today uttered the phrase ‘American exceptionalism’ except Obama.”


Barack Obama: the only president to use the term ‘American exceptionalism’, according to research. Photograph: Rex Features

As US News’s Robert Schlesinger has also noted, “American exceptionalism” is not a traditional part of the presidential vocabulary. According to his search of public records, Obama is the only president in 82 years to use the term.

And yet in recent years it has become a commonplace of Republicans and Democrats alike. As the country has become politically shakier, the rhetoric about its greatness has only escalated in an American version of “the lady doth protest too much”. Such descriptors have become the political equivalent of litmus tests: you couldn’t be president or much of anything else without eternally testifying to your unwavering belief in American greatness.

This, of course, is the line that Trump crossed in a curiously unnoticed fashion in this election campaign. He did so by initially upping the rhetorical ante, adding that exclamation point (which even Reagan avoided). Yet in the process of being more patriotically correct than thou, he somehow also waded straight into American decline so bluntly that his own audience could hardly miss it – even if his critics did.

Think of it as an irony, if you wish, but in promoting his own rise the ultimate American narcissist has also openly promoted a version of decline to striking numbers of Americans. For his followers, a major political figure has quit with the defensive BS and started saying it the way it is.

Of course, don’t furl the flag or shut down those offshore accounts or start writing the complete history of American decline quite yet. After all, the US still looms “lone” on an ever more chaotic planet. Its wealth remains stunning, its economic clout something to behold, its tycoons the envy of the world, and its military beyond compare when it comes to how much and how destructive, even if not how successful. Still, make no mistake about it – Trump is a harbinger, however bizarre, of a new American century in which this country will indeed no longer be “the greatest” or, for all but a shrinking crew, exceptional.

Mark your calendars: 2016 is the year the US first went public as a declinist power, and for that you can thank Donald (or rather Donald!) Trump.

Thursday, 16 May 2013

Depression and physical decline: why retirement can seriously damage your health


Retirement can cause a drastic decline in health, according to a study released today.

Research found that both mental and physical health can suffer, said the Institute of Economic Affairs and the Age Endeavour Fellowship, who claim the Government should help people work longer and raise the state pension ages.

The study - Work Longer, Live Healthier: The Relationship Between Economic Activity, Health And Government Policy - shows there is a small boost in health immediately after retirement but that, over the longer term, there is a significant deterioration.

It suggests retirement increases the likelihood of suffering from clinical depression by 40% and the chance of having at least one diagnosed physical condition by about 60%. The probability of taking medication for such a condition rises by about 60% as well, according to the findings. People who are retired are 40% less likely than others to describe themselves as being in very good or excellent health.
The length of time spent in retirement can also cause further disadvantages, the study found.

It concluded that, for men and women alike, "there seem to exist longer-term health benefits of employment among older people".

Its authors said: "This, in turn, indicates that politicians do not face a trade-off between improving the health of the older population, increasing economic growth, decreasing health spending among the elderly and producing solvent pension systems.

"The policy implication is that impediments to continuing paid word in old age should be decreased. This does not necessarily mean that people should be expected to work full-time until they die, but rather that public policy should remove the strong financial incentives to retire at earlier ages."

Philip Booth, editorial and programme director at the Institute of Economic Affairs, said: "Over several decades, governments have failed to deal with the 'demographic time bomb'.

"There is now general agreement that state pension ages should be raised. The Government should take firmer action here and also deregulate labour markets. Working longer will not only be an economic necessity, it also helps people to live healthier lives."

Edward Datnow, chairman of the Age Endeavour Fellowship, said: "In highlighting the positive link between work and health in old age, this research is a wake-up call for the UK's extensive and well-funded retirement lobbies.

"More emphasis needs to be given to ways of enabling a work-life balance beyond today's normal retirement age with legislative discouragements to extending working life being replaced with incentives. There should be no 'normal' retirement age in future.

"More employers need to consider how they will capitalise on Britain's untapped grey potential and those seeking to retire should think very hard about whether it is their best option."

Tuesday, 14 February 2012

'Losing' the world: American decline in perspective, part 1


US foreign policy 'experts' only ever provide an echo chamber for American imperial power. A longer, broader view is necessary

Significant anniversaries are solemnly commemorated – Japan's attack on the US naval base at Pearl Harbor, for example. Others are ignored, and we can often learn valuable lessons from them about what is likely to lie ahead. Right now, in fact.

At the moment, we are failing to commemorate the 50th anniversary of President John F Kennedy's decision to launch the most destructive and murderous act of aggression of the post-second world war period: the invasion of South Vietnam, later all of Indochina, leaving millions dead and four countries devastated, with casualties still mounting from the long-term effects of drenching South Vietnam with some of the most lethal carcinogens known, undertaken to destroy ground cover and food crops.
The prime target was South Vietnam. The aggression later spread to the North, then to the remote peasant society of northern Laos, and finally to rural Cambodia, which was bombed at the stunning level of all allied air operations in the Pacific region during second world war, including the two atom bombs dropped on Hiroshima and Nagasaki. In this, Henry Kissinger's orders were being carried out – "anything that flies on anything that moves" – a call for genocide that is rare in the historical record. Little of this is remembered. Most was scarcely known beyond narrow circles of activists.
When the invasion was launched 50 years ago, concern was so slight that there were few efforts at justification, hardly more than the president's impassioned plea that "we are opposed around the world by a monolithic and ruthless conspiracy that relies primarily on covert means for expanding its sphere of influence", and if the conspiracy achieves its ends in Laos and Vietnam, "the gates will be opened wide."

Elsewhere, he warned further that "the complacent, the self-indulgent, the soft societies are about to be swept away with the debris of history [and] only the strong … can possibly survive," in this case reflecting on the failure of US aggression and terror to crush Cuban independence.

By the time protest began to mount half a dozen years later, the respected Vietnam specialist and military historian Bernard Fall, no dove, forecast that "Vietnam as a cultural and historic entity … is threatened with extinction … [as] … the countryside literally dies under the blows of the largest military machine ever unleashed on an area of this size." He was again referring to South Vietnam.
When the war ended eight horrendous years later, mainstream opinion was divided between those who described the war as a "noble cause" that could have been won with more dedication, and at the opposite extreme, the critics, to whom it was "a mistake" that proved too costly. By 1977, President Carter aroused little notice when he explained that we owe Vietnam "no debt" because "the destruction was mutual."

There are important lessons in all this for today, even apart from another reminder that only the weak and defeated are called to account for their crimes. One lesson is that to understand what is happening, we should attend not only to critical events of the real world, often dismissed from history, but also to what leaders and elite opinion believe, however tinged with fantasy. Another lesson is that alongside the flights of fancy concocted to terrify and mobilize the public (and perhaps believed by some who are trapped in their own rhetoric), there is also geo-strategic planning based on principles that are rational and stable over long periods because they are rooted in stable institutions and their concerns. That is true in the case of Vietnam, as well. I will return to that, only stressing here that the persistent factors in state action are generally well concealed.

The Iraq war is an instructive case. It was marketed to a terrified public on the usual grounds of self-defense against an awesome threat to survival: the "single question", George W Bush and Tony Blair declared, was whether Saddam Hussein would end his programs of developing weapons of mass destruction. When the single question received the wrong answer, government rhetoric shifted effortlessly to our "yearning for democracy", and educated opinion duly followed course; all routine.
Later, as the scale of the US defeat in Iraq was becoming difficult to suppress, the government quietly conceded what had been clear all along. In 2007-2008, the administration officially announced that a final settlement must grant the US military bases and the right of combat operations, and must privilege US investors in the rich energy system – demands later reluctantly abandoned in the face of Iraqi resistance. And all well kept from the general population.

Gauging American decline

With such lessons in mind, it is useful to look at what is highlighted in the major journals of policy and opinion today. Let us keep to the most prestigious of the establishment journals, Foreign Affairs. The headline blaring on the cover of the December 2011 issue reads in bold face: "Is America Over?"
The title article calls for "retrenchment" in the "humanitarian missions" abroad that are consuming the country's wealth, so as to arrest the American decline that is a major theme of international affairs discourse, usually accompanied by the corollary that power is shifting to the East, to China and (maybe) India.

The lead articles are on Israel-Palestine. The first, by two high Israeli officials, is entitled "The Problem is Palestinian Rejection": the conflict cannot be resolved because Palestinians refuse to recognize Israel as a Jewish state – thereby conforming to standard diplomatic practice: states are recognized, but not privileged sectors within them. The demand is hardly more than a new device to deter the threat of political settlement that would undermine Israel's expansionist goals.
The opposing position, defended by an American professor, is entitled "The Problem Is the Occupation." The subtitle reads "How the Occupation is Destroying the Nation." Which nation? Israel, of course. The paired articles appear under the heading "Israel under Siege".
The January 2012 issue features yet another call to bomb Iran now, before it is too late. Warning of "the dangers of deterrence", the author suggests that:
"[S]keptics of military action fail to appreciate the true danger that a nuclear-armed Iran would pose to US interests in the Middle East and beyond. And their grim forecasts assume that the cure would be worse than the disease – that is, that the consequences of a US assault on Iran would be as bad as or worse than those of Iran achieving its nuclear ambitions. But that is a faulty assumption. The truth is that a military strike intended to destroy Iran's nuclear program, if managed carefully, could spare the region and the world a very real threat and dramatically improve the long-term national security of the United States."
Others argue that the costs would be too high, and at the extremes, some even point out that an attack would violate international law – as does the stand of the moderates, who regularly deliver threats of violence, in violation of the UN Charter.

Let us review these dominant concerns in turn.

American decline is real, though the apocalyptic vision reflects the familiar ruling-class perception that anything short of total control amounts to total disaster. Despite the piteous laments, the US remains the world dominant power by a large margin, and no competitor is in sight, not only in the military dimension, in which, of course, the US reigns supreme.

China and India have recorded rapid (though highly inegalitarian) growth, but remain very poor countries, with enormous internal problems not faced by the West. China is the world's major manufacturing center, but largely as an assembly plant for the advanced industrial powers on its periphery and for western multinationals. That is likely to change over time. Manufacturing regularly provides the basis for innovation, often breakthroughs, as is now sometimes happening in China. One example that has impressed western specialists is China's takeover of the growing global solar panel market, not on the basis of cheap labor, but by coordinated planning and, increasingly, innovation.
But the problems China faces are serious. Some are demographic, reviewed in Science, the leading US science weekly. The study shows that mortality sharply decreased in China during the Maoist years, "mainly a result of economic development and improvements in education and health services, especially the public hygiene movement that resulted in a sharp drop in mortality from infectious diseases." This progress ended with the initiation of the capitalist reforms 30 years ago, and the death rate has since increased.

Furthermore, China's recent economic growth has relied substantially on a "demographic bonus", a very large working-age population. "But the window for harvesting this bonus may close soon," with a "profound impact on development": "Excess cheap labor supply, which is one of the major factors driving China's economic miracle, will no longer be available."

Demography is only one of many serious problems ahead. For India, the problems are far more severe.

Not all prominent voices foresee American decline. Among international media, there is none more serious and responsible than the London Financial Times. It recently devoted a full page to the optimistic expectation that new technology for extracting North American fossil fuels might allow the US to become energy-independent, hence to retain its global hegemony for a century. There is no mention of the kind of world the US would rule in this happy event, but not for lack of evidence.

At about the same time, the International Energy Agency reported that, with rapidly increasing carbon emissions from fossil fuel use, the limit of safety will be reached by 2017, if the world continues on its present course. "The door is closing," the IEA chief economist said, and very soon it "will be closed forever".

Shortly before the US Department of Energy reported the most recent carbon dioxide emissions figures, which "jumped by the biggest amount on record" to a level higher than the worst-case scenario anticipated by the International Panel on Climate Change (IPCC). That came as no surprise to many scientists, including the MIT program on climate change, which for years has warned that the IPCC predictions are too conservative.

Such critics of the IPCC predictions receive virtually no public attention, unlike the fringe of denialists who are supported by the corporate sector, along with huge propaganda campaigns that have driven Americans off the international spectrum in dismissal of the threats. Business support also translates directly to political power. Denialism is part of the catechism that must be intoned by Republican candidates in the farcical election campaign now in progress, and in Congress, they are powerful enough to abort even efforts to inquire into the effects of global warming, let alone do anything serious about it.

In brief, American decline can perhaps be stemmed if we abandon hope for decent survival – prospects that are all too real, given the balance of forces in the world.

'Losing' China and Vietnam

Putting such unpleasant thoughts aside, a close look at American decline shows that China indeed plays a large role, as it has for 60 years. The decline that now elicits such concern is not a recent phenomenon. It traces back to the end of the second world war, when the US had half the world's wealth and incomparable security and global reach. Planners were naturally well aware of the enormous disparity of power, and intended to keep it that way.

The basic viewpoint was outlined with admirable frankness in a major state paper of 1948 (PPS 23). The author was one of the architects of the "new world order" of the day, the chair of the State Department policy planning staff, the respected statesman and scholar George Kennan, a moderate dove within the planning spectrum. He observed that the central policy goal was to maintain the "position of disparity" that separated our enormous wealth from the poverty of others. To achieve that goal, he advised, "We should cease to talk about vague and … unreal objectives such as human rights, the raising of the living standards, and democratization," and must "deal in straight power concepts", not "hampered by idealistic slogans" about "altruism and world-benefaction."

Kennan was referring specifically to Asia, but the observations generalize, with exceptions, for participants in the US-run global system. It was well understood that the "idealistic slogans" were to be displayed prominently when addressing others, including the intellectual classes, who were expected to promulgate them.

The plans that Kennan helped formulate and implement took for granted that the US would control the western hemisphere, the Far East, the former British empire (including the incomparable energy resources of the Middle East), and as much of Eurasia as possible, crucially its commercial and industrial centers. These were not unrealistic objectives, given the distribution of power. But decline set in at once.

In 1949, China declared independence, an event known in Western discourse as "the loss of China" – in the US, with bitter recriminations and conflict over who was responsible for that loss. The terminology is revealing. It is only possible to lose something that one owns. The tacit assumption was that the US owned China, by right, along with most of the rest of the world, much as postwar planners assumed.

The "loss of China" was the first major step in "America's decline". It had major policy consequences. One was the immediate decision to support France's effort to reconquer its former colony of Indochina, so that it, too, would not be "lost".

Indochina itself was not a major concern, despite claims about its rich resources by President Eisenhower and others. Rather, the concern was the "domino theory", which is often ridiculed when dominoes don't fall, but remains a leading principle of policy because it is quite rational. To adopt Henry Kissinger's version, a region that falls out of control can become a "virus" that will "spread contagion", inducing others to follow the same path.

In the case of Vietnam, the concern was that the virus of independent development might infect Indonesia, which really does have rich resources. And that might lead Japan – the "superdomino" as it was called by the prominent Asia historian John Dower – to "accommodate" to an independent Asia as its technological and industrial center in a system that would escape the reach of US power. That would mean, in effect, that the US had lost the Pacific phase of the second world war, fought to prevent Japan's attempt to establish such a new order in Asia.

The way to deal with such a problem is clear: destroy the virus and "inoculate" those who might be infected. In the Vietnam case, the rational choice was to destroy any hope of successful independent development and to impose brutal dictatorships in the surrounding regions. Those tasks were successfully carried out – though history has its own cunning, and something similar to what was feared has since been developing in East Asia, much to Washington's dismay.

The most important victory of the Indochina wars was in 1965, when a US-backed military coup in Indonesia led by General Suharto carried out massive crimes that were compared by the CIA to those of Hitler, Stalin, and Mao. The "staggering mass slaughter", as the New York Times described it, was reported accurately across the mainstream, and with unrestrained euphoria.

It was "a gleam of light in Asia", as the noted liberal commentator James Reston wrote in the Times. The coup ended the threat of democracy by demolishing the mass-based political party of the poor, established a dictatorship that went on to compile one of the worst human rights records in the world, and threw the riches of the country open to western investors. Small wonder that, after many other horrors, including the near-genocidal invasion of East Timor, Suharto was welcomed by the Clinton administration in 1995 as "our kind of guy".

Years after the great events of 1965, Kennedy-Johnson national security adviser McGeorge Bundy reflected that it would have been wise to end the Vietnam war at that time, with the "virus" virtually destroyed and the primary domino solidly in place, buttressed by other US-backed dictatorships throughout the region.

Similar procedures have been routinely followed elsewhere. Kissinger was referring specifically to the threat of socialist democracy in Chile. That threat was ended on another forgotten date, what Latin Americans call "the first 9/11", which in violence and bitter effects far exceeded the 9/11 commemorated in the west. A vicious dictatorship was imposed in Chile, one part of a plague of brutal repression that spread through Latin America, reaching Central America under Reagan. Viruses have aroused deep concern elsewhere as well, including the Middle East, where the threat of secular nationalism has often concerned British and US planners, inducing them to support radical Islamic fundamentalism to counter it.

The concentration of wealth and American decline

Despite such victories, American decline continued. By 1970, US share of world wealth had dropped to about 25%, roughly where it remains, still colossal but far below the end of the second world war. By then, the industrial world was "tripolar": US-based North America, German-based Europe, and East Asia, already the most dynamic industrial region, at the time Japan-based, but by now including the former Japanese colonies Taiwan and South Korea, and, more recently, China.

At about that time, American decline entered a new phase: conscious self-inflicted decline. From the 1970s, there has been a significant change in the US economy, as planners, private and state, shifted it toward financialization and the offshoring of production, driven in part by the declining rate of profit in domestic manufacturing. These decisions initiated a vicious cycle in which wealth became highly concentrated (dramatically so in the top 0.1% of the population), yielding concentration of political power, hence legislation to carry the cycle further: taxation and other fiscal policies, deregulation, changes in the rules of corporate governance allowing huge gains for executives, and so on.

Meanwhile, for the majority, real wages largely stagnated, and people were able to get by only by sharply increased workloads (far beyond Europe), unsustainable debt, and repeated bubbles since the Reagan years, creating paper wealth that inevitably disappeared when they burst (and the perpetrators were bailed out by the taxpayer). In parallel, the political system has been increasingly shredded as both parties are driven deeper into corporate pockets with the escalating cost of elections – the Republicans to the level of farce, the Democrats (now largely the former "moderate Republicans") not far behind.

A recent study by the Economic Policy Institute, which has been the major source of reputable data on these developments for years, is entitled Failure by Design. The phrase "by design" is accurate. Other choices were certainly possible. And as the study points out, the "failure" is class-based. There is no failure for the designers. Far from it. Rather, the policies are a failure for the large majority, the 99% in the imagery of the Occupy movements – and for the country, which has declined and will continue to do so under these policies.

One factor is the offshoring of manufacturing. As the solar panel example mentioned earlier illustrates, manufacturing capacity provides the basis and stimulus for innovation leading to higher stages of sophistication in production, design, and invention. That, too, is being outsourced, not a problem for the "money mandarins" who increasingly design policy, but a serious problem for working people and the middle classes, and a real disaster for the most oppressed, African Americans, who have never escaped the legacy of slavery and its ugly aftermath, and whose meager wealth virtually disappeared after the collapse of the housing bubble in 2008, setting off the most recent financial crisis, the worst so far.

American Decline in Perspective, Part 2
By Noam Chomsky

In the years of conscious, self-inflicted decline at home, “losses” continued to mount elsewhere.  In the past decade, for the first time in 500 years, South America has taken successful steps to free itself from western domination, another serious loss. The region has moved towards integration, and has begun to address some of the terrible internal problems of societies ruled by mostly Europeanized elites, tiny islands of extreme wealth in a sea of misery.  They have also rid themselves of all U.S. military bases and of IMF controls.  A newly formed organization, CELAC, includes all countries of the hemisphere apart from the U.S. and Canada.  If it actually functions, that would be another step in American decline, in this case in what has always been regarded as “the backyard.”

Even more serious would be the loss of the MENA countries -- Middle East/North Africa -- which have been regarded by planners since the 1940s as “a stupendous source of strategic power, and one of the greatest material prizes in world history.” Control of MENA energy reserves would yield “substantial control of the world,” in the words of the influential Roosevelt advisor A.A. Berle.
To be sure, if the projections of a century of U.S. energy independence based on North American energy resources turn out to be realistic, the significance of controlling MENA would decline somewhat, though probably not by much: the main concern has always been control more than access.  However, the likely consequences to the planet’s equilibrium are so ominous that discussion may be largely an academic exercise.

The Arab Spring, another development of historic importance, might portend at least a partial “loss” of MENA.  The US and its allies have tried hard to prevent that outcome -- so far, with considerable success.  Their policy towards the popular uprisings has kept closely to the standard guidelines: support the forces most amenable to U.S. influence and control.

Favored dictators are supported as long as they can maintain control (as in the major oil states).  When that is no longer possible, then discard them and try to restore the old regime as fully as possible (as in Tunisia and Egypt).  The general pattern is familiar: Somoza, Marcos, Duvalier, Mobutu, Suharto, and many others.  In one case, Libya, the three traditional imperial powers intervened by force to participate in a rebellion to overthrow a mercurial and unreliable dictator, opening the way, it is expected, to more efficient control over Libya’s rich resources (oil primarily, but also water, of particular interest to French corporations), to a possible base for the U.S. Africa Command (so far restricted to Germany), and to the reversal of growing Chinese penetration.  As far as policy goes, there have been few surprises.
Crucially, it is important to reduce the threat of functioning democracy, in which popular opinion will significantly influence policy.  That again is routine, and quite understandable.  A look at the studies of public opinion undertaken by U.S. polling agencies in the MENA countries easily explains the western fear of authentic democracy, in which public opinion will significantly influence policy.

Israel and the Republican Party

Similar considerations carry over directly to the second major concern addressed in the issue of Foreign Affairs cited in part one of this piece: the Israel-Palestine conflict.   Fear of democracy could hardly be more clearly exhibited than in this case.  In January 2006, an election took place in Palestine, pronounced free and fair by international monitors. 

The instant reaction of the U.S. (and of course Israel), with Europe following along politely, was to impose harsh penalties on Palestinians for voting the wrong way.
That is no innovation.  It is quite in accord with the general and unsurprising principle recognized by mainstream scholarship: the U.S. supports democracy if, and only if, the outcomes accord with its strategic and economic objectives, the rueful conclusion of neo-Reaganite Thomas Carothers, the most careful and respected scholarly analyst of “democracy promotion” initiatives.

More broadly, for 35 years the U.S. has led the rejectionist camp on Israel-Palestine, blocking an international consensus calling for a political settlement in terms too well known to require repetition.  The western mantra is that Israel seeks negotiations without preconditions, while the Palestinians refuse.  The opposite is more accurate.  The U.S. and Israel demand strict preconditions, which are, furthermore, designed to ensure that negotiations will lead either to Palestinian capitulation on crucial issues, or nowhere.

The first precondition is that the negotiations must be supervised by Washington, which makes about as much sense as demanding that Iran supervise the negotiation of Sunni-Shia conflicts in Iraq.  Serious negotiations would have to be under the auspices of some neutral party, preferably one that commands some international respect, perhaps Brazil.  The negotiations would seek to resolve the conflicts between the two antagonists: the U.S.-Israel on one side, most of the world on the other.

The second precondition is that Israel must be free to expand its illegal settlements in the West Bank.  Theoretically, the U.S. opposes these actions, but with a very light tap on the wrist, while continuing to provide economic, diplomatic, and military support.  When the U.S. does have some limited objections, it very easily bars the actions, as in the case of the E-1 project linking Greater Jerusalem to the town of Ma’aleh Adumim, virtually bisecting the West Bank, a very high priority for Israeli planners (across the spectrum), but raising some objections in Washington, so that Israel has had to resort to devious measures to chip away at the project.

The pretense of opposition reached the level of farce last February when Obama vetoed a Security Council resolution calling for implementation of official U.S. policy (also adding the uncontroversial observation that the settlements themselves are illegal, quite apart from expansion).  Since that time there has been little talk about ending settlement expansion, which continues, with studied provocation.

Thus, as Israeli and Palestinian representatives prepared to meet in Jordan in January 2011, Israel announced new construction in Pisgat Ze’ev and Har Homa, West Bank areas that it has declared to be within the greatly expanded area of Jerusalem, annexed, settled, and constructed as Israel’s capital, all in violation of direct Security Council orders.  Other moves carry forward the grander design of separating whatever West Bank enclaves will be left to Palestinian administration from the cultural, commercial, political center of Palestinian life in the former Jerusalem.

It is understandable that Palestinian rights should be marginalized in U.S. policy and discourse.  Palestinians have no wealth or power.  They offer virtually nothing to U.S. policy concerns; in fact, they have negative value, as a nuisance that stirs up “the Arab street.”

Israel, in contrast, is a valuable ally.  It is a rich society with a sophisticated, largely militarized high-tech industry.  For decades, it has been a highly valued military and strategic ally, particularly since 1967, when it performed a great service to the U.S. and its Saudi ally by destroying the Nasserite “virus,” establishing the “special relationship” with Washington in the form that has persisted since.  It is also a growing center for U.S. high-tech investment.  In fact, high tech and particularly military industries in the two countries are closely linked.

Apart from such elementary considerations of great power politics as these, there are cultural factors that should not be ignored.  Christian Zionism in Britain and the U.S. long preceded Jewish Zionism, and has been a significant elite phenomenon with clear policy implications (including the Balfour Declaration, which drew from it).  When General Allenby conquered Jerusalem during World War I, he was hailed in the American press as Richard the Lion-Hearted, who had at last won the Crusades and driven the pagans out of the Holy Land.

The next step was for the Chosen People to return to the land promised to them by the Lord.  Articulating a common elite view, President Franklin Roosevelt’s Secretary of the Interior Harold Ickes described Jewish colonization of Palestine as an achievement “without comparison in the history of the human race.” Such attitudes find their place easily within the Providentialist doctrines that have been a strong element in popular and elite culture since the country’s origins: the belief that God has a plan for the world and the U.S. is carrying it forward under divine guidance, as articulated by a long list of leading figures.

Moreover, evangelical Christianity is a major popular force in the U.S.  Further toward the extremes, End Times evangelical Christianity also has enormous popular outreach, invigorated by the establishment of Israel in 1948, revitalized even more by the conquest of the rest of Palestine in 1967 -- all signs that End Times and the Second Coming are approaching.

These forces have become particularly significant since the Reagan years, as the Republicans have abandoned the pretense of being a political party in the traditional sense, while devoting themselves in virtual lockstep uniformity to servicing a tiny percentage of the super-rich and the corporate sector.  However, the small constituency that is primarily served by the reconstructed party cannot provide votes, so they have to turn elsewhere.

The only choice is to mobilize tendencies that have always been present, though rarely as an organized political force: primarily nativists trembling in fear and hatred, and religious elements that are extremists by international standards but not in the U.S.  One outcome is reverence for alleged Biblical prophecies, hence not only support for Israel and its conquests and expansion, but passionate love for Israel, another core part of the catechism that must be intoned by Republican candidates -- with Democrats, again, not too far behind.

These factors aside, it should not be forgotten that the “Anglosphere” -- Britain and its offshoots -- consists of settler-colonial societies, which rose on the ashes of indigenous populations, suppressed or virtually exterminated.  Past practices must have been basically correct, in the U.S. case even ordained by Divine Providence.  Accordingly there is often an intuitive sympathy for the children of Israel when they follow a similar course.  But primarily, geostrategic and economic interests prevail, and policy is not graven in stone.

The Iranian “Threat” and the Nuclear Issue

Let us turn finally to the third of the leading issues addressed in the establishment journals cited earlier, the “threat of Iran.” Among elites and the political class this is generally taken to be the primary threat to world order -- though not among populations.  In Europe, polls show that Israel is regarded as the leading threat to peace.  In the MENA countries, that status is shared with the U.S., to the extent that in Egypt, on the eve of the Tahrir Square uprising, 80% felt that the region would be more secure if Iran had nuclear weapons.  The same polls found that only 10% regard Iran as a threat -- unlike the ruling dictators, who have their own concerns.

In the United States, before the massive propaganda campaigns of the past few years, a majority of the population agreed with most of the world that, as a signatory of the Non-Proliferation Treaty, Iran has a right to carry out uranium enrichment.  And even today, a large majority favors peaceful means for dealing with Iran.  There is even strong opposition to military engagement if Iran and Israel are at war.  Only a quarter regard Iran as an important concern for the U.S. altogether.  But it is not unusual for there to be a gap, often a chasm, dividing public opinion and policy.

Why exactly is Iran regarded as such a colossal threat? The question is rarely discussed, but it is not hard to find a serious answer -- though not, as usual, in the fevered pronouncements.  The most authoritative answer is provided by the Pentagon and the intelligence services in their regular reports to Congress on global security.  They report that Iran does not pose a military threat.  Its military spending is very low even by the standards of the region, minuscule of course in comparison with the U.S.

Iran has little capacity to deploy force.  Its strategic doctrines are defensive, designed to deter invasion long enough for diplomacy to set it.  If Iran is developing nuclear weapons capability, they report, that would be part of its deterrence strategy.  No serious analyst believes that the ruling clerics are eager to see their country and possessions vaporized, the immediate consequence of their coming even close to initiating a nuclear war.  And it is hardly necessary to spell out the reasons why any Iranian leadership would be concerned with deterrence, under existing circumstances.

The regime is doubtless a serious threat to much of its own population -- and regrettably, is hardly unique on that score.  But the primary threat to the U.S. and Israel is that Iran might deter their free exercise of violence.  A further threat is that the Iranians clearly seek to extend their influence to neighboring Iraq and Afghanistan, and beyond as well.  Those “illegitimate” acts are called “destabilizing” (or worse).  In contrast, forceful imposition of U.S. influence halfway around the world contributes to “stability” and order, in accord with traditional doctrine about who owns the world.

It makes very good sense to try to prevent Iran from joining the nuclear weapons states, including the three that have refused to sign the Non-Proliferation Treaty -- Israel, India, and Pakistan, all of which have been assisted in developing nuclear weapons by the U.S., and are still being assisted by them.  It is not impossible to approach that goal by peaceful diplomatic means.  One approach, which enjoys overwhelming international support, is to undertake meaningful steps towards establishing a nuclear weapons-free zone in the Middle East, including Iran and Israel (and applying as well to U.S. forces deployed there), better still extending to South Asia.

Support for such efforts is so strong that the Obama administration has been compelled to formally agree, but with reservations: crucially, that Israel’s nuclear program must not be placed under the auspices of the International Atomic Energy Association, and that no state (meaning the U.S.) should be required to release information about “Israeli nuclear facilities and activities, including information pertaining to previous nuclear transfers to Israel.” Obama also accepts Israel’s position that any such proposal must be conditional on a comprehensive peace settlement, which the U.S. and Israel can continue to delay indefinitely.

This survey comes nowhere near being exhaustive, needless to say. Among major topics not addressed is the shift of U.S. military policy towards the Asia-Pacific region, with new additions to the huge military base system underway right now, in Jeju Island off South Korea and Northwest Australia, all elements of the policy of “containment of China.” Closely related is the issue of U.S. bases in Okinawa, bitterly opposed by the population for many years, and a continual crisis in U.S.-Tokyo-Okinawa relations.
Revealing how little fundamental assumptions have changed, U.S. strategic analysts describe the result of China’s military programs as a “classic 'security dilemma,' whereby military programs and national strategies deemed defensive by their planners are viewed as threatening by the other side,” writes Paul Godwin of the Foreign Policy Research Institute.  The security dilemma arises over control of the seas off China’s coasts.  The U.S. regards its policies of controlling these waters as “defensive,” while China regards them as threatening; correspondingly, China regards its actions in nearby areas as “defensive” while the U.S. regards them as threatening.   No such debate is even imaginable concerning U.S. coastal waters.  This “classic security dilemma” makes sense, again, on the assumption that the U.S. has a right to control most of the world, and that U.S. security requires something approaching absolute global control.
While the principles of imperial domination have undergone little change, the capacity to implement them has markedly declined as power has become more broadly distributed in a diversifying world.  Consequences are many.  It is, however, very important to bear in mind that -- unfortunately -- none lifts the two dark clouds that hover over all consideration of global order: nuclear war and environmental catastrophe, both literally threatening the decent survival of the species.
Quite the contrary. Both threats are ominous, and increasing