Search This Blog

Showing posts with label society. Show all posts
Showing posts with label society. Show all posts

Friday 16 June 2023

Fallacies of Capitalism 1: Inevitability of Inequality

How does the 'inevitability of inequality' fallacy ignore the role of social and institutional factors in perpetuating the unequal distribution of wealth and opportunities in a capitalist system?


The "inevitability of inequality" fallacy suggests that inequality is a natural and unavoidable outcome of a capitalist system, implying that it is inherently fair and just. However, this fallacy ignores the significant role of social and institutional factors that contribute to the unequal distribution of wealth and opportunities. Let me break it down with some simple examples:

  1. Unequal starting points: In a capitalist system, individuals have different starting points due to factors like family wealth, education, and social connections. These disparities make it harder for those with fewer resources to compete on an equal footing. For instance, imagine two children who want to become doctors. One child comes from a wealthy family with access to the best schools and tutors, while the other child comes from a low-income family and attends underfunded schools. The unequal starting points put the second child at a significant disadvantage, limiting their opportunities for success.

  2. Discrimination and bias: Social factors such as discrimination based on race, gender, or socioeconomic status can perpetuate inequality. Discrimination may lead to unequal treatment in hiring practices, education, or access to resources. For example, imagine a qualified job applicant who is denied a position because of their gender or ethnicity, while a less qualified candidate from a privileged background is chosen. Discrimination hinders individuals' ability to succeed and reinforces inequality in society.

  3. Power imbalances: Capitalist systems often concentrate power and wealth in the hands of a few individuals or corporations. These powerful entities can influence policies, regulations, and institutions to their advantage, further perpetuating inequality. For instance, consider a large corporation that has significant political influence. They may lobby for policies that favour their interests, such as tax breaks or deregulation, while undermining measures that could reduce inequality, such as progressive taxation or workers' rights.

  4. Lack of social mobility: Inequality can persist if social and institutional factors make it difficult for individuals to move up the social ladder. For example, imagine a society where access to quality education is primarily determined by wealth. If children from low-income families are unable to receive a good education, it becomes challenging for them to break the cycle of poverty and improve their economic prospects. This lack of social mobility reinforces existing inequalities over generations.

These examples demonstrate that the "inevitability of inequality" fallacy overlooks the social and institutional factors that contribute to the unequal distribution of wealth and opportunities in a capitalist system. By recognising these factors and working towards creating a more equitable society, we can address and reduce the systemic barriers that perpetuate inequality.

Thursday 18 February 2021

Why economists kept getting the policies wrong

 Philip Stephens in The FT


The other week I caught sight of a headline declaring that the IMF was warning against cuts in public spending and borrowing. The report stopped me in my tracks. After half a century or so as keeper of the sacred flame of fiscal prudence, the IMF was telling policymakers in rich industrial nations they should not fret overmuch about huge build-ups of public debt during the Covid-19 crisis. John Maynard Keynes had been disinterred, and the world turned upside down. 

To be clear, there is nothing irresponsible about the IMF’s advice that policymakers in advanced economies should prioritise a restoration of growth after the deflationary shock of the pandemic. The fund prefaced a shift last year, and most people would say it was common sense to allow economic recovery to take hold. Nations such as Britain might have learned that lesson from the damage inflicted by the ill-judged austerity programme imposed by David Cameron’s government after the 2008 financial crash. 

And yet. This was the IMF speaking — the hallowed (for some, hated) institution that, as many Brits will recall, formally read the rites over Keynesianism when in 1976 it forced James Callaghan’s Labour government to impose politically calamitous cuts in spending and borrowing. This is the organisation that in the intervening years had a few simple answers to any economic problem you care to think of: fiscal retrenchment, a smaller state and/or market liberalisation. The advice was heralded as the Washington consensus because of the IMF’s location.  

My first job after joining the Financial Times during the early 1980s was to learn the language of the new economic orthodoxy. Kindly officials at the UK Treasury explained to me that the technique of using fiscal policy to manage demand, put to rest in 1976, had been replaced by a new theory. Monetarism decreed that as long as the authorities kept control of the money supply, and thus inflation, everything would be fine. 

The snag was that every time the Treasury alighted on a particular measure of the money supply to target — sterling M3, PSL2, and M0 come in mind — it ceased to be a reliable guide to price changes. Goodhart’s law, this was called, after the eponymous economist Charles. By the end of the 1980s, monetarism had been ditched, and targeting the exchange rate had become the holy grail. If sterling’s rate was fixed against the Deutschmark, the UK would import stability from Germany.  

It was about this time that a senior aide to the chancellor took me to one side to explain that one of the great skills of the Treasury was to perform perfect U-turns while persuading the world it had deviated not a jot from previous policy. This proved its worth again when the exchange rate policy was blown up by sterling’s ejection from the European exchange rate mechanism in 1992. The currency was quickly replaced by an inflation target as an infallible lodestar of policy. 

The eternal truths amid the missteps and swerves were that public spending and borrowing were bad, tax cuts were good, and market liberalisation was the route to sunlit uplands. The pound’s ERM debacle was followed by a ferocious budgetary squeeze, and, across the channel, the eurozone was designed to fit a fiscal straitjacket. Financial market deregulation, we were told, oiled the wheels of globalisation. If madcap profits and bonuses at big financial institutions prompted unease, the answer was that markets would self-correct. Britain’s Labour government backed “light-touch” regulation in the 2000s. The Bank of England reduced its oversight of systemic financial stability. 

The abiding sin threaded through it all was that of certitude. Perfectly plausible but untested theories, whether about the money supply, fiscal balances and debt levels, or market risk, were elevated to the level of irrefutable facts. Economics, essentially a faith-based discipline, represented itself as a hard science. The real world was reduced by the 1990s to a set of complex mathematical equations that no one, least of all democratically elected politicians, dared challenge. 

Thus detached from reality, economic policy swept away the postwar balance between the interests of society and markets. Arid econometrics replaced a measured understanding of political economy. It scarcely mattered that the gains of globalisation were scooped up by the super-rich, that markets became casinos and that fiscal fundamentalism was widening social divisions. Nothing counted above the equations. And now? After Donald Trump, Brexit and Covid-19, it seems we are back at the beginning. Time to dust off Keynes’s general theory.

Sunday 22 October 2017

Oxbridge bashing is an empty ritual if we ignore wider social inequities

Priyamvada Gopal in The Guardian

The numbers are clearly unacceptable. Several colleges in both Oxford and Cambridge frequently admit cohorts with no black students in them at all. Roughly 1.5% of total offers are made to black British applicants and more than 80% of offers are made to the children of the top two social classes. With offers made overwhelmingly to those in London and a handful of the home counties, both universities are consistently excluding entire ethnic and regional demographics. They also continue to admit a grotesquely disproportionate number of privately schooled students. In effect, the two ancients are running a generous quota scheme for white students, independent schools and the offspring of affluent south-eastern English parents. 

There is undoubtedly a great deal that both institutions can and must do to remedy this. Our admissions processes at Cambridge are not sufficiently responsive to the gravity of the situation. Despite periodic panics in response to such media “revelations” or staged political scolding, and notwithstanding the good intentions of many involved in admissions, questions of diversity and inclusion are not taken seriously enough in their own right.

The focus on educational achievement, itself defined in purely numerical terms and worsened by internal league tables, means there is little sense of meaningful diversity as an educational and community good in its own right. Despite having contextual indicators that would allow us to diversify our admissions, we balk at non-traditional attainment profiles for fear that the student will not be able to cope once here.

For any Oxbridge college to not have a single black student at any given point in time, where they would rightly not tolerate having low numbers of women, is not just about looking institutionally racist but also impoverishes the educational and social environment we provide. The same holds true for regional and class exclusions.

When I first came to Cambridge in 2001, having taught at different institutions in the US, I was struck by the relative whiteness and sheer cultural homogeneity of this university. Even the minimal improvements I’ve seen since then in some years – more students from ethnic minority backgrounds, more young women from northern comprehensives – have made a huge difference both to me as a teacher and, more importantly, to what students are able to learn from each other.

Not all of them will get first-class marks, but they both gain a lot from and have a great deal to give to the educational environment here, not least by expanding the definition of what counts as achievement. We need more of them. (At Cambridge, in recent years, a quantum of vocal BME students as well as students from northern comprehensives has demanded change, often to good effect. There is some cause for hope.)

There is also undoubtedly a culture of denial when it comes to matters of race and racism, which students speak of both in class and privately and which I have experienced when I’ve tried to draw attention to them. And more than one student from northern comprehensives has told me about being discouraged by teachers from applying and feeling amazed to have received an offer only to feel alienated by the stultifying class conformity of the affluent south-east once they get here.

It is simply not good enough for Oxford and Cambridge to say that they are welcoming of diversity and in effect blame certain demographics for not applying despite their outreach programmes. It is Oxbridge that must change more substantially to provide a better environment for a diverse student body. The two ancients must be held to account; homogeneity must fall.

But should they be the only ones held to account? In having a necessary conversation about elitism and exclusion, are we forgetting – or being encouraged – to not have a larger one about wider deprivation and systemic inequality? It is striking that some quarters only too happy to periodically attack Oxbridge for its failings, from rightwing tabloids to Tory ministers, are rarely interested in the roots of inequality and lack of opportunity of which Oxbridge exclusion is a symptom but is hardly the origin.

We should be careful that a headline-friendly focus on these two institutions alone does not become an easy way to avoid even more painful and challenging questions. It seems somewhat selective and inadequate to focus on what David Lammy rightly calls “social apartheid” at Oxbridge without discussing the widespread and worsening economic apartheid in this country.

We know that access to university education in general is sharply determined by school achievement that, in turn, is shaped by parental income and education levels. In an economically stratified society, it is inevitable that most young people from economically deprived backgrounds have a substantially lower chance of achieving the kind of marks that enable access to higher education.

Hence it is incoherent to have a discussion about access to higher education without having one simultaneously about economic disadvantage, which, in some cases, including British Caribbean and Bangladeshi communities, has an added ethnic minority dimension to it. In a context of worsening economic fault lines, there’s a whiff of something convenient about only attacking the admissions failings of top universities.

The other obvious missing dimension to this discussion is the existence and encouragement for independent schools. It’s somewhat contradictory to encourage a market culture where money can buy a deluxe education and then feel shocked when the well-off get their money’s worth by easily meeting the requirements for offers from high-status institutions. It’s worth saying that as long as independent schools, hardly bastions of ethnic diversity, exist, there will remain a fundamental apartheid between two kinds of students.

Oxbridge, or even the Russell Group of universities more broadly, can only do so much to mitigate this state of affairs, which lifting the tuition fee cap will only worsen. Lammy notes that more offers are made to Eton than to students on free school meals.

But why not also question the very existence of Eton and the lamentable state of an economic order that necessitates free school meals for many? Add to this the parlous condition of state education with its chronic underfunding, inflated classroom sizes, an undermining testing and target culture and difficulties in recruiting and retaining good teachers.

The same politicians who rightly point to Oxbridge’s demographic narrowness are rarely willing to grasp the nettle of a two-tier educational structure in which some are destined to do much better than others. Who, for instance, would be willing to call for the abolition of private schooling, subject as such a suggestion would be to shrill denunciations about how individual choice, personal aspiration and the workings of the market are being interfered with?

There are other tough discussions that could be had if the aim truly is to address and undo inequalities in university demographics. Would politicians and institutions be willing, for instance, to impose representational quotas for both ethnic minorities and state-educated students that reflect the national pie-chart?

Currently, the Office for Fair Access (Offa) makes some toothless demands around “widening participation”, a rather feeble phrase, which are not accompanied by penalties for failure. Lammy, whose suggestion that admissions be centralised has some merit to it, not least towards undoing the unhelpful internal collegiate caste system at Oxbridge, has made also a comparison between Oxbridge’s abysmal intake of black students and Harvard’s healthy numbers.

Would the political and intellectual classes be willing to have a discussion about something like “affirmative action” in the US, a process of “positive discrimination” by which underrepresented ethnic minorities and disadvantaged groups are given special consideration? We must hope so. For failing a wide-ranging discussion aimed at radical measures, all the huffing and puffing about Oxbridge is destined to remain a yearly ritual, each controversial headline simply making way for the same unsurprising headlines every year.

Friday 16 June 2017

With Grenfell Tower, we’ve seen what ‘ripping up red tape’ really looks like

George Monbiot in The Guardian

For years successive governments have built what they call a bonfire of regulations. They have argued that “red tape” impedes our freedom and damages productivity. Britain, they have assured us, would be a better place with fewer forms to fill in, fewer inspections and less enforcement.
But what they call red tape often consists of essential public protections that defend our lives, our futures and the rest of the living world. The freedom they celebrate is highly selective: in many cases it means the freedom of the rich to exploit the poor, of corporations to exploit their workers, landlords to exploit their tenants and industry of all kinds to use the planet as its dustbin. As RH Tawney remarked, “Freedom for the pike is death for the minnows.”

It will be a long time before we know exactly what caused the horrific fire in the Grenfell Tower, and why it was able to rage so freely, with such devastating loss of life. But it seems at this stage likely that the rapidity with which the fire spread was either caused or exacerbated by the cladding with which the tower was refurbished.

There have been plenty of warnings that cladding can present a severe fire risk. To give just one example, in 1999 the House of Commons select committee on environment, transport and rural affairs published a report entitled Potential Risk of Fire Spread in Buildings Via External Cladding Systems.

But both Conservative and New Labour governments have been highly reluctant to introduce new public protections, even when the need is pressing. They have been highly amenable to tearing down existing protections at the behest of trade associations and corporate lobbyists. Deregulation of this kind is a central theme of the neoliberal ideology to which both the Conservatives and Labour under Tony Blair succumbed.

In 2014, the then housing minister (who is now the immigration minister), Brandon Lewis, rejected calls to force construction companies to fit sprinklers in the homes they built on the following grounds:


Conservative MPs see Brexit as an excellent opportunity to strip back regulations

“In our commitment to be the first Government to reduce regulation, we have introduced the one in, two out rule for regulation … Under that rule, when the Government introduce a regulation, we will identify two existing ones to be removed. The Department for Communities and Local Government has gone further and removed an even higher proportion of regulations. In that context, Members will understand why we want to exhaust all non-regulatory options before we introduce any new regulations.”

In other words, though he accepted that sprinklers “are an effective way of controlling fires and of protecting lives and property”, to oblige builders to introduce them would conflict with the government’s deregulatory agenda. Instead, it would be left to the owners of buildings to decide how best to address the fire risk: “Those with responsibility for ensuring fire safety in their businesses, in their homes or as landlords, should and must make informed decisions on how best to manage the risks in their own properties,” Lewis said.

This calls to mind the Financial Times journalist Willem Buiter’s famous remark that “self-regulation stands in relation to regulation the way self-importance stands in relation to importance”. Case after case, across all sectors, demonstrates that self-regulation is no substitute for consistent rules laid down, monitored and enforced by government.

Crucial public protections have long been derided in the billionaire press as “elf ’n’ safety gone mad”. It’s not hard to see how ruthless businesses can cut costs by cutting corners, and how this gives them an advantage over their more scrupulous competitors.



Grenfell Tower fire is corporate manslaughter, says Labour MP



The “pollution paradox” (those corporations whose practices are most offensive to voters have to spend the most money on politics, with the result that their demands come to dominate political life) ensures that our protections are progressively dismantled by governments courting big donors.

Conservative MPs see Brexit as an excellent opportunity to strip back regulations. The speed with which the “great repeal bill” will have to pass through parliament (assuming that any of Theresa May’s programme can now be implemented) provides unprecedented scope to destroy the protections guaranteed by European regulations. The bill will rely heavily on statutory instruments, which permit far less parliamentary scrutiny than primary legislation. Unnoticed and undebated, crucial elements of public health and safety, workers’ rights and environmental protection could be made to disappear.

Too many times we have seen what the bonfire of regulations, which might sound like common sense when issuing from the mouths of ministers, looks like in the real world. The public protections that governments describe as red tape are what make the difference between a good society and barbarism. It is time to bring the disastrous deregulatory agenda to an end, and put public safety and other basic decencies ahead of corner-cutting and greed.

Saturday 15 April 2017

Telling children 'hard work gets you to the top' is simply a lie

Hashi Mohamed in The Guardian


I know about social mobility: I went to underperforming state schools, and am now a barrister. Could somebody take the same route today? It’s highly unlikely




‘Those inside the system naturally recruit in their own image. This then entrenches the lack of any potential for upward mobility and means that the vast majority are excluded.’ Photograph: Mick Tsikas/AAP


It is a common promise made to the next generation. “If you work hard, and do the right thing, you will be able to get on in life.” I believe that it is a promise that we have no capacity to fulfil. And that’s because its underlying assumptions must be revisited.
Imagine a life living in quads. You attend a highly prestigious school in which you dash from one quad to the next for your classes. You then continue on to yet another prestigious institution for your tertiary education, say Oxford or Cambridge University, and yet more quads with manicured lawns. Then you end up in the oasis of Middle Temple working as a barrister: more manicured lawns and, yes, you guessed it, more quads. You have clearly led a very square and straight life. Effortlessly gliding from one world to the next with clear continuity, familiarity and ease.

Now contrast the above oasis with the overcrowded and under-performing schools of inner cities, going home to a bedroom which you share with many other siblings. A home you are likely to vacate when the council can’t house you there anymore. Perhaps a single-parent household where you have caring duties at a young age, or a household where no one works. A difficult neighbourhood where the poverty of ambition is palpable, stable families a rarity, and role models very scarce.


The unwritten rules are rarely shared and 'diversity' and 'open recruitment' have made little if any difference


The former trajectory, in some or all its forms, is much more likely to lend itself to a more successful life in Britain. The latter means you may have the grades and talent, despite the odds, but you’re still lacking the crucial ingredients essential to succeeding. I don’t have to imagine much of this. I have experienced both of these extremes in my short lifetime.

My mother gave birth to 12 children. I arrived in London at the age of nine, speaking practically no English. I attended some of the worst performing schools in inner-city London and was raised exclusively on state benefits. Many years later I was lucky enough to attend Oxford on a full scholarship for my postgraduate degree. Now as a barrister I am a lifetime member of The Honourable Society of Lincoln’s Inn.

Is my route possible for anyone in the next generation with whom I share a similar background? I believe not. And this is not because they are any less able or less dedicated to succeed.

What I have learned in this short period of time is that the pervasive narrative of “if you work hard you will get on” is a complete myth. It’s not true and we need stop saying it. This is because “working hard, and doing the right thing” barely gets you to the starting line. Furthermore, it means something completely different depending on to which context you’re applying this particular notion. So much more is required.

I have come to understand that the systems that underpin the top professions in Britain are set up to serve only a certain section of society: they’re readily identifiable by privileged backgrounds, particular schools and accents. To some this may seem obvious, so writing it may be superfluous. But it wasn’t obvious to me growing up, and it isn’t obvious to many others. The unwritten rules are rarely shared and “diversity” and “open recruitment” have tried but made little if any difference.

Those inside the system then naturally recruit in their own image. This then entrenches the lack of any potential for upward mobility and means that the vast majority are excluded.

As a form of short-term distraction, we are obsessed with elevating token success stories which distort the overall picture.
The story of the Somali boy who got a place at Eton, or the girl from the East End who is now going to MIT. These stories may seem inspiring at first blush, but they skew the complex picture that exists in deprived communities. It perpetuates the simple notion that what’s required is working hard, and that all else afterwards falls neatly into place. This simple ritual we seem to constantly engage in is therefore as much about setting up false hopes for other children, as it is about privileged, middle-class-led institutions making themselves feel good.

The reality is that there are many like them trying hard to do better, but may be lacking the environment to fully realise their potential. Are they worth less? When told to “dream big” and it will happen, who will tell them that failure had nothing to do with their lack of vision? But that real success, especially from their starting point, often boils down to a complex combination of circumstances: luck, sustained stability, the right teachers at the right time, and even not experiencing moments of grief at crucial, destabilising junctures.

Improving educational attainment is critical, and so much progress has been made over the years to improve this. But this is not enough. Employers must see hiring youngsters from poorer backgrounds as good for business as well as for a fairer society. They must be assisted with a real chance to succeed, in a non-judgmental context and inclusive environment. They must do more to focus on potential rather than polish. More leadership and more risk-taking are required on this front.
Perversely, class and accents remain an overwhelmingly important way of judging intelligence. In France or Germany, for example, your accent rarely matters. Your vocabulary and conjugation will give much more away, but never your accent, apart from regional perhaps. I don’t see this mindset shifting, so my advice to youngsters has remained: you need to adapt yourself. You need to find the right way to speak to different people, at different times in different contexts. This is not compromising who you are, but rather adapting to the relevant surroundings.

We need to do more to double down on improving environments both at home and at school which continuously constrain potential. If the adage that hard work truly matters rings true, then we must do more – at all levels of society – to make it a reality.

Wednesday 4 January 2017

Supreme Court brings Indian cricket into the 21st century

Suresh Menon in The Hindu


The world’s most successful secret society has been given a lesson in transparency and that is cause for celebration.

No tears need be wasted on the panjandrums who have been running the Board of Control for Cricket in India and its State associations like personal fiefdoms.

The Supreme Court finally reeled in the long rope it had given the BCCI, and so tripped up its senior officials. If there was contrition among the officials, these remained unexpressed. Yesterday’s powerhouses will be tomorrow’s forgotten men, their frown and wrinkled lip and sneer of cold command erased forever.

Inevitably, some good men will be thrown out with the bad, and there will be much churning as the old order makes way for the new. The saner elements of the board will wonder if it had to come to this, when, with greater maturity and common sense, the BCCI might have emerged with some dignity.

For the BCCI brought about its own downfall, aided by nothing more than its hubris and cavalier disregard for the laws of the land. You cannot ignore a Supreme Court judgement, as the BCCI did, and hope that nothing will change. It wasn’t just arrogance, it was foolishness of the highest order.

Would past presidents like Chinnaswamy and Sriraman, Gaekwad and Bindra, Dungarpur and Dalmiya have allowed things to come to this pass? It is convenient to believe they wouldn’t. But there is false memory at play here, a harking back to a golden era that never existed. Ghulam Ahmed, former off spinner and board vice-president, put it succinctly, “There are no values in the board.”

The Anurag Thakurs and Ajay Shirkes are paying the price for the culture that men like those mentioned had brought into the BCCI. These men ran the best sports body in the country, and somehow believed that they had a divine right to do so. Players kowtowed to them, politicians and businessmen chased them, and they clung on to power with a touching desperation.

The current dispensation extended that culture and refined it. They, like their predecessors, failed to understand the connection between actions and consequences.

At any time in the BCCI’s eight-decade history, the Supreme Court could have stepped in and ruled as it did now. Accountability and transparency were never in the BCCI’s handbook for officials, but public scrutiny was not as intense as it is now, and in some cases the good that an official did outweighed the bad, and all was forgiven.

Brinkmanship — a tactic much favoured by the BCCI to bring other cricket boards and indeed the International Cricket Council to its knees — is not a strategy guaranteed to impress the Supreme Court. That the highest court gave the BCCI more than six months to comply with its order when it could have acted even as deadlines were ignored is a testimony to its benevolence.

But how did a three-time Member of Parliament, which is what Anurag Thakur is, and sundry other luminaries, misjudge the seriousness of the situation? Was this a proxy war fought on behalf of his political masters by Thakur, or was the board, recognising the inevitable, preparing for a scorched earth response? The first will have to remain in the realm of speculation till a lead actor in the drama spills the beans. We shall soon know about the second.

The BCCI’s death wish has been one of the features of the whole saga. Thakur came in as the bright, young face of the board. There was an energy about him which makes his fall a disappointment. At 42 he was the man who replaced the old guard. Yet, within weeks, the cozy club he had tried to break up when N. Sinivasan was in charge, quickly reshaped itself into a new cozy club.

His fall is a cautionary tale for those who set out to change the system but is absorbed by it. The Supreme Court’s ruling will also impact other sports which have been resisting change like the BCCI. And that is good news for Indian sport.
The domestic season has been unaffected by the BCCI’s problems. This has been the case traditionally, and is one of the true blessings of Indian cricket. There are enough dedicated officials to ensure that the show goes on.

A generational change has been forced upon the BCCI, which is otherwise happy to continue with sons and nephews (never daughters and nieces) and other relatives keeping everything in the family.

Now State associations will have to change their registrations where necessary, holding general body meetings in order to advance this. Legal procedures need to be followed. There is a temptation to believe that cricketers make the best administrators. This is a common fallacy. There are cricketers who have made excellent administrators, but being able to play the square cut is no guarantee of managerial skills. The names of corrupt cricketer-officials are well known.

There is a long road ahead, mostly uncharted. But a start has been made. The new system may not be perfect, but it is better than the old one. Accountability ensures that.

Tuesday 3 March 2015

We’re desperate to believe in something. But bringing God into economics is risky

Eliza Filby in The Guardian

With just over two months to go until polling day, it is becoming clear that the most interesting ideas are emanating from those not seeking election. The Anglican bishops have issued a pastoral letter which, despite being mauled by leading Conservatives, legitimately aims to move the debate beyond the old market-v-state model towards a new vision, one that incorporates themes of civil society, interdependency, human dignity and the common good.

Meanwhile, leading Conservative thinkers Tim Montgomerie and Stefan Shakespeare have launched their “good right” initiative, which hopes to succeed where David Cameron has so obviously failed: to detoxify the Conservative brand. Making the Tories electable again is certainly the aim, but at its core is an even more ambitious endeavour: to re-establish the moral credibility of the free market. To this chorus of extra-parliamentary voices we might also add “blue Labour” Maurice Glasman and “red Tory” Phillip Blond and, for that matter, Russell Brand. Even if their ideas are unlikely to feature in forthcoming party manifestos, a movement is clearly afoot. This disparate group may differ on the remedy but share a diagnosis: the neoliberal revolution is politically and morally defunct. One way or another, they are all dancing on Thatcher’s grave.

But to those seeking a new moral vision for Britain, Thatcherism itself offers a cautionary tale. It was, much like now, a response to widespread disillusionment and a redundant political consensus. Like the “good right”, Conservatives in the 70s also sought to disconnect the association of collectivism with virtue and reinstate the moral integrity of the “invisible hand”. Margaret Thatcher would eventually cast herself as the shepherd leading the British people out of the dark days of decline towards the path of economic and social enlightenment. Ultimately, however, it was a story of false idols and unintended consequences – one where the mix of God, economics and single-minded vision proved to be toxic. The paradox of Thatcherism is that, like all political ideologies, there was a complete discrepancy between its aims and outcomes.

“Economics is the method; the object is to change the soul,” Margaret Thatcher declared in 1981, revealing the way in which Thatcherism for her was always about transforming values rather than simply GDP. A strong religious basis to her outlook stemmed from her father – the greengrocer, councillor and Wesleyan lay preacher, Alf Roberts.

If we were sourcing the origins of Thatcherism, we wouldn’t find it in the pages of Hayek’s Road to Serfdom or Milton Friedman’s monetarist theory but in Roberts’ sermon notes, now housed in Thatcher’s personal archive at Churchill College, Cambridge. Contained in them is the theological basis of Thatcherism: an individualistic interpretation of the Bible, a nod to the spiritual dangers of avarice, the Protestant work ethic, praise of the godly virtues of thrift and self-reliance and, finally, a divine justification for individual liberty and the free market. In short, Thatcherism always owed more to Methodism than to monetarism.

Thatcher herself had been a preacher before she entered politics, and even though she transferred this missionary energy from pulpit to podium, her religious values remained crucial. On becoming Conservative leader, she saw it as her chief mission to discredit the assumed moral superiority of socialism and reconnect the broken link between Protestant and capitalist values in Britain. Preaching from the pulpit on several occasions – most famously to the Church of Scotland’s General Assembly in 1988 – Thatcher unashamedly asserted the Biblical case for the sovereignty of individual liberty and the ‘invisible hand’. Thatcher’s pledge, of course, was that greater wealth would not encourage selfishness but neighbourliness. With more money in our pocket and less dependency on the state, we would be free to exercise our moral virtue and perform our duty as Good Samaritans.

We would not walk by on the other side, nor would we need state-imposed traffic lights to guide us there.

In the end, though, even she was prepared to admit she had failed in her crusade. When asked by Frank Field what her greatest regret in office was, she replied: “I cut taxes and I thought we would get a giving society, and we haven’t.” She was right. A survey conducted by the Charities Aid Foundation in 1989 revealed that those who gave the most to causes were not from the prosperous south but were disproportionately located in those areas that benefited least from the Thatcher boom.



FacebookTwitterPinterest Thatcher’s naivety was perhaps her greatest flaw.Photograph: ITV/Rex/ITV/Rex

Thatcher’s naivety was perhaps her greatest flaw: her understanding of capitalism for example was more a provincial than global one; Alf Roberts behind the counter of his grocery shop rather than the yuppie on the trading floor was the image of market transaction in her mind. It is little wonder then that she could not understand the world she had created, where the nation’s homes and household budgets were entwined with a global financial services sector that made up an ever-growing percentage of Britain’s GDP, largely internationally owned and in the hands of speculators concerned with short-term gain and distant from the deals and lives they were gambling on. In private Thatcher used to rage against bankers and their bonuses. Why did they not follow the example of those in the army she would cry, which in her view was the model demonstration of responsibility to one’s fellow man.

As someone reared in a home where profligacy was a vice and thrift a virtue, nor could Thatcher fathom why so many Britons struggled with debt. Yet paradoxically it was her government that did most to encourage it. What might be termed the “democratisation of debt”, be it in the form of credit and store cards, personal loans and of course, mortgages, fundamentally reordered the nation’s psyche and our attitudes towards money and the state. In short, we transferred our dependency from the government to the money-lenders. The notion of deferred gratification or thrift, that is saving for something before consuming it, became an alien concept for Britain’s “grab now, pay later” society. Total consumer credit more than doubled, while the number of credit cards nearly tripled in the 1980s and would spiral to unimaginable levels over the next two decades. This culture of credit too trickled down the social scale for as the government squeezed the benefits system so those low-income households turned to credit companies who asked few questions. In 1980 22% of households were using credit; by 1989 that had trebled to 69%, with an estimated 50% of those loans going on essentials. As the New Economics Foundation report of 2002 into debt recognised this led to the absurd situation whereby “what the taxpayer was providing in terms of benefits, the lender was often taking away – with interest”. It is doubtful that even Thatcher considered Britain’s record personal debt as part of her plan of “setting the people free”.

Thatcherism laid the foundations for a culture in which individualism and self-reliance could thrive, but ultimately it created a culture in which only selfishness and excess were rewarded. Thatcher liked to quote John Wesley’s mantra, “Earn all you can, save all you can and give all you can,” and yet it was only ever the first instruction that was sufficiently encouraged. While Cameron and Osborne have spoken at length about paying off the ‘nation’s credit card’, they have consciously avoided entreating individuals to pay off their own. Tellingly, it is now a vote-winner to talk of governmental thrift but political suicide to talk of personal thrift. That is the true legacy of Thatcherite economics.

When Thatcher said that there was ‘no such thing as society', it was a rallying cry for individual moral responsibility

When Thatcher uttered those now immortal words that there was “no such thing as society”, it was not a negative or flippant statement but a naive rallying cry for individual moral responsibility. Perhaps the flaw in her thinking was not that she did not believe in society but that she had too much faith in man.

Thatcher seemed to have forgotten the key doctrine in both Conservative philosophy and the Bible: the Fall. Thatcherism was a challenge to individual moral virtue, yet in Thatcher’s Eden, when given the choice, we – of course – ate the fruit. Where critics tend to go wrong in their assessment of Thatcher is that they do not consider that there was any moral, only economic, thinking behind it; where Thatcher’s admirers go wrong is that they do not admit that was a fundamental discrepancy between her aims and outcomes.

It is, of course, wrong to heap all the blame on Thatcher. This culture was encouraged and this behaviour continued unabated under New Labour. Much like a gangster’s wife who enjoys the lifestyle but does not question how her husband gets his money, Blair and Brown were content to pocket a significant share of the profits to fund their schools and hospitals.

By 2008 the world seemed on the precipice of something fundamental, but one of the remarkable features of the last seven years is how little has changed. Perhaps Thatcher’s great mistake was that, as Alfred Sherman said, “she saw life in primary colours”.

So there is credibility and value in dreaming up an alternative where Thatcher insisted that there was none. Given the contemporary disillusionment with capitalism, voters are still in desperate need of something to believe in. What the neoliberal experiment of the last 30 years teaches us is not that religion and politics do not mix, but that the politics of certainty is where danger lies.

Sunday 8 February 2015

Depression, suicide and the fragility of the strong, silent male

Yvonne Roberts in The Guardian
On Thursday, the bruised and tearful face of former footballer and chairman of the Professional Footballers’ Association, Clarke Carlisle, 35, appeared on the front page of the Sun. He was released from psychiatric hospital two weeks ago. In a clip on the paper’s website, he appears so raw and vulnerable that to watch it provokes thoughts of a modern-day version of Bedlam with us as Hogarthian gawpers treating the mentally fragile as entertainment.
The paper’s headline read: “I leapt in front of a lorry hoping to die.” Carlisle, a father of three, has suffered from depression for 18 months. He explained that the end of his career, the curtailment of his contract as a TV sports pundit and a struggle with alcohol led to financial problems. He felt the lack of “a sense of worth and value in life”.
He said strangers would comment: “Didn’t you used to be Clarke Carlisle?”, as if, once off the television screen and football pitch, he had passed into no-man’s-land. Throwing himself in front of a lorry became the “perfect answer”. Carlisle survived, unlike 12 men who will kill themselves today, as 12 do every day, in England and Wales.
Just before his death, the psychiatrist Anthony Clare wrote a thoughtful book, On Men: Masculinity in Crisis. He concluded with a plea to men to place “a greater value on love, family and personal relationships and less on power, possessions and achievement… to find meaning and fulfilment”.
Except that redefining what it means to be a man in contemporary society isn’t a job for men alone. It’s a dynamic process of cultural and social change that repeatedly judders to a halt. And it will continue to be impeded for a variety of reasons (better the stereotype you know) and as long as some women hold fast to a hierarchy of need.
This is the kind of thinking that says: if male fragility is addressed, women’s requirements are marginalised. Men can hog resources, but the two requirements are interlocked. Until male violence can be defused, for instance, the refugees will continue to overflow.
In the main, support for Carlisle’s honesty has been strong, as it has been for Nick Baber, 48, chief operating officer at KPMG, who last week said he would pretend he had flu during severe depression. He has called for more senior executives to speak out. But then what? As Dr Margaret McCartney explains in The Patient Paradox, the severely depressed are too ill to make plans to end their life. When a patient is beginning to recover, suicide becomes an option, particularly if they are male. Thoreau wrote: “The mass of men lead lives of quiet desperation.” Talk to parents from Papyrus, the charity that campaigns to prevent young suicide, and again and again they say they had no idea that their sons were depressed, let alone suicidal. Their sons, they felt, had so much to live for.
According to the charity, Campaign Against Living Miserably (Calm), men account for more than three-quarters of all suicides in England and Wales, 4,590 deaths – the single biggest cause of death among males under 50. Three out of four had no contact with mental health professionals. As the Men’s Health Forum constantly points out, men are reluctant to go their GP and fail to identify their own symptoms of depression. When Carlisle’s wife, Gemma, was diagnosed with postnatal depression, he advised her to “get a grip”; then he took Goldberg’s depression test and recognised his own symptoms. They include lack of energy, sadness, negativity and self-destructiveness. A survey by Calm revealed that 69% of men said they preferred to deal with problems themselves, 56% didn’t want to burden others. “The traditional strong silent response to adversity is increasingly failing to protect men from themselves,” said Jane Powell, Calm’s chief executive.
Last year, the charity issued a much-needed four-point charter to encourage change for the better. It includes a shift in thinking about the needs of males in schools, work and public services and a fuller range of expression of masculinity in the media and advertising. Too often, still, while depression in women is wrongly viewed as an inevitable part of being female, it’s precisely this alleged association with female fragility that underscores the notion that the male sufferer is less of a man; he has a weakness, not an illness best kept secret. So, as the suicide rate has risen, the taboos and social “norms” stay in place.
Change, however, is possible. Last month, a new policy on suicide prevention was launched, the Stop Suicide pledge. It is based on the work of Dr Ed Coffey in Detroit that enrols as many members of the public as possible with the aim of ending the stigma and the secrecy. In four years, the suicide rate dropped 75%.
The UK “zero suicide” pilots ask the whole community to look out for each other, recognise warning signs and offer help, not exclusion. The pledge, with a badge, is, “I’d ask”. (Although what you ask is trickier. “Is everything OK?” is bound to get a positive response in a well-trained man.)
The New Economics Foundation says the five foundation stones of wellbeing are: connect, be active, take notice, keep learning and give. The female sphere, even when it involves working 10 hours a day as well as mothering and acting as a carer, has all those aspects woven into it (and paradoxically at extremes can be the cause of female depression and breakdown). The male stereotypes of protector, provider, toughie and top dog shoves wellbeing well down the list.
Kurt Cobain, desperately in need of help for years, in his poignant suicide note to Boddah, his imaginary childhood friend, quoted a Neil Young song: “… better to burn out than fade away…” The tragedy for too many men is that society doesn’t yet allow them to let down their guard so they can value and enjoy the infinity of choices that lie between those two extremes.

Wednesday 15 October 2014

The age of loneliness is killing us


For the most social of creatures, the mammalian bee, there’s no such thing now as society. This will be our downfall
Man sitting on a bench under a tree
‘Social isolation is as potent a cause of early death as smoking 15 cigarettes a day. Loneliness is twice as deadly as obesity.’ Photograph: Feri Lukas/Rex

What do we call this time? It’s not the information age: the collapse of popular education movements left a void filled by marketing and conspiracy theories. Like the stone age, iron age and space age, the digital age says plenty about our artefacts but little about society. The anthropocene, in which humans exert a major impact on the biosphere, fails to distinguish this century from the previous 20. What clear social change marks out our time from those that precede it? To me it’s obvious. This is the Age of Loneliness.
When Thomas Hobbes claimed that in the state of nature, before authority arose to keep us in check, we were engaged in a war “of every man against every man”, he could not have been more wrong. We were social creatures from the start, mammalian bees, who depended entirely on each other. The hominins of east Africa could not have survived one night alone. We are shaped, to a greater extent than almost any other species, by contact with others. The age we are entering, in which we exist apart, is unlike any that has gone before.
Three months ago we read that loneliness has become an epidemic among young adults. Now we learn that it is just as great an affliction of older people. A study by Independent Age shows that severe loneliness in England blights the lives of 700,000 men and 1.1m women over 50, and is rising with astonishing speed.
Ebola is unlikely ever to kill as many people as this disease strikes down. Social isolation is as potent a cause of early death as smoking 15 cigarettes a day; loneliness, research suggests, is twice as deadly as obesity. Dementia, high blood pressure, alcoholism and accidents – all these, like depression, paranoia, anxiety and suicide, become more prevalent when connections are cut. We cannot cope alone.
Yes, factories have closed, people travel by car instead of buses, use YouTube rather than the cinema. But these shifts alone fail to explain the speed of our social collapse. These structural changes have been accompanied by a life-denying ideology, which enforces and celebrates our social isolation. The war of every man against every man – competition and individualism, in other words – is the religion of our time, justified by a mythology of lone rangers, sole traders, self-starters, self-made men and women, going it alone. For the most social of creatures, who cannot prosper without love, there is no such thing as society, only heroic individualism. What counts is to win. The rest is collateral damage.
British children no longer aspire to be train drivers or nurses – more than a fifth say they “just want to be rich”: wealth and fame are the sole ambitions of 40% of those surveyed. A government study in June revealed that Britain is the loneliness capital of Europe. We are less likely than other Europeans to have close friends or to know our neighbours. Who can be surprised, when everywhere we are urged to fight like stray dogs over a dustbin?
We have changed our language to reflect this shift. Our most cutting insult is loser. We no longer talk about people. Now we call them individuals. So pervasive has this alienating, atomising term become that even the charities fighting loneliness use it to describe the bipedal entities formerly known as human beings. We can scarcely complete a sentence without getting personal. Personally speaking (to distinguish myself from a ventriloquist’s dummy), I prefer personal friends to the impersonal variety and personal belongings to the kind that don’t belong to me. Though that’s just my personal preference, otherwise known as my preference.
One of the tragic outcomes of loneliness is that people turn to their televisions for consolation: two-fifths of older people report that the one-eyed god is their principal company. This self-medication aggravates the disease. Research by economists at the University of Milan suggests that television helps to drive competitive aspiration. It strongly reinforces the income-happiness paradox: the fact that, as national incomes rise, happiness does not rise with them.
Aspiration, which increases with income, ensures that the point of arrival, of sustained satisfaction, retreats before us. The researchers found that those who watch a lot of TV derive less satisfaction from a given level of income than those who watch only a little. TV speeds up the hedonic treadmill, forcing us to strive even harder to sustain the same level of satisfaction. You have only to think of the wall-to-wall auctions on daytime TV, Dragon’s Den, the Apprentice and the myriad forms of career-making competition the medium celebrates, the generalised obsession with fame and wealth, the pervasive sense, in watching it, that life is somewhere other than where you are, to see why this might be.
So what’s the point? What do we gain from this war of all against all? Competition drives growth, but growth no longer makes us wealthier. Figures published this week show that, while the income of company directors has risen by more than a fifth, wages for the workforce as a whole have fallen in real terms over the past year. The bosses earn – sorry, I mean take – 120 times more than the average full-time worker. (In 2000, it was 47 times). And even if competition did make us richer, it would make us no happier, as the satisfaction derived from a rise in income would be undermined by the aspirational impacts of competition.
The top 1% own 48% of global wealth, but even they aren’t happy. A survey by Boston College of people with an average net worth of $78m found that they too were assailed by anxiety, dissatisfaction and loneliness. Many of them reported feeling financially insecure: to reach safe ground, they believed, they would need, on average, about 25% more money. (And if they got it? They’d doubtless need another 25%). One respondent said he wouldn’t get there until he had $1bn in the bank.
For this, we have ripped the natural world apart, degraded our conditions of life, surrendered our freedoms and prospects of contentment to a compulsive, atomising, joyless hedonism, in which, having consumed all else, we start to prey upon ourselves. For this, we have destroyed the essence of humanity: our connectedness.
Yes, there are palliatives, clever and delightful schemes like Men in Sheds and Walking Football developed by charities for isolated older people. But if we are to break this cycle and come together once more, we must confront the world-eating, flesh-eating system into which we have been forced.
Hobbes’s pre-social condition was a myth. But we are entering a post-social condition our ancestors would have believed impossible. Our lives are becoming nasty, brutish and long.

Saturday 10 May 2014

University economics teaching isn't an education: it's a £9,000 lobotomy


Economics took a battering after the financial crisis, but faculties are refusing to teach alternative views. It's as if there's only one way to run an economy
Students from the Post-Crash Economics Society pictured at Manchester University
The Post-Crash Economics Society at Manchester University has arranged an evening class on bubbles, panics and crashes. Photograph: Jon Super for the Guardian

"I don't care who writes a nation's laws – or crafts its treatises – if I can write its economics textbooks," said Paul Samuelson. The Nobel prizewinner grasped that what was true of gadgets was also true for economies: he who produces the instruction manual defines how the object will be used, and to what ends.
Samuelson's axiom held good until the collapse of Lehman Brothers, which triggered both an economic crisis and a crisis in economics. In the six years since, the reputations of those high priests of capitalism, academic economists, have taken a battering.
The Queen herself asked why hardly any of them saw the crash coming, while the Bank of England's Andy Haldane has noted how it rendered his colleagues' enchantingly neat models as good as useless: "The economy in crisis behaved more like slime descending a warehouse wall than Newton's pendulum." And this week, economics students from Kolkata to Manchester have gone on the warpath demanding radical changes in what they're taught.
In a manifesto signed by 42 university economics associations from 19 countries, the students decry a "dramatic narrowing of the curriculum" that presents the economy "in a vacuum". The result is that the generation next in line to run our economy, from Whitehall departments or corporate corner-offices, discuss policy without touching on "broader social impacts and moral implications of economic decisions".
The problem is summed up by one of the manifesto's coordinators, Faheem Rokadiya, at the University of Glasgow: "Whenever I sit an economics exam, I have to turn myself into a robot." But he and his fellow reformers aren't seeking to skimp on algebra, or calling for a bonfire of the works of the Chicago school. They simply object to the notion that there is one true way to do economics, especially after that apparently scientific method has been found so badly wanting.
In their battle to open up economics, Rokadiya et al have one hell of a fight on their hands, for the same reason that it has proved so hard to democratise so many aspects of the post-crash order: the forces of conservatism are just too powerful. To see how fiercely the academics fight back, take a look at the University of Manchester.
Since last autumn, members of the university's Post-Crash Economics Society have been campaigning for reform of their narrow syllabus. They've put on their own lectures from non-mainstream, heterodox economists, even organising evening classes on bubbles, panics and crashes. You might think academics would be delighted to see such undergraduate engagement, or that economists would be swift to respond to the market.
Not a bit of it. Manchester's economics faculty recently announced that it wouldn't renew the contract of the temporary lecturer of the bubbles course, and that students who wanted to learn about the crash would have to go to the business school.
The most significant economics event of our lifetime isn't being taught in any depth at one of the largest economics faculties in the country. So what exactly is a Russell Group university teaching our future economists? Last month the Post-Crash members published a report on the deficiencies of the teaching they receive. It is thorough and thoughtful, and reports: "Tutorials consist of copying problem sets off the board rather than discussing economic ideas, and 18 out of 48 modules have 50% or more marks given by multiple choice." Students point out that they are trained to digest economic theory and regurgitate it in exams, but never to question the assumptions that underpin it. This isn't an education: it's a nine-grand lobotomy.
The Manchester example is part of a much broader trend in which non-mainstream economists have been evicted from economics faculties and now hole up in geography departments or business schools. "Intellectual talibanisation" is how one renowned economist describes it in private. This isn't just bad for academia: the logical extension of the argument that you can only study economics in one way is that you can only run the economy in one way.
Mainstream economics still has debates, but they tend to be technical in nature. The Nobel prizewinner Paul Krugman has pointed to the recent work of Thomas Piketty as proof that mainstream economics is plenty wide-ranging enough. Yet when Piketty visited the Guardian last week, he complained that economists generate "sophisticated models with very little or no empirical basis … there's a lot of ideology and self-interest".
Like so many other parts of the post-crash order, mainstream economists are liberal in theory but can be authoritarian in practice. The reason for that is brilliantly summed up by that non-economist Upton Sinclair: "It is difficult to get a man to understand something when his salary depends on his not understanding it."